Marvell Blogs

Marvell Newsroom

Latest Marvell Blog Articles

  • December 02, 2025

    5 Times More Queries per Second: What CXL Compute Accelerators Can Do for AI

    By Khurram Malik, Senior Director of Marketing, Custom Cloud Solutions, Marvell

    Near-memory compute technologies have always been compelling. They can offload tasks from CPUs to boost utilization and revenue opportunities for cloud providers. They can reduce data movement, one of the primary contributors to power consumption,1  while also increasing memory bandwidth for better performance.  

    They have also only been deployed sporadically; thermal problems, a lack of standards, cost and other issues have prevented many of these ideas giving developers that goldilocks combination of wanted features that will jumpstart commercial adoption.2

    This picture is now changing with CXL compute accelerators, which leverage open standards, familiar technologies and a broad ecosystem. And, in a demonstration at OCP 2025, Samsung Electronics, software-defined composable solution provider Liqid, and Marvell showed how CXL accelerators can deliver outsized gains in performance.

    The Liqid EX5410C is a demonstration of a CXL memory pooling and sharing appliance capable of scaling up to 20TB of additional memory. Five of the 4RU appliances can then be integrated into a pod for a whopping 100TB of memory and 5.1Tbps of additional memory bandwidth. The CXL fabric is managed by Liqid’s Matrix software that enables real-time and precise memory deployment based on workload requirements: 

  • December 02, 2025

    Marvell Makes Inaugural 100 Best Companies in Southeast Asia List

    By Vienna Alexander, Marketing Content Professional, Marvell

    Marvell Forbes Southeast Asia Award

    Great Place to Work, in combination with Fortune, launched the first-ever 100 Best Companies to Work For in Southeast Asia. Marvell made the list based on its 2025-2026 Great Place to Work Trust Index™ Survey results for Vietnam, where it earned an impressive 95% rating and positive employee sentiments of community.

    Marvell continues to invest in being a great place to work. Recently, Marvell expanded its operations in Vietnam with three new offices that opened in September 2025. Vietnam is a fast-growing innovation hub at Marvell, alongside another Marvell Southeast Asia location, Singapore.

    The “100 Best Companies to Work For” accolade recognizes organizations that build cultures where trust, innovation, and performance can thrive. It was compiled from comprehensive survey data gathered from employees in the region. The list is presented by Great Place to Work, the global authority on organizational culture and employee experience, and Fortune, a global leader in recognizing innovation in workplaces.

  • November 30, 2025

    Marvell Earns “Fittest Firm” Title in Silicon Valley Turkey Trot for 10th Consecutive Year

    By Vienna Alexander, Marketing Content Professional, Marvell

    Marvell is proud to celebrate its 10th consecutive year as the “Fittest Firm” in the Silicon Valley Turkey Trot. Since 2016, Marvell has sponsored the competition and consistently earned this distinction for having the highest employee participation among large firms.

    The Silicon Valley Turkey Trot is the largest Thanksgiving Day race in the United States. Embracing the spirit of giving, the event donates all proceeds to four local non-profit organizations: Healthier Kids Foundation, HomeFirst, Second Harvest of Silicon Valley, and Second Harvest. The race has contributed more than $13 million and provided more than 10 million meals to these causes since its inception in 2005.

    This year, on Thanksgiving morning, more than 700 Marvell employees and their families joined the race to support these local organizations and stay active during the holiday season.

    “We’re incredibly proud to have so many employees participate in this meaningful event year after year,” said Chris Koopmans, Marvell President and Chief Operating Officer. “Marvell has supported the Silicon Valley Turkey Trot for a long time, and we’re honored to contribute to such worthwhile organizations in our community. We care deeply about promoting physical and mental well-being, and it’s inspiring to see our team come together in support of such an important cause.”

  • November 20, 2025

    The Next Step for AI Storage: GPU-initiated and CPU-initiated Storage

    By Chander Chadha, Director of Marketing, Flash Storage Products, Marvell

    AI is all about dichotomies. Distinct computing architectures and processors have been developed for training and inference workloads. In the past two years, scale-up and scale-out networks have emerged.

    Soon, the same will happen in storage.

    The AI infrastructure need is prompting storage companies to develop SSDs, controllers, NAND and other technologies fine-tuned to support GPUs—with an emphasis on higher IOPS (input/output operations per second) for AI inference—that will be fundamentally different from those for CPU-connected drives where latency and capacity are the bigger focus points. This drive bifurcation also likely won’t be the last; expect to also see drives optimized for training or inference.

    As in other technology markets, the changes are being driven by the rapid growth of AI and the equally rapidly growing need to boost the performance, efficiency and TCO of AI infrastructure. The total amount of SSD capacity inside data centers is expected to double to approximately 2 zettabytes by 2028 with the growth primary fueled by AI.1 By that year, SSDs will account for 41% of the installed base of data center drives, up from 25% in 2023.1

    Greater storage capacity, however, also potentially means more storage network complexity, latency, and storage management overhead. It also means potentially more power. In 2023, SSDs accounted for 4 terawatt hours of data center power, or around 25% of the 16 TWh consumed by storage. By 2028, SSDs are slated to account for 11TWh, or 50%, of storage’s expected total for the year.1 While storage represents less than five percent of total data power consumption, the total remains large and provides incentives for saving. Reducing storage power by even 1 TWh, or less than 10%, would save enough electricity to power 90,000 US homes for a year.2 Finding the precise balance between capacity, speed, power and cost will be critical for both AI data center operators and customers. Creating different categories of technologies becomes the first step toward optimizing products in a way that will be scalable.

  • November 20, 2025

    Video Series: The Future of Optical Technology

    By Vienna Alexander, Marketing Content Professional, Marvell

    Optical connectivity is the backbone of AI servers and an expanding opportunity where Marvell shines, given its comprehensive optical connectivity portfolio.

    Marvell showcased its notable developments at ECOC, the European Conference on Optical Communication, alongside various companies contributing to the hardware needed for this AI era.

    Learn more about these impactful optical innovations that are enabling AI infrastructure, plus the trends and goings-on of the market.

     

     

Archives