Tumgik
#CXLmemory
govindhtech · 4 months
Text
SK Hynix Platinum P41 SSD Sparks Up COMPUTEX Taipei 2024
Tumblr media
SK Hynix Platinum P41 SSD
Beginning in June, SK Hynix showcased its cutting-edge AI memory solutions at COMPUTEX Taipei 2024. Under the topic “Connecting AI,” COMPUTEX Taipei 2024, one of Asia’s leading IT exhibitions, drew in about 1,500 international attendees, including tech firms, venture capitalists, and accelerators. SK Hynix made its debut at the event and highlighted with its array of next-generation products that it is a leader in AI memory and a first mover.
“Connecting AI” with top AI memory solutions
“Linking Artificial Intelligence” SK Hynix’s booth showcased its cutting-edge AI server solutions, innovative technologies for on-device AI PCs, and exceptional consumer SSD goods, all under the theme “Memory, The Power of AI.”
High bandwidth and capacity memory solutions from SK Hynix are tailored for AI systems.
Among the AI server solutions on show was HBM3E, the fifth version of HBM. With its enormous capacity, superior heat dissipation capability, and industry-leading 1.18 terabytes (TB) per second data processing performance, HBM3E is designed to satisfy the demands of AI servers and other applications.
High Bandwidth Memory
A high-performance, high-value product that connects numerous DRAMs vertically using through-silicon via (TSV) to achieve substantially faster data processing speeds than current DRAMs.
CXL is another technology that has proven essential for AI servers since it may boost processing power and system bandwidth. By introducing its CXL Memory Module-DDR5 (CMM-DDR5), which dramatically increases system bandwidth and capacity in comparison to systems that are merely outfitted with DDR5, SK Hynix demonstrated the strength of its CXL portfolio. The server DRAM products DDR5 RDIMM and MCR DIMM were among the other AI server solutions on exhibit. SK Hynix, in instance, debuted their towering 128-gigabyte (GB) MCR DIMM at an exhibition.
Compute Express Link
High-performance computing systems are built using this next-generation PCIe-based interconnect technology.
Double Data Rate 5
A server DRAM that provides improved bandwidth and power efficiency over the previous generation, DDR4, to successfully manage the growing needs of larger and more complex data workloads.
Registered Dual In-line Memory Module
The Registered Dual In-line Memory Module (RDIMM) is a high-density memory module designed to connect DRAM dies vertically in servers and other applications.
Multiplexer Combined Ranks Dual In-line Memory Module
All-Time Multiplexer Ranks The Dual In-line Memory Module (MCR DIMM) is a motherboard-bonded module product that combines multiple DRAMs to operate two rank-one basic information processing units at the same time, improving speed.
Additionally, several of the company’s business SSDs (eSSD), such as the PS1010 and PE9010, were on display for visitors to the stand. Specifically, the fast sequential read speed of the PCIe Gen5-based PS1010 makes it perfect for AI, big data, and machine learning applications. Furthermore, by introducing its flagship devices, the D5-P5430 and D5-P5336, which have ultrahigh capacities of up to 30.72 TB and 61.44 TB, respectively, SK Hynix’s U.S. subsidiary Solidigm enhanced the eSSD range.
Peripheral Component Interconnect Express 
Digital device motherboards employ PCIe, or peripheral component interconnect express, a high-speed input/output interface with a serialisation format.
SK Hynix demonstrated its ground-breaking memory solutions for on-device AI PCs, in keeping with the expanding on-device AI trend. Among them was PCB01, a PCIe Gen5 client SSD for on-device AI that has the fastest sequential read and write speeds in the business at 14 gigabytes per second (GB/s) and 12 GB/s, respectively. In addition to PCB01, SK Hynix showcased other products designed for on-device AI PCs, such as GDDR7, the next generation graphic DRAM, and LPCAMM2, a module solution that can replace two DDR5 SODIMMs with the same performance.
On-device AI
Unlike cloud-based AI services, which depend on a distant cloud server, on-device AI does AI computation and inference directly within devices like PCs or smartphones.
Low Power Compression Attached Memory Module 2
Minimal Power Compression The LPDDR5X-based Attached Memory Module 2 (LPCAMM2) module solution provides great performance, power efficiency, and space reductions.
Small Outline Dual In-line Memory Module
The server DRAM known as a Small Outline Dual In-line Memory Module (SODIMM) is smaller than the standard DIMM found in desktop computers.
Platinum P41 SSD
Additionally, attendees might get a look at consumer SSDs from SK Hynix. The incredibly dependable consumer SSD Platinum P51 and Platinum P41, which provide great speed to improve PC performance, were on exhibit at the event. The Platinum P51, which is scheduled for mass production later in 2024, makes use of SK Hynix’s “Aries,” the first high-performance internal controller in the industry. With sequential read and write speeds of up to 14 GB/s and 12 GB/s, respectively, Platinum P51 roughly doubles the speed capabilities of Platinum P41, the previous version. This loads AI training and inference LLMs in under a second.
Large language model
Generative AI problems require LLMs to construct, summarise, and translate texts using massive data sets.
At the presentation, a revised version of the portable SSD Beetle X31 was also revealed. With a USB 3.2 Gen 2 interface, this small and fashionable SSD can operate at a speedy 10 gigabits per second (Gbps). In the third quarter of 2024, SK Hynix intends to release the higher-capacity 2 TB version in addition to the current 512 GB and 1 TB variants.
SK Hynix Platinum P41
A prominent 2024 Red Dot Design Award was given to SK Hynix’s stick-type SSD Tube T31 and heat sink for Platinum P41, Haechi H02, earlier in May, highlighting the SSD lineup’s exceptional design.
Finally, SK Hynix’s ESG strategy which included enhancing the energy efficiency of its products was showcased at the show. The company’s selection of energy-efficient products is especially well-suited for enhancing the sustainability of AI applications, which have high power requirements.
Maintaining the Advancement of AI Memory
In keeping with the COMPUTEX Taipei 2024 theme, SK Hynix is working to develop its technology in order to contribute to the realisation of a day when “Connecting AI” is a common occurrence. To advance its AI memory capabilities, the company will keep taking part in international conferences that showcase the newest developments in the sector.
Read more on Govindhtech.com
0 notes
govindhtech · 6 months
Text
Samsung Unveils CXL Memory Pooling Technology
Tumblr media
CXL memory pooling
Samsung unveiled its state-of-the-art CXL DRAM memory pooling product, the CXL Memory Module – Box (CMM-B), to highlight the expanding momentum in the CXL ecosystem. The Samsung CMM-B has a capacity of up to two terabytes (TB) and can hold eight CMM-D devices with the E3.S form factor. AI, in-memory databases (IMDB), data analytics, and other applications requiring large memory capacities may benefit from the enormous memory capacity combined with high efficiency of as much as 60 gigabytes-per-second (GB/s) as well as a latency of 596 a nanosecond (ns).
Samsung also showcased the first Rack-Level memory solution for highly scalable and composable disaggregated infrastructure in collaboration with Supermicro, a pioneer in Plug and Play Rack-Scale IT solutions worldwide. Unlike typical designs, which lack the flexibility and efficiency required for current applications, this innovative solution makes use of Samsung’s CMM-B to enhance memory capacity and bandwidth, allowing data centres to manage demanding workloads. Applications like AI, IMDB, data analytics, and more that need high-capacity memory may benefit from the enhanced memory capacity and high-performance of up to 60GB/s bandwidth per server.
Samsung CXL memory expander Reiterating its leadership in high-performance and high-capacity solutions for AI applications, Samsung Electronics, a global leader in advanced semiconductor technology, announced the extension of its Compute Express Link (CXL) memory module portfolio and demonstrated its most recent HBM3E technology.
What is CXL memory? Unlike DDR5 or HBM2E, CXL memory, usually referred to as CXL-attached memory, isn’t precisely a particular kind of memory. Rather, the Compute Express Link (CXL) connection standard makes this idea possible.
CXL memory controller SangJoon Hwang, Corporate Executive Vice President, Head of DRAM Product and Technology at Samsung Electronics, and Jin-Hyeok Choi, Corporate Executive Vice President, Device Solutions Research America Memory at Samsung Electronics, utilised centre stage to announce the latest memory methods and talk regarding how Samsung Electronics is advancing HBM and Compute Express Link (CXL) in the AI era. The event was held in front of a full house at Santa Clara’s Computer History Museum. Gunnar Hellekson, vice president and general manager at Red Hat, and Paul Turner, vice president, product team, VCF division at VMware by Broadcom, joined Samsung on stage to talk about how their software solutions, when paired with Samsung’s hardware technology, are pushing the envelope in memory innovation.
According to Choi, “innovation in memory technology is essential for the advancement of AI.” As the leader in the memory market, Samsung is pleased to keep pushing innovation with products like the most sophisticated CMM-B technology in the business and potent memory solutions like HBM3E for demanding AI applications and high-performance computing. In order to jointly realise the full potential of the AI age, they are dedicated to working with their partners and providing for their clients.
CXL memory sharing The world’s first FPGA (Field Programmable Gate Arrays)-based tiered memory solution for hypervisors, known as the CXL Memory Module Hybrid for Tiered Memory (CMM-H TM), was also unveiled on stage by Samsung and VMware by Broadcom as part of project Peaberry. In order to address memory management issues, minimise downtime, optimise scheduling for tiered memory, and optimise performance all while drastically lowering total cost of ownership (TCO) this hybrid system combines DRAM and NAND storage in an Add-in Card (AIC) design factor.
Paul Turner said, “VMware by Broadcom is happy to collaborate with Samsung to bring new innovations in memory.” “A new innovation in CXL and a compelling value-proposition with significant TCO benefits, better utilization of expensive DRAM resources, and improved consolidation of server resources while delivering the same great performance are made possible by Samsung’s leadership in memory technologies and VMware‘s leadership in software memory tiering.”
CXL memory expansion CXL Memory Module Furthermore, Samsung demonstrated its CXL Memory Module DRAM (CMM-D) technology, which combines the CXL open standard interface with Samsung’s DRAM technology to provide effective, low-latency communication between the CPU and memory expansion devices. For the first time in the market, Red Hat, a pioneer in open source software solutions, successfully verified Samsung’s CMM-D devices with its business software last year. Through Samsung Memory Research (SMRC), the two businesses will keep working together to create CXL reference and open-source models in addition to working together on a variety of other storage and memory solutions.
Additionally, Samsung provided 2024 Memcon guests with a demo of its most recent HBM3E 12H chip, which is the first 12-stack HBM3E DRAM in the world and represents a breakthrough with the largest capacity in HBM technology. By using the company’s cutting-edge thermal compression non-conductive film (TC NCF) technology, the HBM3E 12H improves both product yield and vertical density of the chip by more than 20% when compared to its predecessor. Samsung intends to begin mass manufacturing of the HBM3E 12H during the first half of this year, and is presently providing samples to customers.
Enabling cxl memory expansion for in-memory database management systems In-memory database management systems (IMDBMS) might potentially circumvent the constraints of standard memory architectures by enabling CXL memory extension. Here’s how the idea is broken down:
Traditional Memory’s Drawbacks for IMDBMSs Capacity restrictions: The main memory (DRAM) of a server might easily get overloaded with large datasets utilised by IMDBMS. Performance is impacted as a result of forcing data switching to slower storage (SSD/HDD). Bottlenecks caused by latency: Accessing data from slower storage causes latency, which reduces the performance benefit of in-memory databases. Concerning Samsung Electronics Samsung’s revolutionary concepts and innovations inspire people all across the globe and help to build the future. Through its SmartThings ecosystem and open cooperation with partners, the firm is transforming the worlds of TVs, smartphones, wearables, tablets, home appliances, network systems, memory, system LSI, foundry, and LED solutions. It is also creating a seamless connected experience.
Read more on Govindhtech.com
1 note · View note
govindhtech · 9 months
Text
Highlights of SK Hynix’s AI Memory Leadership Exhibition at CES 2024
Tumblr media
Highlights of SK Hynix’s AI Memory
SK Hynix will emphasize memory chips’ significance in the AI era
HBM3E’s very high performance, an interactive AI fortuneteller will be on show
The company, which aims to accelerate business turnaround while solidifying its leadership in AI memory, announced today that it will present the technology for ultra-high performance memory products, which form the basis of future AI infrastructure, at CES 2024, the world’s most important tech event, which will take place in Las Vegas from January 9 to 12.
According to SK Hynix, the company will showcase its future vision at the event via its Memory Centric1 and emphasize the value of memory products in boosting technical innovation in the AI Memory age and maintaining their competitiveness in the global memory markets.
Along with other significant SK Group affiliates including SK Inc., SK Innovation, and SK Telecom, the business will operate a venue called SK Wonderland where it will highlight its key AI memory technologies, such as HBM3E.
SK Hynix intends to commence mass production of HBM3E, the world’s best-performing memory product, in the first half of 2024, and provide it to the biggest AI technology businesses worldwide. HBM3E was successfully created in August.
SK Hynix will showcase an AI Fortuneteller at their amusement park-themed area, where they are using HBM3E-based generative AI technology. By drawing their cartoon characters from their own faces and interpreting their New Year’s fortunes, the AI Memory fortuneteller is supposed to provide visitors with new entertainment value.
The top AI technology from SK Hynix will also be on show in the SK ICT Family Demo Room, which is co-run by other SK ICT enterprises.
The company will demonstrate two products:
Accelerator-in-Memory based Accelerator (AiMX3), a low-cost, high-efficiency processing-in-memory chip-based accelerator card for generative AI;
And Compute Express Link (CXL2), a next-generation interface and test product of Computational Memory Solution (CMS), a memory solution that integrates the computational functions of CXL.
In particular, with the growth of AI technology, CXL memory and HBM are two of the key items in the spotlight. In the second half, SK Hynix intends to offer DDR5-based CXL 2.0 memory solutions in sizes of 96GB and 128GB to AI clients.
Justin Kim, President (Head of AI Infra) at SK Hynix, said, “We are thrilled to showcase our technology, which has risen to the core of the AI infrastructure in the U.S., home to AI technology.” “SK Hynix aims to accelerate a business turnaround through its leadership in the AI memory space and will intensify its collaboration with global players.”
Notice of Disclaimer
These publications are not meant to constitute an offer for buyers to purchase or sell any SK Hynix, Inc. securities in the United States. Without registrations with the Securities and Exchange Commission of the USA or an exemption from registration under the U.S. Securities Act of 1933, as changed, the securities cannot be legally offered or sold in the United States. In the United States, SK Hynix Inc. has no plans to file an offering or carry out a public offering of securities.
Concerning SK Hynix Inc.
Having its main office located in Korea as a whole SK Hynix Inc. is a leading international supplier of semiconductor products, specializing in memory chips for flash memory, image sensors with CMOS technology, and dynamic random access memory (RAM) chips for many different kinds of prominent customers worldwide. The worldwide Bank shares are listed on the Luxembourg Stock Exchange, whereas the Company’s shares are traded on the Korea Exchange.
Read more on Govindhtech.com
0 notes