#AI_accelerators
Explore tagged Tumblr posts
Link
But there is no shortage of gaming video cards The shortage of gaming video cards has given way to a shortage of GPUs for working with artificial intelligence. And this deficit will also drag on. [caption id="attachment_53701" align="alignnone" width="780"] Nvidia[/caption] According to TSMC, which produces its current generation of graphics processors for Nvidia, the shortage of Nvidia accelerators for AI will continue for about another year and a half. The problem is due to limitations in the capacity of chip packaging using CoWoS packaging technology, which cannot be quickly solved. The shortage of Nvidia accelerators for AI will continue for another year and a half It's not a lack of AI chips, but a lack of CoWoS capabilities. Currently, we cannot satisfy 100% of our customers' needs, but we try to maintain around 80%. We think this is a temporary phenomenon that will last about a year and a half TSMC notes that demand for CoWoS packaging unexpectedly increased at the beginning of this year. It was then that the boom in generative AI began and, as a result, the increase in demand for Nvidia accelerators. TSMC is currently installing additional tools for CoWoS in its existing factories, but this takes time.
#AI_accelerators#AI_Training#computational_power#data_centers#deep_learning#GPU_acceleration#graphics_processing_units#high_performance_computing#machine_learning#Nvidia_accelerators#Nvidia_GPU#Parallel_Processing#Scientific_Computing.
0 notes
Link
Almost all parameters are unknown The Chinese company Loongson produces not only some of the most modern Chinese processors, but also GPUs. And its new development is designed to compete with Nvidia accelerators for AI, although they are far from the most productive and modern. [caption id="attachment_85288" align="aligncenter" width="600"] Loongson introduced LG200 AI accelerator[/caption] The accelerator (or its GPU) is called LG200. The characteristics of this solution, unfortunately, are unknown. The block diagram shows that the GPU consists of 16 small ALUs, four large ALUs, and one huge ALU or special purpose unit. Loongson introduced LG200 AI accelerator But the performance is known, albeit for the whole node: from 256 GFLOPS to 1 TFLOPS. Here, unfortunately, the details are again unknown, so it is unclear for which mode the performance is indicated, but even if it is FP64, the figure is quite modest, since modern monsters Nvidia and AMD offer 50-60 TFLOPS or more. At the same time, Loongson’s solution is a GPGPU, that is, it supports general-purpose computing. Unfortunately, there are no details here yet. Separately, we can recall that Loongson promised next year to release a video card that can compete with the Radeon RX 550 , whose performance (FP32) is just over 1.1 TFLOPS. It is possible that the LG200 will be a direct relative of this adapter.
#accelerators_in_computing#advanced_computing#AI_Accelerator#AI_Hardware#AI_Processing#Artificial_Intelligence.#Chinese_Technology#computing_hardware#deep_learning#Hardware_acceleration#LG200#Loongson#Loongson_AI_products.#Loongson_LG200_specifications#machine_learning#neural_networks#processor_architecture#semiconductor_industry#semiconductor_technology#Technology_innovation
0 notes