At the heart of Titans' design is a concerted effort to more closely emulate the functioning of the human brain.
Daniel Yoo of Yuanta Securities, Korea says the legacy memory business is on a decline, but is optimistic for SK Hynix as their HBM business continues to grow and will retain over 50% market share.
Artificial intelligence (AI) 's recent innovation has had several advantages and disadvantages. On the positive side of this technology is its ability to assist medical professionals in making more accurate diagnoses sooner.
Digital storage and memory capability play a crucial role in modern computing, including AI. The 2025 CES show showed advancements in NAND flash controllers, non-volatile memory and various AI applications.
Leverages MRAM and Patented Technologies to Deliver Nonvolatile, High-Speed, Ultra-Low Power Solutions from the Data Center to the Edge Numem 22nm Chip Numem’s advanced test chip sets the foundation for the company’s 12nm NuRAM solution,
Meta's chief AI scientist, Yann LeCun, says that a "new paradigm of AI architectures" will emerge in the next three to five years, going far beyond the
Microsoft is planning to develop an AI therapist at some point, as the company has patented a technology for it.
South Korea’s SK Hynix, one of the world’s largest memory chipmakers supported by strong sales of high bandwidth memory (HBM) used in generative AI.
In a separate post, Behrouz claimed that based on internal testing on the BABILong benchmark (needle-in-a-haystack approach), Titans (MAC) models were able to outperform large AI models such as GPT-4, LLama 3 + RAG, and LLama 3 70B.
SK Hynix CFO Kim Woohyun said that the outlook for memory demand in 2025 was clouded by inventory adjustments, protective trade policies and geopolitical risks.
Micron (NASDAQ: MU) makes memory chips and storage for phones and computers. However, these chips can also be used for data centers and it is now one of the key players in the AI business. The stock is down 24% from its peak but analysts are eyeing a massive rebound here.
The rapid evolution of artificial intelligence (AI) has been marked by the rise of large language models (LLMs) with ever-growing numbers of parameters. From early iterations with millions of parameters to today’s tech giants boasting hundreds of billions or even trillions, the sheer scale of these models is staggering.