Here’s why 100TB+ SSDs will play a huge role in ultra large language models in the near future

Posted by:
James Thompson
Mon, 20 Jan
0 Comment
Feature image

Kioxia, a Japanese memory giant, has introduced a new project named AiSAQ, aiming to revolutionize AI data processing. The project proposes to replace traditional RAM with larger capacity SSDs for improved Retrieval-Augmented Generation (RAG). This innovative approach seeks to enhance reliability in critical data tasks by mitigating inaccuracies often generated by large language models. By utilizing high-capacity SSDs to store vector data and indices, AiSAQ offers a more cost-effective and scalable solution for supporting large AI models. The shift towards SSD-based storage not only allows for handling larger datasets but also contributes to making advanced AI technologies more accessible and cost-effective. While the exact launch date remains undisclosed, it is expected that competitors like Micron and SK Hynix will soon follow suit with similar technology developments. In a rapidly evolving AI landscape, Kioxia’s AiSAQ project shows promise in advancing the capabilities of AI applications.

Tags:

0 0 votes
Article Rating
Subscribe
Notify of
guest

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments