Liquid AI, a startup co-founded by former MIT researchers from the Computer Science and Artificial Intelligence Laboratory (CSAIL), has introduced its first multimodal AI models, the “Liquid Foundation Models (LFMs).” These models represent a bold departure from the transformer architecture that has dominated AI development since the release of the 2017 paper “Attention Is All You Need.”
Unlike the current wave of generative AI models built on the transformer architecture, Liquid AI aims to develop foundation models from “first principles,” taking an engineering approach akin to building engines, cars, or airplanes. This fundamental shift has led to models that outperform transformer-based alternatives of similar size, such as Meta’s Llama 3.1-8B and Microsoft’s Phi-3.5 3.8B.
Continue reading… “Liquid AI Unveils Groundbreaking Foundation Models, Challenging Transformer-Based AI”
