
LinearCosine: When AI Researchers Decided Multiplication was Too Mainstream
Hey there, optimization seekers and efficiency enthusiasts! 📊🧮 Today, we’re diving into a world where even basic arithmetic operations are up for debate. Buckle up as we explore LinearCosine, an experiment that asks: “Do we really need multiplication for AI?” Quick Links to skip the talk: Project Website - Linear Cosine | GitHub Repo | Original Paper The Paper That Started It All During my fall break, while I was supposed to be relaxing, my roommate Yash Maurya forwarded me a fascinating paper by Hongyin Luo and Wei Sun titled “Addition is All You Need for Energy-efficient Language Models”. I was immediately intrigued by their approach to modify one of the core fundamental computations in AI, multiplication. This project builds upon my previous work on in-browser vanilla js semantic search, such as YC-Dendrolinguistics, where I implemented a cosine similarity-based information retrieval system for YC startups. LinearCosine takes this a step further by exploring ways to make these fundamental calculations more energy-efficient. ...