Together’s $20M seed funding to build open-source AI and cloud platform
We’re excited to announce our $20M seed round funding round led by Lux Capital. It’s an honor to reach this milestone with the support of incredible investors who believe in our mission: to empower innovation and creativity by providing leading open-source generative AI models and a cloud platform that makes AI accessible to anyone, anywhere.
When Chris, Percy, Ce and I first got together last year, we all felt it was clear that foundation models represented a generational shift in technology, maybe the most significant since the invention of the transistor. At the time, we could see the trend toward centralization of these models within a small number of corporations due to the vast expense of high-end GPU clusters needed for training. Meanwhile, the open community that had led the innovations in AI over the prior decades had limited agency in shaping the coming world of AI. In founding Together, we were driven by the belief that open and decentralized alternatives to closed systems were going to be important — and possibly critical for business and society.
Several venture funds and prominent entrepreneurs have backed our vision of an open ecosystem for AI and invested along with our leads in this funding round, including Factory, SV Angel, First Round Capital, Long Journey Ventures, A Capital, Robot Ventures, Common Metal, Definition Capital, Susa Ventures, Cadenza Ventures, SCB 10x, Scott Banister, the co-founder of PayPal; Jeff Hammerbacher, the co-founder of Cloudera; Dawn Song, the co-founder of Oasis Labs, Alex Atallah, the co-founder of OpenSea; MC Lader, the COO of Uniswap, Lip-Bu Tan, the founder of Cadence Systems; Jakob Uszkoreit, the co-inventor of the Transformer architecture, as well as angel investors Marc Bhargava, Jennifer Campbell, Chafic Kazoun, Sabrina Hahn, SongYee Yoon, Chase Lochmiller, Yi Sun, Dave Eisenberg, Panos Madamopoulos-Moraris, and Zach Frankel.
Since our founding, we have brought together an incredible team of researchers, engineers, and AI practitioners; created collaborations with decentralized infrastructure providers, open-source groups, and both academic and corporate research labs to further this mission. We have released several projects that have garnered support from hundreds of thousands of AI developers including GPT-JT, OpenChatKit and RedPajama. This is just the beginning. Our aim is to help create open models that outrival closed models and establish open-source as the default way to incorporate AI.
Leveraging research in distributed optimization, we have built a specialized cloud platform for large models that efficiently scales training and inference. In the coming months we plan to open up access to this platform, enabling rapid customization and coupling of foundation models with production tasks. Open models will give developers and organizations greater ability to understand, inspect, and utilize AI, without vendor lock-in and with strong privacy protections.
We are at the beginning of a new era of AI. I am so excited for what the future holds and humbled to be a part of the incredible open-source AI movement.
– Vipul Ved Prakash, Co-founder and CEO
- Lower
Cost20% - faster
training4x - network
compression117x
Q: Should I use the RedPajama-V2 Dataset out of the box?
RedPajama-V2 is conceptualized as a pool of data that serves as a foundation for creating high quality datasets. The dataset is thus not intended to be used out of the box and, depending on the application, data should be filtered out using the quality signals that accompany the data. With this dataset, we take the view that the optimal filtering of data is dependent on the intended use. Our goal is to provide all the signals and tooling that enables this.
article