Stay Tuned!

Subscribe to our newsletter to get our newest articles instantly!

Tech News

MosaicML launches MPT-7B-8K, a 7B-parameter open-source LLM


Head over to our on-demand library to view sessions from VB Transform 2023. Register Here


MosaicML has unveiled MPT-7B-8K, an open-source large language model (LLM) with 7 billion parameters and an 8k context length. 

According to the company, the model is trained on the MosaicML platform and underwent a pretraining process commencing from the MPT-7B checkpoint. The pretraining phase was conducted using Nvidia H100s, with an additional three days of training on 256 H100s,…



Source link

Avatar

Techy Nerd

About Author

Leave a comment

Your email address will not be published. Required fields are marked *

You may also like

Tech News

3 ways businesses can strike the ideal marketing and IT balance

We’re seeing two schools of thought emerge on how best to leverage data in the digital media landscape. The first
Software Tech News

Build Smart Biolinks with AI: Introducing the AI Biolink Creator

AI powered content for Bio Links and Marketing.