News

MosaicML has unveiled MPT-7B-8K, an open-source large language model (LLM) with 7 billion parameters and an 8k context length. According to the company, the model is trained on the MosaicML ...