News
MosaicML has unveiled MPT-7B-8K, an open-source large language model (LLM) with 7 billion parameters and an 8k context length. According to the company, the model is trained on the MosaicML ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results