Mar, 2024
BioMedLM:训练于生物医学文本的 27 亿参数语言模型
BioMedLM: A 2.7B Parameter Language Model Trained On Biomedical Text
Elliot Bolton, Abhinav Venigalla, Michihiro Yasunaga, David Hall, Betty Xiong...
TL;DRBioMedLM, a 2.7 billion parameter GPT-style autoregressive model trained on PubMed, demonstrates competitive performance in biomedical NLP tasks, highlighting the potential of smaller, targeted models as efficient and environmentally friendly alternatives.