Skip to content

Commit

Permalink
Merge pull request #116 from nhsengland/minor_issue106
Browse files Browse the repository at this point in the history
Update Publications.md
  • Loading branch information
amaiaita authored May 30, 2024
2 parents d822e00 + bf907d4 commit b65509e
Showing 1 changed file with 10 additions and 1 deletion.
11 changes: 10 additions & 1 deletion docs/our_work/Publications.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,15 @@ tags: ['PUBLICATIONS']

List of pre-releases and publications connected to our work

[6] [https://arxiv.org/abs/2403.19802](https://arxiv.org/abs/2403.19802)

**Developing Healthcare Language Model Embedding Spaces**

**Niall Taylor**, **Dan Schofield**, Andrey Kormilitzin, Dan W Joyce, Alejo Nevado-Holgado

*Pre-trained Large Language Models (LLMs) often struggle on out-of-domain datasets like healthcare focused text. We explore specialized pre-training to adapt smaller LLMs to different healthcare datasets. Three methods are assessed: traditional masked language modeling, Deep Contrastive Learning for Unsupervised Textual Representations (DeCLUTR), and a novel pre-training objective utilizing metadata categories from the healthcare settings. These schemes are evaluated on downstream document classification tasks for each dataset, with additional analysis of the resultant embedding spaces. Contrastively trained models outperform other approaches on the classification tasks, delivering strong performance from limited labeled data and with fewer model parameter updates required. While metadata-based pre-training does not further improve classifications across the datasets, it yields interesting embedding cluster separability. All domain adapted LLMs outperform their publicly available general base LLM, validating the importance of domain-specialization. This research illustrates efficient approaches to instill healthcare competency in compact LLMs even under tight computational budgets, an essential capability for responsible and sustainable deployment in local healthcare settings. We provide pre-training guidelines for specialized healthcare LLMs, motivate continued inquiry into contrastive objectives, and demonstrates adaptation techniques to align small LLMs with privacy-sensitive medical tasks.*

---

[5] [https://link.springer.com/chapter/10.1007/978-3-031-56107-8_21](https://link.springer.com/chapter/10.1007/978-3-031-56107-8_21) - Conference Paper

Expand Down Expand Up @@ -61,4 +70,4 @@ representation of multimorbidity and leveraging the growing availability of elec
---

[comment]: <> (The below header stops the title from being rendered (as mkdocs adds it to the page from the "title" attribute) - this way we can add it in the main.html, along with the summary.)
#
#

0 comments on commit b65509e

Please sign in to comment.