This Artificial Intelligence (AI) Study From the UK Presents A GPT-3 Like Language Model That Can Simulate A Patient’s Health Predictions Using EHRs
Most information regarding each patient’s health condition and clinical history kept in Electronic Health Records (EHRs) is recorded in clinical notes inside the unstructured text. Such data might be used to create temporal models that recreate the patient’s health trajectory, anticipate illnesses and treatments, determine risk ratings, and much more. Most earlier research on prediction and forecasting relies on structured datasets or structured data found in EHRs. It is targeted at predicting events that will occur over a specified period. Structured datasets have the drawback of not always being accessible, and even when they are, they may only provide a partial picture of a patient’s experience (80% of the patient’s data is in free text).
On top of BERT, other earlier investigations are being included. One such is BEHRT, which only uses a small fraction of the 301 diseases included in the structured section of EHRs. The information must be categorized into patient visits since BEHRT can only forecast conditions that will manifest during the patient’s subsequent hospital visit or within a predetermined time frame. Additionally, they point out that the method uses several labels, which might be problematic when the number of projected ideas rises. Another illustration is the G-BERT model, whose inputs are all single-visit samples and are inadequate for capturing long-term contextual information in the EHR. Only structured data is used, the same as in BEHRT.
The International Classification of Diseases codes the structured diagnosis data that Med-BERT is trained on. The objective job of predicting a new disease is not directly introduced into the model; instead, it is improved using data from the conventional Masked Language Modelling (MLM) task. The model can only be used with ICD-10 codes, and it has only been tested on a small selection of illnesses, which may need more to predict general performance accurately. In addition to BERT-based models, they also draw attention to Long Short-Term Memory (LSTM) models, such as the LM-LSTM model put out by Steinberg et al. They refine their model to predict specific future occurrences, much like the other models and only use structured data.
In this study, they develop a unique Foresight model for forecasting biological concepts using the free text data from the EHR. This study follows the methodology described in GPTv3, where several jobs are implicit in the dataset; for instance, one GPTv3 model may automatically produce HTML code, respond to queries, compose tales, and much more. The same is true of foresight since the same model might be applied to anticipate illness risk, provide differentials for upcoming events or treatments, and much more.
Their main contributions included: A transformer-based approach that generates temporal sequences of biomedical concepts in clinical narratives. Evaluating the model's performance in multiple hospitals, including both physical and mental health facilities. Making a model trained on over 800,000 patients from a major UK hospital, representing a diverse population, publicly available through a web application.A publicly accessible dataset (currently under review for submission to the Physionet database)."
Check out the Paper and Website. All Credit For This Research Goes To Researchers on This Project. Also, don’t forget to join our Reddit page and discord channel, where we share the latest AI research news, cool AI projects, and more.
Aneesh Tickoo is a consulting intern at MarktechPost. He is currently pursuing his undergraduate degree in Data Science and Artificial Intelligence from the Indian Institute of Technology(IIT), Bhilai. He spends most of his time working on projects aimed at harnessing the power of machine learning. His research interest is image processing and is passionate about building solutions around it. He loves to connect with people and collaborate on interesting projects.
Credit: Source link
Comments are closed.