Week 9: Root Access
April 27, 2024
Hi everyone, and welcome to Week 9 of my senior project! I’m so excited to share this week’s updates as my custom GPT is almost complete. This week, I fed my data into the large language model so that my model would be able to provide assistance in context. So, let’s get into it!
Experimenting with the Dataset
My first step to put the data in context was to feed my sample dataset into the custom GPT. However, this proved to be unsuccessful as OpenAI restricts its users from using the GPT to read datasets directly due to lack of confidentiality. This led me to manually input my dataset and have the model provide an overview of each patient’s medical history in a readable format. (I can’t provide pictures right now as I have reached my GPT-4 limit but I’ll make sure to do that when the limit time is over.)
As I mentioned a couple of weeks ago, I configured the model to ask the user for their patient ID number. However, there was no way for it to confirm whether their ID is valid as there was no dataset for it to refer to. Which is why when I fed the data into the chatbot, I ensured that it would look for the patient’s ID number within the patient information I provided. So now, my model is able to put this information in context and keep their medical history in mind while providing treatments.
NaN
This raises the question: What if the patient’s ID number or medical history isn’t included in the system? To further my project’s goal of making oral care accessible to more communities, my model still helps the user by inquiring more on their issue and issuing immediate remedies.
During my last talk with Dr. Kalai, I gained insight into how she handles new patients and was informed that if there is no data on their medical history, patients must fill out a medical examination form. I plan to use a similar system by asking the user questions such as: Have you had any surgeries or Do you have/ plan to have any metal objects in your body?
SQL
Manually inputting the dataset would be ineffective since when implementing this project in the real world, it would require a lot of manpower as we are exposed to large amounts of information. To ease this potential problem, I plan on feeding in the data using either Llamaindex or PostgreSQL and then connecting this with ChatGPT using my API key.
After selecting the database, I plan on training and testing my model to predict the dental charting. The model will look at each patient’s information (medical history, intra, and extra oral exams) to predict the treatment they will need.
Follow along to see which database works and whether I am able to successfully connect my model with the dataset. Thank you all so much for reading and see you next week! Please let me know if you have any questions.
Citations
Seabra, Antony. “Unleashing the Power of Knowledge: Connecting Chatgpt to Databases for Advanced Question-Answering…” Medium, Medium, 10 Aug. 2023, medium.com/@antonyseabra/unleashing-the-power-of-knowledge-connecting-chatgpt-to-databases-for-advanced-question-answering-8dfe5f140b1b.
Leave a Reply
You must be logged in to post a comment.