An AI assistant misdiagnosed a patient and prescribed wrong medicines because the doctor did not double-check. The system should require explicit doctor review and approval before finalizing any AI-generated diagnoses or prescriptions to prevent patient harm.
So day before yesterday, I got an allergic reaction and went to visit the doctor check-up. They have updated to an AI assistant called freed to write notes for them so my doctor just talked to me, and the assistant heard me and wrote my symptoms and my medicine. The doctor gave me my form with medicines written on it without double checking what the assistant had written down. I went out to the store and got them and applied the lotions and ate the medicine accordingly and slept off. Woke up after an hour and my allergy had literally worsened. I immediately called the doctor back and paid them a visit to get check again. I was furious already and then they admitted their mistake. Apparently their assistant Mr. Freed misdiagnosed me, wrote wrong symptoms and decided I have atopic dermatitis and prescribed wrong medicines and lotions. They wanted to do a check up again and give me correct medicines after that and that too they wanted me to pay for it again but I refused it and went to another doc. It's seriously becoming a scary world out there with AI literally everywhere. TL;DR: Doctor trusted their AI assistant and it worsened my allergy as it misdiagnosed me