ChatGPT, once a wide-ranging advisor, is now strictly an educational tool. The chatbot has stopped offering specific advice on legal, medical, and financial issues under its updated policy.

ChatGPT’s Role is Now Limited
People often used ChatGPT for advice. This included relationship queries and complex legal matters. However, following its advice sometimes caused harm. OpenAI has now made significant changes to the AI chatbot.
OpenAI announced that ChatGPT will no longer give medical, financial, or legal advice. Reports confirm this change started on October 29. ChatGPT stopped providing specific guidance on treatment, legal cases, and money matters. The chatbot is no longer a consultant. It will function purely as an educational tool now.
What Changes for Users?
Under these new rules, ChatGPT will not advise users on drug names or dosages. It will not provide templates for lawsuits or legal strategies. Advice on specific investments is also off-limits. It will now only share general principles and basic mechanisms. It will strongly recommend users consult professionals. These include doctors, lawyers, and financial advisors.
Why Is This Change Happening?
Several instances of users being harmed by following ChatGPT’s advice have surfaced. One case involved a 60-year-old man in August. He took ChatGPT’s advice and began consuming sodium bromide instead of salt. This led to serious mental health issues. He required immediate hospitalization.
In another incident, a 37-year-old American man faced difficulty swallowing. He asked ChatGPT about his symptoms. The chatbot incorrectly said cancer was an unlikely cause. The man was satisfied with this answer. He delayed contacting a doctor. He only sought medical help later. By then, the cancer had progressed to stage four.









