Technology

ChatGPT will no longer provide medical, legal, or financial advice

OpenAI has updated its ChatGPT usage policy, prohibiting the use of the AI system to provide medical, legal, or any other advice that requires professional licensing. 

The changes are detailed in the company’s official Usage Policies and took effect from October 29. 

Under the new rules, users are forbidden from using ChatGPT for: consultations that require professional certification (including medical or legal advice); facial or personal recognition without a person’s consent; making critical decisions in areas such as finance, education, housing, migration, or employment without human oversight; academic misconduct or manipulation of evaluation results. 

Related Articles

OpenAI states that the updated policy aims to enhance user safety and prevent potential harm that could result from using the system beyond its intended capabilities. 

As reported by NEXTA, the bot will no longer give specific medical, legal, or financial advice. 

ChatGPT is now officially an “educational tool”, not a “consultant.” 

The reason for this change has been chalked up to “regulations and liability fears” to avoid lawsuits. 

Now, instead of providing direct advice, ChatGPT will “only explain principles, outline general mechanisms and tell you to talk to a doctor, lawyer or financial professional.” 

Based on the new explicit rules, there will be “no more naming medications or giving dosages… no lawsuit templates… no investment tips or buy/sell suggestions.” 

This clampdown directly addresses the fears that have long surrounded the technology.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button

Adblock Detected

Allow ads or disable ads blocker on you browser