If you ever thought about asking ChatGPT for legal advice instead of paying for a lawyer — turns out you’re not alone. A new study found people are actually more likely to trust advice from AI than from real lawyers.
Researchers ran experiments and found that even when people knew which advice came from a human and which came from AI, they were just as willing to follow ChatGPT. Why? AI tends to sound more complex and confident, while real lawyers use simpler, easier-to-understand language that sometimes sounds less “official.”
But here’s the catch: AI, including ChatGPT, can “hallucinate” — meaning it can make up wrong or even dangerous information that sounds super convincing. And most people aren’t good at spotting the difference.
Experts say this is a big wake-up call: If you’re getting advice about important stuff like law or healthcare, you need to make sure it’s actually correct — because trusting the wrong advice could lead to serious consequences.
Moral of the story: AI is smart, but it’s not a lawyer... and it definitely doesn’t have a law degree.