ChatGPT is just a name now: The 'technology revolution' scam, after nearly 2 years still just a tool for homework and cheating on tests, causing OpenAI to lose $5 billion

Once promising to change the world, but for almost 2 years, ChatGPT hasn't done much in reality, and it even causing OpenAI to lose up to $5 billion.

Posted  336 Views updated 4 months ago

Sources: BI, WSJ, Washington Post

 

Image

When ChatGPT launched in late 2022, it was quickly used to help with homework and even cheat on tests by writing essays for students. However, the chatbot's fame grew with promises of starting a new revolution in technology.

Microsoft's total market value, after investing $10 billion in OpenAI to develop ChatGPT, rose sharply above $3 trillion. Meanwhile, many rivals from Google and Facebook to Apple and Tesla also jumped into artificial intelligence (AI).

But nearly 2 years have passed, and ChatGPT still hasn't brought Microsoft any profit besides boosting its stock price.

The New York Post says OpenAI could lose up to $5 billion this year because running ChatGPT costs too much. The company expects to spend up to $7 billion to run the chatbot, not including $1.5 billion a year in salaries for 1,500 employees.

However, the results ChatGPT brings back don't match up.

In fact, a survey by The Washington Post shows 21% of ChatGPT users use it to write scripts or creative content, and 18% use it for homework or writing essays.

The rest use it for translation, finding information, and coding.

Business Insider (BI) clearly states that ChatGPT still hasn't brought many changes or any big revolution besides mainly doing homework or writing essays for students.

Although OpenAI said they would develop a tool to detect writing that uses ChatGPT, no product has been released yet.

This move makes many people doubt OpenAI's motives in encouraging students to cheat on tests using AI technology.

Image

"The Scam"

When the issue of users cheating on tests with AI came up, OpenAI said they would develop a tool to mark content made by ChatGPT. These marks wouldn't show to users but could be found by any another AI tool made by OpenAI with 99.99% accuracy.

However, BI says OpenAI is having internal conflicts about developing this tool since most ChatGPT users are people who need AI for homework and other creative writing.

An OpenAI spokesperson argues that developing a tool to identify ChatGPT content could "harm" users who don't have English as their first language.

"The content detection tool we're developing has potential but also hidden risks, so we're looking for another option," and OpenAI spokesperson told the Wall Street Journal (WSJ).

The funny thing is, The Washington Post survey shows nearly 30% of ChatGPT users would stop using it if OpenAI develops marking software like this, while other companies in the field don't do the same.

According to the Washington Post, most of the data OpenAI uses to train ChatGPT is public content, including essays, research papers, and knowledge books. This makes students really like using it to summarize homework, write essays, or even answer test questions.

However, a big weakness is that ChatGPT doesn't really understand what it's saying. Most answers are copied from articles or even make up wrong information.

In 2023, a lawyer was fired for using ChatGPT to draft a speech in court, and the chatbot made up some laws that don't exist in real life.

Clearly, the "revolution" many people still expect from ChatGPT is still just a buggy chatbot that hasn't made a profit and is mainly used for cheating or "cooking up" content.

Image

Other features

The Washington Post study shows only about 5% of ChatGPT users use it for private things like asking for dating advice or how to tell if your partner is cheating.

However, many experts warn this chatbot can easily give wrong or negative information, and should only be used as a reference instead of trusting it completely.

Also, about 7% of users use ChatGPT to write code, find bugs, or understand programming. But most of this feature is only useful for beginners rather than being able to fully replace programmers as many reports have said.

Professor Hatim Rahman from Northwestern University says ChatGPT can't replace humans. It just helps employees access programming or coding more easily even without basic knowledge.

"It's like TurboTax for taxes," Mr. Rahman said. "People can now do basic tax returns with this app, but tax experts and accountants won't disappear just because of one app."

Next, only 2% of ChatGPT users use it to look for jobs or write job applications.

With these results, more investors doubt AI's success in tech. Can this technology really create a new wave like the iPhone did for phones?

CREDIT: antt.nguoiduatin.vn


Your reaction?

0
LOL
0
LOVED
0
PURE
0
AW
0
FUNNY
0
BAD!
0
EEW
0
OMG!
0
ANGRY
0 Comments

  • ChatGPT is just a name now: The 'technology revolution' scam, after nearly 2 years still just a tool for homework and cheating on tests, causing OpenAI to lose $5 billion
  • hoangphan