Latest News

ChatGPT and AI Must Pay For the News It Consumes: News Corp Australia CEO

According to Michael Miller, generative AI is a trend by digital corporations to use other people’s creative stuff “without compensating them for their original effort.”

According to News Corp Australia’s CEO, makers of artificial intelligence (AI) powered applications should pay for the news and content used to improve their products.

In an editorial published on April 2 in The Australian, Michael Miller urged “creators of original news and material” to avoid the mistakes that he said “decimated their businesses,” allowing digital companies to benefit from utilizing their stories and information without recompense.

Chatbots are software that consumes news, data, and other material to generate responses to queries that imitate written or spoken human speech. The ChatGPT-4 chatbot from AI firm OpenAI is the most known example. According to Miller, the rapid emergence of generative AI is another another attempt by big digital firms to create “a new pot of gold to maximize revenues and profit by absorbing the creative content of others without compensating them for their original effort.”

Miller used OpenAI as an example, claiming that the corporation “quickly developed a business” worth $30 billion by “taking the original content and ingenuity of others without remuneration or acknowledgement.” In 2021, the Australian federal government implemented the News Media Bargaining Code, which requires Australian tech platforms to pay news publishers for news material made available or linked on their platforms.

Miller believes that similar laws are required for AI so that all content creators are fairly compensated for their efforts. “Creators ought to be recognized for their original work being used by AI engines that are stealing the style and tone of not only journalists, but also musicians, novelists, poets, historians, painters, filmmakers, and photographers, to mention a few.”

More than 2,600 technology professionals and researchers recently signed an open letter advocating a temporary halt to further AI development, citing “deep hazards to society and mankind.” Meanwhile, Italy’s data protection agency issued a temporary shutdown of ChatGPT and began an inquiry into alleged violations of data privacy rules.

 Miller believes that an agreement, rather than outright prohibitions or bans on the technology, can benefit both content providers and AI businesses. I understand your concerns, but I will not sign this. LLMs will never become AGIs. They, like many other items, offer societal risks. They also have a lot of good potential. Societal pressure should be kept for bioweapons and nuclear weapons, not complex scenarios like this. He explained that with “proper guardrails,” AI has the potential to become a significant journalistic resource, assisting in content creation and “gathering information faster,” as well as assisting in publishing on many platforms and accelerating video production.

The crypto business is also seeing more projects that use AI, albeit this is still in its early stages. Miller believes AI engines risk future success if they can’t persuade the public that their information is trustworthy and reliable; “to achieve this, they will have to fairly recompense people who give the content for their success.”


Crypto products and NFTs are unregulated and can be highly risky. There may be no regulatory recourse for any loss from such transactions. Crypto is not a legal tender and is subject to market risks. Readers are advised to seek expert advice and read offer document(s) along with related important literature on the subject carefully before making any kind of investment whatsoever. Crypto market predictions are speculative and any investment made shall be at the sole cost and risk of the readers.