Latest News

Regulate AI Like Nuclear Power, Says UK Labour Party

In a recent report by the Guardian, officials in the United Kingdom have proposed the regulation and licensing of artificial intelligence (AI) technology, drawing parallels to industries such as pharmaceuticals and nuclear power. Lucy Powell, a digital spokesperson for the Labour Party, expressed support for this approach, stating that it provides a viable model for the future. Powell emphasized the importance of regulating AI during the developmental stage instead of outright banning it, differing from Italy’s temporary ban on ChatGPT due to privacy concerns.

Powell highlighted her concerns about the need for regulation surrounding large language models that can be widely applied in various AI tools. Her sentiment aligns with U.S. Senator Lindsey Graham, who suggested the establishment of an agency to grant and revoke licenses for AI developers—an idea supported by OpenAI CEO Sam Altman. Altman even went as far as recommending creating a federal agency responsible for setting industry standards and ensuring safety compliance.

The comparison between AI and nuclear technology is not new, as famed investor Warren Buffett has previously drawn a parallel between AI and the atomic bomb. Furthermore, Geoffrey Hinton, a renowned AI pioneer, recently resigned from Google to actively voice his concerns about the potential dangers of AI. These sentiments were echoed by the Center for AI Safety, which stressed the need for global attention to mitigate the risks of AI, placing it on par with other existential threats such as pandemics and nuclear war.

The rapid advancement and implementation of AI technology have also raised worries regarding bias, discrimination, and surveillance. Powell believes these issues can be addressed by requiring developers to be transparent about their data usage. She argues that due to the fast-paced nature of AI progress, a proactive and interventionist government approach is necessary to ensure its safe and responsible development, as opposed to a laissez-faire approach.

As AI continues to transform various aspects of society, the proposal for regulation and licensing presents a critical step toward a safer future. By implementing measures to monitor AI development and usage, policymakers aim to strike a balance between innovation and the protection of individuals and society.

Crypto products and NFTs are unregulated and can be highly risky. There may be no regulatory recourse for any loss from such transactions. Crypto is not a legal tender and is subject to market risks. Readers are advised to seek expert advice and read offer document(s) along with related important literature on the subject carefully before making any kind of investment whatsoever. Crypto market predictions are speculative and any investment made shall be at the sole cost and risk of the readers.