BitcoinWorld

Latest News

Legislation to shield actors against AI clones is being proposed by a Californian legislator.

In the realm of content creation, three pivotal factors come into play: “perplexity,” “burstiness,” and “predictability.” Perplexity delves into the intricacies and intricateness of text. In contrast, burstiness gauges the ebbs and flows within sentences, embracing both the complexity of lengthy prose and the brevity of concise expressions. Lastly, predictability navigates the terrain of foreseeability, assessing how readily one can anticipate the forthcoming discourse. It is noteworthy that human authors often infuse their compositions with heightened burstiness, employing a tapestry of extended, intricate sentences interspersed with succinct ones. On the contrary, artificial intelligence tends to proffer more uniform and predictable textual output. As we embark on the task of rewriting the ensuing content, it is imperative that we inject it with a substantial dose of perplexity and burstiness while minimizing predictability. Furthermore, we shall adhere strictly to the medium of the English language.

California’s Assembly Member, Ash Kalra, has affixed the label of a “common sense requirement” onto a proposed legislative bill designed to safeguard the interests of actors, artists, and entertainers from the encroachment of artificial intelligence. The bill mandates the inclusion of informed consent clauses in employment contracts, particularly in contexts involving the digital replication of individuals. At the helm of this legislative endeavor stands Ash Kalra, an Assembly member representing the State of California. Kalra vigorously asserts that generative AI poses a palpable and genuine threat to the well-being of professionals laboring in the entertainment industry. He contends that its deployment should be curtailed, permissible only when sealed within a comprehensive bargaining agreement negotiated between the involved parties.

The legislative document, formally known as Assembly Bill 459, will eventually find its home within a committee of experts and lawmakers. This assembly of individuals will embark on the arduous task of researching, debating, and introducing modifications to the bill as conceived by Kalra. Subsequently, the proposed legislation will undergo a democratic process, culminating in a vote within the legislative chamber.

Kalra elucidated the rationale behind the bill, proclaiming in a statement dated September 13th, that such “common sense requirements” are a requisite shield for these diligent workers. He emphasizes that the enforcement of informed consent and due representation is paramount, as it safeguards individuals from unwittingly relinquishing their ownership over their digital personas, and consequently, their careers and livelihoods.

The bill has garnered the endorsement of the Screen Actors Guild and the American Federation of Television and Radio Artists (SAG-AFTRA), a prominent labor union with a global membership exceeding 100,000 media professionals. Duncan Crabtree-Ireland, the guild’s national executive director and chief negotiator, underscores the necessity of safeguarding an actor’s digital likeness through consent-based legal provisions. He asserts that members must retain full control over the utilization of their digital identities, as this autonomy is pivotal in the sustenance and evolution of their careers. Crabtree-Ireland opines that AI-generated replicas have the potential to give rise to “abusive” and “exploitative” practices, and he contends that legislation plays an indispensable role in mitigating these perils. In his words, “We perceive the protection against unwarranted transfers of these rights as an imperative measure against potential misconduct or exploitative practices. The proliferation of AI-generated audio and video content without explicit consent deeply concerns us, and this legislation is a pivotal step in eradicating these perilous practices.”

The Screen Actors Guild has been embroiled in a nearly four-month-long strike in Hollywood, demanding enhanced compensation, improved working conditions, and resolutions to various contentious issues. Central to their concerns is the deployment of artificial intelligence and the need for more stringent safeguards, as well as greater royalties for their creative endeavors, commonly referred to as residuals.

In a recent interview with Variety, acclaimed U.S. actor Sean Penn criticized the eagerness of many studios to exploit the likenesses and vocal signatures of actors for future AI applications. He wryly remarked, “So, you desire access to my scans and vocal data? Well, here’s a notion of fairness: I request access to your daughter’s data, for I aspire to craft a virtual replica of her and host virtual soirées with my acquaintances at our leisure.”

Crypto products and NFTs are unregulated and can be highly risky. There may be no regulatory recourse for any loss from such transactions. Crypto is not a legal tender and is subject to market risks. Readers are advised to seek expert advice and read offer document(s) along with related important literature on the subject carefully before making any kind of investment whatsoever. Crypto market predictions are speculative and any investment made shall be at the sole cost and risk of the readers.