philip lelyveld The world of entertainment technology

22Jul/24Off

Will.i.am on vulnerability, AI, and the future of music

... The 49-year-old hitmaker emphasised that it was down to musicians to shape its use proactively, but insisted that no amount of AI in the industry would take away from the vulnerability required to make good music. ...

See the full story here: https://www.independent.co.uk/arts-entertainment/music/news/william-ai-black-eyed-peas-exit-festival-b2582668.html

21Jul/24Off

The Data That Powers A.I. Is Disappearing Fast

...

Yacine Jernite, a machine learning researcher at Hugging Face, a company that provides tools and data to A.I. developers, characterized the consent crisis as a natural response to the A.I. industry’s aggressive data-gathering practices.

“Unsurprisingly, we’re seeing blowback from data creators after the text, images and videos they’ve shared online are used to develop commercial systems that sometimes directly threaten their livelihoods,” he said.

But he cautioned that if all A.I. training data needed to be obtained through licensing deals, it would exclude “researchers and civil society from participating in the governance of the technology.”

...

A.I. companies have claimed that their use of public web data is legally protected under fair use. But gathering new data has gotten trickier. Some A.I. executives I’ve spoken to worry about hitting the “data wall” — their term for the point at which all of the training data on the public internet has been exhausted, and the rest has been hidden behind paywalls, blocked by robots.txt or locked up in exclusive deals. ...

But there’s also a lesson here for big A.I. companies, who have treated the internet as an all-you-can-eat data buffet for years, without giving the owners of that data much of value in return. Eventually, if you take advantage of the web, the web will start shutting its doors. ...

See the full story here: https://www.nytimes.com/2024/07/19/technology/ai-data-restrictions.html

18Jul/24Off

Why Olympic venues are using digital twins

See the 4 minute video here: https://www.bbc.com/reel/video/p0jbsxq6/why-olympic-venues-are-using-digital-twins

17Jul/24Off

Hacker group says it leaked Disney data over the company’s ‘approach to AI’

A group of hackers says it recently leaked internal communications at Walt Disney Co. over the company’s handling of “artist contracts, its approach to AI, and its pretty blatant disregard for the consumer.” ...

California lawmakers are also trying to regulate AI through legislation, and tech companies have responded by urging caution against overregulation. ...

See the full story here: https://www.latimes.com/entertainment-arts/business/story/2024-07-16/disney-leak-hack-nullbulge-ai-artificial-intelligence

16Jul/24Off

These IATSE Artists Are Voting ‘No’ on Their Next Contract, and AI Is to Blame

...

While actors “were able to maintain the rights to their own images and identities and to choose to be replicated or not,” the memo read, “in the new contract language, we have not been given any protections relating to our individual processes when designing, building models, illustrating or creating documents.” ...

Zach Berger, the lead creature designer of the in “Avatar: The Way of Water,” also published a social media thread outlining why he was voting against the contract. Berger explained he was part of the AI task force for the ADG that constructed “proposals that both acknowledged AI’s proliferation, but attempted to protect members’ jobs.”

To his dismay, he did not find any trace of the proposals the task force had assembled in the tentative agreement. ...

“If a producer wants to effectively pre-design the movie for themselves before it even gets to an art department…they want to have the power to do that,” Saunders said. “It’s a huge, huge, huge cost savings. I don’t believe for a second that it’s an accident that stuff was left out.” ...

“We can’t say, ‘Well, this work would normally have taken 10 illustrators three months, and now you’re having two illustrators do it in three weeks,’ Saunders said. “The displacement doesn’t come from the prompts. The selling point of these AI systems is that it means fewer people need to be hired to start with, and that will lead to the credits lists getting shorter on films.” ...

See the full story here: https://www.yahoo.com/entertainment/iatse-artists-voting-no-next-132200635.html

15Jul/24Off

IT’S THE CREATOR ECONOMY

... By Hollywood standards, Dax Shepard is not a huge star. He started this podcast, in his attic on a larkon his own, after a recent movie, CHiPS, failed. I don’t italicize “on his own” because starting a podcast is either hard or brave. Five million available podcasts is proof that starting a podcast is neither of those. I use italics because podcasting is the perfect example of a large and influential Media segment that almost no one thinks is part of the Creator Economy. And that proves the core thesis of this piece: The Creator Economy is both very different and much bigger than most people think. ...

Yes, Joe Rogan is a millionaire. But he wasn’t when he started his podcast - now the biggest in the world - and he was making $30 million per year on that indie podcast long before Spotify came along. Which is precisely why Spotify wanted him so badly, and why so many huge Media companies are clamoring to do enormous deals with similar Creators like the Smartless guys. They know something I’ve been professing for a while: Increasingly, Creators are generating audiences just as large, and far more engaged, as gatekeeper-led content. ...

See the full story here: https://eshap.substack.com/p/its-the-creator-economy

12Jul/24Off

Here’s how OpenAI will determine how powerful its AI systems are

OpenAI has created an internal scale to track the progress its large language models are making toward artificial general intelligence, or AI with human-like intelligence, a spokesperson told Bloomberg.

Today’s chatbots, like ChatGPT, are at Level 1. OpenAI claims it is nearing Level 2, defined as a system that can solve basic problems at the level of a person with a PhD. Level 3 refers to AI agents capable of taking actions on a user’s behalf. Level 4 involves AI that can create new innovations. Level 5, the final step to achieving AGI, is AI that can perform the work of entire organizations of people. ...

In May, OpenAI dissolved its safety team after the group’s leader, OpenAI cofounder Ilya Sutskever, left the company. Jan Leike, a key OpenAI researcher, resigned shortly after claiming in a post that “safety culture and processes have taken a backseat to shiny products” at the company. While OpenAI denied that was the case, some are concerned about what this means if the company does in fact reach AGI. ...

See the full story here; https://www.theverge.com/2024/7/11/24196746/heres-how-openai-will-determine-how-powerful-its-ai-systems-are

12Jul/24Off

The AI-focused COPIED Act would make removing digital watermarks illegal

PhilNote: I like that it appears to have teeth, but there is a good chance that, as a slow-moving NIST process, by the time it is developed and deployed industry will say the market is too big to implement it and it is unenforceable anyway. Oh yeah, also it will "stifle innovation."

A bipartisan group of senators introduced a new bill to make it easier to authenticate and detect artificial intelligence-generated content and protect journalists and artists from having their work gobbled up by AI models without their permission.

The Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act) would direct the National Institute of Standards and Technology (NIST) to create standards and guidelines that help prove the origin of content and detect synthetic content, like through watermarking. It also directs the agency to create security measures to prevent tampering and requires AI tools for creative or journalistic content to let users attach information about their origin and prohibit that information from being removed. Under the bill, such content also could not be used to train AI models.

Content owners, including broadcasters, artists, and newspapers, could sue companies they believe used their materials without permission or tampered with authentication markers. State attorneys general and the Federal Trade Commission could also enforce the bill, which its backers say prohibits anyone from “removing, disabling, or tampering with content provenance information” outside of an exception for some security research purposes. ...

See the full story here: https://www.theverge.com/2024/7/11/24196769/copied-act-cantwell-blackburn-heinrich-ai-journalists-artists?mc_cid=442e730f2e&mc_eid=3ce5196977

12Jul/24Off

OpenAI’s new superalignment

...

OpenAI's new superalignment team, which (over the next four years) will dedicate 20% of OpenAI’s compute resources to solving alignment challenges, will be co-led by Ilya Sutskever and Jan Leike. The team will focus on developing scalable training methods, validating alignment models, and conducting adversarial testing to ensure the AI systems align with human intent and do not go rogue.

Additionally, OpenAI is collaborating with industry leaders like Anthropic, Google, and Microsoft through the Frontier Model Forum. This initiative aims to advance AI safety research, identify best practices, and facilitate information sharing among policymakers, academia, and civil society. The Forum will focus on developing standardized evaluations and benchmarks for frontier AI models to ensure their responsible development and deployment. ...

July 12, 2024 issue https://shellypalmer.com/blog/

12Jul/24Off

Will K-pop’s AI experiment pay off?

...

The music video features an AI-generated scene, and the record might well include AI-generated lyrics too. At the launch of the album in Seoul, one of the band members, Woozi, told reporters he was "experimenting" with AI when songwriting.

“We practised making songs with AI, as we want to develop along with technology rather than complain about it," he said.

...

Her worry though, is that a whole album of AI generated lyrics means fans will lose touch with their favourite musicians.

"I love it when music is a reflection of an artist and their emotions," she says. "K-pop artists are much more respected when they’re hands on with choreographing, lyric writing and composing, because you get a piece of their thoughts and feelings. ...

“What I've learned by hanging out in Seoul is that Koreans are big on innovation, and they're very big on ‘what's the next thing?’, and asking, ‘how can we be one step ahead?’ It really hit me when I was there,” he says.

“So, to me, it's no surprise that they're implementing AI in lyric writing, it's about keeping up with technology.” ...

See the full story here: https://www.bbc.com/news/articles/c4ngr3r0914o