Search
AI-powered search, human-powered content.
scroll to top arrow or icon

AI

U.S. President Donald Trump hosts his first cabinet meeting with Elon Musk in attendance, in Washington, D.C., U.S., on Feb. 26, 2025.

REUTERS/Brian Snyder

What happens when you ask artificial intelligence to create a video of gilded Trump statues (straight out ofTurkmenistan) and new Trump Hotels (straight out of Atlantic City) featuring an up-tempo, pro-Trump track (straight from the J6 Prison Choir’s club remix album)? You get the US president’s Truth Social post advertising his postwar Gaza proposal, of course.

While Donald Trump’s rhetoric on redeveloping Gaza has been absent from headlines recently, this AI music video troll serves as a win-win-win for him: It reinvigorates his base, enrages his opposition, and leaves his true intentions up for debate.

What isn’t up for debate? The included belly dancers with female bodies and bearded male heads wouldn’t appreciate his slew of executive orders on the strict gender binary. Don’t forget to always double-check your AI outputs …

Courtesy of Midjourney

A federal district court judge in Delaware issued the first major ruling on whether using copyrighted materials to train artificial intelligence systems constitutes copyright infringement.

On Feb. 11, Judge Stephanos Bibas granted a summary judgment to Thomson Reuters, which makes the legal research service Westlaw, against a company named Ross Intelligence. The judge found that Ross infringed on Reuters’ copyrights by using Westlaw headnotes — essentially case summaries — to train its own legal research AI.

Read moreShow less

An image of a firefly from Adobe Firefly.

Adobe Firefly

A floppy-eared, brown-eyed beagle turns her head. A sunbeam shines through the driver’s side window. The dog is outfitted in the finest wide-brimmed sun hat, which fits perfectly atop her little head.

If this hat-wearing dog weren’t a clue, I’m describing an AI video. There are other hints too: If you look closely, the dog is sitting snugly between two black-leather seats, which are way too close together. Outside, cornfields and mountains start to blur, and the road contorts behind the car.

Despite these problems, this is still one of the better text-to-video generation models I’ve encountered. And it’s not from a major AI startup, but rather from Adobe, the company behind PhotoShop.

Adobe first released its AI model, Firefly, for image generation in March 2023 and followed it up this month with a video generator, which is still in beta. (You can try out the program for free, but we paid $10 after quickly hitting a limit of how many videos we could generate.)

Firefly’s selling point isn’t just that it makes high-quality video clips or that it integrates with the rest of the Adobe Creative Cloud. Adobe also promises that its AI tools are all extremely copyright-safe. “As part of Adobe’s effort to design Firefly to be commercially safe, we are training our initial commercial Firefly model on Adobe Stock images, openly licensed content, and public domain content where copyright has expired,” the company writes on its website.

In the past, Adobe has also offered to pay the legal bills of any enterprise user of Firefly’s image model that is sued for copyright violations — “as a proof point that we stand behind the commercial safety and readiness of these features,” Adobe’s Claude Alexandre said in 2023. (It’s unclear if any users have taken the company up on the offer.)

eMarketer’s Gadjo Sevilla said that Adobe has a clear selling point amid a fresh crop of video tools such as OpenAI, ByteDance, and Luma: its copyright promises. “Major brands like Dentsu, Gatorade, and Stagwell are already testing Firefly, signaling wider enterprise adoption,” Sevilla said. “Making IP-safe AI available in industry-standard tools can help Firefly, and by extension Adobe, gain widespread adoption in copyright-friendly AI image generation.”

But Adobe’s track record isn’t spotless. The company had a mea culpa last year after AI images from rival Midjourney were found in Firefly’s training set, according to Bloomberg, likely submitted to the Adobe Stock program and slid past content moderation guardrails.

Firefly’s video model is still new, so public testing will bear out how well it’s received and what exactly users get it to spit out. For our trial, we asked for “an extreme close-up of a flower” and selected settings for an aerial shot and an extreme close-up.

We also asked Firefly to show us President Donald Trump waving to a crowd. It wouldn’t show us Trump because of content rules around politics but gave us some other guy.

And, of course, we asked to see if Mickey Mouse — who is at least partly in the public domain — could ride a bicycle. At least on that front, it’s copyright-safe. You’re welcome, Disney.

When compared to OpenAI’s Sora video generator, Firefly takes longer (about 30 seconds vs. 15 for Sora) and is not quite as polished. But if I get into trouble using Adobe’s products, well, at least a quick call to their general counsel’s office should solve my problems.

Security cameras representing surveillance.

Photo by Lianhao Qu on Unsplash

On Friday, OpenAI announced that it had uncovered a Chinese AI surveillance tool. The tool, which OpenAI called Peer Review, was developed to gather real-time data on anti-Chinese posts on social media.

Read moreShow less

Then Republican presidential candidate Donald Trump gestures and declares "You're fired!" at a rally in New Hampshire in 2015.

REUTERS/Dominick Reuter

Sweeping cuts are expected to come to the US National Institute of Standards and Technology, or NIST, the federal lab housed within the Department of Commerce. NIST oversees, among other things, chips and artificial intelligence technology. The Trump administration is reportedly preparing to terminate as many as 500 of NIST’s probationary employees.

Read moreShow less

President Joe Biden signs an executive order about artificial intelligence as Vice President Kamala Harris looks on at the White House on Oct. 30, 2023.

REUTERS/Leah Millis

US President Joe Biden on Monday signed an expansive executive order about artificial intelligence, ordering a bevy of government agencies to set new rules and standards for developers with regard to safety, privacy, and fraud. Under the Defense Production Act, the administration will require AI developers to share safety and testing data for the models they’re training — under the guise of protecting national and economic security. The government will also develop guidelines for watermarking AI-generated content and fresh standards to protect against “chemical, biological, radiological, nuclear, and cybersecurity risks.”

Read moreShow less

Capitol Hill, Washington, D.C.

On Thursday, Meta public policy director Chris Yiu told attendees at a tech event in Stockholm, Sweden, that the Meta AI-enabled Ray-Ban smart glasses have been slow to come out in Europe because of stringent regulations on the continent.
Read moreShow less

Subscribe to our free newsletter, GZERO Daily

GZEROMEDIA

Subscribe to GZERO's daily newsletter

Most Popular Videos