LOWDOWN_PINK_2

Volume Two

Welcome

If 2025 was the year of generation, January 2026 has officially kicked off the year of control. Things are slowing down, we’re thinking a bit harder about what AI will look like, and how we can keep it safe. 

Here is what has happened so far this month.



The Tech

The Consumer Electronics Show (Jan 6–9) wrapped up in Vegas. If you aren’t aware, this is like the Oscars for tech nerds. The headline wasn’t a new shiny model, but rather where AI is living. We saw a massive pivot to Invisible AI. How companies are putting their agents directly into hardware. Samsung and Lenovo both showcased ‘Device Experience’ ecosystems. Samsung showcased Multi Control. If you own a Galaxy Book and an S26, the devices now share a clipboard and mouse so seamlessly they feel like one computer. For example, if you’re on a location scout and take photos on your phone, you don't need to email them to yourself. You just sit at your laptop, move your mouse cursor off your screen and onto your phone, and drag the files directly into your edit timeline. It’s seamless, but it only works if you stay 100% Samsung.

Lenovo, on the other hand, showcased their new Smart Connect ecosystem (featured on the Aura Edition laptops) which allows for drag-and-drop file sharing between their PCs and iPhones and Macs. Their Smart Share feature is particularly impressive: you just tap your phone against the side of the laptop screen to instantly trigger a transfer. Once the footage is there, the laptop's NPU (Neural Processing Unit) allows you to use AI tools to sort, tag, or rename the files locally at lightning speed, without bogging down the cloud.

January saw the wider rollout of Adobe’s ‘Firefly Video Editor’, powered by Runway’s Aleph model. Previously, if you generated a shot and the hands were weird (seven fingers and no thumb), you had to re-roll the dice and generate a whole new clip. As of this month, we now have a functional ‘Prompt-to-Edit’ option. You can circle a glitchy hand and type ‘fix fingers’, and it keeps the rest of the shot exactly the same.

It was confirmed that Netflix’s new production Las Muertas was graded using the new DaVinci Resolve AI workflow. So for example, if the colorist clicks the actors’ face once the AI (specifically the Magic Mask tool) understands "OK, this is a human face" and tracks it perfectly through shadows, turning heads, and hair flips. Additionally, we’re seeing features like ‘IntelliScript’ (which builds a timeline from a script PDF) being used on major sets. You upload the script PDF and the video files into DaVinci Resolve, the AI "listens" to the audio in the video files, matches the spoken dialogue to the text in your PDF, and automatically builds the timeline for you.

The Music

The line between art and content is continuously blurring. Meet Sienna Rose and Jacub. Rose (anonymous Spotify musician), recently hit 5 million streams on a single track, reportedly pulling in roughly $4,000AUD a week in royalties. In Sweden, a chart-topper named "Jacub" was recently banned after it was revealed the artist didn't exist. But this is where it gets tricky. The audience reaction proves that ignorance is bliss might be a legitimate business strategy. Before Sienna was outed as AI, she was slipping into playlists next to Norah Jones, and even got shared by Selena Gomez. But the second the truth came out, fans immediately branded it soulless slop and felt cheated. It raises a massive question for 2026: Do audiences actually care about the sound, or do they care about the soul? The data suggests that while people hate the idea of AI music, they might not actually be able to hear the difference until you tell them.

The Regulation

On January 9, the European Commission closed the door on the biggest loophole in AI: the ‘Ignorance Defense.’ For years, AI companies have operated on a ‘don't ask, don't tell’ basis regarding their training data. They scrape billions of images from the internet and when a specific artist complains, they essentially say, "We have so much data, we couldn't possibly know your specific work was in there." The new EU machine-readable protocols change this legally. It forces companies to keep "receipts" of everything they scrape. They can no longer claim they "didn't know" they were using copyrighted work. If they can’t list it, they can’t use it. While we talk about copyright and money, January also highlighted the darker danger of unregulated AI. On January 22, Paris Hilton appeared on Capitol Hill alongside Alexandria Ocasio-Cortez to push for the DEFIANCE Act. Hilton, who described herself as "the original target of non-consensual media" (referencing her 2004 leaked tape), is using her platform to fight the explosion of AI-generated deepfake pornography. She highlighted that AI isn't just stealing art; it’s stealing identity. The DEFIANCE Act aims to allow victims to sue those who create or distribute "digital forgeries" of them. If you use "wild west" AI tools that have no guardrails (like the recently under-fire Grok), you are supporting an ecosystem that exploits real people.

Lastly, we are watching the SAG-AFTRA negotiations closely this month as they begin to enforce the ‘Digital Replica’ riders in commercial contracts. If you are casting talent for a February shoot, double-check your rider, if you plan to use AI to fix their dubbing later, you now need explicit, separate consent for that. Technically, SAG is American. But since their 'Global Rule One' follows actors home to Australia, and our local union (MEAA) typically adopts their standards, these US rules effectively set the 'premium' standard for our industry.

The Producers © 2026

View