lowdown_header_text_1_

Volume One

Welcome

Hello! I’m Pablo Alfierri, the Creative AI Lead here at The Producers.Welcome to the very first edition of THE LOWDOWN. We’re launching this newsletter to give you a clear, honest look at how artificial intelligence is actually evolving in our world. As a production company deeply rooted in live-action projects, we’re seeing the landscape shift every single week. This is our space to share what’s working (and what's not working), and how things are changing.

Since it’s January, let’s cover the year that changed everything. Here is the 2025 AI year-in-review.


The Regulation

In 2025, we saw over 50 major federal lawsuits filed against AI developers. The biggest shockwave? A group of 14 major publishers (including The Guardian and The Atlantic) sued Cohere, resulting in a $1.5 billion settlement. Locally, the Australian Government officially ruled out any "copyright exemptions" for AI training, confirming a creatives-first approach. For us, this means the industry is finally moving toward mandatory licensing, protecting the human artists we work with.

We also saw new language added to our vernacular. The term “AI Slop” officially entered the Macquarie Dictionary this year. It refers to the flood of low-quality, unpolished AI content filling our feeds. We saw major brands like Coca-Cola face heat for "glitchy" AI-generated holiday ads where trucks morphed and lighting didn't match. The Lesson: Audiences have a high "BS-meter" for lazy AI. If it doesn't have a director’s vision behind it, it’s often just noise.

And lastly, the NO FAKES Act (which stands for Nurture Originals, Foster Art, and Keep Entertainment Safe) was introduced. Think of it like a digital right, of sorts, that lasts for your entire life plus 70 years. For a production company like ours, this is actually a huge win for clarity; it establishes a clear "Notice and Takedown" system similar to how copyright works on YouTube. If someone uses an unauthorised AI replica of a performer or a brand ambassador, the penalties are up to $1 million in damages if there wasn't an effort to get consent. It means the "Wild West" of using famous voices for scratch tracks or mood reels without a license is officially coming to an end. Europe has taken the lead on this. On August 2, 2026, it will become mandatory to explicitly label any AI-generated content that could be mistaken for real life, especially deepfakes. This is pushing the industry toward a global standard called C2PA (Content Credentials).

The Tech

The "Big Three" of video generation evolved into production-ready platforms. After months of anticipation, Sora 2 officially launched in September. The big takeaway for us? It wasn't just the 1080p quality; it was the "Character Cameo" feature, allowing us to maintain consistent characters across different shots.

In December, Runway dropped Gen-4.5. It’s significantly faster and, more importantly, introduced advanced Speech-to-Speech and SFX generation, meaning we can now generate synchronised sound and foley alongside the visuals in one workflow. This may sound a tad confusing, but what it means is that you (or a voice actor) can record a "scratch track" with the exact cadence, emphasis, and emotion you want. The S2S tool then "skins" that performance with a high-end brand voice. It preserves human performance (the timing of a joke or the breathiness of a luxury ad) while giving you the professional "final" voice quality instantly. No more robotic delivery.

Luma’s new "Modify" feature (released Dec 18) changed the game for VFX. Instead of hours of manual masking, we can now use text prompts to swap outfits, remove objects, or restyle specific parts of a shot. It tracks the original movement and lighting perfectly, it’s basically "Text-to-VFX."

In a massive shift for IP and the world of production, The Walt Disney Company announced a $1 billion partnership with OpenAI in December. This deal allows for the generation of iconic characters (Marvel, Star Wars, Pixar) within the Sora ecosystem under strict licensing. This signals a future where "official" brand assets are integrated directly into generative tools, making high-end commercial remixes much more accessible.

The Agent

We heard a lot about “Generative AI” in 2025. I predict we’ll hear about “Agentic AI” a lot more in 2026. While Generative AI is like a very smart intern who waits for you to tell them exactly what to write or draw, Agentic AI is the workmate who understands the goal and goes off to make it happen. An "agent" doesn't just respond to a prompt; it reasons, plans, and executes multi-step workflows. It can access your calendar, browse the web for location permits, cross-reference them with weather reports, and then draft an email to the crew, all without you having to baby-sit every step. In short, we’re moving from an era where we prompt AI to an era where we task it.

2025 was the year the training wheels came off. We saw a massive shift from chatbots to "Agentic Ecosystems" where multiple AI agents talk to each other to solve complex problems. In 2026, and onward, Agentic AI will only continue to change and evolve. Imagine this: as an example, a specific agent could be your ‘Spam Agent’. This ‘Spam Agent’, we’ll call it Rex, will read all of your e-mails, within a few seconds. Rex will then discern your spam emails from important emails, and organise accordingly. Rex could even give you reminders, after checking your emails, or let you know if you’ve missed anything important. All in the background.

As we head into 2026, the trend is clear: AI is not going anywhere, but the "Wild West" era is officially closing. We’re stepping into a much more mature, regulated, and, honestly, more exciting chapter. Not only will we see harsher laws and regulations surrounding AI content, but we’ll also see a massive leap in what these tools can actually do for us in a professional production environment.

Stay tuned for the next issue.

THE LOWDOWN - VOL.2
Article

THE LOWDOWN - VOL.2
THE LOWDOWN - VOL.2
Article

The Producers © 2026

View