1) Runway launched Characters for real-time brand avatars (Mar 9, 2026)
What happened: Runway announced Runway Characters, a real-time video agent API that can generate conversational avatars from a single image, with controls for voice, personality, and behavior.
Why it matters for ads and film: This lowers the barrier for interactive brand spokespeople, product explainers, and campaign microsites that need visual consistency and live conversational capability.
Source: Runway Characters announcement →
2) Runway created Labs to incubate next-gen video products (Mar 11, 2026)
What happened: Runway introduced Runway Labs, an internal incubator focused on new applications for generative video and world models across sectors including film and advertising.
Why it matters for ads and film: Teams should expect faster product experimentation and more workflow-specific tools for pre-vis, concepting, and campaign iteration.
Source: Runway Labs announcement →
3) OpenAI accelerated the Sora transition (updated this week)
What happened: OpenAI Help Center documentation now states the Sora 1 web experience is actively being deprecated while Sora 2 and Sora app workflows continue to roll forward.
Why it matters for ads and film: If your creative pipeline depends on Sora 1 web behavior, update internal workflows now to avoid delivery delays and mismatched output assumptions.
Source: OpenAI Sora help update →
4) IAB Tech Lab opened CoMP for public comment (Mar 10, 2026)
What happened: IAB Tech Lab released the CoMP v1.0 specification for public comment, describing a standard way for AI systems and content owners to define commercial terms before crawling and reuse.
Why it matters for ads and film: This could become important infrastructure for IP licensing, training permissions, and content monetization as generative workflows scale across agencies, studios, and publishers.
Source: IAB Tech Lab CoMP press release →
5) Arcads x ElevenLabs reported ad voice scale (late last week, Mar 6)
What happened: Arcads and ElevenLabs reported over 1 billion ad impressions generated with AI UGC videos using ElevenLabs voices.
Why it matters for ads and film: This is a clear signal that synthetic voice is now production-grade in performance marketing, especially for multilingual direct-response video pipelines.
Source: ElevenLabs Arcads case study →
Bottom line for creative teams
- Video: real-time, conversational video is moving from demo to deployable API products.
- Image: image-to-avatar and image-seeded generation remain central to workflow speed and visual consistency.
- Sound: AI voice quality and ad-scale distribution are now proven in high-volume campaign contexts.
We can map model shifts to your campaign roadmap, then package testable concepts you can ship fast.