1) NVIDIA highlighted gen-AI media workflows at GTC (Mar 17, 2026)
What happened: NVIDIA's GTC live updates outlined new AI media workflows across color grading, video localization, and production pipelines, with named tooling references including Moonvalley, WPP Open and ElevenLabs integrations.
Why it matters for ads and film: This is a practical signal that multimodal AI stacks are converging into real production paths for studio and agency teams, not just isolated model demos.
Source: NVIDIA GTC 2026 live updates →
2) Runway Characters pushed real-time image-to-avatar video agents (Mar 9, 2026)
What happened: Runway launched Runway Characters, a real-time video agent API that generates conversational avatars from a single image, with control over voice, personality and actions.
Why it matters for ads and film: It enables branded characters and campaign spokespeople that can stay visually consistent while interacting live in owned channels and interactive experiences.
Source: Runway Characters announcement →
3) Runway launched Labs to accelerate next-gen video products (Mar 11, 2026)
What happened: Runway introduced Runway Labs, an internal incubator dedicated to discovering new product applications for generative video and world models.
Why it matters for ads and film: Teams should expect a faster cadence of specialized video tooling for pre-vis, concepting, iterative campaign production and interactive brand formats.
Source: Runway Labs announcement →
4) Google Ads made VRC Non-Skips generally available (week of Mar 16, 2026)
What happened: Google announced global GA for VRC Non-Skip ads in Google Ads and DV360, with AI-driven optimization across 6s, 15s, and 30s CTV non-skippable formats.
Why it matters for ads and film: This increases practical distribution options for AI-assisted creative variations, especially where full-message completion and CTV reach are key campaign goals.
Source: Google Ads VRC Non-Skip announcement →
5) ElevenLabs shipped Scribe v2 Realtime for live transcription (Mar 14, 2026)
What happened: ElevenLabs introduced Scribe v2 Realtime, a low-latency speech-to-text model positioned for live agents, meeting assistants and real-time captioning workflows.
Why it matters for ads and film: Faster, more accurate live transcription shortens post-production loops for captioning, localization and rapid creative review across distributed teams.
Source: ElevenLabs Scribe v2 Realtime announcement →
Bottom line for creative teams
- Video: avatar-based real-time video and enterprise media workflows are moving from experimentation to deployment.
- Image: single-image character generation is becoming an operational format for branded interactive experiences.
- Sound: low-latency transcription and voice infrastructure continue to compress turnaround time in campaign production.
We can map model shifts to your campaign roadmap, then package testable concepts you can ship fast.