Technology

Instagram cracks the code on iPhone video brilliance

MUMBAI: Every iPhone video carries a secret ingredient: tiny packets of data that tell screens exactly how bright and vivid to display footage. For three years, Instagram binned them. Not anymore. Chris Ellsworth, Cosmin Stejerean and Hassene Tmar—engineers at Meta—have cracked a thorny problem that made high dynamic range (HDR) videos look drab on Instagram's iOS app, particularly when viewed in dim light or at low screen brightness. The culprits were two pieces of metadata embedded in iPhone recordings: Dolby Vision, which enhances colour, brightness and contrast, and ambient viewing environment (amve), which adjusts rendering based on lighting conditions. Since 2022, Instagram has supported HDR video. But its encoding pipeline, built on FFmpeg, an open-source tool, stripped out both metadata types. The result? Pictures that looked nothing like their creators intended.The fix required surgical precision across three stages of video processing. First, the team convinced FFmpeg's developers to add amve support in 2024. The data proved remarkably consistent—every frame carried identical values—allowing a quick workaround whilst proper support landed. Dolby Vision proved trickier. iPhones encode video in HEVC format using profile 8.4, but Instagram delivers AV1 and VP9 codecs instead. Meta collaborated with Dolby and FFmpeg developers to implement profile 10, allowing Dolby Vision metadata to travel within AV1 streams. They also built custom extraction tools to feed metadata into Apple's display layer, since Instagram decodes video independently rather than using Apple's standard player. Then came a nasty surprise. Initial tests showed viewers watched less video with Dolby Vision enabled. The metadata added roughly 100 kilobits per second to file sizes—enough to slow loading times and send impatient scrollers elsewhere. Meta's solution: implement a compressed format that slashed overhead to 25 kbps, requiring another 2,000 lines of code for compression and decompression. The second test vindicated the effort. Viewers spent more time watching HDR videos, particularly in low-light conditions where proper metadata reduced eye strain. Instagram for iOS now delivers Dolby Vision metadata on all AV1 encodings derived from iPhone HDR uploads, making it Meta's first app with full support. Facebook Reels is next in line. The broader web remains a problem child. Browser and display support for Dolby Vision is patchy, meaning most readers cannot see the difference on this page. For that, you will need an iPhone and Instagram.Three years to fix discarded data. But for anyone squinting at their screen in bed, scrolling through Reels at 2am, it matters rather a lot.

Instagram cracks the code on iPhone video brilliance

MUMBAI: Every iPhone video carries a secret ingredient: tiny packets of data that tell screens exactly how bright and vivid to display footage. For three years, Instagram binned them. Not anymore.

Chris Ellsworth, Cosmin Stejerean and Hassene Tmar—engineers at Meta—have cracked a thorny problem that made high dynamic range (HDR) videos look drab on Instagram's iOS app, particularly when viewed in dim light or at low screen brightness. The culprits were two pieces of metadata embedded in iPhone recordings: Dolby Vision, which enhances colour, brightness and contrast, and ambient viewing environment (amve), which adjusts rendering based on lighting conditions.

Since 2022, Instagram has supported HDR video. But its encoding pipeline, built on FFmpeg, an open-source tool, stripped out both metadata types. The result? Pictures that looked nothing like their creators intended.The fix required surgical precision across three stages of video processing. First, the team convinced FFmpeg's developers to add amve support in 2024. The data proved remarkably consistent—every frame carried identical values—allowing a quick workaround whilst proper support landed.

Dolby Vision proved trickier. iPhones encode video in HEVC format using profile 8.4, but Instagram delivers AV1 and VP9 codecs instead. Meta collaborated with Dolby and FFmpeg developers to implement profile 10, allowing Dolby Vision metadata to travel within AV1 streams. They also built custom extraction tools to feed metadata into Apple's display layer, since Instagram decodes video independently rather than using Apple's standard player.

Then came a nasty surprise. Initial tests showed viewers watched less video with Dolby Vision enabled. The metadata added roughly 100 kilobits per second to file sizes—enough to slow loading times and send impatient scrollers elsewhere. Meta's solution: implement a compressed format that slashed overhead to 25 kbps, requiring another 2,000 lines of code for compression and decompression.

The second test vindicated the effort. Viewers spent more time watching HDR videos, particularly in low-light conditions where proper metadata reduced eye strain. Instagram for iOS now delivers Dolby Vision metadata on all AV1 encodings derived from iPhone HDR uploads, making it Meta's first app with full support. Facebook Reels is next in line.

The broader web remains a problem child. Browser and display support for Dolby Vision is patchy, meaning most readers cannot see the difference on this page. For that, you will need an iPhone and Instagram.Three years to fix discarded data. But for anyone squinting at their screen in bed, scrolling through Reels at 2am, it matters rather a lot.

Related Articles