
HeyGen just open-sourced HyperFrames, and it hit 13,000 stars in seven weeks. That's fast for a video rendering framework.
The thing that makes it different: it's built for AI agents, not humans. You describe what you want in plain text, and the agent writes HTML. Then HyperFrames renders that HTML into video. No React. No build step. Just index.html that plays as-is.
The Remotion comparison matters here. Remotion requires React components. If you want to make a video, you write TSX, bundle it, and wait for the build. Hyperframes skips all that. You paste in an HTML snippet with data attributes, hit preview, and it renders. The same input always gives the same output. Deterministic rendering is useful when you're automating video pipelines at scale.
The license is also a real difference. Remotion is source-available, not open source. You need a paid license if your company exceeds their threshold. HyperFrames is Apache 2.0. OSI-approved. Use it commercially at any scale with no per-render fees or seat caps.
HeyGen ships skills for Claude Code, Cursor, Gemini CLI, and Codex. The agent loads the skill and learns the framework's patterns. Then you can say "create a 10-second product intro with fade-in title" and it produces valid HTML with GSAP animations.
The repo has 50+ ready-to-use blocks. Social overlays, shader transitions, data visualizations, cinematic effects. You install them with npx hyperframes add flash-through-white and they drop into your composition.
HeyGen makes AI video products. They used Remotion in production for years. They learned from it. The source code has attribution comments for patterns Remotion pioneered. But they switched to HTML because agents already speak HTML fluently. Teaching an agent to write valid React components with proper state management takes longer. HTML is simpler for the agent to generate correctly.
The repo was created on March 10, 2026. Last commit was today. 33 open issues, 1,142 forks. Active development.
Technical requirements: Node.js 22+, FFmpeg. The engine uses Puppeteer for headless Chrome capture and FFmpeg for encoding. Frame-accurate GSAP and Anime.js animations. Bring your own animation runtime if you want.
If you're building automated video pipelines, this is worth a look. The HTML-first approach is simpler than React for agent-driven workflows.
Sources: https://github.com/heygen-com/hyperframes https://hyperframes.heygen.com/introduction https://www.reddit.com/r/LovingOpenSourceAI/comments/1k8lfg/say_goodbye_to_remotion_heygen_just_open_sourced/ https://www.npmjs.com/package/hyperframes