Luma Labs AI's "Modify Video" tool, part of its Dream Machine platform, recently gained attention through a tweet by Jon Finger, demonstrating its advanced generative AI capabilities with "Shark and Capybara tests." The tweet, which included a video showcasing the transformation of existing footage, highlights the company's focus on reimagining environments and styles within video while preserving original motion and performance.
The "Modify Video" feature allows users to alter elements like environments, lighting, and textures in a video without compromising the integrity of the performance, motion, camera, or character. This technology is powered by Luma's Ray2, a large-scale video generative model designed for realistic visuals, coherent motion, and logical event sequences. Luma Labs AI asserts that its Modify Video tool outperforms competitors like Runway V2V in maintaining motion, facial animation, and temporal consistency.
San Francisco-based Luma Labs AI, founded in 2021 by Amit Jain, Alberto Taiuti, and Alex Yu, is a generative AI startup specializing in multimodal AI to expand human imagination and capabilities. The company has secured substantial funding, totaling approximately $173 million to $200 million across multiple rounds, with notable investors including Andreessen Horowitz, Matrix Partners, Amazon, Nvidia, Amplify Partners, and Constructor Capital. Its latest funding rounds include a $90 million Series C in December 2024 and a $10 million round in April 2025.
Luma Labs AI's mission extends beyond video modification, aiming to make photorealistic 3D capture from smartphone photos a reality for industries such as e-commerce, real estate, and 3D gaming. The company's Dream Machine platform, which incorporates Ray2, is designed to transform text descriptions and images into high-quality videos and 3D models. This positions Luma Labs AI as a significant player in the rapidly evolving generative AI market, offering tools for creative professionals and developers to produce immersive digital content efficiently.