Meshcapade, a leader in 3D human modeling, has significantly advanced its markerless motion capture technology, MoCapade, with the release of version 3.0. The update now enables the simultaneous tracking of both 3D camera movements and detailed human pose estimation from a single video source, a capability highlighted by tech enthusiast Bilawal Sidhu. Sidhu stated in a recent tweet, "Meshcapade can now pull apart both 3D camera tracking + human pose estimation data. Effectively like Wonder Dynamics / Autodesk Flow Studio at this point."
The new capabilities position Meshcapade's offering alongside established industry tools, making it a powerful AI model for 3D artists and for integration into video-to-video workflows. MoCapade 3.0 introduces multi-person capture, allowing processing of up to four individuals, and supports the export of estimated 3D camera trajectories. This feature, highly requested by 3D animation artists and AI researchers, exports estimated motion in world coordinates alongside human movement.
Meshcapade's official announcement for MoCapade 3.0 detailed additional enhancements, including more detailed hand articulation and gestures, and new 3D and video export options like GLB, MP4, and SMPL formats. The company also introduced a new .SMPL file format, making their industry-standard parametric 3D body model more accessible for commercial users. This allows for seamless integration with various platforms, including Unreal Engine, Roblox, and Fortnite.
The technology is built on Meshcapade's foundational SMPL (Skinned Multi-Person Linear Model) and SMPL-X models, which are widely recognized in academia and industry for creating realistic 3D avatars with accurate body shape and motion. These advancements streamline the creation of digital humans for applications ranging from gaming and fashion to robotics and healthcare. The company also announced new investors, including HV Capital, LBBW VC, Luc Vincent, and Jeff Dean, signaling strong confidence in their continued innovation in the 3D embodiment space.