Draganything: Motion control for anything using entity representation
European Conference on Computer Vision, 2025•Springer
We introduce DragAnything, which utilizes a entity representation to achieve motion control
for any object in controllable video generation. Comparison to existing motion control
methods, DragAnything offers several advantages. Firstly, trajectory-based is more user-
friendly for interaction, when acquiring other guidance signals (eg, masks, depth maps) is
labor-intensive. Users only need to draw a line (trajectory) during interaction. Secondly, our
entity representation serves as an open-domain embedding capable of representing any …
for any object in controllable video generation. Comparison to existing motion control
methods, DragAnything offers several advantages. Firstly, trajectory-based is more user-
friendly for interaction, when acquiring other guidance signals (eg, masks, depth maps) is
labor-intensive. Users only need to draw a line (trajectory) during interaction. Secondly, our
entity representation serves as an open-domain embedding capable of representing any …
Abstract
We introduce DragAnything, which utilizes a entity representation to achieve motion control for any object in controllable video generation. Comparison to existing motion control methods, DragAnything offers several advantages. Firstly, trajectory-based is more user-friendly for interaction, when acquiring other guidance signals (eg, masks, depth maps) is labor-intensive. Users only need to draw a line (trajectory) during interaction. Secondly, our entity representation serves as an open-domain embedding capable of representing any object, enabling the control of motion for diverse entities, including background. Lastly, our entity representation allows simultaneous and distinct motion control for multiple objects. Extensive experiments demonstrate that our DragAnything achieves state-of-the-art performance for FVD, FID, and User Study, particularly in terms of object motion control, where our method surpasses the previous methods (eg, DragNUWA) by in human voting. The project website is at: DragAnything.
Springer
以上显示的是最相近的搜索结果。 查看全部搜索结果