Skip to main content
Motion Control transfers body movements from a reference video onto a character image. You supply a portrait or full-body image of a character and a short video clip containing the movements you want to replicate. The AI reads the body motion from the reference clip and animates your character to perform those same movements. This is distinct from other video modes: you are not generating a scene from a prompt, and you are not animating an image with added effects. Motion Control is specifically about motion transfer — taking human movement data from one source and applying it to a different subject.

How to use Motion Control

1

Open Motion Control

Navigate to Video in the left sidebar. Scroll through the available modes and select Motion Control. The input panel opens, accepting a character image and a reference video.
2

Upload your character image

Upload the image of the character you want to animate. The quality of motion transfer depends significantly on the source image.For best results:
  • Show the full body or at least the upper body clearly in the frame
  • Use a front-facing or near-front-facing pose (slight angles work; extreme side profiles reduce accuracy)
  • Avoid heavily cropped images where limbs are cut off
  • Use a clean, simple background — complex backgrounds can produce visual noise around the character boundary
  • Ensure the image is well-lit and in focus
3

Upload a motion reference video

Upload the video containing the movements you want to transfer to your character. This is the driving video — the AI reads body position, joint angles, and motion timing from this clip and maps them to your character.For best results:
  • Use a video with a single person as the primary subject
  • Ensure the subject in the reference video is clearly visible and well-lit
  • Minimal background clutter improves motion tracking accuracy
  • The subject should be performing the movement you want to transfer — avoid clips where the subject is partially obscured or moving out of frame
You can also choose from the built-in motion reference presets provided in the interface. These are curated clips covering a range of movements including walking, dancing, waving, and more — a useful starting point if you don’t have a specific reference video.
4

Generate the animation

Once both inputs are in place, click Generate. ImagineArt processes the motion transfer and produces an animated video of your character performing the movements from the reference clip.Generation time varies based on the length of the reference video and the current processing queue.

What affects output quality

FactorImpact
Character image poseFront-facing images produce more accurate motion mapping than profiles
Character image claritySharp, well-lit images with visible limbs give the model more to work with
Reference video qualityClear, well-lit subjects with minimal occlusion improve motion tracking
Reference video complexitySingle-person clips with simple backgrounds outperform crowded scenes
Reference video lengthLonger reference clips generate longer animations; processing time scales accordingly

Common use cases

Transfer natural human movement onto an illustrated character, game asset, or AI-generated portrait. This is useful for previewing how a character design moves before investing in full animation production.
Use a dance performance video as the reference to make a character image perform the same choreography. Works well for music content, social media videos, and promotional material.
Apply a presenter’s gestures and body language from a reference video to an avatar or illustrated character, creating an animated spokesperson without additional filming.
Only use reference videos of people whose motion you have the right to use. Do not upload recordings of individuals without their consent for commercial or public-facing projects.

What to do next

Lipsync

Add speech and lip animations to a character after applying motion.

Edit Video

Modify the background or environment of your animated output.

Extend Video

Add 5 more seconds of content to the generated animation.

Video Credits

Understand credit costs for Motion Control generations.