How to use Motion Control
Open Motion Control
Navigate to Video in the left sidebar. Scroll through the available modes and select Motion Control. The input panel opens, accepting a character image and a reference video.
Upload your character image
Upload the image of the character you want to animate. The quality of motion transfer depends significantly on the source image.For best results:
- Show the full body or at least the upper body clearly in the frame
- Use a front-facing or near-front-facing pose (slight angles work; extreme side profiles reduce accuracy)
- Avoid heavily cropped images where limbs are cut off
- Use a clean, simple background — complex backgrounds can produce visual noise around the character boundary
- Ensure the image is well-lit and in focus
Upload a motion reference video
Upload the video containing the movements you want to transfer to your character. This is the driving video — the AI reads body position, joint angles, and motion timing from this clip and maps them to your character.For best results:
- Use a video with a single person as the primary subject
- Ensure the subject in the reference video is clearly visible and well-lit
- Minimal background clutter improves motion tracking accuracy
- The subject should be performing the movement you want to transfer — avoid clips where the subject is partially obscured or moving out of frame
Generate the animation
Once both inputs are in place, click Generate. ImagineArt processes the motion transfer and produces an animated video of your character performing the movements from the reference clip.Generation time varies based on the length of the reference video and the current processing queue.
What affects output quality
| Factor | Impact |
|---|---|
| Character image pose | Front-facing images produce more accurate motion mapping than profiles |
| Character image clarity | Sharp, well-lit images with visible limbs give the model more to work with |
| Reference video quality | Clear, well-lit subjects with minimal occlusion improve motion tracking |
| Reference video complexity | Single-person clips with simple backgrounds outperform crowded scenes |
| Reference video length | Longer reference clips generate longer animations; processing time scales accordingly |
Common use cases
Animating illustrated characters
Animating illustrated characters
Transfer natural human movement onto an illustrated character, game asset, or AI-generated portrait. This is useful for previewing how a character design moves before investing in full animation production.
Dance and choreography
Dance and choreography
Use a dance performance video as the reference to make a character image perform the same choreography. Works well for music content, social media videos, and promotional material.
Presentation and speech animation
Presentation and speech animation
Apply a presenter’s gestures and body language from a reference video to an avatar or illustrated character, creating an animated spokesperson without additional filming.
What to do next
Lipsync
Add speech and lip animations to a character after applying motion.
Edit Video
Modify the background or environment of your animated output.
Extend Video
Add 5 more seconds of content to the generated animation.
Video Credits
Understand credit costs for Motion Control generations.

