Unifically LogoUnificAlly
Model logo

Kling 2.6 Motion Control API

Image to VideoReference to Video

Transfer motion from a reference video onto a character image. Perfect for dance and movement replication.

Click or drag & dropMP4, WEBM, MOV · Max 100MB
Click or drag & dropPNG, JPG, WEBP, GIF · Max 100MB
Keep Audio
Preserve audio from the motion reference video
Output

Your generated video will appear here

Features

What Kling Motion Control API offers

Motion transfer from a reference video onto a character image
Reference motion clips from 3 to 30 seconds
Optional prompt text for scene wording
Character orientation Video (up to 30 seconds motion) or Image (up to 10 seconds motion)
Keep audio from the motion clip when you want the soundtrack in the result
Standard (720p) or Pro (1080p) output
REST API with JSON request and response bodies

Use cases

Built for

Primary

Trend choreography - Rerecord a reference take onto a new character still

#2

Animated mascots - Drive a brand still through human reference motion

#3

Short performance clips - Social posts that need timing borrowed from a study clip

#4

Indie games and teasers - Rapid character motion without a full rig

#5

Fitness and coaching - Show form from a template clip applied to another presenter image

FAQ

About Kling Motion Control API

It creates a video where your character image follows the movement from a supplied reference clip. Orientation and clip length limits follow the mode you pick.

Reference uploads can be up to 30 seconds. With Video orientation you can use that full range for rich body motion. Image orientation targets camera style moves and caps usable motion length at 10 seconds in the product copy shown in the playground.

Video orientation fits dance and full-body motion. Image orientation suits shots where framing and camera motion matter and the motion is less articulated.

The 3.0 endpoint emphasizes stronger facial consistency. The 2.6 flow matches older projects that already use these timings and assets.

Provide the motion reference as video and the subject as an image through the API fields your integration maps from the playground.