Documentation Index Fetch the complete documentation index at: https://docs.varg.ai/llms.txt
Use this file to discover all available pages before exploring further.
The varg SDK (vargai) lets you create AI videos programmatically using JSX syntax. One API key, one gateway, all providers.
Installation
Setup
Only one key is needed:
# .env
VARG_API_KEY = varg_xxx
Get your key at app.varg.ai or run bunx vargai login.
Two Ways to Use varg
1. Gateway Mode (Recommended)
Use createVarg() for one-key access to all providers:
/** @jsxImportSource vargai */
import { Render , Clip , Image , Video , Music } from "vargai/react"
import { createVarg } from "vargai/ai"
const varg = createVarg ({ apiKey: process . env . VARG_API_KEY ! })
const character = Image ({
model: varg . imageModel ( "nano-banana-pro" ),
prompt: "cute robot character" ,
aspectRatio: "9:16" ,
})
export default (
< Render width = { 1080 } height = { 1920 } >
< Music prompt = "upbeat electronic" model = { varg . musicModel () } volume = { 0.3 } duration = { 5 } />
< Clip duration = { 5 } >
< Video
prompt = { { text: "robot waves hello" , images: [ character ] } }
model = { varg . videoModel ( "kling-v3" ) }
/>
</ Clip >
</ Render >
)
2. Direct Provider Mode (BYOK)
Use individual provider keys directly. See BYOK for details.
import { render , Render , Clip , Image , Video } from "vargai/react"
import { fal , elevenlabs } from "vargai/ai"
const character = Image ({ prompt: "cute robot character" })
await render (
< Render width = { 1080 } height = { 1920 } >
< Clip duration = { 5 } >
< Video
prompt = { { text: "robot waves hello" , images: [ character ] } }
model = { fal . videoModel ( "kling-v3" ) }
/>
</ Clip >
</ Render > ,
{ output: "robot.mp4" }
)
Requires individual provider keys in .env:
FAL_KEY = fal_xxx
ELEVENLABS_API_KEY = xxx
Core Concepts
Elements vs Components
In varg, media generation happens via element functions that return references:
// Element function - generates AI content, returns reference
const image = Image ({ prompt: "sunset" }) // Not rendered yet
// Component - uses reference in composition
< Clip duration = { 3 } >
< Video prompt = { { images: [ image ] } } /> // Reference used here
</ Clip >
Critical : Image(), Video(), Speech() are function calls. <Music>, <Captions>, <Title> are JSX components. Never write <Image prompt="..." />.
Automatic Caching
Same props = instant cache hit at $0:
// First run: ~60 seconds (AI generation)
const cat = Image ({ prompt: "orange cat" })
// Second run: instant (cached)
const cat = Image ({ prompt: "orange cat" })
// Different prompt: regenerates
const cat = Image ({ prompt: "orange cat sitting" })
Parallel Generation
Independent elements generate simultaneously:
// These generate in parallel, not sequentially
const scene1 = Image ({ prompt: "forest" })
const scene2 = Image ({ prompt: "ocean" })
const scene3 = Image ({ prompt: "mountain" })
// Total time = longest single generation, not sum of all
Package Exports
// React/JSX components
import { Render , Clip , Image , Video , Speech , Music , Captions , ... } from "vargai/react"
// Gateway client (recommended — routes through api.varg.ai)
import { createVarg } from "vargai/ai"
// Direct providers (BYOK)
import { fal , elevenlabs , higgsfield , openai , replicate } from "vargai/ai"
Environment Variables
# Required
VARG_API_KEY = varg_xxx # Gateway key (one key for everything)
# Optional (BYOK — see /byok for details)
FAL_KEY = fal_xxx # Direct Fal access
ELEVENLABS_API_KEY = xxx # Direct ElevenLabs access
HIGGSFIELD_API_KEY = hf_xxx # Direct Higgsfield access
REPLICATE_API_TOKEN = r8_xxx # Direct Replicate access
Next Steps
Components All JSX components and their props
AI Models All supported models with pricing
CLI Reference Command-line interface
Templates Copy-paste examples