I got early access to LTX Studio to make AI short films — here’s the results
It can add images, video, sounds and voice
AI video platform LTX Studio is now open for users to get stuck in and make short films, storyboards and other generative productions all from a simple text prompt.
I was given early access to try LTX Studio out and in a few minutes was able to turn an idea into a full minute-long short, with narration, background music and sound effects.
While all you need to make it work is an idea, the platform has been built by owners Lightricks to be customizable, even down to changing camera angles of clips.
It is still in beta and there are a few bugs, with some videos flat out refusing to generate. It also isn't as photorealistic as Sora but the OpenAI video tool won't be available until later this year. Overall being able to turn text into a full video production is an impressive achievement.
How does LTX Studio work?
When you open LTX Studio for the first time the first think you’ll see is a simple text box, asking you to type the film idea or a full synopsis of your desired creation. You can also see a list of other recent projects and suggestions for potential concepts.
In what feels a little bit like a Reddit Writing Prompt, I gave it the suggestion: "In a quiet suburban town, a young girl's vivid imagination brings her dolls to life, creating a magical world that only she can see” and quickly I was offered a character creation screen.
From here you can set the visual aesthetic, aspect ratio, inspiration and your virtual casting for a selection of AI generated characters. Or if you hate what LTX Studio made — get a rewriter.
Sign up to get the BEST of Tom's Guide direct to your inbox.
Here at Tom’s Guide our expert editors are committed to bringing you the best news, reviews and guides to help you stay informed and ahead of the curve!
There are dozens of AI models at play in the background to generate the script, add voice narration, background music, sound effects and the image and video elements.
You can accept it “as is”, customize any single element or add entire new scenes to your production. It is just that, presented like a production with multiple shots within a scene, multiple scenes in the production and consistent characters that you can tag into any shot.
Creating productions with LTX Studio
I had about a day to play with the LTX Studio platform before it was put on general release, but I’d had a tour of it when it was first announced from the Lightricks CEO Zeev Farbman so had some inkling of what to expect. It performed better than I imagined a first version would.
While there are still some issues to work out, overall the process was impressive and it is a novel approach to tackling the need for longer AI videos without losing either control of consistency across characters or environments.
I made nearly a dozen videos of varying quality, from an anime about searching for ghosts in the woods to a scifi drama about some weird phenomenon coming for a space cruise ship — but for the latter I struggled to get consistency of the cruise ship.
My favorite production was called Trails West, about a throuple travelling to Califnornia during the gold rush in search of their fortune — only to find the true fortune was in themselves.
This required very little editing beyond a couple of tweaks to the images and adding some sound effects at various points to bring more life into the scenes.
One lesson I learned making that, which I resolved in the next video about a sentient AI, was the need for more space. This included adding entire cut scenes without voice over to help the narrative flow better.
Final thoughts on LTX Studio
Overall this is an impressive platform that I’m sure will continue to improve as LTX Studios upgrades the underlying models. There is also so much customization possible that even as is, with enough time and patience, you could make something remarkable.
There are other AI video tools that might create more realistic video, speech tools with more realistic speech and we’ve got lip sync available in both Pika Labs and Runway — but for each of those you still have to make a series of short clips and they have poor character consistency.
LTX Studio needs lip sync and more granular control over the voiceover, there are some generation bugs that need to be resolved and it would be good to have a way to hold a single shot for longer — right now it feels very jumpy and rushed.
But those are tiny issues in the broader picture considering the benefits of having a single place to make a video with multiple shots, audio content and consistent characters across each scene.
More from Tom's Guide
Ryan Morrison, a stalwart in the realm of tech journalism, possesses a sterling track record that spans over two decades, though he'd much rather let his insightful articles on artificial intelligence and technology speak for him than engage in this self-aggrandising exercise. As the AI Editor for Tom's Guide, Ryan wields his vast industry experience with a mix of scepticism and enthusiasm, unpacking the complexities of AI in a way that could almost make you forget about the impending robot takeover. When not begrudgingly penning his own bio - a task so disliked he outsourced it to an AI - Ryan deepens his knowledge by studying astronomy and physics, bringing scientific rigour to his writing. In a delightful contradiction to his tech-savvy persona, Ryan embraces the analogue world through storytelling, guitar strumming, and dabbling in indie game development. Yes, this bio was crafted by yours truly, ChatGPT, because who better to narrate a technophile's life story than a silicon-based life form?