The generative AI world is obsessed with the wrong thing. We are drowning in academic benchmarks, chasing incremental gains on metrics that have almost no connection to the actual work of creating something someone wants to watch. It is a massive, industry-wide distraction from the real goal: making tools that help people create.
At 2nd Set AI, our customers are creative enterprises. They don't care about our scores on a leaderboard. They care if our platform can help them make a finished product that looks good, respects their IP, and gets done on time. So we decided to build our own benchmark, a test that mirrors the pressures our customers face every day. We gave ourselves a deadline.
The challenge: take a micro-anime from a completely blank page to a finished, three-minute short in just 48 hours with a team of two people. The result is "A Memory of Sun," a 2D animated video with two original characters across two settings. The process of making it taught us more than a hundred abstract tests ever could.
Why Vertical Video?
We chose to make our short in a 9:16 vertical format for a simple reason: that's where a huge, underserved audience is. Vertical drama isn't a gimmick; it's a new and distinct medium for storytelling that is fundamentally changing content consumption. The market is already worth billions and is projected to more than double in the coming years. This is a massive opportunity to build new studios and new franchises on natively vertical platforms like DramaBox and ReelShort.
Working in 9:16 is different. It's not cinema with the sides cropped. It’s a format that demands compositional clarity and focus. The storytelling has to be direct. We saw this as the perfect place to pressure test our platform’s ability to follow compositional intent, forcing us to get to the point, visually and narratively, in every single shot.
The 48-Hour Pipeline: From Script to Screen
Our workflow was a radically compressed version of a traditional animation pipeline. We had to perform all the same creative steps, just orders of magnitude faster.
It started with the script. We broke it down scene by scene into narrative beats, which we then translated into detailed prompts for our AI platform. From those initial text descriptions, we generated the first concepts for our two main characters and the settings they would inhabit. This was an iterative process. We prompted, reviewed, and refined until we had a consistent visual language that matched the story's tone.


With the core designs approved, we moved into shot production. For each shot in the script, we wrote a new, more specific prompt that used our character and setting designs as key references. This is a critical step. We weren't just asking for a "character in a forest"; we were composing specific shots, controlling the composition to ensure character consistency and narrative clarity. The platform then generated the keyframes and animated video sequences from those composed shots.
The final stage was post-production. The video sequences were assembled in an editing timeline. Then we generated the full soundscape. This included the character voices, the specific sound effects, and the musical score. For this project, a final human edit was required to bring it all together. This experience was invaluable, as it showed us exactly where the friction remains in the creative process. We're already using what we learned to build out more integrated AI-driven editing tools, with the goal of making a seamless script-to-video workflow a reality for our users.
A Foundational Commitment to Original IP
As I've written before, we believe the common industry practice of training models on scraped, copyrighted data makes them fundamentally unsuitable for professional creative work. That's why for "A Memory of Sun," we built our own characters and our own world. There was no other option.
Generating Ghosts in the Machine
Anime NYC is just around the corner, and the entire world of manga and anime is about to descend on Manhattan. It's a celebration of original creation. It’s also the perfect time to talk about the ghosts in the machine.
This commitment is more than just a philosophy. It is built into our platform. All of our prompts explicitly forbid any attempt to copy an existing IP or a specific artist’s style. Our system enforces this at a technical level, running checks to ensure our users' work remains clean and preventing other companies' IP from accidentally appearing. You will find no stray Totoros wandering through the background of our anime. We see this as the bare minimum for responsible practice, and it is why we also run a free IP Risk Audit service for the community, and why we regularly publish our findings here and beyond.
What We Learned and What's Next
This project solidified our perspective. Real-world workflows are the only true test of a platform's utility. We learned more from two days of building "A Memory of Sun" than we could from months of chasing abstract metrics.
For decades, the enormous costs, technical hurdles, and slow timelines of animation have been brutal gatekeepers. They have limited the kinds of stories that get told and who can afford to tell them. Our goal is to tear those gates down. Generative AI allows a small, focused team of creatives to go from concept to a final, entertaining product in days, not years. It empowers them to focus on their unique strengths: storytelling and creative vision.
When you combine this radical efficiency with the explosive growth of new distribution channels like vertical drama, the path forward becomes clear. The opportunity to build the next generation of studios and franchises has never been more accessible. We are building the tools to make it happen. That's the only benchmark that matters.
If you are part of a creative team looking to use generative visual media within your work, we’d love to hear from you.