We’re thrilled to announce the return of GamesBeat Next, hosted in San Francisco this October, where we will explore the theme of “Playing the Edge.” Apply to speak here and learn more about sponsorship opportunities here. At the event, we will also announce 25 top game startups as the 2024 Game Changers. Apply or nominate today!

NPCx has raised $3 million in funding for improving motion capture for non-player characters (NPCs) in games.

Kakao Investment led the round to strengthen NPCx’s position within the gaming and entertainment industry and propel it to create more realistic character movements using AI-powered products.

St. Petersburg, Florida-based NPCx launched its flagship product in March with the debut of TrackerX. This motion capture processing tool disrupts the conventional and labor-intensive process of tracking raw 3D point cloud data, the company said. By seamlessly integrating with any optical or sensor-based motion capture system, TrackerX simplifies the workflow by directly applying the captured data onto the TrackerX character skeleton.

Cameron Madani, CEO of NPCx, said in a statement, “TrackerX disrupts this costly manual process that’s been around for nearly thirty years by cleaning raw motion capture with AI and proprietary biomechanical models, saving companies thousands of labor hours and significant financial resources per project.”

The new round will fuel the development of NPCx’s pioneering product, BehaviorX. This technology aims to enhance gaming experiences by capturing and utilizing real-time data from players to create lifelike behavioral clones in NPCs. By analyzing player behavior and translating it into realistic NPC actions, BehaviorX promises to level up the immersion and engagement in gaming.

In addition to BehaviorX, NPCx has plans to launch two other innovative products: RetargetX and AIMX. Both products harness the power of neural networks to retarget motion capture animation and predict the next animation frame, resulting in smoother and more lifelike character movements.

Prior to securing the Kakao Investment, NPCx raised over $540,000 through the crowdfunding platform Republic. This achievement not only demonstrates the potential of NPCx but also highlights the public’s keen interest in the integration of AI technologies within the gaming industry.

“We are honored to have Kakao Investment lead our seed round,” Madani said. “As a prominent startup investor in Asia, securing Kakao Investment’s investment during these challenging times in the capital markets is a tremendous vote of confidence in our team, technology, and product roadmap. This partnership marks a significant milestone for NPCx and will greatly expedite the development of our AI-powered products that will revolutionize the industry.”


Cameron Madani is CEO of NPCx.

NPCx was founded in 2020 by Madani (CEO), Michael Puscar (CTO), and Alberto Menache (CPO). Before starting the firm, Madani had experience founding a game development studio and later a motion capture and animation company that worked with major studios and publishers in the gaming, film, and XR industries.

Recognizing AI’s potential in revolutionizing animation pipelines for years, Madani sought AI specialists and crossed paths with Puscar in 2019. Leveraging Puscar’s AI and entrepreneurial expertise, they wanted to use AI and machine learning to streamline animation processes by automating tasks through neural networks, drawn from Puscar’s successful applications of the technology in other domains.

Soon after, the cofounders brought in Alberto Menache, a well-known pioneer in animation and motion capture pipelines with decades of experience. In fact, Menache authored a book on motion capture and has been a leading figure in the development of animation and motion capture pipelines for nearly 30 years.

“What I particularly value about this founding team is our extensive industry experience and expertise,” Madani said in an email to GamesBeat. “This allows us to create numerous AI-driven products that seamlessly integrate into existing animation pipelines, resulting in immediate and substantial time and cost savings for the same customers we’ve been working with throughout our professional careers.”

In 2008, Madani began working as a business development director with a third party developer and publisher involved in 16 title releases (Sony, Microsoft and Nintendo). In 2010, he co-developed the top-selling game, Torchlight for Microsoft/Runic Games (Xbox 360 and PC/Mac). In 2014, Cameron co-founded Motion Burner, an award-winning motion capture and animation studio which has provided motion capture, rigging, modeling and animation services for 24 clients and 71 projects.

Puscar has been programming since the mid-1980’s, when at 11 years old he found a Commodore 64 under the Christmas tree. His work as a teenager was noticed by the US government, and he was recruited out of university to work for DARPA via Lockheed Martin with a top secret security clearance.

Puscar‘s expertise as a technologist is in the area of artificial intelligence and machine learning, including natural language processing, computer vision and the development of neural networks.

Menache is known as one of the fathers of Motion Capture. He has spent the last seven years solving “impossible” technical challenges for Lightstorm Entertainment and James Cameron on the Avatar films (2 through 5). Some of his credits include top film franchises such as Superman, Spider Man and Mission Impossible. He is the author of two definitive books on Motion Capture and a holder of nine patents in animation and motion capture innovations.

Revamping mocap

NPCx is bringing AI and machine learning to NPCs.

NPCx primarily focuses on two innovative aspects in video gaming: character movement and character intelligence. For character movement, NPCx is creating a suite of products that vastly reduces the time and cost of creating and deploying motion capture and key-framed animations in video games, film, XR, and the metaverse. Their first product, TrackerX, launched in March 2023 utilizing neural networks and biomechanical models to substantially reduce the processing time of motion capture data, which up until now was done manually.

For character intelligence, NPCx is actually modeling humans – real-world players – and not creating god-like AIs or robotic LLM/GPT-driven procedural animation engines. They aim to virtually ‘clone’ players into games, XR, and the metaverse to such an extent that distinguishing between an NPC and a human player becomes nearly impossible.

Traditionally, motion capture performances relying on optical or sensor-based hardware systems require painstaking manual “cleaning” to prepare them for the final product. This process, done with a mouse and keyboard, corrects issues such as feet going through the floor or limbs penetrating other characters and objects.

TrackerX transforms this by combining biomechanical modeling and neural networks to automate this cleaning process. Currently, TrackerX speeds up manual cleaning by nearly 50 times, with ongoing neural network training for sustained improvements. This efficiency not only saves time and money for studios but also enables more extensive motion capture content creation within the same budget.

One big rival is Inworld AI, which recently raised $50 million at a $500 million valuation. Madani said, “Since we are developing lifelike NPCs for video games, XR, and the metaverse, Inworld AI would be a close competitor of ours. However, like many other similar competitors, they primarily use advanced Large Language Models (LLMs) and Generative Pre-Trained Transformers (GPTs) as their engine to bring characters to life, along with a procedural animation shell. We believe using LLMs and GPTs with generative animation systems is a “red ocean” strategy, meaning anybody can deploy a GPT engine at fairly low costs and train it within a generic animation wrapper, we believe this approach will create a very crowded field in a short amount of time, basically making it a commodity.”

Other close competitors are attempting to create super NPCs, utilizing General Adversarial Networks (GANs) and Generative Adversarial Imitation Learning (GAIL), similar to OpenAI’s approach with DOTA 2 in 2019, although it’s important to note that OpenAI’s methods at the time could technically be consider cheating, according to an article by Vice.

“What distinguishes us from the rest of the pack, and where we hold a competitive advantage, is that while they either create a GPT system with an animation shell, or aim to generate super NPCs that are excessively lethal, our technology fine-tunes the NPCs to achieve a highly lifelike quality,” Madani said. “In fact, we can replicate various character play styles. Our secret lies in how we replicate these characters. We believe our methodology will result in more lifelike characters that exhibit human-like behaviors, avoiding the extremes of behaving solely like an animated GPT or, on the other end of the spectrum, being godlike.”

The company has 22 employees and plans to hire an additional five by the end of 2023.

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.