How Sony Is Using AI Behind the Scenes of Spider-Man 2 and Horizon’s Aloy
Inside Sony’s 2025 AI Strategy: Ethics, Upscaling, and Enterprise Language Models
Sony has quietly sharpened its AI ambition. In its 2025 Corporate Report, the company lays out how it’s deploying an internal tool, its Enterprise Large Language Model (Enterprise LLM) across game development and entertainment, and just how far those experiments have already gone.
What Sony is doing
Since rolling out Enterprise LLM in 2023, Sony reports that over 50,000 employees, spanning 210 teams, are now using the system in their daily workflows.
This isn’t just chatting help or auto-writing memos. Some 300+ AI-powered internal projects have been tried, and over 50 of them have made it into regular use.
Among the standout uses are in Marvel’s Spider-Man 2, where voice recognition tech is helping automate subtitles in multiple languages. And the figure of Aloy (from the Horizon franchise) has been used in a demonstration to show what AI can do: voice, facial animation, even conversational responses.
Other parts of the work include upscaling visual fidelity, especially for the PlayStation 5 and its Pro variant, using PSSR (PlayStation Spectral Super Resolution) tech, and improving sound and image quality on older films.
Balancing creativity, ethics, and human cost
Importantly, Sony doesn’t present this as a power grab or a replacement scheme for developers. The report emphasizes that AI should assist, not substitute, human creativity.
To that end, it’s also building in guardrails: legal, privacy, and ethical teams are involved in shaping how the tool is used. One concern is avoiding copyright or IP infringements—especially when music, art, or voice might be involved. Sony is also developing systems to detect unauthorized reuse of creative content.
Still, not everyone is totally relaxed. Some developers and actors remain worried about how these tools might erode roles or, over time, reduce the need for human-led asset creation. An AI-voice demo of Aloy (using synthesized voice and motion) sparked controversy in part because it diverged from the character’s original voice actress and showed how uncanny or insecure these systems can feel.
Game development has been growing more complex and expensive. AAA titles demand massive teams, long timelines, high fidelity, and broad localization. Sony is clearly hoping that AI tools like its Enterprise LLM can help shave off inefficiency, especially in repetitive or technical work, so creative teams can focus on narrative, design, and polish.
From consumers’ perspective, this could mean smoother localization, better visuals on new and older titles, maybe faster updates, and fewer “rough edges.” But it also raises questions: how much human touch will remain? What happens to voice actors or artists whose roles overlap with what AI can mimic? How transparent will Sony and others be about which assets are human-made vs AI-aided?
Sony is not alone in saying “AI’s here to help, not replace.” But unlike many vague promises, the 2025 report gives concrete examples of how it’s using its Enterprise LLM for how many people, which tasks, and which protections. The key going forward will be how well Sony (and the industry broadly) maintains that balance: pushing creative boundaries, while respecting the people who make stories, art, and games possible.
If it gets this right, AI could be an amplifier; if not, it risks becoming a crutch or worse, quietly edging out human artisans. For now, Sony seems keenly aware of both paths.
Subscribe to my whatsapp channel