Ant Group hasΒ enteredΒ the trillion-parameter AI model arena with Ling-1T, a newly open-sourced language model that the Chinese fintech giant positions as a breakthrough in balancing computational efficiency with advanced reasoning capabilities.
The October 9 announcement marks a significant milestone for the Alipay operator, which has been rapidly building out its artificial intelligence infrastructure across multiple model architectures.Β
The trillion-parameter AI model demonstrates competitive performance on complex mathematical reasoning tasks, achieving 70.42% accuracy on the 2025 American Invitational Mathematics Examination (AIME) benchmarkβa standard used to evaluate AI systemsβ problem-solving abilities.
According to Ant Groupβs technical specifications, Ling-1T maintains this performance level while consuming an average of over 4,000 output tokens per problem, placing it alongside what the company describes as βbest-in-class AI modelsβ in terms of result quality.
Dual-pronged approach to AI advancement
The trillion-parameter AI model release coincides with Ant Groupβs launch of dInfer, a specialised inference framework engineered for diffusion language models. This parallel release strategy reflects the companyβs bet on multiple technological approaches rather than a single architectural paradigm.
Diffusion language models represent a departure from the autoregressive systems that underpin widely used chatbots like ChatGPT. Unlike sequential text generation, diffusion models produce outputs in parallelβan approach already prevalent in image and video generation tools but less common in language processing.
Ant Groupβs performance metrics for dInfer suggest substantial efficiency gains. Testing on the companyβs LLaDA-MoE diffusion model yielded 1,011 tokens per second on the HumanEval coding benchmark, versus 91 tokens per second for Nvidiaβs Fast-dLLM framework and 294 for Alibabaβs Qwen-2.5-3B model running on vLLM infrastructure.
βWe believe that dInfer provides both a practical toolkit and a standardised platform to accelerate research and development in the rapidly growing field of dLLMs,β researchers at Ant Group noted in accompanying technical documentation.
Ecosystem expansion beyond language models
The Ling-1T trillion-parameter AI model sits within a broader family of AI systems that Ant Group has assembled over recent months.Β

The companyβs portfolio now spans three primary series: the Ling non-thinking models for standard language tasks, Ring thinking models designed for complex reasoning (including the previously released Ring-1T-preview), and Ming multimodal models capable of processing images, text, audio, and video.
This diversified approach extends to an experimental model designated LLaDA-MoE, which employs Mixture-of-Experts (MoE) architectureβa technique that activates only relevant portions of a large model for specific tasks, theoretically improving efficiency.
He Zhengyu, chief technology officer at Ant Group, articulated the companyβs positioning around these releases. βAt Ant Group, we believe Artificial General Intelligence (AGI) should be a public goodβa shared milestone for humanityβs intelligent future,β He stated, adding that the open-source releases of both the trillion-parameter AI model and Ring-1T-preview represent steps toward βopen and collaborative advancement.β
Competitive dynamics in a constrained environment
The timing and nature of Ant Groupβs releases illuminate strategic calculations within Chinaβs AI sector. With access to cutting-edge semiconductor technology limited by export restrictions, Chinese technology firms have increasingly emphasised algorithmic innovation and software optimisation as competitive differentiators.
ByteDance, parent company of TikTok, similarly introduced a diffusion language model called Seed Diffusion Preview in July, claiming five-fold speed improvements over comparable autoregressive architectures. These parallel efforts suggest industry-wide interest in alternative model paradigms that might offer efficiency advantages.
However, the practical adoption trajectory for diffusion language models remains uncertain. Autoregressive systems continue dominating commercial deployments due to proven performance in natural language understanding and generationβthe core requirements for customer-facing applications.
Open-source strategy as market positioning
By making the trillion-parameter AI model publicly available alongside the dInfer framework, Ant Group is pursuing a collaborative development model that contrasts with the closed approaches of some competitors.Β
This strategy potentially accelerates innovation while positioning Antβs technologies as foundational infrastructure for the broader AI community.
The company is simultaneously developing AWorld, a framework intended to support continual learning in autonomous AI agentsβsystems designed to complete tasks independently on behalf of users.
Whether these combined efforts can establish Ant Group as a significant force in global AI development depends partly on real-world validation of the performance claims and partly on adoption rates among developers seeking alternatives to established platforms.Β
The trillion-parameter AI modelβs open-source nature may facilitate this validation process while building a community of users invested in the technologyβs success.
For now, the releases demonstrate that major Chinese technology firms view the current AI landscape as fluid enough to accommodate new entrants willing to innovate across multiple dimensions simultaneously.
See also: Ant Group uses domestic chips to train AI models and cut costs

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is part of TechEx and is co-located with other leading technology events including the Cyber Security Expo, click here for more information.
AI News is powered by TechForge Media. Explore other upcoming enterprise technology events and webinars here.