LongCat-Flash-Chat is on par with the performance of models from DeepSeek, Alibaba and Moonshot AI, according to its technical report
According to the model’s technical report, LongCat-Flash-Chat features 560 billion total parameters – the variables used to define and manipulate the input and output of models – and the so-called Mixture-of-Experts (MoE) architecture. MoE architecture divides the model into separate sub-networks, or “experts”, that specialise in a subset of the input data to jointly perform a task.
Meituan’s foray into open-source models reflects its efforts to establish a viable AI business since acquiring Light Year, which was established by a co-founder of the food delivery services giant.
It also showed how Chinese AI companies are continuing to narrow the gap with their US peers through the open-source approach, which makes the source code of AI models available for third-party developers to use, modify and distribute.
Your personal data will be processed and information from your device (cookies, unique identifiers, and other device data) may be stored by, accessed by and shared with 88 TCF vendor(s) and 20 ad partner(s), or used specifically by this site or app.






