Wan2.2 claims to be the industry’s ‘first open-source large video generation models incorporating the Mixture-of-Experts architecture’
Alibaba Group Holding and Zhipu AI have launched new open-source models as China’s rivalry with the US in artificial intelligence heats up.
On Tuesday, Alibaba released Wan2.2, which it claimed was the industry’s “first open-source large video generation models incorporating the Mixture-of-Experts (MoE) architecture”. Alibaba owns the South China Morning Post.
MoE is a machine-learning approach that divides an AI model into separate sub-networks, or experts – each focused on a subset of the input data – to jointly perform a task. It enables models to be pre-trained with far less computing power.
Your personal data will be processed and information from your device (cookies, unique identifiers, and other device data) may be stored by, accessed by and shared with 88 TCF vendor(s) and 20 ad partner(s), or used specifically by this site or app.







