New model announcements, capabilities, and key improvements - tracked as they ship.
Total
267
Open Source ?
133
Open source models have publicly released weights - anyone can download and run them locally without an API key. Often available on Hugging Face.
Proprietary ?
134
Proprietary models are closed-source - weights aren't public and you access them only through the company's API or products.
Labs ?
26
Number of distinct AI labs and companies whose model releases are tracked here.
View lab progression
G
GoogleApr 16, 2026The date the model was publicly announced or its weights became available to download.Proprietary?Proprietary - The model's weights are not public. You interact with it only through the company's API or products. The underlying architecture and training data are typically undisclosed.
Nano Banana 2
Key improvements
Key improvements highlighted in the model's official release notes or announcement.
Uses personal context
Integrates with Google Photos
Creates personalized images reflecting unique life
AlibabaApr 16, 2026The date the model was publicly announced or its weights became available to download.Proprietary?Proprietary - The model's weights are not public. You interact with it only through the company's API or products. The underlying architecture and training data are typically undisclosed.
Qwen3-Reranker-4B
4B params?4B parameters - Parameters are the learnable numerical weights inside a neural network. More parameters = more capacity to store knowledge and handle complex tasks, but also more compute and memory to run. 4B = 4 billion weights.
Key improvements
Key improvements highlighted in the model's official release notes or announcement.
state-of-the-art performance across a wide range of downstream application evaluations
A
AlibabaApr 16, 2026The date the model was publicly announced or its weights became available to download.Open Source?Open source - The model's weights are publicly released, meaning anyone can download, run, fine-tune, or deploy it without an API key. Often released under licences like Apache 2.0 or MIT.
Qwen3-Reranker-0.6B
0.6B params?0.6B parameters - Parameters are the learnable numerical weights inside a neural network. More parameters = more capacity to store knowledge and handle complex tasks, but also more compute and memory to run. 0.6B = 0.6 billion weights.
Key improvements
Key improvements highlighted in the model's official release notes or announcement.
Achieved state-of-the-art performance in text embedding and ranking tasks, with the 8B embedding model ranking No.1 on the MTEB multilingual leaderboard.
A
AlibabaApr 16, 2026The date the model was publicly announced or its weights became available to download.Open Source?Open source - The model's weights are publicly released, meaning anyone can download, run, fine-tune, or deploy it without an API key. Often released under licences like Apache 2.0 or MIT.
Qwen3-VL-Embedding-8B
8B params?8B parameters - Parameters are the learnable numerical weights inside a neural network. More parameters = more capacity to store knowledge and handle complex tasks, but also more compute and memory to run. 8B = 8 billion weights.
Key improvements
Key improvements highlighted in the model's official release notes or announcement.
Multimodal Versatility
Unified Representation Learning
A
AlibabaApr 16, 2026The date the model was publicly announced or its weights became available to download.Open Source?Open source - The model's weights are publicly released, meaning anyone can download, run, fine-tune, or deploy it without an API key. Often released under licences like Apache 2.0 or MIT.
Qwen3-VL-Embedding-2B
2B params?2B parameters - Parameters are the learnable numerical weights inside a neural network. More parameters = more capacity to store knowledge and handle complex tasks, but also more compute and memory to run. 2B = 2 billion weights.
Key improvements
Key improvements highlighted in the model's official release notes or announcement.
OpenAIApr 16, 2026The date the model was publicly announced or its weights became available to download.Proprietary?Proprietary - The model's weights are not public. You interact with it only through the company's API or products. The underlying architecture and training data are typically undisclosed.
GPT-Rosalind
Key improvements
Key improvements highlighted in the model's official release notes or announcement.
GoogleApr 15, 2026The date the model was publicly announced or its weights became available to download.Proprietary?Proprietary - The model's weights are not public. You interact with it only through the company's API or products. The underlying architecture and training data are typically undisclosed.
Gemini 3.1 Flash TTS
Key improvements
Key improvements highlighted in the model's official release notes or announcement.
GoogleApr 14, 2026The date the model was publicly announced or its weights became available to download.Open Source?Open source - The model's weights are publicly released, meaning anyone can download, run, fine-tune, or deploy it without an API key. Often released under licences like Apache 2.0 or MIT.
tipsv2-g14
1.1B params?1.1B parameters - Parameters are the learnable numerical weights inside a neural network. More parameters = more capacity to store knowledge and handle complex tasks, but also more compute and memory to run. 1.1B = 1.1 billion weights.
Key improvements
Key improvements highlighted in the model's official release notes or announcement.
Giant variant with 1.1B vision params and 389M text params
G
GoogleApr 14, 2026The date the model was publicly announced or its weights became available to download.Open Source?Open source - The model's weights are publicly released, meaning anyone can download, run, fine-tune, or deploy it without an API key. Often released under licences like Apache 2.0 or MIT.
TIPSv2 — L/14
Key improvements
Key improvements highlighted in the model's official release notes or announcement.
Produces spatially rich image features aligned with text embeddings
MicrosoftApr 14, 2026The date the model was publicly announced or its weights became available to download.Proprietary?Proprietary - The model's weights are not public. You interact with it only through the company's API or products. The underlying architecture and training data are typically undisclosed.
Skala-1.0
Key improvements
Key improvements highlighted in the model's official release notes or announcement.
Scalable neural network for DFT exchange-correlation functional
Learns non-local representations from inexpensive input features
Achieves chemical accuracy for atomization energies
Competitive accuracy with state-of-the-art functionals at lower cost
Offers comprehensive flexibility with a full spectrum of sizes (0.6B to 8B), flexible vector definitions, and support for user-defined instructions.
Provides robust multilingual capabilities, supporting over 100 languages including programming languages, for multilingual, cross-lingual, and code retrieval.