AI21 Labs has released two new open-source AI models, Jamba 1.5 Mini and Jamba 1.5 Large, featuring a unique hybrid architecture called SSM-Transformer that combines the strengths of Transformers and Structured State Space Models. These Jamba models excel in handling long context windows, offering faster processing and lower resource usage, making them ideal for complex, data-heavy AI tasks. Jamba 1.5 models outperform other popular models like Llama 3.1 and Mistral in benchmarks, providing developers with powerful, efficient AI tools for diverse applications. #ai #opensource

AI NewsAI UpdatesAI RevolutionAIAI21 LabsJamba 1.5open-source AIAI modelshybrid AISSM-Transformermachine learningnatural language processinglong context windowsTransformersdeep learningneural networksMamba architectureAI benchmarkslanguage modelsgenerative AIcloud AINVIDIA NIMAI for businesscontext-aware AIquantizationExpertsInt8AI innovationAI developmententerprise AIAI research