New algorithms open possibilities for training AI models on analog chips
Research
Peter Hess
In a lab in Yorktown sits the most advanced quantum computer in the world, and a cluster of revolutionary prototype AI chips. It’s where we’re inventing what’s next.
A community-based approach to building open-source LLMs, created by engineers from Red Hat and IBM Research.
A series of enterprise-focused, open-source models for language, code, time series and geospatial.
A large-scale dataset with approximately 14 million code samples, each of which is an intended solution to one of 4000 coding problems. Rich annotation enables research in code search, code completion, code-code translation, and myriad other use cases.
An open-source library to accelerate hypothesis generation in the scientific discovery process.