Hugging Face sees small AI models advancing robotics
Hugging Face, the AI startup known for open-source innovations, is developing smaller language models aimed at enhancing robotics and on-device AI.
Speaking at Web Summit in Lisbon, Thomas Wolf, Co-Founder and Chief Science Officer of Hugging Face, emphasised that smaller models are key for real-time applications.
"We want to deploy models in robots that are smarter, so we can start having robots that are not only on assembly lines, but also in the wild," Wolf said, noting the importance of low latency.
“You cannot wait two seconds so that your robots understand what's happening, and the only way we can do that is through a small language model," he added.
Wolf highlighted that these smaller models can handle many tasks previously thought to require larger models and can be deployed directly on devices like laptops and smartphones.
"If you think about this kind of game changer, you can have them running on your laptop," he said.
Earlier this year, Hugging Face introduced its SmolLM, a small-scale language model that performs efficiently with fewer parameters.
Wolf explained that a one-billion parameter model could match the performance of larger, ten-billion parameter models from last year.
"You have a 10 times smaller model that can reach roughly similar performance," he pointed out.
According to Wolf, training these models on tailored datasets enhances their utility for specific tasks such as data processing and speech recognition.
Adaptations include embedding "very tiny, tiny neural nets" to further refine model specialisation, likened to "putting a hat for a specific task."
Disclaimer: The content of this article solely reflects the author's opinion and does not represent the platform in any capacity. This article is not intended to serve as a reference for making investment decisions.
You may also like
Huge ‘screw-up’ — Pump Science apologizes after flood of fraud tokens
Bitcoin ETFs See $550M Outflows in Two Days as Investor Sentiment Cools
Sei Research Initiative Aims to Overcome EVM Limitations
Sei Labs and the Sei Foundation launch the Sei Research Initiative to address Ethereum Virtual Machine limitations, enhancing scalability and decentralization.
US Elections Propel $2.2B Inflows into Digital Assets, Bitcoin (BTC) Leads
The recent US elections have sparked $2.2 billion in inflows into digital assets, with Bitcoin and Ethereum witnessing significant activity, according to CoinShares.