H2O AI launches Danube, a tiny LLM for mobile applications

Today, H2O AI, THE business functioning has democratize AI with A range of Open source And owner tools, announcement THE release of Danube, A new super-small big language model (LLM) For mobile devices.

Appointed After THE second biggest river In Europe, THE Open source model come with 1.8 billion settings And East said has match Or surpass in the same way size models through A range of natural language Tasks. This puts he In THE even category as strong offerings Since Microsoft, Stability AI And Eleuther AI.

THE Hourly of THE announcement makes Perfect sense. Businesses building consumer devices are races has explore THE potential of offline generative AI, Or models run locally on THE product, giving users fast assistance through functions And eliminate THE need has take information out has THE cloud.

"We are excited has release H2O-Danube-1.8B as A portable LLM on little devices as your smartphone… THE proliferation of smaller, lower cost material And more effective training NOW allow modest in size models has be accessible has A wider audience… We believe H2O-Danube-1.8B will be A game changer For mobile offline applications," Sri Ambati, CEO And co-founder of H2O, said In A declaration.

V.B. Event

THE AI Impact Tour – New York

GOOD be In New York on FEBRUARY 29 In Partnership with Microsoft has discuss how has balance risks And rewards of AI applications. Request A invite has THE exclusive event below.

Request A invite

While Danube has just has been announcement, H2O complaints he can be refined has handle A range of natural language apps on little devices, including common sense reasoning, while reading understanding, recapitulation And translation.

HAS form THE mini model, THE business collection A thousand billion tokens Since miscellaneous the Web sources And used techniques refined Since Lama 2 And Mistral models has improve It is generation abilities.

"We adjusted THE Lama 2 architecture For A total of around 1.8 billion settings. We (SO) used THE original Lama 2 tokenizer with A vocabulary size of 32,000 And qualified OUR model up has A context length of 16,384. We incorporated THE slip window attention Since Mistral with A size of 4096", THE business note while describing THE model architecture on Cuddles Face.

When tested on marks, THE model was find has be perform on by Or better that most models In THE Parameter 1-2B category.

For example, In THE Hellaswag test aiming has assess common sense natural language inference, he carried out with A precision of 69.58%, session just behind Stability AI Stable M.L. 2 1.6 billion setting model pre-trained on 2 thousand billion tokens. In the same way, In THE Bow reference For advance question respondent, he ranks third behind Microsoft Phi 1.5 (1.3 billion setting model) And Stable M.L. 2 with A precision of 39.42%.

H2O has released Danube-1.8B below A Apache 2.0 Licence For commercial to use. Any of them team look has implement THE model For A mobile to use case can download he Since Cuddles Confront And perform application specific fine tuning.

HAS TO DO This process Easier, THE business Also plans has release additional tools Soon. He has Also released A listening to the cat version of THE model (

H2O AI launches Danube, a tiny LLM for mobile applications

Today, H2O AI, THE business functioning has democratize AI with A range of Open source And owner tools, announcement THE release of Danube, A new super-small big language model (LLM) For mobile devices.

Appointed After THE second biggest river In Europe, THE Open source model come with 1.8 billion settings And East said has match Or surpass in the same way size models through A range of natural language Tasks. This puts he In THE even category as strong offerings Since Microsoft, Stability AI And Eleuther AI.

THE Hourly of THE announcement makes Perfect sense. Businesses building consumer devices are races has explore THE potential of offline generative AI, Or models run locally on THE product, giving users fast assistance through functions And eliminate THE need has take information out has THE cloud.

"We are excited has release H2O-Danube-1.8B as A portable LLM on little devices as your smartphone… THE proliferation of smaller, lower cost material And more effective training NOW allow modest in size models has be accessible has A wider audience… We believe H2O-Danube-1.8B will be A game changer For mobile offline applications," Sri Ambati, CEO And co-founder of H2O, said In A declaration.

V.B. Event

THE AI Impact Tour – New York

GOOD be In New York on FEBRUARY 29 In Partnership with Microsoft has discuss how has balance risks And rewards of AI applications. Request A invite has THE exclusive event below.

Request A invite

While Danube has just has been announcement, H2O complaints he can be refined has handle A range of natural language apps on little devices, including common sense reasoning, while reading understanding, recapitulation And translation.

HAS form THE mini model, THE business collection A thousand billion tokens Since miscellaneous the Web sources And used techniques refined Since Lama 2 And Mistral models has improve It is generation abilities.

"We adjusted THE Lama 2 architecture For A total of around 1.8 billion settings. We (SO) used THE original Lama 2 tokenizer with A vocabulary size of 32,000 And qualified OUR model up has A context length of 16,384. We incorporated THE slip window attention Since Mistral with A size of 4096", THE business note while describing THE model architecture on Cuddles Face.

When tested on marks, THE model was find has be perform on by Or better that most models In THE Parameter 1-2B category.

For example, In THE Hellaswag test aiming has assess common sense natural language inference, he carried out with A precision of 69.58%, session just behind Stability AI Stable M.L. 2 1.6 billion setting model pre-trained on 2 thousand billion tokens. In the same way, In THE Bow reference For advance question respondent, he ranks third behind Microsoft Phi 1.5 (1.3 billion setting model) And Stable M.L. 2 with A precision of 39.42%.

H2O has released Danube-1.8B below A Apache 2.0 Licence For commercial to use. Any of them team look has implement THE model For A mobile to use case can download he Since Cuddles Confront And perform application specific fine tuning.

HAS TO DO This process Easier, THE business Also plans has release additional tools Soon. He has Also released A listening to the cat version of THE model (

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow