Orca 2: Enhancing Reasoning in Smaller Language Models

United Kingdom News News

Orca 2: Enhancing Reasoning in Smaller Language Models
United Kingdom Latest News,United Kingdom Headlines
  • 📰 hackernoon
  • ⏱ Reading Time:
  • 23 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 13%
  • Publisher: 51%

Orca 2 enhances small language models' reasoning by teaching diverse strategies for tasks, outperforming models up to 10x larger in complex benchmarks.

Authors: Arindam Mitra; Luciano Del Corro, work done while at Microsoft; Shweti Mahajan, work done while at Microsoft; Andres Codas, denote equal contributions; Clarisse Simoes, denote equal contributions; Sahaj Agarwal; Xuxi Chen, work done while at Microsoft;; Anastasia Razdaibiedina, work done while at Microsoft; Erik Jones, work done while at Microsoft; Kriti Aggarwal, work done while at Microsoft; Hamid Palangi; Guoqing Zheng; Corby Rosset; Hamed Khanpour; Ahmed Awadall.

Eval, BigBench-Hard , DROP, RACE, GSM8K, and CRASS. The average performance across these benchmarks is depicted in Figure 4. When comparing Orca 2, we observe the following phenomenon: • Surpassing models of the same size - Orca-2-13B significantly outperforms models of the same size on zero-shot reasoning tasks. Orca-2-13B provides a relative improvement of 47.54% over LLaMA-2-Chat-13B and 28.15% over WizardLM-13B.

Eval Subtask Metrics A.

We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

hackernoon /  🏆 532. in US

United Kingdom Latest News, United Kingdom Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

Orca 2: Enhancing Reasoning in Smaller Language Models - Abstract and IntroductionOrca 2: Enhancing Reasoning in Smaller Language Models - Abstract and IntroductionOrca 2 enhances small language models' reasoning by teaching diverse strategies for tasks, outperforming models up to 10x larger in complex benchmarks.
Read more »

Orca 2: Enhancing Reasoning in Smaller Language ModelsOrca 2: Enhancing Reasoning in Smaller Language ModelsOrca 2 enhances small language models' reasoning by teaching diverse strategies for tasks, outperforming models up to 10x larger in complex benchmarks.
Read more »

Orca 2: Enhancing Reasoning in Smaller Language ModelsOrca 2: Enhancing Reasoning in Smaller Language ModelsOrca 2 enhances small language models' reasoning by teaching diverse strategies for tasks, outperforming models up to 10x larger in complex benchmarks.
Read more »

Orca 2: Enhancing Reasoning in Smaller Language ModelsOrca 2: Enhancing Reasoning in Smaller Language ModelsOrca 2 enhances small language models' reasoning by teaching diverse strategies for tasks, outperforming models up to 10x larger in complex benchmarks.
Read more »

Orca 2: Enhancing Reasoning in Smaller Language ModelsOrca 2: Enhancing Reasoning in Smaller Language ModelsOrca 2 enhances small language models' reasoning by teaching diverse strategies for tasks, outperforming models up to 10x larger in complex benchmarks.
Read more »

Smaller homes on smaller lots: Could “light touch density” help erase Colorado’s housing deficit?Smaller homes on smaller lots: Could “light touch density” help erase Colorado’s housing deficit?A right-of-center think tank, the American Enterprise Institute, came to Denver on Monday to pitch a free market solution to resolve the state’s housing deficit in under three years and gener…
Read more »



Render Time: 2025-04-10 20:13:43