Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
lm-evaluation-harness
Commits
e1ae8a2f
Commit
e1ae8a2f
authored
Nov 26, 2023
by
Herbie Bradley
Browse files
Merge remote-tracking branch 'origin/big-refactor' into calibration
parents
50e99bd7
30936bc7
Changes
1000
Hide whitespace changes
Inline
Side-by-side
Showing
20 changed files
with
217 additions
and
0 deletions
+217
-0
lm_eval/tasks/mmlu/flan_cot_fewshot/mmlu_us_foreign_policy.yaml
...l/tasks/mmlu/flan_cot_fewshot/mmlu_us_foreign_policy.yaml
+40
-0
lm_eval/tasks/mmlu/flan_cot_fewshot/mmlu_virology.yaml
lm_eval/tasks/mmlu/flan_cot_fewshot/mmlu_virology.yaml
+31
-0
lm_eval/tasks/mmlu/flan_cot_fewshot/mmlu_world_religions.yaml
...val/tasks/mmlu/flan_cot_fewshot/mmlu_world_religions.yaml
+27
-0
lm_eval/tasks/mmlu/flan_cot_zeroshot/_mmlu.yaml
lm_eval/tasks/mmlu/flan_cot_zeroshot/_mmlu.yaml
+6
-0
lm_eval/tasks/mmlu/flan_cot_zeroshot/_mmlu_flan_cot_zeroshot_template_yaml
...u/flan_cot_zeroshot/_mmlu_flan_cot_zeroshot_template_yaml
+23
-0
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_abstract_algebra.yaml
...l/tasks/mmlu/flan_cot_zeroshot/mmlu_abstract_algebra.yaml
+6
-0
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_anatomy.yaml
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_anatomy.yaml
+6
-0
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_astronomy.yaml
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_astronomy.yaml
+6
-0
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_business_ethics.yaml
...al/tasks/mmlu/flan_cot_zeroshot/mmlu_business_ethics.yaml
+6
-0
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_clinical_knowledge.yaml
...tasks/mmlu/flan_cot_zeroshot/mmlu_clinical_knowledge.yaml
+6
-0
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_college_biology.yaml
...al/tasks/mmlu/flan_cot_zeroshot/mmlu_college_biology.yaml
+6
-0
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_college_chemistry.yaml
.../tasks/mmlu/flan_cot_zeroshot/mmlu_college_chemistry.yaml
+6
-0
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_college_computer_science.yaml
...mmlu/flan_cot_zeroshot/mmlu_college_computer_science.yaml
+6
-0
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_college_mathematics.yaml
...asks/mmlu/flan_cot_zeroshot/mmlu_college_mathematics.yaml
+6
-0
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_college_medicine.yaml
...l/tasks/mmlu/flan_cot_zeroshot/mmlu_college_medicine.yaml
+6
-0
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_college_physics.yaml
...al/tasks/mmlu/flan_cot_zeroshot/mmlu_college_physics.yaml
+6
-0
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_computer_security.yaml
.../tasks/mmlu/flan_cot_zeroshot/mmlu_computer_security.yaml
+6
-0
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_conceptual_physics.yaml
...tasks/mmlu/flan_cot_zeroshot/mmlu_conceptual_physics.yaml
+6
-0
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_econometrics.yaml
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_econometrics.yaml
+6
-0
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_electrical_engineering.yaml
...s/mmlu/flan_cot_zeroshot/mmlu_electrical_engineering.yaml
+6
-0
No files found.
Too many changes to show.
To preserve performance only
1000 of 1000+
files are displayed.
Plain diff
Email patch
lm_eval/tasks/mmlu/flan_cot_fewshot/mmlu_us_foreign_policy.yaml
0 → 100644
View file @
e1ae8a2f
"
dataset_name"
:
"
us_foreign_policy"
"
description"
:
"
The
following
are
multiple
choice
questions
(with
answers)
about
us
\
\
foreign
policy.
\n\n
Q:
How
did
Donald
Trump
attack
globalization
in
the
2016
campaign?
\n\
(A)
Globalization
had
made
men
like
him
too
rich
(B)
Globalization
only
benefited
\
\
certain
American
states,
such
as
New
York
(C)
Liberal
elites
had
encouraged
globalization,
\
\
while
'ordinary
Americans'
lost
jobs
because
of
it
(D)
Globalization
encouraged
\
\
damaging
trade
wars
\n
A:
Let's
think
step
by
step.
We
refer
to
Wikipedia
articles
\
\
on
us
foreign
policy
for
help.
Trump
attacked
globalization
because
he
believed
\
\
ordinary
Americans
lost
jobs
due
to
it,
and
so
he
wanted
to
blame
liberals
who
\
\
had
encouraged
it.
The
answer
is
(C).
\n\n
Q:
How
did
NSC-68
change
U.S.
strategy?
\n\
(A)
It
globalized
containment.
(B)
It
militarized
containment.
(C)
It
called
for
\
\
the
development
of
the
hydrogen
bomb.
(D)
All
of
the
above
\n
A:
Let's
think
step
\
\
by
step.
We
refer
to
Wikipedia
articles
on
us
foreign
policy
for
help.
NSC-68
\
\
outlined
a
variety
of
courses
of
action,
including
globalization
of
containment,
\
\
militarization
of
contaiment,
and
the
development
of
the
hydrogen
bomb.
The
answer
\
\
is
(D).
\n\n
Q:
How
do
Defensive
Realism
and
Offensive
Realism
differ
in
their
explanation
\
\
of
state
behaviour?
\n
(A)
Defensive
realists
place
greater
emphasis
on
the
role
\
\
of
international
institutions
(B)
Defensive
realists
place
less
emphasis
on
geographical
\
\
factors
(C)
Offensive
realists
give
more
priority
to
the
national
interest
than
\
\
Defensive
realists.
(D)
Defensive
realists
believe
states
are
security
maximizers,
\
\
while
Offensive
realists
believe
states
to
be
power
maximizers
\n
A:
Let's
think
\
\
step
by
step.
We
refer
to
Wikipedia
articles
on
us
foreign
policy
for
help.
While
\
\
defensive
realism
advocates
that
states
are
security
maximizers,
offensive
realists
\
\
think
of
states
as
power
maximizers.
The
answer
is
(D).
\n\n
Q:
The
realm
of
policy
\
\
decisions
concerned
primarily
with
relations
between
the
United
States
and
the
\
\
rest
of
the
world
is
known
as
\n
(A)
terrorism
policy.
(B)
economic
policy.
(C)
\
\
foreign
policy.
(D)
international
policy.
\n
A:
Let's
think
step
by
step.
We
refer
\
\
to
Wikipedia
articles
on
us
foreign
policy
for
help.
The
topic
of
policy
decisions
\
\
concerns
with
relations
between
the
US
and
the
rest
of
the
world
is
known
as
foreign
\
\
policy.
The
answer
is
(C).
\n\n
Q:
How
did
the
2008
financial
crisis
affect
America's
\
\
international
reputation?
\n
(A)
It
damaged
support
for
the
US
model
of
political
\
\
economy
and
capitalism
(B)
It
created
anger
at
the
United
States
for
exaggerating
\
\
the
crisis
(C)
It
increased
support
for
American
global
leadership
under
President
\
\
Obama
(D)
It
reduced
global
use
of
the
US
dollar
\n
A:
Let's
think
step
by
step.
\
\
We
refer
to
Wikipedia
articles
on
us
foreign
policy
for
help.
The
2008
financial
\
\
crisis
damanged
the
international
reputation
of
the
American
model
of
political
\
\
economy
and
capitalism.
The
answer
is
(A)."
"
group"
:
"
mmlu_flan_cot_fewshot_social_sciences"
"
include"
:
"
_mmlu_flan_cot_fewshot_template_yaml"
"
task"
:
"
mmlu_flan_cot_fewshot_us_foreign_policy"
lm_eval/tasks/mmlu/flan_cot_fewshot/mmlu_virology.yaml
0 → 100644
View file @
e1ae8a2f
"
dataset_name"
:
"
virology"
"
description"
:
"
The
following
are
multiple
choice
questions
(with
answers)
about
virology.
\n\
\n
Q:
The
median
survival
time
to
AIDS
and
death
was
established
by
following:
\n\
(A)
Seroprevalent
HIV-infected
individuals
(B)
Seronegatives
(C)
Seroconverters
\
\
(D)
High-risk
seronegatives
\n
A:
Let's
think
step
by
step.
We
refer
to
Wikipedia
\
\
articles
on
virology
for
help.
The
median
survival
time
to
AIDS
and
death
was
\
\
established
as
a
result
of
the
development
of
seroconverters.
The
answer
is
(C).
\n\
\n
Q:
Which
of
the
following
is
a
morphological
characteristic
of
the
paramyxoviruses.
\n\
(A)
Fragile
viruses
often
visualised
with
RNA
spewing
from
the
inside
(B)
Elongate
\
\
viruses
(C)
Icosahedral
viruses
with
envelope
(D)
Very
large
viruses
\n
A:
Let's
\
\
think
step
by
step.
We
refer
to
Wikipedia
articles
on
virology
for
help.
Paramyxoviruses
\
\
are
fragile
viruses
often
visualised
with
RNA
spewing
from
the
inside.
The
answer
\
\
is
(A).
\n\n
Q:
The
most
important
goal
of
a
behavioral
intervention
is:
\n
(A)
Change
\
\
in
behavior
(B)
Comprehensive
coverage
(C)
Effective
use
of
behavioral
theory
\
\
(D)
Sustained
behavior
change
\n
A:
Let's
think
step
by
step.
We
refer
to
Wikipedia
\
\
articles
on
virology
for
help.
The
prim
goal
of
a
behavioral
intervention
is
to
\
\
cause
sustained
behavior
change.
The
answer
is
(D).
\n\n
Q:
A
key
factor
facilitating
\
\
the
application
of
nested
case-control
studies
from
the
MACS
was:
\n
(A)
Data
collection
\
\
(B)
Establishment
of
a
repository
of
biologic
specimens
(C)
Participant
interest
\
\
(D)
Administration
of
the
questionnaire
by
staff
\n
A:
Let's
think
step
by
step.
\
\
We
refer
to
Wikipedia
articles
on
virology
for
help.
The
Multicenter
AIDS
Cohort
\
\
Study's
use
of
nested
case-control
studies
was
facilitated
by
the
establishment
\
\
of
a
repository
of
biologic
specimens.
The
answer
is
(B).
\n\n
Q:
Why
are
parvoviruses
\
\
a
highly
impactful
parasite?
\n
(A)
Because
they
have
no
nucleic
acid
(B)
They
require
\
\
a
helper
virus
(C)
Only
replicate
in
dividing
cells
(D)
Can
integrate
into
host
\
\
chromosomes
\n
A:
Let's
think
step
by
step.
We
refer
to
Wikipedia
articles
on
virology
\
\
for
help.
Paroviruses
are
highly
impactful
because
they
do
not
have
nucleic
acid.
\
\
The
answer
is
(A)."
"
group"
:
"
mmlu_flan_cot_fewshot_other"
"
include"
:
"
_mmlu_flan_cot_fewshot_template_yaml"
"
task"
:
"
mmlu_flan_cot_fewshot_virology"
lm_eval/tasks/mmlu/flan_cot_fewshot/mmlu_world_religions.yaml
0 → 100644
View file @
e1ae8a2f
"
dataset_name"
:
"
world_religions"
"
description"
:
"
The
following
are
multiple
choice
questions
(with
answers)
about
world
\
\
religions.
\n\n
Q:
How
can
the
Upanishads
be
characterized?
\n
(A)
Ritual
texts
(B)
\
\
Philosophical
texts
(C)
Hymns
(D)
Origin
stories
\n
A:
Let's
think
step
by
step.
\
\
We
refer
to
Wikipedia
articles
on
world
religions
for
help.
The
Upanishads
are
\
\
the
most
recent
part
of
Vedas
(the
oldest
scriptures
in
Hinduism)
and
supplied
\
\
the
basis
of
later
Hindu
philosophy.
So
they
are
philosophical
texts.
The
answer
\
\
is
(B).
\n\n
Q:
What
is
the
Second
Gem
in
Buddhism?
\n
(A)
The
Dharma
(B)
The
Sangha
\
\
(C)
The
Buddha
(D)
The
Bodhisattva
\n
A:
Let's
think
step
by
step.
We
refer
to
Wikipedia
\
\
articles
on
world
religions
for
help.
The
Second
Gem
in
Buddhism
is
The
Dharma.
\
\
The
answer
is
(A).
\n\n
Q:
Which
Japanese
government
promoted
a
kind
of
national
\
\
cult
based
on
the
emperor
and
his
associations
with
kami?
\n
(A)
Honen
(B)
Tanaka
\
\
(C)
Tokugawa
(D)
Meiji
\n
A:
Let's
think
step
by
step.
We
refer
to
Wikipedia
articles
\
\
on
world
religions
for
help.
The
promotion
of
a
national
cult
based
on
the
emperor
\
\
and
his
associations
with
Kami
happened
during
the
reign
of
Emperor
Meiji
(1852-1912).
\
\
The
answer
is
(D).
\n\n
Q:
In
which
dynasty
was
the
\"
Mandate
of
Heaven
\"
developed
\
\
to
legitimatize
the
new
rulers?
\n
(A)
Shang
(B)
Zhou
(C)
Han
(D)
Xia
\n
A:
Let's
\
\
think
step
by
step.
We
refer
to
Wikipedia
articles
on
world
religions
for
help.
\
\
The
\"
Mandate
of
Heaven
\"
was
developed
as
an
ancient
Chinese
philosophical
concept
\
\
during
the
Zhou
Dynasty
(1046-256
BCE).
The
answer
is
(B).
\n\n
Q:
What
is
the
sign
\
\
of
the
covenant
for
Jewish
males?
\n
(A)
The
rainbow
(B)
Circumcision
(C)
A
son
\
\
(D)
Bar
mitzvah
\n
A:
Let's
think
step
by
step.
We
refer
to
Wikipedia
articles
on
\
\
world
religions
for
help.
In
Judaism,
the
most
distinctive
sign
of
the
covenant
\
\
is
circumcision
(brit
milah).
The
answer
is
(B)."
"
group"
:
"
mmlu_flan_cot_fewshot_humanities"
"
include"
:
"
_mmlu_flan_cot_fewshot_template_yaml"
"
task"
:
"
mmlu_flan_cot_fewshot_world_religions"
lm_eval/tasks/mmlu/flan_cot_zeroshot/_mmlu.yaml
0 → 100644
View file @
e1ae8a2f
group
:
mmlu_flan_cot_zeroshot
task
:
-
mmlu_flan_cot_zeroshot_stem
-
mmlu_flan_cot_zeroshot_other
-
mmlu_flan_cot_zeroshot_social_sciences
-
mmlu_flan_cot_zeroshot_humanities
lm_eval/tasks/mmlu/flan_cot_zeroshot/_mmlu_flan_cot_zeroshot_template_yaml
0 → 100644
View file @
e1ae8a2f
dataset_path: hails/mmlu_no_train # a copy of `cais/mmlu` with no auxiliary_train split
validation_split: validation
fewshot_split: dev
output_type: generate_until
doc_to_text: "Q: {{question.strip()}}\n(A) {{choices[0]}} (B) {{choices[1]}} (C) {{choices[2]}} (D) {{choices[3]}}\nA: Let's think step by step."
doc_to_target: "{{['(A)', '(B)', '(C)', '(D)'][answer]}}"
filter_list:
- name: "get-answer"
filter:
- function: "regex"
regex_pattern: "((?<=The answer is )(.*)(?=.)|(?<=the answer is )(.*)(?=.)|(?<=The answer: )(.*)(?=.)|(?<=The final answer: )(.*)(?=.))"
- function: "take_first"
generation_kwargs:
until:
- "</s>"
do_sample: false
temperature: 0.0
metric_list:
- metric: exact_match
aggregation: mean
higher_is_better: true
ignore_case: true
ignore_punctuation: true
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_abstract_algebra.yaml
0 → 100644
View file @
e1ae8a2f
"
dataset_name"
:
"
abstract_algebra"
"
description"
:
"
The
following
are
multiple
choice
questions
(with
answers)
about
abstract
\
\
algebra.
\n\n
"
"
group"
:
"
mmlu_flan_cot_zeroshot_stem"
"
include"
:
"
_mmlu_flan_cot_zeroshot_template_yaml"
"
task"
:
"
mmlu_flan_cot_zeroshot_abstract_algebra"
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_anatomy.yaml
0 → 100644
View file @
e1ae8a2f
"
dataset_name"
:
"
anatomy"
"
description"
:
"
The
following
are
multiple
choice
questions
(with
answers)
about
anatomy.
\n\
\n
"
"
group"
:
"
mmlu_flan_cot_zeroshot_stem"
"
include"
:
"
_mmlu_flan_cot_zeroshot_template_yaml"
"
task"
:
"
mmlu_flan_cot_zeroshot_anatomy"
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_astronomy.yaml
0 → 100644
View file @
e1ae8a2f
"
dataset_name"
:
"
astronomy"
"
description"
:
"
The
following
are
multiple
choice
questions
(with
answers)
about
astronomy.
\n\
\n
"
"
group"
:
"
mmlu_flan_cot_zeroshot_stem"
"
include"
:
"
_mmlu_flan_cot_zeroshot_template_yaml"
"
task"
:
"
mmlu_flan_cot_zeroshot_astronomy"
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_business_ethics.yaml
0 → 100644
View file @
e1ae8a2f
"
dataset_name"
:
"
business_ethics"
"
description"
:
"
The
following
are
multiple
choice
questions
(with
answers)
about
business
\
\
ethics.
\n\n
"
"
group"
:
"
mmlu_flan_cot_zeroshot_other"
"
include"
:
"
_mmlu_flan_cot_zeroshot_template_yaml"
"
task"
:
"
mmlu_flan_cot_zeroshot_business_ethics"
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_clinical_knowledge.yaml
0 → 100644
View file @
e1ae8a2f
"
dataset_name"
:
"
clinical_knowledge"
"
description"
:
"
The
following
are
multiple
choice
questions
(with
answers)
about
clinical
\
\
knowledge.
\n\n
"
"
group"
:
"
mmlu_flan_cot_zeroshot_other"
"
include"
:
"
_mmlu_flan_cot_zeroshot_template_yaml"
"
task"
:
"
mmlu_flan_cot_zeroshot_clinical_knowledge"
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_college_biology.yaml
0 → 100644
View file @
e1ae8a2f
"
dataset_name"
:
"
college_biology"
"
description"
:
"
The
following
are
multiple
choice
questions
(with
answers)
about
college
\
\
biology.
\n\n
"
"
group"
:
"
mmlu_flan_cot_zeroshot_stem"
"
include"
:
"
_mmlu_flan_cot_zeroshot_template_yaml"
"
task"
:
"
mmlu_flan_cot_zeroshot_college_biology"
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_college_chemistry.yaml
0 → 100644
View file @
e1ae8a2f
"
dataset_name"
:
"
college_chemistry"
"
description"
:
"
The
following
are
multiple
choice
questions
(with
answers)
about
college
\
\
chemistry.
\n\n
"
"
group"
:
"
mmlu_flan_cot_zeroshot_stem"
"
include"
:
"
_mmlu_flan_cot_zeroshot_template_yaml"
"
task"
:
"
mmlu_flan_cot_zeroshot_college_chemistry"
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_college_computer_science.yaml
0 → 100644
View file @
e1ae8a2f
"
dataset_name"
:
"
college_computer_science"
"
description"
:
"
The
following
are
multiple
choice
questions
(with
answers)
about
college
\
\
computer
science.
\n\n
"
"
group"
:
"
mmlu_flan_cot_zeroshot_stem"
"
include"
:
"
_mmlu_flan_cot_zeroshot_template_yaml"
"
task"
:
"
mmlu_flan_cot_zeroshot_college_computer_science"
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_college_mathematics.yaml
0 → 100644
View file @
e1ae8a2f
"
dataset_name"
:
"
college_mathematics"
"
description"
:
"
The
following
are
multiple
choice
questions
(with
answers)
about
college
\
\
mathematics.
\n\n
"
"
group"
:
"
mmlu_flan_cot_zeroshot_stem"
"
include"
:
"
_mmlu_flan_cot_zeroshot_template_yaml"
"
task"
:
"
mmlu_flan_cot_zeroshot_college_mathematics"
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_college_medicine.yaml
0 → 100644
View file @
e1ae8a2f
"
dataset_name"
:
"
college_medicine"
"
description"
:
"
The
following
are
multiple
choice
questions
(with
answers)
about
college
\
\
medicine.
\n\n
"
"
group"
:
"
mmlu_flan_cot_zeroshot_other"
"
include"
:
"
_mmlu_flan_cot_zeroshot_template_yaml"
"
task"
:
"
mmlu_flan_cot_zeroshot_college_medicine"
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_college_physics.yaml
0 → 100644
View file @
e1ae8a2f
"
dataset_name"
:
"
college_physics"
"
description"
:
"
The
following
are
multiple
choice
questions
(with
answers)
about
college
\
\
physics.
\n\n
"
"
group"
:
"
mmlu_flan_cot_zeroshot_stem"
"
include"
:
"
_mmlu_flan_cot_zeroshot_template_yaml"
"
task"
:
"
mmlu_flan_cot_zeroshot_college_physics"
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_computer_security.yaml
0 → 100644
View file @
e1ae8a2f
"
dataset_name"
:
"
computer_security"
"
description"
:
"
The
following
are
multiple
choice
questions
(with
answers)
about
computer
\
\
security.
\n\n
"
"
group"
:
"
mmlu_flan_cot_zeroshot_stem"
"
include"
:
"
_mmlu_flan_cot_zeroshot_template_yaml"
"
task"
:
"
mmlu_flan_cot_zeroshot_computer_security"
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_conceptual_physics.yaml
0 → 100644
View file @
e1ae8a2f
"
dataset_name"
:
"
conceptual_physics"
"
description"
:
"
The
following
are
multiple
choice
questions
(with
answers)
about
conceptual
\
\
physics.
\n\n
"
"
group"
:
"
mmlu_flan_cot_zeroshot_stem"
"
include"
:
"
_mmlu_flan_cot_zeroshot_template_yaml"
"
task"
:
"
mmlu_flan_cot_zeroshot_conceptual_physics"
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_econometrics.yaml
0 → 100644
View file @
e1ae8a2f
"
dataset_name"
:
"
econometrics"
"
description"
:
"
The
following
are
multiple
choice
questions
(with
answers)
about
econometrics.
\n\
\n
"
"
group"
:
"
mmlu_flan_cot_zeroshot_social_sciences"
"
include"
:
"
_mmlu_flan_cot_zeroshot_template_yaml"
"
task"
:
"
mmlu_flan_cot_zeroshot_econometrics"
lm_eval/tasks/mmlu/flan_cot_zeroshot/mmlu_electrical_engineering.yaml
0 → 100644
View file @
e1ae8a2f
"
dataset_name"
:
"
electrical_engineering"
"
description"
:
"
The
following
are
multiple
choice
questions
(with
answers)
about
electrical
\
\
engineering.
\n\n
"
"
group"
:
"
mmlu_flan_cot_zeroshot_stem"
"
include"
:
"
_mmlu_flan_cot_zeroshot_template_yaml"
"
task"
:
"
mmlu_flan_cot_zeroshot_electrical_engineering"
Prev
1
…
43
44
45
46
47
48
49
50
Next
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment