Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
lm-evaluation-harness
Commits
4f5b72bc
Commit
4f5b72bc
authored
Sep 04, 2023
by
lintangsutawika
Browse files
fixed template
parent
4a752867
Changes
109
Hide whitespace changes
Inline
Side-by-side
Showing
20 changed files
with
20 additions
and
20 deletions
+20
-20
lm_eval/tasks/bbh/flan_fewshot/formal_fallacies.yaml
lm_eval/tasks/bbh/flan_fewshot/formal_fallacies.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/geometric_shapes.yaml
lm_eval/tasks/bbh/flan_fewshot/geometric_shapes.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/hyperbaton.yaml
lm_eval/tasks/bbh/flan_fewshot/hyperbaton.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/logical_deduction_five_objects.yaml
...asks/bbh/flan_fewshot/logical_deduction_five_objects.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/logical_deduction_seven_objects.yaml
...sks/bbh/flan_fewshot/logical_deduction_seven_objects.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/logical_deduction_three_objects.yaml
...sks/bbh/flan_fewshot/logical_deduction_three_objects.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/movie_recommendation.yaml
lm_eval/tasks/bbh/flan_fewshot/movie_recommendation.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/multistep_arithmetic_two.yaml
lm_eval/tasks/bbh/flan_fewshot/multistep_arithmetic_two.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/navigate.yaml
lm_eval/tasks/bbh/flan_fewshot/navigate.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/object_counting.yaml
lm_eval/tasks/bbh/flan_fewshot/object_counting.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/penguins_in_a_table.yaml
lm_eval/tasks/bbh/flan_fewshot/penguins_in_a_table.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/reasoning_about_colored_objects.yaml
...sks/bbh/flan_fewshot/reasoning_about_colored_objects.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/ruin_names.yaml
lm_eval/tasks/bbh/flan_fewshot/ruin_names.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/salient_translation_error_detection.yaml
...bbh/flan_fewshot/salient_translation_error_detection.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/snarks.yaml
lm_eval/tasks/bbh/flan_fewshot/snarks.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/sports_understanding.yaml
lm_eval/tasks/bbh/flan_fewshot/sports_understanding.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/temporal_sequences.yaml
lm_eval/tasks/bbh/flan_fewshot/temporal_sequences.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/tracking_shuffled_objects_five_objects.yaml
.../flan_fewshot/tracking_shuffled_objects_five_objects.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/tracking_shuffled_objects_seven_objects.yaml
...flan_fewshot/tracking_shuffled_objects_seven_objects.yaml
+1
-1
lm_eval/tasks/bbh/flan_fewshot/tracking_shuffled_objects_three_objects.yaml
...flan_fewshot/tracking_shuffled_objects_three_objects.yaml
+1
-1
No files found.
lm_eval/tasks/bbh/flan_fewshot/formal_fallacies.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
formal_fallacies"
"
description"
:
"
Distinguish
deductively
valid
arguments
from
formal
fallacies.
\n\n
"
"
doc_to_text"
:
"
Q:
\"
It
is
not
always
easy
to
see
who
is
related
to
whom
--
and
in
which
ways.
The
following
argument
pertains
to
this
question:
To
begin
with,
Lesley
is
a
close
friend
of
Fernando.
Moreover,
being
a
close
friend
of
Fernando
or
a
schoolmate
of
Lowell
is
sufficient
for
being
a
great-grandfather
of
Leroy.
It
follows
that
Lesley
is
a
great-grandfather
of
Leroy.
\"\n
Is
the
argument,
given
the
explicitly
stated
premises,
deductively
valid
or
invalid?
\n
Options:
\n
-
valid
\n
-
invalid
\n
A:
valid
\n\n
Q:
\"
It
is
not
always
easy
to
see
who
is
related
to
whom
--
and
in
which
ways.
The
following
argument
pertains
to
this
question:
Whoever
is
not
a
great-grandfather
of
Clyde
is
a
stepbrother
of
Brian.
Being
an
ancestor
of
Dana
is
sufficient
for
not
being
a
great-grandfather
of
Clyde.
We
may
conclude:
Everyone
who
is
an
ancestor
of
Dana
is
a
stepbrother
of
Brian,
too.
\"\n
Is
the
argument,
given
the
explicitly
stated
premises,
deductively
valid
or
invalid?
\n
Options:
\n
-
valid
\n
-
invalid
\n
A:
valid
\n\n
Q:
\"
It
is
not
always
easy
to
grasp
who
is
consuming
which
products.
The
following
argument
pertains
to
this
question:
Every
infrequent
user
of
Paul
Mitchell
shampoo
is
either
a
rare
consumer
of
Nioxin
shampoo
or
a
loyal
buyer
of
Caress
soap,
or
both.
No
regular
consumer
of
Lush
soap
is
a
rare
consumer
of
Nioxin
shampoo
and,
in
the
same
time,
a
loyal
buyer
of
Caress
soap.
It
follows
that
whoever
is
an
infrequent
user
of
Paul
Mitchell
shampoo
is
not
a
regular
consumer
of
Lush
soap.
\"\n
Is
the
argument,
given
the
explicitly
stated
premises,
deductively
valid
or
invalid?
\n
Options:
\n
-
valid
\n
-
invalid
\n
A:
invalid
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_formal_fallacies"
lm_eval/tasks/bbh/flan_fewshot/geometric_shapes.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
geometric_shapes"
"
description"
:
"
Name
geometric
shapes
from
their
SVG
paths.
\n\n
"
"
doc_to_text"
:
"
Q:
This
SVG
path
element
<path
d=
\"
M
31.00,73.00
L
32.00,59.00
L
44.00,50.00
L
49.00,41.00
L
64.00,37.00
L
71.00,55.00
L
64.00,76.00
L
52.00,61.00
L
31.00,73.00
\"
/>
draws
a
\n
Options:
\n
(A)
circle
\n
(B)
heptagon
\n
(C)
hexagon
\n
(D)
kite
\n
(E)
line
\n
(F)
octagon
\n
(G)
pentagon
\n
(H)
rectangle
\n
(I)
sector
\n
(J)
triangle
\n
A:
(F)
\n\n
Q:
This
SVG
path
element
<path
d=
\"
M
14.19,26.04
L
51.43,39.21
L
58.44,36.69
L
56.63,30.17
L
48.53,26.66
L
14.19,26.04
\"
/>
draws
a
\n
Options:
\n
(A)
circle
\n
(B)
heptagon
\n
(C)
hexagon
\n
(D)
kite
\n
(E)
line
\n
(F)
octagon
\n
(G)
pentagon
\n
(H)
rectangle
\n
(I)
sector
\n
(J)
triangle
\n
A:
(G)
\n\n
Q:
This
SVG
path
element
<path
d=
\"
M
41.00,43.00
L
37.00,34.00
L
41.00,33.00
L
45.00,34.00
L
41.00,43.00
\"
/>
draws
a
\n
Options:
\n
(A)
circle
\n
(B)
heptagon
\n
(C)
hexagon
\n
(D)
kite
\n
(E)
line
\n
(F)
octagon
\n
(G)
pentagon
\n
(H)
rectangle
\n
(I)
sector
\n
(J)
triangle
\n
A:
(D)
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_geometric_shapes"
lm_eval/tasks/bbh/flan_fewshot/hyperbaton.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
hyperbaton"
"
description"
:
"
Order
adjectives
correctly
in
English
sentences.
\n\n
"
"
doc_to_text"
:
"
Q:
Which
sentence
has
the
correct
adjective
order:
\n
Options:
\n
(A)
rubber
terrible
ship
\n
(B)
terrible
rubber
ship
\n
A:
(B)
\n\n
Q:
Which
sentence
has
the
correct
adjective
order:
\n
Options:
\n
(A)
repulsive
small
Brazilian
exercise
ship
\n
(B)
Brazilian
repulsive
exercise
small
ship
\n
A:
(A)
\n\n
Q:
Which
sentence
has
the
correct
adjective
order:
\n
Options:
\n
(A)
blue
gold
wonderful
square
shoe
\n
(B)
wonderful
square
blue
gold
shoe
\n
A:
(B)
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_hyperbaton"
lm_eval/tasks/bbh/flan_fewshot/logical_deduction_five_objects.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
logical_deduction_five_objects"
"
description"
:
"
A
logical
deduction
task
which
requires
deducing
the
order
of
a
sequence
of
objects.
\n\n
"
"
doc_to_text"
:
"
Q:
The
following
paragraphs
each
describe
a
set
of
three
objects
arranged
in
a
fixed
order.
The
statements
are
logically
consistent
within
each
paragraph.
In
a
golf
tournament,
there
were
three
golfers:
Amy,
Eli,
and
Eve.
Eve
finished
above
Amy.
Eli
finished
below
Amy.
\n
Options:
\n
(A)
Amy
finished
last
\n
(B)
Eli
finished
last
\n
(C)
Eve
finished
last
\n
A:
(B)
\n\n
Q:
The
following
paragraphs
each
describe
a
set
of
three
objects
arranged
in
a
fixed
order.
The
statements
are
logically
consistent
within
each
paragraph.
On
a
shelf,
there
are
three
books:
a
white
book,
a
green
book,
and
an
orange
book.
The
green
book
is
to
the
right
of
the
white
book.
The
orange
book
is
the
rightmost.
\n
Options:
\n
(A)
The
white
book
is
the
leftmost
\n
(B)
The
green
book
is
the
leftmost
\n
(C)
The
orange
book
is
the
leftmost
\n
A:
(A)
\n\n
Q:
The
following
paragraphs
each
describe
a
set
of
three
objects
arranged
in
a
fixed
order.
The
statements
are
logically
consistent
within
each
paragraph.
On
a
shelf,
there
are
three
books:
a
red
book,
a
gray
book,
and
a
white
book.
The
white
book
is
to
the
left
of
the
gray
book.
The
red
book
is
the
second
from
the
left.
\n
Options:
\n
(A)
The
red
book
is
the
leftmost
\n
(B)
The
gray
book
is
the
leftmost
\n
(C)
The
white
book
is
the
leftmost
\n
A:
(C)
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_logical_deduction_five_objects"
lm_eval/tasks/bbh/flan_fewshot/logical_deduction_seven_objects.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
logical_deduction_seven_objects"
"
description"
:
"
A
logical
deduction
task
which
requires
deducing
the
order
of
a
sequence
of
objects.
\n\n
"
"
doc_to_text"
:
"
Q:
The
following
paragraphs
each
describe
a
set
of
three
objects
arranged
in
a
fixed
order.
The
statements
are
logically
consistent
within
each
paragraph.
In
a
golf
tournament,
there
were
three
golfers:
Amy,
Eli,
and
Eve.
Eve
finished
above
Amy.
Eli
finished
below
Amy.
\n
Options:
\n
(A)
Amy
finished
last
\n
(B)
Eli
finished
last
\n
(C)
Eve
finished
last
\n
A:
(B)
\n\n
Q:
The
following
paragraphs
each
describe
a
set
of
three
objects
arranged
in
a
fixed
order.
The
statements
are
logically
consistent
within
each
paragraph.
On
a
shelf,
there
are
three
books:
a
white
book,
a
green
book,
and
an
orange
book.
The
green
book
is
to
the
right
of
the
white
book.
The
orange
book
is
the
rightmost.
\n
Options:
\n
(A)
The
white
book
is
the
leftmost
\n
(B)
The
green
book
is
the
leftmost
\n
(C)
The
orange
book
is
the
leftmost
\n
A:
(A)
\n\n
Q:
The
following
paragraphs
each
describe
a
set
of
three
objects
arranged
in
a
fixed
order.
The
statements
are
logically
consistent
within
each
paragraph.
On
a
shelf,
there
are
three
books:
a
red
book,
a
gray
book,
and
a
white
book.
The
white
book
is
to
the
left
of
the
gray
book.
The
red
book
is
the
second
from
the
left.
\n
Options:
\n
(A)
The
red
book
is
the
leftmost
\n
(B)
The
gray
book
is
the
leftmost
\n
(C)
The
white
book
is
the
leftmost
\n
A:
(C)
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_logical_deduction_seven_objects"
lm_eval/tasks/bbh/flan_fewshot/logical_deduction_three_objects.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
logical_deduction_three_objects"
"
description"
:
"
A
logical
deduction
task
which
requires
deducing
the
order
of
a
sequence
of
objects.
\n\n
"
"
doc_to_text"
:
"
Q:
The
following
paragraphs
each
describe
a
set
of
three
objects
arranged
in
a
fixed
order.
The
statements
are
logically
consistent
within
each
paragraph.
In
a
golf
tournament,
there
were
three
golfers:
Amy,
Eli,
and
Eve.
Eve
finished
above
Amy.
Eli
finished
below
Amy.
\n
Options:
\n
(A)
Amy
finished
last
\n
(B)
Eli
finished
last
\n
(C)
Eve
finished
last
\n
A:
(B)
\n\n
Q:
The
following
paragraphs
each
describe
a
set
of
three
objects
arranged
in
a
fixed
order.
The
statements
are
logically
consistent
within
each
paragraph.
On
a
shelf,
there
are
three
books:
a
white
book,
a
green
book,
and
an
orange
book.
The
green
book
is
to
the
right
of
the
white
book.
The
orange
book
is
the
rightmost.
\n
Options:
\n
(A)
The
white
book
is
the
leftmost
\n
(B)
The
green
book
is
the
leftmost
\n
(C)
The
orange
book
is
the
leftmost
\n
A:
(A)
\n\n
Q:
The
following
paragraphs
each
describe
a
set
of
three
objects
arranged
in
a
fixed
order.
The
statements
are
logically
consistent
within
each
paragraph.
On
a
shelf,
there
are
three
books:
a
red
book,
a
gray
book,
and
a
white
book.
The
white
book
is
to
the
left
of
the
gray
book.
The
red
book
is
the
second
from
the
left.
\n
Options:
\n
(A)
The
red
book
is
the
leftmost
\n
(B)
The
gray
book
is
the
leftmost
\n
(C)
The
white
book
is
the
leftmost
\n
A:
(C)
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_logical_deduction_three_objects"
lm_eval/tasks/bbh/flan_fewshot/movie_recommendation.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
movie_recommendation"
"
description"
:
"
Recommend
movies
similar
to
the
given
list
of
movies.
\n\n
"
"
doc_to_text"
:
"
Q:
Find
a
movie
similar
to
Star
Wars
Episode
IV
-
A
New
Hope,
Indiana
Jones
and
the
Last
Crusade,
Star
Wars
Episode
V
-
The
Empire
Strikes
Back,
The
Big
Lebowski:
\n
Options:
\n
(A)
Tetsuo
\n
(B)
the
Ironman
\n
(C)
The
Princess
Bride
\n
(D)
The
Barkley
Marathons
The
Race
That
Eats
Its
Young
\n
(E)
Bug
\n
A:
(C)
\n\n
Q:
Find
a
movie
similar
to
Twister,
The
Silence
of
the
Lambs,
Independence
Day,
Braveheart:
\n
Options:
\n
(A)
They
Shoot
Horses
\n
(B)
Don't
They
\n
(C)
Forrest
Gump
\n
(D)
The
Salton
Sea
\n
(E)
Extreme
Days
\n
A:
(C)
\n\n
Q:
Find
a
movie
similar
to
Minority
Report,
Total
Recall,
Inside
Out,
Forrest
Gump:
\n
Options:
\n
(A)
Phenomena
\n
(B)
Lilting
\n
(C)
Catwoman
\n
(D)
Edge
of
Tomorrow
\n
A:
(D)
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_movie_recommendation"
lm_eval/tasks/bbh/flan_fewshot/multistep_arithmetic_two.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
multistep_arithmetic_two"
"
description"
:
"
Solve
multi-step
arithmetic
problems.
\n\n
"
"
doc_to_text"
:
"
Q:
((-5
+
9
*
-4
-
0)
*
(4
+
-7
+
0
*
-5))
=
\n
A:
123
\n\n
Q:
((-9
*
7
*
7
*
-9)
+
(4
*
-9
-
8
-
-4))
=
\n
A:
3929
\n\n
Q:
((-3
+
5
*
8
*
-4)
-
(9
-
8
*
-7
+
-9))
=
\n
A:
-219
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_multistep_arithmetic_two"
lm_eval/tasks/bbh/flan_fewshot/navigate.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
navigate"
"
description"
:
"
Given
a
series
of
navigation
instructions,
determine
whether
one
would
end
up
back
at
the
starting
point.
\n\n
"
"
doc_to_text"
:
"
Q:
If
you
follow
these
instructions,
do
you
return
to
the
starting
point?
Turn
left.
Turn
around.
Turn
left.
Take
7
steps.
Take
2
steps.
Take
4
steps.
Take
8
steps.
\n
Options:
\n
-
Yes
\n
-
No
\n
A:
No
\n\n
Q:
If
you
follow
these
instructions,
do
you
return
to
the
starting
point?
Turn
around.
Take
1
step.
Take
6
steps.
Turn
around.
Take
6
steps.
Take
9
steps.
Take
1
step.
\n
Options:
\n
-
Yes
\n
-
No
\n
A:
No
\n\n
Q:
If
you
follow
these
instructions,
do
you
return
to
the
starting
point?
Always
face
forward.
Take
2
steps
right.
Take
9
steps
left.
Take
7
steps
right.
\n
Options:
\n
-
Yes
\n
-
No
\n
A:
Yes
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_navigate"
lm_eval/tasks/bbh/flan_fewshot/object_counting.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
object_counting"
"
description"
:
"
Questions
that
involve
enumerating
objects
and
asking
the
model
to
count
them.
\n\n
"
"
doc_to_text"
:
"
Q:
I
have
a
blackberry,
a
clarinet,
a
nectarine,
a
plum,
a
strawberry,
a
banana,
a
flute,
an
orange,
and
a
violin.
How
many
fruits
do
I
have?
\n
A:
6
\n\n
Q:
I
have
an
orange,
a
raspberry,
two
peaches,
a
blackberry,
an
apple,
a
grape,
a
nectarine,
and
three
plums.
How
many
fruits
do
I
have?
\n
A:
11
\n\n
Q:
I
have
a
lettuce
head,
a
head
of
broccoli,
an
onion,
a
stalk
of
celery,
two
carrots,
a
garlic,
and
a
yam.
How
many
vegetables
do
I
have?
\n
A:
8
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_object_counting"
lm_eval/tasks/bbh/flan_fewshot/penguins_in_a_table.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
penguins_in_a_table"
"
description"
:
"
Answer
questions
about
a
table
of
penguins
and
their
attributes.
\n\n
"
"
doc_to_text"
:
"
Q:
Here
is
a
table
where
the
first
line
is
a
header
and
each
subsequent
line
is
a
penguin:
name,
age,
height
(cm),
weight
(kg)
Louis,
7,
50,
11
Bernard,
5,
80,
13
Vincent,
9,
60,
11
Gwen,
8,
70,
15
For
example:
the
age
of
Louis
is
7,
the
weight
of
Gwen
is
15
kg,
the
height
of
Bernard
is
80
cm.
We
now
add
a
penguin
to
the
table:
\n
James,
12,
90,
12
\n
How
many
penguins
are
less
than
8
years
old?
\n
Options:
\n
(A)
1
\n
(B)
2
\n
(C)
3
\n
(D)
4
\n
(E)
5
\n
A:
(B)
\n\n
Q:
Here
is
a
table
where
the
first
line
is
a
header
and
each
subsequent
line
is
a
penguin:
name,
age,
height
(cm),
weight
(kg)
Louis,
7,
50,
11
Bernard,
5,
80,
13
Vincent,
9,
60,
11
Gwen,
8,
70,
15
For
example:
the
age
of
Louis
is
7,
the
weight
of
Gwen
is
15
kg,
the
height
of
Bernard
is
80
cm.
Which
is
the
youngest
penguin?
\n
Options:
\n
(A)
Louis
\n
(B)
Bernard
\n
(C)
Vincent
\n
(D)
Gwen
\n
(E)
James
\n
A:
(B)
\n\n
Q:
Here
is
a
table
where
the
first
line
is
a
header
and
each
subsequent
line
is
a
penguin:
name,
age,
height
(cm),
weight
(kg)
Louis,
7,
50,
11
Bernard,
5,
80,
13
Vincent,
9,
60,
11
Gwen,
8,
70,
15
For
example:
the
age
of
Louis
is
7,
the
weight
of
Gwen
is
15
kg,
the
height
of
Bernard
is
80
cm.
What
is
the
name
of
the
second
penguin
sorted
by
alphabetic
order?
\n
Options:
\n
(A)
Louis
\n
(B)
Bernard
\n
(C)
Vincent
\n
(D)
Gwen
\n
(E)
James
\n
A:
(D)
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_penguins_in_a_table"
lm_eval/tasks/bbh/flan_fewshot/reasoning_about_colored_objects.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
reasoning_about_colored_objects"
"
description"
:
"
Answer
extremely
simple
questions
about
the
colors
of
objects
on
a
surface.
\n\n
"
"
doc_to_text"
:
"
Q:
On
the
nightstand,
there
is
a
red
pencil,
a
purple
mug,
a
burgundy
keychain,
a
fuchsia
teddy
bear,
a
black
plate,
and
a
blue
stress
ball.
What
color
is
the
stress
ball?
\n
Options:
\n
(A)
red
\n
(B)
orange
\n
(C)
yellow
\n
(D)
green
\n
(E)
blue
\n
(F)
brown
\n
(G)
magenta
\n
(H)
fuchsia
\n
(I)
mauve
\n
(J)
teal
\n
(K)
turquoise
\n
(L)
burgundy
\n
(M)
silver
\n
(N)
gold
\n
(O)
black
\n
(P)
grey
\n
(Q)
purple
\n
(R)
pink
\n
A:
(E)
\n\n
Q:
On
the
table,
you
see
a
bunch
of
objects
arranged
in
a
row:
a
purple
paperclip,
a
pink
stress
ball,
a
brown
keychain,
a
green
scrunchiephone
charger,
a
mauve
fidget
spinner,
and
a
burgundy
pen.
What
is
the
color
of
the
object
directly
to
the
right
of
the
stress
ball?
\n
Options:
\n
(A)
red
\n
(B)
orange
\n
(C)
yellow
\n
(D)
green
\n
(E)
blue
\n
(F)
brown
\n
(G)
magenta
\n
(H)
fuchsia
\n
(I)
mauve
\n
(J)
teal
\n
(K)
turquoise
\n
(L)
burgundy
\n
(M)
silver
\n
(N)
gold
\n
(O)
black
\n
(P)
grey
\n
(Q)
purple
\n
(R)
pink
\n
A:
(F)
\n\n
Q:
On
the
nightstand,
you
see
the
following
items
arranged
in
a
row:
a
teal
plate,
a
burgundy
keychain,
a
yellow
scrunchiephone
charger,
an
orange
mug,
a
pink
notebook,
and
a
grey
cup.
How
many
non-orange
items
do
you
see
to
the
left
of
the
teal
item?
\n
Options:
\n
(A)
zero
\n
(B)
one
\n
(C)
two
\n
(D)
three
\n
(E)
four
\n
(F)
five
\n
(G)
six
\n
A:
(A)
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_reasoning_about_colored_objects"
lm_eval/tasks/bbh/flan_fewshot/ruin_names.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
ruin_names"
"
description"
:
"
Select
the
humorous
edit
that
'ruins'
the
input
movie
or
musical
artist
name.
\n\n
"
"
doc_to_text"
:
"
Q:
Which
of
the
following
is
a
humorous
edit
of
this
artist
or
movie
name:
'whitesnake'?
\n
Options:
\n
(A)
whitesnape
\n
(B)
whitesnapke
\n
(C)
whitesnuake
\n
(D)
mwhitesnake
\n
A:
(A)
\n\n
Q:
Which
of
the
following
is
a
humorous
edit
of
this
artist
or
movie
name:
'one
of
our
dinosaurs
is
missing'?
\n
Options:
\n
(A)
ofne
of
our
dinosaurs
is
missing
\n
(B)
one
af
our
dinosaurs
is
missing
\n
(C)
one
of
our
dinosaurs
is
pissing
\n
(D)
one
of
our
dinosaur
is
missing
\n
A:
(C)
\n\n
Q:
Which
of
the
following
is
a
humorous
edit
of
this
artist
or
movie
name:
'counting
crows'?
\n
Options:
\n
(A)
countingy
crows
\n
(B)
counting
cows
\n
(C)
courting
crows
\n
(D)
coutnting
crows
\n
A:
(B)
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_ruin_names"
lm_eval/tasks/bbh/flan_fewshot/salient_translation_error_detection.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
salient_translation_error_detection"
"
description"
:
"
Detect
the
type
of
error
in
an
English
translation
of
a
German
source
sentence.
\n\n
"
"
doc_to_text"
:
"
Q:
The
following
translations
from
German
to
English
contain
a
particular
error.
That
error
will
be
one
of
the
following
types:
Named
Entities:
An
entity
(names,
places,
locations,
etc.)
is
changed
to
a
different
entity.
Numerical
Values:
Numerical
values
(ordinals
or
cardinals),
dates,
and/or
units
are
changed.
Modifiers
or
Adjectives:
The
modifiers
and
adjectives
pertaining
to
a
noun
are
changed.
Negation
or
Antonyms:
Introduce
or
remove
a
negation
or
change
comparatives
to
their
antonyms.
Facts:
Trivial
factual
errors
not
pertaining
to
the
above
classes
are
introduced
in
the
translations.
Dropped
Content:
A
significant
clause
in
the
translation
is
removed.
Please
identify
that
error.
Source:
In
der
Liste
der
Baudenkmale
in
Lenzen
(Elbe)
sind
alle
Baudenkmale
der
brandenburgischen
Stadt
Lenzen
(Elbe)
und
ihrer
Ortsteile
aufgelistet.
\n
Translation:
In
the
list
of
architectural
monuments
in
Lenzen
all
architectural
monuments
of
the
Brandenburg
city
of
Lenzen
and
its
districts
are
listed.
\n
The
translation
contains
an
error
pertaining
to
\n
Options:
\n
(A)
Modifiers
or
Adjectives
\n
(B)
Numerical
Values
\n
(C)
Negation
or
Antonyms
\n
(D)
Named
Entities
\n
(E)
Dropped
Content
\n
(F)
Facts
\n
A:
(D)
\n\n
Q:
The
following
translations
from
German
to
English
contain
a
particular
error.
That
error
will
be
one
of
the
following
types:
Named
Entities:
An
entity
(names,
places,
locations,
etc.)
is
changed
to
a
different
entity.
Numerical
Values:
Numerical
values
(ordinals
or
cardinals),
dates,
and/or
units
are
changed.
Modifiers
or
Adjectives:
The
modifiers
and
adjectives
pertaining
to
a
noun
are
changed.
Negation
or
Antonyms:
Introduce
or
remove
a
negation
or
change
comparatives
to
their
antonyms.
Facts:
Trivial
factual
errors
not
pertaining
to
the
above
classes
are
introduced
in
the
translations.
Dropped
Content:
A
significant
clause
in
the
translation
is
removed.
Please
identify
that
error.
Source:
Auf
dieser
Seite
sind
die
Baudenkmäler
der
oberbayerischen
Großen
Kreisstadt
Landsberg
am
Lech
zusammengestellt.
\n
Translation:
On
this
page
are
compiled
the
architectural
monuments
of
the
town
of
Landsberg
am
Lech.
\n
The
translation
contains
an
error
pertaining
to
\n
Options:
\n
(A)
Modifiers
or
Adjectives
\n
(B)
Numerical
Values
\n
(C)
Negation
or
Antonyms
\n
(D)
Named
Entities
\n
(E)
Dropped
Content
\n
(F)
Facts
\n
A:
(E)
\n\n
Q:
The
following
translations
from
German
to
English
contain
a
particular
error.
That
error
will
be
one
of
the
following
types:
Named
Entities:
An
entity
(names,
places,
locations,
etc.)
is
changed
to
a
different
entity.
Numerical
Values:
Numerical
values
(ordinals
or
cardinals),
dates,
and/or
units
are
changed.
Modifiers
or
Adjectives:
The
modifiers
and
adjectives
pertaining
to
a
noun
are
changed.
Negation
or
Antonyms:
Introduce
or
remove
a
negation
or
change
comparatives
to
their
antonyms.
Facts:
Trivial
factual
errors
not
pertaining
to
the
above
classes
are
introduced
in
the
translations.
Dropped
Content:
A
significant
clause
in
the
translation
is
removed.
Please
identify
that
error.
Source:
Łeba
ist
eine
Kleinstadt
und
ein
Badeort
im
Powiat
Lęborski
der
polnischen
Woiwodschaft
Pommern.
\n
Translation:
Eba
is
not
a
small
town
and
seaside
resort
in
the
Powiat
Léborski
county
of
the
Pomeranian
Voivodeship
of
Poland.
\n
The
translation
contains
an
error
pertaining
to
\n
Options:
\n
(A)
Modifiers
or
Adjectives
\n
(B)
Numerical
Values
\n
(C)
Negation
or
Antonyms
\n
(D)
Named
Entities
\n
(E)
Dropped
Content
\n
(F)
Facts
\n
A:
(C)
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_salient_translation_error_detection"
lm_eval/tasks/bbh/flan_fewshot/snarks.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
snarks"
"
description"
:
"
Determine
which
of
two
sentences
is
sarcastic.
\n\n
According
to
Cambridge
University
Dictionary,
sarcasm
is
\"
the
use
of
remarks
that
clearly
mean
the
opposite
of
what
they
say,
made
in
order
to
hurt
someone's
feelings
or
to
criticize
something
in
a
humorous
way.
\"
Sarcastic
sentences
often
contain
satirical
or
ironic
utterances,
hyperboles,
ambivalent
or
witty
remarks.
\n\n
"
"
doc_to_text"
:
"
Q:
Which
statement
is
sarcastic?
\n
Options:
\n
(A)
Yes,
because
having
interests
and
actively
researching
them
is
a
huge
waste
\n
(B)
Yes,
because
having
interests
and
actively
researching
them
is
a
huge
deal
\n
A:
(A)
\n\n
Q:
Which
statement
is
sarcastic?
\n
Options:
\n
(A)
No
one
is
going
to
disagree
with
you
on
this.
Avoiding
ad
hominem
attacks
really
help
your
case
\n
(B)
No
one
is
going
to
disagree
with
you
on
this.
Ad
hominem
attacks
really
help
your
case
\n
A:
(B)
\n\n
Q:
Which
statement
is
sarcastic?
\n
Options:
\n
(A)
Consistency
in
the
league's
punishments?
What
do
you
think
this
is
supposed
to
be,
politics?
\n
(B)
Consistency
in
the
league's
punishments?
What
do
you
think
this
is
supposed
to
be,
moral?
\n
A:
(A)
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_snarks"
lm_eval/tasks/bbh/flan_fewshot/sports_understanding.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
sports_understanding"
"
description"
:
"
Determine
whether
an
artificially
constructed
sentence
relating
to
sports
is
plausible
or
not.
\n\n
"
"
doc_to_text"
:
"
Q:
Is
the
following
sentence
plausible?
\"
Bam
Adebayo
scored
a
reverse
layup
in
the
Western
Conference
Finals.
\"\n
A:
yes
\n\n
Q:
Is
the
following
sentence
plausible?
\"
Santi
Cazorla
scored
a
touchdown.
\"\n
A:
no
\n\n
Q:
Is
the
following
sentence
plausible?
\"
DeMar
DeRozan
was
called
for
the
goal
tend.
\"\n
A:
yes
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_sports_understanding"
lm_eval/tasks/bbh/flan_fewshot/temporal_sequences.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
temporal_sequences"
"
description"
:
"
Task
description:
Answer
questions
about
which
times
certain
events
could
have
occurred.
\n\n
"
"
doc_to_text"
:
"
Q:
Today,
Emily
went
to
the
museum.
Between
what
times
could
they
have
gone?
\n
We
know
that:
\n
Emily
woke
up
at
1pm.
\n
Elizabeth
saw
Emily
reading
at
the
library
from
2pm
to
4pm.
\n
Jessica
saw
Emily
watching
a
movie
at
the
theater
from
4pm
to
5pm.
\n
Leslie
saw
Emily
waiting
at
the
airport
from
5pm
to
6pm.
\n
William
saw
Emily
buying
clothes
at
the
mall
from
6pm
to
7pm.
\n
The
museum
was
closed
after
7pm.
\n
Between
what
times
could
Emily
have
gone
to
the
museum?
\n
Options:
\n
(A)
1pm
to
2pm
\n
(B)
6pm
to
7pm
\n
(C)
5pm
to
6pm
\n
(D)
2pm
to
4pm
\n
A:
(A)
\n\n
Q:
Today,
Elizabeth
went
to
the
amusement
park.
Between
what
times
could
they
have
gone?
\n
We
know
that:
\n
Elizabeth
woke
up
at
7am.
\n
David
saw
Elizabeth
fixing
their
computer
at
the
electronic
store
from
1pm
to
2pm.
\n
Sarah
saw
Elizabeth
playing
tennis
at
the
tennis
court
from
2pm
to
3pm.
\n
Susan
saw
Elizabeth
walking
towards
the
Statue
of
Liberty
from
3pm
to
6pm.
\n
Andrew
saw
Elizabeth
taking
photos
near
the
Eiffel
Tower
from
6pm
to
9pm.
\n
Emily
saw
Elizabeth
getting
a
coffee
at
the
cafe
from
9pm
to
10pm.
\n
The
amusement
park
was
closed
after
10pm.
\n
Between
what
times
could
Elizabeth
have
gone
to
the
amusement
park?
\n
Options:
\n
(A)
7am
to
1pm
\n
(B)
9pm
to
10pm
\n
(C)
1pm
to
2pm
\n
(D)
3pm
to
6pm
\n
A:
(A)
\n\n
Q:
Today,
Tiffany
went
to
the
beach.
Between
what
times
could
they
have
gone?
\n
We
know
that:
\n
Tiffany
woke
up
at
5am.
\n
Betty
saw
Tiffany
getting
a
coffee
at
the
cafe
from
5am
to
6am.
\n
Jessica
saw
Tiffany
working
at
the
office
from
6am
to
9am.
\n
John
saw
Tiffany
stretching
at
a
yoga
studio
from
9am
to
12pm.
\n
Sean
saw
Tiffany
sitting
on
a
rooftop
from
12pm
to
2pm.
\n
Sarah
saw
Tiffany
playing
tennis
at
the
tennis
court
from
2pm
to
3pm.
\n
The
beach
was
closed
after
4pm.
\n
Between
what
times
could
Tiffany
have
gone
to
the
beach?
\n
Options:
\n
(A)
9am
to
12pm
\n
(B)
12pm
to
2pm
\n
(C)
5am
to
6am
\n
(D)
3pm
to
4pm
\n
A:
(D)
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_temporal_sequences"
lm_eval/tasks/bbh/flan_fewshot/tracking_shuffled_objects_five_objects.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
tracking_shuffled_objects_five_objects"
"
description"
:
"
A
task
requiring
determining
the
final
positions
of
a
set
of
objects
given
their
initial
positions
and
a
description
of
a
sequence
of
swaps.
\n\n
"
"
doc_to_text"
:
"
Q:
Alice,
Bob,
and
Claire
are
playing
a
game.
At
the
start
of
the
game,
they
are
each
holding
a
ball:
Alice
has
a
yellow
ball,
Bob
has
a
blue
ball,
and
Claire
has
a
pink
ball.
\n
As
the
game
progresses,
pairs
of
players
trade
balls.
First,
Claire
and
Alice
swap
balls.
Then,
Alice
and
Bob
swap
balls.
Finally,
Claire
and
Bob
swap
balls.
At
the
end
of
the
game,
Bob
has
the
\n
Options:
\n
(A)
yellow
ball
\n
(B)
blue
ball
\n
(C)
pink
ball
\n
A:
(A)
\n\n
Q:
Alice,
Bob,
and
Claire
are
playing
a
game.
At
the
start
of
the
game,
they
are
each
holding
a
ball:
Alice
has
a
white
ball,
Bob
has
a
purple
ball,
and
Claire
has
a
pink
ball.
\n
As
the
game
progresses,
pairs
of
players
trade
balls.
First,
Bob
and
Alice
swap
balls.
Then,
Bob
and
Claire
swap
balls.
Finally,
Bob
and
Alice
swap
balls.
At
the
end
of
the
game,
Alice
has
the
\n
Options:
\n
(A)
white
ball
\n
(B)
purple
ball
\n
(C)
pink
ball
\n
A:
(C)
\n\n
Q:
Alice,
Bob,
and
Claire
are
dancers
at
a
square
dance.
At
the
start
of
a
song,
they
each
have
a
partner:
Alice
is
dancing
with
Lola,
Bob
is
dancing
with
Rodrigo,
and
Claire
is
dancing
with
Patrick.
\n
Throughout
the
song,
the
dancers
often
trade
partners.
First,
Alice
and
Bob
switch
partners.
Then,
Claire
and
Bob
switch
partners.
Finally,
Bob
and
Alice
switch
partners.
At
the
end
of
the
dance,
Alice
is
dancing
with
\n
Options:
\n
(A)
Lola
\n
(B)
Rodrigo
\n
(C)
Patrick
\n
A:
(C)
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_tracking_shuffled_objects_five_objects"
lm_eval/tasks/bbh/flan_fewshot/tracking_shuffled_objects_seven_objects.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
tracking_shuffled_objects_seven_objects"
"
description"
:
"
A
task
requiring
determining
the
final
positions
of
a
set
of
objects
given
their
initial
positions
and
a
description
of
a
sequence
of
swaps.
\n\n
"
"
doc_to_text"
:
"
Q:
Alice,
Bob,
and
Claire
are
playing
a
game.
At
the
start
of
the
game,
they
are
each
holding
a
ball:
Alice
has
a
yellow
ball,
Bob
has
a
blue
ball,
and
Claire
has
a
pink
ball.
\n
As
the
game
progresses,
pairs
of
players
trade
balls.
First,
Claire
and
Alice
swap
balls.
Then,
Alice
and
Bob
swap
balls.
Finally,
Claire
and
Bob
swap
balls.
At
the
end
of
the
game,
Bob
has
the
\n
Options:
\n
(A)
yellow
ball
\n
(B)
blue
ball
\n
(C)
pink
ball
\n
A:
(A)
\n\n
Q:
Alice,
Bob,
and
Claire
are
playing
a
game.
At
the
start
of
the
game,
they
are
each
holding
a
ball:
Alice
has
a
white
ball,
Bob
has
a
purple
ball,
and
Claire
has
a
pink
ball.
\n
As
the
game
progresses,
pairs
of
players
trade
balls.
First,
Bob
and
Alice
swap
balls.
Then,
Bob
and
Claire
swap
balls.
Finally,
Bob
and
Alice
swap
balls.
At
the
end
of
the
game,
Alice
has
the
\n
Options:
\n
(A)
white
ball
\n
(B)
purple
ball
\n
(C)
pink
ball
\n
A:
(C)
\n\n
Q:
Alice,
Bob,
and
Claire
are
dancers
at
a
square
dance.
At
the
start
of
a
song,
they
each
have
a
partner:
Alice
is
dancing
with
Lola,
Bob
is
dancing
with
Rodrigo,
and
Claire
is
dancing
with
Patrick.
\n
Throughout
the
song,
the
dancers
often
trade
partners.
First,
Alice
and
Bob
switch
partners.
Then,
Claire
and
Bob
switch
partners.
Finally,
Bob
and
Alice
switch
partners.
At
the
end
of
the
dance,
Alice
is
dancing
with
\n
Options:
\n
(A)
Lola
\n
(B)
Rodrigo
\n
(C)
Patrick
\n
A:
(C)
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_tracking_shuffled_objects_seven_objects"
lm_eval/tasks/bbh/flan_fewshot/tracking_shuffled_objects_three_objects.yaml
View file @
4f5b72bc
"
dataset_name"
:
"
tracking_shuffled_objects_three_objects"
"
description"
:
"
A
task
requiring
determining
the
final
positions
of
a
set
of
objects
given
their
initial
positions
and
a
description
of
a
sequence
of
swaps.
\n\n
"
"
doc_to_text"
:
"
Q:
Alice,
Bob,
and
Claire
are
playing
a
game.
At
the
start
of
the
game,
they
are
each
holding
a
ball:
Alice
has
a
yellow
ball,
Bob
has
a
blue
ball,
and
Claire
has
a
pink
ball.
\n
As
the
game
progresses,
pairs
of
players
trade
balls.
First,
Claire
and
Alice
swap
balls.
Then,
Alice
and
Bob
swap
balls.
Finally,
Claire
and
Bob
swap
balls.
At
the
end
of
the
game,
Bob
has
the
\n
Options:
\n
(A)
yellow
ball
\n
(B)
blue
ball
\n
(C)
pink
ball
\n
A:
(A)
\n\n
Q:
Alice,
Bob,
and
Claire
are
playing
a
game.
At
the
start
of
the
game,
they
are
each
holding
a
ball:
Alice
has
a
white
ball,
Bob
has
a
purple
ball,
and
Claire
has
a
pink
ball.
\n
As
the
game
progresses,
pairs
of
players
trade
balls.
First,
Bob
and
Alice
swap
balls.
Then,
Bob
and
Claire
swap
balls.
Finally,
Bob
and
Alice
swap
balls.
At
the
end
of
the
game,
Alice
has
the
\n
Options:
\n
(A)
white
ball
\n
(B)
purple
ball
\n
(C)
pink
ball
\n
A:
(C)
\n\n
Q:
Alice,
Bob,
and
Claire
are
dancers
at
a
square
dance.
At
the
start
of
a
song,
they
each
have
a
partner:
Alice
is
dancing
with
Lola,
Bob
is
dancing
with
Rodrigo,
and
Claire
is
dancing
with
Patrick.
\n
Throughout
the
song,
the
dancers
often
trade
partners.
First,
Alice
and
Bob
switch
partners.
Then,
Claire
and
Bob
switch
partners.
Finally,
Bob
and
Alice
switch
partners.
At
the
end
of
the
dance,
Alice
is
dancing
with
\n
Options:
\n
(A)
Lola
\n
(B)
Rodrigo
\n
(C)
Patrick
\n
A:
(C)
\n\n
Q:
{{input}}
\n
A:"
"
include"
:
"
_template_yaml"
"
include"
:
"
_
flan_fewshot_
template_yaml"
"
task"
:
"
bbh_flan_fewshot_tracking_shuffled_objects_three_objects"
Prev
1
2
3
4
5
6
Next
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment