Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
a6b4d1ad
Unverified
Commit
a6b4d1ad
authored
Jun 20, 2023
by
Sylvain Gugger
Browse files
Remove print statement
parent
6c134444
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
0 additions
and
2 deletions
+0
-2
src/transformers/models/whisper/tokenization_whisper.py
src/transformers/models/whisper/tokenization_whisper.py
+0
-2
No files found.
src/transformers/models/whisper/tokenization_whisper.py
View file @
a6b4d1ad
...
...
@@ -904,7 +904,6 @@ def _decode_asr(tokenizer, model_outputs, *, return_timestamps, return_language,
if
current_tokens
:
previous_tokens
.
append
(
current_tokens
)
elif
not
(
any
(
p
for
p
in
previous_tokens
)):
# print("Flushing previous tokens (END)")
chunk
=
new_chunk
()
previous_tokens
=
[]
current_tokens
=
[]
...
...
@@ -917,7 +916,6 @@ def _decode_asr(tokenizer, model_outputs, *, return_timestamps, return_language,
)
# Happens when we don't use timestamps
resolved_tokens
=
_find_longest_common_sequence
(
previous_tokens
)
# print("Flushing previous tokens (FINAL)")
resolved_text
=
tokenizer
.
decode
(
resolved_tokens
)
chunk
[
"text"
]
=
resolved_text
chunks
.
append
(
chunk
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment