Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
Megatron-LM
Commits
80047314
Commit
80047314
authored
Apr 28, 2021
by
mpatwary
Browse files
removed commnets
parent
b9fcb7b4
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
8 deletions
+2
-8
megatron/model/biencoder_model.py
megatron/model/biencoder_model.py
+2
-8
No files found.
megatron/model/biencoder_model.py
View file @
80047314
...
@@ -88,16 +88,10 @@ class BiEncoderModel(MegatronModule):
...
@@ -88,16 +88,10 @@ class BiEncoderModel(MegatronModule):
def
set_input_tensor
(
self
,
input_tensor
):
def
set_input_tensor
(
self
,
input_tensor
):
"""See megatron.model.transformer.set_input_tensor()"""
"""See megatron.model.transformer.set_input_tensor()"""
#this is just a placeholder and will be needed when model
#parallelism will be used
#self.language_model.set_input_tensor(input_tensor)
#self.language_model.set_input_tensor(input_tensor)
return
return
# #if self._model_key is not None:
# # print("_model_key {}".format(self._model_key), flush=True)
# print(input_tensor)
# if self._query_key is not None:
# print("_query_key {}".format(self._query_key), flush=True)
# if self._context_key is not None:
# print("_context_key {}".format(self._context_key), flush=True)
# exit()
def
forward
(
self
,
query_tokens
,
query_attention_mask
,
query_types
,
def
forward
(
self
,
query_tokens
,
query_attention_mask
,
query_types
,
context_tokens
,
context_attention_mask
,
context_types
):
context_tokens
,
context_attention_mask
,
context_types
):
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment