"git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "d5848a574a3990c95f20512673ecef9f57e0fe81"
Unverified Commit 9459d821 authored by Chi's avatar Chi Committed by GitHub
Browse files

Remove a redundant variable. (#27288)

* Removed the redundant SiLUActivation class and now use nn.functional.silu directly.

* I apologize for adding torch.functional.silu. I have replaced it with nn.SiLU.

* Remove redundant variable in feature_extraction file
parent 88832c01
...@@ -77,8 +77,7 @@ class FeatureExtractionPipeline(Pipeline): ...@@ -77,8 +77,7 @@ class FeatureExtractionPipeline(Pipeline):
return preprocess_params, {}, postprocess_params return preprocess_params, {}, postprocess_params
def preprocess(self, inputs, **tokenize_kwargs) -> Dict[str, GenericTensor]: def preprocess(self, inputs, **tokenize_kwargs) -> Dict[str, GenericTensor]:
return_tensors = self.framework model_inputs = self.tokenizer(inputs, return_tensors=self.framework, **tokenize_kwargs)
model_inputs = self.tokenizer(inputs, return_tensors=return_tensors, **tokenize_kwargs)
return model_inputs return model_inputs
def _forward(self, model_inputs): def _forward(self, model_inputs):
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment