-
Notifications
You must be signed in to change notification settings - Fork 101
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adds an example about FunctionTransformer #1042
Conversation
Signed-off-by: Xavier Dupre <[email protected]>
Signed-off-by: Xavier Dupre <[email protected]>
Signed-off-by: Xavier Dupre <[email protected]>
Signed-off-by: Xavier Dupre <[email protected]>
Signed-off-by: Xavier Dupre <[email protected]>
Signed-off-by: Xavier Dupre <[email protected]>
Signed-off-by: Xavier Dupre <[email protected]>
Signed-off-by: Xavier Dupre <[email protected]>
Signed-off-by: Xavier Dupre <[email protected]>
Signed-off-by: Xavier Dupre <[email protected]>
Signed-off-by: Xavier Dupre <[email protected]>
Signed-off-by: Xavier Dupre <[email protected]>
Signed-off-by: Xavier Dupre <[email protected]>
operator.outputs[0].type = input_type([input_dim, 1]) | ||
|
||
|
||
def growth_converter(scope, operator, container): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@xadupre what do these 2 function arguments mean? And is there any doc that explains the structure/attributes that these have?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I should have added annotations: scope: Scope, operator: Operator, container: ModelComponentContainer
. scope is a class storing names and giving unique names. operator is a container holding the scikit-learn operator, container stores the created onnx nodes. I usually look. The documentation contains examples and a documentation for some methods: http://onnx.ai/sklearn-onnx/api_summary.html#.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @xadupre . I have the following transformer:
class Clipper(BaseEstimator, TransformerMixin):
def init(self,lower_limit,upper_limit):
self.lower_limit = lower_limit
self.upper_limit = upper_limit
pass
def clip(self,data):
return np.clip(data,self.lower_limit,self.upper_limit)
def fit(self, X, y=None):
return self
def transform(self,X,y=None):
x = X.apply(lambda x: self.clip(x), axis=1)
return x.values.reshape((-1, 1))
def get_feature_names_out(self):
pass
Questions:
- Is there a way to write a converter which can accept the lower limit and upper limit arguments (similar to my init function)?
- In your example below, you use operator.inputs[0]. What other contents does the inputs list contain?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The converter gets access to the model it converts, so it has access to any of its attributes. If the information you need is not stored in the transformer, you can use the option mechanism: https://onnx.ai/sklearn-onnx/parameterized.html.
Answer to issue #609.