Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix Dense init bug where initialisation of kernel weights with a numpy array raises a TypeError #3592

Open
wants to merge 4 commits into
base: master
Choose a base branch
from

Conversation

delzac
Copy link
Contributor

@delzac delzac commented Mar 10, 2019

Allow the initialisation of the kernel weight to be directly initialised by a numpy array, in line with the documentation of Dense layer.

Previously, before this amendment, there is a bug (#2913 ) where initialising kernel weight with numpy array raises
TypeError: in method 'random_initializer_with_rank', argument 1 of type 'CNTK::ParameterInitializer const &'

This method of allowing dense kernel weight to be initialised by a numpy array is the same as what you see in the all the Convolution API Layers.

@delzac
Copy link
Contributor Author

delzac commented Apr 2, 2019

@liqunfu @KeDengMS @BowenBao Like to request for review. Its a trivial change, but will allow weights from pre-trained models from other frameworks to be instantiated in cntk.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant