Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhance MLP to support multi-hidden-layers in an easy way #29

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

gutouyu
Copy link

@gutouyu gutouyu commented Sep 21, 2017

Multi-hidden-layers MLP

The original MLP only has one-hidden-layer. I just Modify it to support a multi-hidden-layer in an easy way. You just need to tell MLP the hidden size in a python list. It is very convenient to experiment in multi-layers.

In the end of the file, I made a MLP with three-hidden-layers, and the test passed.

[[  5.17868684e-05   9.99948213e-01]
 [  9.99878554e-01   1.21445770e-04]
 [  9.99889287e-01   1.10712980e-04]
 [  1.34578242e-04   9.99865422e-01]]

Also, I don't know whether the owner still watch this project, but I am great interested in making the project better by implementing some other features, such as CNN,RNN, Momentum/Sgd_momentum/Adam/RmsProp etc.

Plz Let Me know if you allow me to do this. I'm very glad to hear from you.Thank you. @yusugomori

@gutouyu
Copy link
Author

gutouyu commented Sep 23, 2017

Is there anyone who is maintaining this project?So sad no response :(

@rwolst
Copy link

rwolst commented Sep 23, 2017

I'm not sure if anyone is maintaing but I suggest you can fork the project of not.

@gutouyu
Copy link
Author

gutouyu commented Sep 23, 2017

@rwolst Thanks, I have already fork this project and make this PR. Just want to make some contributions to this project as well as learn deep learning by implementing it. I'd be very glad if the code was accepted.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants