Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add sweep displays #106

Open
wants to merge 11 commits into
base: master
Choose a base branch
from
Open

Add sweep displays #106

wants to merge 11 commits into from

Conversation

rcurtin
Copy link
Member

@rcurtin rcurtin commented Aug 18, 2017

This is not the prettiest JS ever written, but it adds two views to plot both runtimes over a sweep, and metrics vs. runtime over a sweep. Here is an image of each view:

benchmark_1
benchmark_2

@Iron-Stark
Copy link
Contributor

@rcurtin

Sorry for the late response on this one. I had no prior HTML/CSS/JS experience so it is taking me time to learn them and understand the codes. I am currently watching a series of youtube lectures on D3.js and learning how to build webpages using Java Script. It shall take me around 1-2 more days. I know that it is after GSoC period ends but I will continue to contribute and complete the webpage. In the meantime I have worked on benchmarking the Shark C++ ML library and will open up a PR on it by tomorrow.

@rcurtin
Copy link
Member Author

rcurtin commented Aug 28, 2017

Sure, no worries. If you have any comments on this view it would be great, if not that's ok too. :)

@rcurtin
Copy link
Member Author

rcurtin commented Sep 21, 2017

Ok, here is the actual display I was aiming for:

http://orange.ratml.org:8000/

Use the 'All metric plots for parameter sweeps'. The database there is only results from logistic regression runs. I think, before I merge this, I will fix some of the logistic regression code and add/modify some datasets (not all of the ones there give interesting results for logistic regression).

Basically, the display plots a learning curve for parameter sweeps over all libraries for a given method. This means that if we benchmark different optimizers they will always show up on the same plot.

@zoq
Copy link
Member

zoq commented Sep 22, 2017

This looks great to me, also really like the updated tooltip. Not really related to this PR, but do you think we should update the library color scheme, I think it's sometimes difficult to distinguish the different libs.

@rcurtin
Copy link
Member Author

rcurtin commented Sep 22, 2017

Sure, I would agree with changing the colors. Let me look into the colormap we are using and see if I can come up with something better.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants