Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Progress of Wrapping APIs of ReverseDiff: Part 2 #17

Open
1 of 3 tasks
Non-Contradiction opened this issue May 31, 2018 · 2 comments
Open
1 of 3 tasks

Progress of Wrapping APIs of ReverseDiff: Part 2 #17

Non-Contradiction opened this issue May 31, 2018 · 2 comments

Comments

@Non-Contradiction
Copy link
Owner

Non-Contradiction commented May 31, 2018

This is a continuation of #3. Now the non-mutating APIs of ReverseDiff has all been implemented. And basic documentation and tests have been finished. Although some basic tests are still failed.....

Note that different from the original plan, APIs related to AbstractTape have been implemented.
And multiple argument function is a challenge unexpected.

In sum, the result of wrapping of APIs of ReverseDiff is satisfactory, and can be considered finished basically, with potential further improvements:

  • More documentations, especially examples.
  • Wrapping DiffResults related mutating APIs, which is also related to ForwardDiff.
  • Writing more tests.
@nashjc
Copy link
Collaborator

nashjc commented Jun 6, 2018

Is there an explanation of "mutating" or "non-mutating" somewhere? This may be important for maintenance of the code later.

@Non-Contradiction
Copy link
Owner Author

An easy but not very rigorous explanation will be

x <- something
f(x)
f non-mutating if x is still the same thing before executing f

Since most R functions are non-mutating (except the <- ones), then autodiffr assumes that the target function func in grad(func, x, ...), hessian(func, x, ...) to be non-mutating. so APIs related to mutating target function func of ForwardDiff and ReverseDiff are NOT wrapped in autodiffr (in Julia, functions with names ending in ! are supposed to "mutate" the arguments).

And ForwardDiff and ReverseDiff provide another sets of mutating APIs, which looks like:

result = DiffResults(...) # the object is to store the differentiation results.
result = ForwardDiff.gradient!(result, func, input, ...)
# then the things like gradient, the original function value will be stored in the result object

I'm wondering whether to wrap this set of APIs, and treat DiffResults just as a JuliaObject not normal R object.
It has advantages in calculating the function value and gradient value and etc all in the same run.
So it could improve performance if used appropriately.
And I'm also wondering how to incorporate this into packages like optim and optimr.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants