Skip to content

FMIFlux.jl is a free-to-use software library for the Julia programming language, which offers the ability to place FMUs (fmi-standard.org) everywhere inside of your ML topologies and still keep the resulting model trainable with a standard (or custom) FluxML training process.

License

Notifications You must be signed in to change notification settings

ThummeTo/FMIFlux.jl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FMIFlux.jl Logo

FMIFlux.jl

What is FMIFlux.jl?

FMIFlux.jl is a free-to-use software library for the Julia programming language, which offers the ability to simply place your FMU (fmi-standard.org) everywhere inside of your ML topologies and still keep the resulting models trainable with a standard (or custom) FluxML training process. This includes for example:

  • NeuralODEs including FMUs, so called Neural Functional Mock-up Units (NeuralFMUs): You can place FMUs inside of your ML topology.
  • PINNs including FMUs, so called Functional Mock-Up Unit informed Neural Networks (FMUINNs): You can evaluate FMUs inside of your loss function.

Dev Docs Test (latest) Test (LTS) Examples Build Docs Run PkgEval Coverage ColPrac: Contributor's Guide on Collaborative Practices for Community Packages SciML Code Style

How can I use FMIFlux.jl?

1. Open a Julia-REPL, switch to package mode using ], activate your preferred environment.

2. Install FMIFlux.jl:

(@v1) pkg> add FMIFlux

3. If you want to check that everything works correctly, you can run the tests bundled with FMIFlux.jl:

(@v1) pkg> test FMIFlux

4. Have a look inside the examples folder in the examples branch or the examples section of the documentation. All examples are available as Julia-Script (.jl), Jupyter-Notebook (.ipynb) and Markdown (.md).

What is currently supported in FMIFlux.jl?

  • building and training ME-NeuralFMUs (NeuralODEs) with support for event-handling (DiffEqCallbacks.jl) and discontinuous sensitivity analysis (SciMLSensitivity.jl)
  • building and training CS-NeuralFMUs
  • building and training NeuralFMUs consisting of multiple FMUs
  • building and training FMUINNs (PINNs)
  • different AD-frameworks: ForwardDiff.jl (CI-tested), ReverseDiff.jl (CI-tested, default setting), FiniteDiff.jl (not CI-tested) and Zygote.jl (not CI-tested)
  • use Flux.jl optimizers as well as the ones from Optim.jl
  • using the entire DifferentialEquations.jl solver suite (autodiff=false for implicit solvers, not all are tested, see following section)
  • ...

(Current) Limitations

  • Not all implicit solvers work for challenging, hybrid models (stiff FMUs with events), currently tested are: Rosenbrock23(autodiff=false).

  • Implicit solvers using autodiff=true is not supported (now), but you can use implicit solvers with autodiff=false.

  • Sensitivity information over state change by event $\partial x^{+} / \partial x^{-}$ can't be accessed in FMI. These sensitivities are sampled if the FMU supports fmiXGet/SetState. If this feature is not available, wrong sensitivities are computed, which my influence your optimization (dependent on the use case). This issue is also part of the OpenScaling research project.

  • If continuous adjoints instead of automatic differentiation through the ODE solver (discrete adjoint) are applied, this might lead to issues, because FMUs are by design not capable of being simulated backwards in time. On the other hand, many FMUs are capable of doing so. This issue is also part of the OpenScaling research project.

  • For now, only FMI version 2.0 is supported, but FMI 3.0 support is coming with the OpenScaling research project.

What is under development in FMIFlux.jl?

  • performance optimizations
  • multi threaded CPU training
  • improved documentation
  • more examples
  • FMI3 integration
  • ...

What Platforms are supported?

FMIFlux.jl is tested (and testing) under Julia versions v1.6 (LTS) and v1 (latest) on Windows (latest) and Ubuntu (latest). MacOS should work, but untested. All shipped examples are automatically tested under Julia version v1 (latest) on Windows (latest).

What FMI.jl-Library should I use?

FMI.jl Family To keep dependencies nice and clean, the original package FMI.jl had been split into new packages:

  • FMI.jl: High level loading, manipulating, saving or building entire FMUs from scratch
  • FMIImport.jl: Importing FMUs into Julia
  • FMIExport.jl: Exporting stand-alone FMUs from Julia Code
  • FMIBase.jl: Common concepts for import and export of FMUs
  • FMICore.jl: C-code wrapper for the FMI-standard
  • FMISensitivity.jl: Static and dynamic sensitivities over FMUs
  • FMIBuild.jl: Compiler/Compilation dependencies for FMIExport.jl
  • FMIFlux.jl: Machine Learning with FMUs
  • FMIZoo.jl: A collection of testing and example FMUs

Video-Workshops

JuliaCon 2024 (Eindhoven University of Technology, Netherlands)

YouTube Video of Workshop

JuliaCon 2023 (Massachusetts Institute of Technology, United States)

YouTube Video of Workshop

How to cite?

Tobias Thummerer, Johannes Stoljar and Lars Mikelsons. 2022. NeuralFMU: presenting a workflow for integrating hybrid NeuralODEs into real-world applications. Electronics 11, 19, 3202. DOI: 10.3390/electronics11193202

Tobias Thummerer, Lars Mikelsons and Josef Kircher. 2021. NeuralFMU: towards structural integration of FMUs into neural networks. Martin Sjölund, Lena Buffoni, Adrian Pop and Lennart Ochel (Ed.). Proceedings of 14th Modelica Conference 2021, Linköping, Sweden, September 20-24, 2021. Linköping University Electronic Press, Linköping (Linköping Electronic Conference Proceedings ; 181), 297-306. DOI: 10.3384/ecp21181297

Related publications?

Tobias Thummerer, Johannes Tintenherr, Lars Mikelsons 2021. Hybrid modeling of the human cardiovascular system using NeuralFMUs Journal of Physics: Conference Series 2090, 1, 012155. DOI: 10.1088/1742-6596/2090/1/012155

About

FMIFlux.jl is a free-to-use software library for the Julia programming language, which offers the ability to place FMUs (fmi-standard.org) everywhere inside of your ML topologies and still keep the resulting model trainable with a standard (or custom) FluxML training process.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages