You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, I am trying to use FMUParameterRegistrator to do parameter calibration/estimation but encounter some issues.
I do experiments on both ME-type & CS-type FMU:
For ME-type FMU
Model:SpringPendulum1D.mo (from FMIZoo.jl)
Exporting Tool:Dymola (I dircetly use the FMU from FMIZoo.jl) Result:good, parameter can be tuned correctly after training
Model:SpringPendulum1D.mo (from FMIZoo.jl)
Exporting Tool:OpenModelca v1.23.0 (64-bit) Result:the loss doesn't change during training so the parameter is not tuned correctly, below is the loss function I use and part of info during training process
function lossSum(p)
global neuralFMU, x₀, posData, params, w, counter
w = p[1].value
if FMU_SOURCE == "OpenModelica"
params = Dict(zip(initStates, [w, 0.0, 1.0, 0.5, 0.0, 10.0, 1.0])) # for OM
elseif FMU_SOURCE == "Dymola"
params = Dict(zip(initStates, [0.5, 0.0, w, 10.0, 1.0, 1.0, 0.0])) # for Dymola
end
solution = neuralFMU(x₀; parameters=params, p=p, showProgress=true, saveat=tSave)
#solution = neuralFMU(x₀; p=p, showProgress=true, saveat=tSave)
if !solution.success
return Inf
end
posNet = fmi2GetSolutionState(solution, 1; isIndex=true)
velNet = fmi2GetSolutionState(solution, 2; isIndex=true)
loss_value = FMIFlux.Losses.mse(posData, posNet)
if counter % 100 == 0 || counter == 1
@info "LossSum[$counter] - Loss: $(round(loss_value.value, digits=5)), p: $(p[1].value)"
end
return loss_value
end
Actually, I found this issue comes from the wrong return value of neuralFMU(x₀; parameters=params, p=p, showProgress=true, saveat=tSave) in lossSum because the posNet is an array with same value for all time steps and velNet is an array with linear incremental values. For example, the posNet = [0.5, 0.5, 0.5, ..., 0.5, 0.5, 0.5] and the velNet = [0.0, 0.1, 0.2, 0.3, ..., 40.0], however, both of them should be like a string wave.
By the way, this only happens when doing FMIFlux.train!. It is normal if I run neuralFMU(x₀; parameters=params, p=p, showProgress=true, saveat=tSave) independently.
I am not sure whether canGetAndSetFMUstate="false" may be the possible reason that cause weird FMU solution? Because OpenModelica seems not support ME-type FMU to support this functionality even I follow the guideline of OpenModelica to enable it.
#################### Begin information for FMU ####################
Model name: SpringPendulum1D
FMI-Version: 2.0
GUID: {07765a18-d4d2-45bf-9416-dceca504316f}
Generation tool: OpenModelica Compiler OpenModelica v1.23.0 (64-bit)
Generation time: 2024-08-02T16:27:51Z
Var. naming conv.: structured
Event indicators: 0
Inputs: 0
Outputs: 0
States: 2
0 ["mass.stateSelect", "mass.s"]
1 ["mass.v"]
Supports Co-Simulation: true
Model identifier: SpringPendulum1D
Get/Set State: true
Serialize State: true
Dir. Derivatives: true
Var. com. steps: true
Input interpol.: true
Max order out. der.: 1
Supports Model-Exchange: true
Model identifier: SpringPendulum1D
Get/Set State: false
Serialize State: false
Dir. Derivatives: true
##################### End information for FMU #####################
For CS-type FMU
Model:SpringPendulumExtForce1D.mo (from FMIZoo.jl)
Exporting Tool:Dymola (I dircetly use the FMU from FMIZoo.jl) Result:the loss keeps increasing so the result is totally on the contrary like below
Model:SpringPendulumExtForce1D.mo (from FMIZoo.jl)
Exporting Tool:OpenModelca v1.23.0 (64-bit) Result:good, parameter can be tuned correctly after training
Conclusion
There is two issues when using FMUParameterRegistrator:
Wrong FMU output during FMIFlux.train! for ME-type FMU
Contrary update direction of loss during training for CS-type FMU
If more information is needed, please tell me, thank you!
The text was updated successfully, but these errors were encountered:
Thanks for the issue, this is very interesting!
For canGetAndSetFMUstate="false" the default fall-back is to use sampling and finite differences.
We will check that!
Hi @ThummeTo , I doubt that the first issue not only happened when using FMUParameterRegistrator.
I also do experiment about learning unknown effect example like the example of hybrid_ME on SpringPendulum and I get worse result
It seems like it easily stucks at local optimum and the result is show below
The difference of calling neuralFMU(x₀; p=p, showProgress=true, saveat=tSave) in and out FMIFlux.train!() is below, where in FMIFlux.train!() means the calling of neuralFMU(...) is in loss function during the training process and out FMIFlux.train!() means calling neuralFMU(...) outside the training process like solutionBefore = neuralFMU(x₀; saveat=tSave)
this plot shows that the FMU output in the training process is linear, which is weird I think
NOTE: the fmu I use in above results is exported by OpenModelica v1.23.0 (64-bit), which may be the main reason but not sure why
Currently, I am trying to use
FMUParameterRegistrator
to do parameter calibration/estimation but encounter some issues.I do experiments on both ME-type & CS-type FMU:
Model:SpringPendulum1D.mo (from FMIZoo.jl)
Exporting Tool:Dymola (I dircetly use the FMU from FMIZoo.jl)
Result:good, parameter can be tuned correctly after training
Model:SpringPendulum1D.mo (from FMIZoo.jl)
Exporting Tool:OpenModelca v1.23.0 (64-bit)
Result:the loss doesn't change during training so the parameter is not tuned correctly, below is the loss function I use and part of info during training process
Actually, I found this issue comes from the wrong return value of
neuralFMU(x₀; parameters=params, p=p, showProgress=true, saveat=tSave)
inlossSum
because theposNet
is an array with same value for all time steps andvelNet
is an array with linear incremental values. For example, theposNet = [0.5, 0.5, 0.5, ..., 0.5, 0.5, 0.5]
and thevelNet = [0.0, 0.1, 0.2, 0.3, ..., 40.0]
, however, both of them should be like a string wave.By the way, this only happens when doing
FMIFlux.train!
. It is normal if I runneuralFMU(x₀; parameters=params, p=p, showProgress=true, saveat=tSave)
independently.I am not sure whether
canGetAndSetFMUstate="false"
may be the possible reason that cause weird FMU solution? Because OpenModelica seems not support ME-type FMU to support this functionality even I follow the guideline of OpenModelica to enable it.Exporting Tool:Dymola (I dircetly use the FMU from FMIZoo.jl)
Result:the loss keeps increasing so the result is totally on the contrary like below
If I negtivelize the return value of loss, the training is relatively normal but I think this is not a good way to do
Exporting Tool:OpenModelca v1.23.0 (64-bit)
Result:good, parameter can be tuned correctly after training
Conclusion
There is two issues when using
FMUParameterRegistrator
:If more information is needed, please tell me, thank you!
The text was updated successfully, but these errors were encountered: