Skip to content

Session requires model path? #15464

Answered by wschin
lilhoser asked this question in General
Apr 11, 2023 · 5 comments · 1 reply
Discussion options

You must be logged in to vote

Humm, it's protobuf limitation. There are several potential solutions/workarounds.

  • Use PrePackedWeightsContainer.
  • Move all onnx_model.graph.initializer to onnx_model.graph.input and feed those initializers as inputs when launching InferenceSession.
  • Implement new API which takes bytes and a folder path containing external data (seems the same to just load from model directly).

Not sure which one fits best to your scenario.

Replies: 5 comments 1 reply

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@lilhoser
Comment options

Comment options

You must be logged in to vote
0 replies
Answer selected by lilhoser
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
3 participants