Xenova, I need your help.

#5
by shadowT - opened

I'm trying to run mistralai/Ministral-3-3B-Instruct-2512-ONNX on Electron, but I'm getting the following error:

An error occurred during model execution: "Error: invalid data location: undefined for input "input_ids"".

Inputs given to model: Object input_ids : {type: 'int64', dims: Array(2), location: undefined, data: BigInt64Array(827)}

I followed the demo here: https://huggingface.co/spaces/mistralai/Ministral_3B_WebGPU

This is most likely an issue with how electron handles passing parameters. https://github.com/microsoft/onnxruntime/issues/20431#issuecomment-3392906560 seems like a related/relevant issue.

Thank you for your reply. I have checked the related comment at the link you shared: https://github.com/microsoft/onnxruntime/issues/20431#issuecomment-3392906560
.

I tried creating a Tensor using onnxruntime-web, and in that case the returned object has a valid dataLocation set to "cpu", so everything works correctly:

{pixel_values: ..., input_ids: ..., attention_mask: ...} attention_mask: { cpuData: BigInt64Array(...), dataLocation: "cpu", ... } input_ids: { cpuData: BigInt64Array(...), dataLocation: "cpu", ... } pixel_values: { cpuData: Float32Array(...), dataLocation: "cpu", ... }

However, when I use a Tensor created from @huggingface/transformers, the dataLocation becomes undefined, which leads to the error.

I also tested using a Tensor from onnxruntime-web directly inside the model, but it seems incompatible. I get the following error:

TypeError: input_ids.tolist is not a function

It appears that the ONNXRuntime Tensor and the Transformers Tensor have different expectations.

Do you have any suggestions on how to bridge this, or any recommended workaround?

Thanks again!

Sign up or log in to comment