如何export一個android能用的model?

照住依個方法訓練, export model
亦用pc測試過model正常運作
https://tensorflow-object-detect ... ing-a-trained-model

跟住用依個方法將model轉成tflite
  1. import tensorflow as tf

  2. converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
  3. converter.target_spec.supported_ops = [
  4.   tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
  5.   tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
  6. ]
  7. tflite_model = converter.convert()
  8. open("converted_model.tflite", "wb").write(tflite_model)
複製代碼
然後放入依個demo中的Custom Object Detection
https://github.com/googlesamples ... d/vision-quickstart

會出現error

Object detection failed!
    h3.a: Failed to initialize detector. Unexpected number of dimensions for output index 0: got 3D, expected either 2D (BxN with B=1) or 4D (BxHxWxN with B=1, W=1, H=1).
        at q5.f.d(:66)
        at y4.d.b(:3)
        at m3.b.f(:3)
        at y4.e.b(:2)
        at l3.k.f(:4)
        at l3.z.run(Unknown Source:10)
        at l3.c0.run(:2)
        at l3.j.e(:4)
        at l3.j.c(:1)
        at l3.v.run(Unknown Source:2)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1162)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:636)
        at l3.j.d(:2)
        at l3.w.run(Unknown Source:2)
        at java.lang.Thread.run(Thread.java:764)


網上有人話MLKIT唔support custom object detection的model, 但明明上面就係寫住Custom Object Detection
不過唔緊要, 既然mlkit唔得, 直接用tensorflow lite總得了吧?

於是又試下將tflite放入
https://github.com/tensorflow/ex ... t_detection/android

但今次就會話

java.lang.AssertionError: Error occurred when initializing ObjectDetector: Could not build model from the provided pre-loaded flatbuffer: Unsupported builtin op: STRIDED_SLICE, version: 6


感覺上係tensorflow某D功能lite無, 但個tflite有
所以試下delete依句, tf.lite.OpsSet.SELECT_TF_OPS
結果連convert都唔成功

網上有人話加依句就得
converter.experimental_new_converter = True

但加完, 轉出黎的file sha1係一模一樣


請問如果想轉出一個android用到的model, 應該要點做呢?