Vision AI Model Export Failure

I followed the tutorial for creating a custom AI model to the letter on this Google Colab doc: Google Colab

(my doc is here: Google Colab)

The only thing I changed was the code block dedicated to downloading a custom dataset from roboflow.

Attempting to run the example is successful until code block 7, when I get the following error, and no tflite file results: “TensorFlow Lite: export failure: EndVector() missing 1 required positional argument: ‘vectorNumElems’”

I understand this is because FlatBuffers was updated, breaking compatibility. However, attempting to reinstall a later version of FlatBuffers does not take, and the error is always present.

I managed to make it work again, but I haven’t been able to get my grove AI module to take that model or any other - only get “Invoke Failed”.

#convert TF to tflite

Here’s the code snippet that must replace step 7:

#convert TF to tflite, may have to run this manually

!pip install -U flatbuffers

import tensorflow as tf
import flatbuffers

export_dir = "/content/yolov5-swift/runs/train/yolov5n6_results/weights/best_saved_model"
converter = tf.lite.TFLiteConverter.from_saved_model(export_dir)
tflite_model = converter.convert()
outpath = "/content/yolov5-swift/runs/train/yolov5n6_results/weights/best-int8.tflite"

with open(outpath, 'wb') as f:
  f.write(tflite_model)

Another solution is to modify export.py from yolov5-swift,

on line 496 there is a requirements check for flatbuffers v1.12 that reinstalls the buggy flatbuffers, we need v21 of flatbuffers, so change that conditional to:


       if int8 or edgetpu:  # TFLite --int8 bug https://github.com/ultralytics/yolov5/issues/5707
            pass
            #check_requirements(('flatbuffers==1.12',))  # required before `import tensorflow`