My understanding is that tflite was developed to avoid installing the full package of Tensorflow in embedded system such Raspberry Pi, saving processing and power resources.
Following the directions from /, the documentation is linked to /edge/litert/models/convert_tf, but the examples shows the usage of Tensorflow full package.
Additionally, Tensorflow API shows tf.lite (), so what would be the point of installing tflite if the whole Tensorflow package needs to be installed?
What am I missing or misunderstanding? To use Tensorflow Lite, the full Tensorflow package needs to be installed anyway? If so, what resources would be host saved?
My understanding is that tflite was developed to avoid installing the full package of Tensorflow in embedded system such Raspberry Pi, saving processing and power resources.
Following the directions from https://pypi./project/tflite/, the documentation is linked to https://ai.google.dev/edge/litert/models/convert_tf, but the examples shows the usage of Tensorflow full package.
Additionally, Tensorflow API shows tf.lite (https://www.tensorflow./api_docs/python/tf/lite), so what would be the point of installing tflite if the whole Tensorflow package needs to be installed?
What am I missing or misunderstanding? To use Tensorflow Lite, the full Tensorflow package needs to be installed anyway? If so, what resources would be host saved?
the expected flow for your use case is to use TensorFlow to train your model on hardware with sufficient processing power and then, once the model is trained, convert it to TFLite and install an interpreter (tflite-runtime
) on your Raspberry Pi
TensorFlow contains the ability to convert .tf
files into .tflite
by using the tflite
package you have linked.
Files which are then understood by tflite-runtime
see:
PyPi
GitHub