xref: /aosp_15_r20/external/tensorflow/tensorflow/lite/g3doc/examples/trained/index.md (revision b6fb3261f9314811a0f4371741dbb8839866f948)
1# Pre-trained models for TensorFlow Lite
2
3There are a variety of already trained, open source models you can use
4immediately with TensorFlow Lite to accomplish many machine learning tasks.
5Using pre-trained TensorFlow Lite models lets you add machine learning
6functionality to your mobile and edge device application quickly, without having
7to build and train a model. This guide helps you find and decide on trained
8models for use with TensorFlow Lite.
9
10You can start browsing TensorFlow Lite models right away based on general use
11cases in the [TensorFlow Lite Examples](../../examples) section, or browse a
12larger set of models on [TensorFlow Hub](https://tfhub.dev/s?deployment-
13format=lite).
14
15**Important:** TensorFlow Hub lists both regular TensorFlow models and
16TensorFlow Lite format models. These model formats are not interchangeable.
17TensorFlow models can be converted into TensorFlow Lite models, but that process
18is not reversible.
19
20
21## Find a model for your application
22
23Finding an existing TensorFlow Lite model for your use case can be tricky
24depending on what you are trying to accomplish. Here are a few recommended ways
25to discover models for use with TensorFlow Lite:
26
27**By example:** The fastest way to find and start using models with TensorFlow
28Lite is to browse the [TensorFlow Lite Examples](../../examples) section to find
29models that perform a task which is similar to your use case. This short catalog
30of examples provides models for common use cases with explanations of the models
31and sample code to get you started running and using them.
32
33**By data input type:** Aside from looking at examples similar to your use
34case, another way to discover models for your own use is to consider the type of
35data you want to process, such as audio, text, images, or video data. Machine
36learning models are frequently designed for use with one of these types of data,
37so looking for models that handle the data type you want to use can help you
38narrow down what models to consider. On [TensorFlow
39Hub](https://tfhub.dev/s?deployment-format=lite), you can use the **Problem
40domain** filter to view model data types and narrow your list.
41
42Note: Processing video with machine learning models can frequently be
43accomplished with models that are designed for processing single images,
44depending on how fast and how many inferences you need to perform for your use
45case. If you intend to use video for your use case, consider using single-frame
46video sampling with a model built for fast processing of individual images.
47
48The following lists links to TensorFlow Lite models on [TensorFlow
49Hub](https://tfhub.dev/s?deployment-format=lite) for common use cases:
50
51-   [Image classification](https://tfhub.dev/s?deployment-format=lite&module-type=image-classification)
52    models
53-   [Object detection](https://tfhub.dev/s?deployment-format=lite&module-type=image-object-detection)
54    models
55-   [Text classification](https://tfhub.dev/s?deployment-format=lite&module-type=text-classification)
56    models
57-   [Text embedding](https://tfhub.dev/s?deployment-format=lite&module-type=text-embedding)
58    models
59-   [Audio speech synthesis](https://tfhub.dev/s?deployment-format=lite&module-type=audio-speech-synthesis)
60    models
61-   [Audio embedding](https://tfhub.dev/s?deployment-format=lite&module-type=audio-embedding)
62    models
63
64
65## Choose between similar models
66
67If your application follows a common use case such as image classification or
68object detection, you may find yourself deciding between multiple TensorFlow
69Lite models, with varying binary size, data input size, inference speed, and
70prediction accuracy ratings. When deciding between a number of models, you
71should narrow your options based first on your most limiting constraint: size of
72model, size of data, inference speed, or accuracy.
73
74Key Point: Generally, when choosing between similar models, pick the smallest
75model to allow for the broadest device compatibility and fast inference times.
76
77If you are not sure what your most limiting constraint is, assume it is the
78size of the model and pick the smallest model available. Picking a small model
79gives you the most flexibility in terms of the devices where you can
80successfully deploy and run the model. Smaller models also typically produce
81faster inferences, and speedier predictions generally create better end-user
82experiences. Smaller models typically have lower accuracy rates, so you may need
83to pick larger models if prediction accuracy is your primary concern.
84
85
86## Sources for models
87
88Use the [TensorFlow Lite Examples](../../examples)
89section and [TensorFlow Hub](https://tfhub.dev/s?deployment-format=lite) as your
90first destinations for finding and selecting models for use with TensorFlow
91Lite. These sources generally have up to date, curated models for use with
92TensorFlow Lite, and frequently include sample code to accelerate your
93development process.
94
95### TensorFlow models
96
97It is possible to [convert](https://www.tensorflow.org/lite/models/convert) regular
98TensorFlow models to TensorFlow Lite format. For more information about
99converting models, see the [TensorFlow Lite
100Converter](https://www.tensorflow.org/lite/models/convert) documentation. You can find
101TensorFlow models on [TensorFlow Hub](https://tfhub.dev/) and in the
102[TensorFlow Model Garden](https://github.com/tensorflow/models).
103