FiftyOne Model Zoo#
The FiftyOne Model Zoo provides a powerful interface for downloading models and applying them to your FiftyOne datasets.
It provides native access to hundreds of pre-trained models, and it also supports downloading arbitrary public or private models whose definitions are provided via GitHub repositories or URLs.
Note
Zoo models may require additional packages such as PyTorch or TensorFlow (or specific versions of them) in order to be used. See this section for more information on viewing/installing package requirements for models.
If you try to load a zoo model without the proper packages installed, you will receive an error message that will explain what you need to install.
Depending on your compute environment, some package requirement failures may be erroneous. In such cases, you can suppress error messages.
Built-in models#
The Model Zoo provides built-in access to hundreds of pre-trained models that you can apply to your datasets with a few simple commands.
Note
Did you know? You can also pass
custom models to methods like
apply_model()
and compute_embeddings()
!
Remotely-sourced models#
The Model Zoo also supports downloading and applying models whose definitions are provided via GitHub repositories or URLs.
Model interface#
All models in the Model Zoo are exposed via the Model
class, which defines a
common interface for loading models and generating predictions with
defined input and output data formats.
API reference#
The Model Zoo can be accessed via the Python library and the CLI. Consult the API reference below to see how to download, apply, and manage zoo models.
Basic recipe#
Methods for working with the Model Zoo are conveniently exposed via the Python
library and the CLI. The basic recipe is that you load a model from the zoo and
then apply it to a dataset (or a subset of the dataset specified by a
DatasetView
) using methods such as
apply_model()
and
compute_embeddings()
.
Prediction#
The Model Zoo provides a number of convenient methods for generating predictions with zoo models for your datasets.
For example, the code sample below shows a self-contained example of loading a Faster R-CNN model from the model zoo and adding its predictions to the COCO-2017 dataset from the Dataset Zoo:
1import fiftyone as fo
2import fiftyone.zoo as foz
3
4# List available zoo models
5print(foz.list_zoo_models())
6
7# Download and load a model
8model = foz.load_zoo_model("faster-rcnn-resnet50-fpn-coco-torch")
9
10# Load some samples from the COCO-2017 validation split
11dataset = foz.load_zoo_dataset(
12 "coco-2017",
13 split="validation",
14 dataset_name="coco-2017-validation-sample",
15 max_samples=50,
16 shuffle=True,
17)
18
19#
20# Choose some samples to process. This can be the entire dataset, or a
21# subset of the dataset. In this case, we'll choose some samples at
22# random
23#
24samples = dataset.take(25)
25
26#
27# Generate predictions for each sample and store the results in the
28# `faster_rcnn` field of the dataset, discarding all predictions with
29# confidence below 0.5
30#
31samples.apply_model(model, label_field="faster_rcnn", confidence_thresh=0.5)
32print(samples)
33
34# Visualize predictions in the App
35session = fo.launch_app(view=samples)
Embeddings#
Many models in the Model Zoo expose embeddings for their predictions:
1import fiftyone.zoo as foz
2
3# Load zoo model
4model = foz.load_zoo_model("inception-v3-imagenet-torch")
5
6# Check if model exposes embeddings
7print(model.has_embeddings) # True
For models that expose embeddings, you can generate embeddings for all
samples in a dataset (or a subset of it specified by a DatasetView
) by
calling
compute_embeddings()
:
1import fiftyone.zoo as foz
2
3# Load zoo model
4model = foz.load_zoo_model("inception-v3-imagenet-torch")
5print(model.has_embeddings) # True
6
7# Load zoo dataset
8dataset = foz.load_zoo_dataset("imagenet-sample")
9
10# Select some samples to process
11samples = dataset.take(10)
12
13#
14# Option 1: Generate embeddings for each sample and return them in a
15# `num_samples x dim` array
16#
17embeddings = samples.compute_embeddings(model)
18
19#
20# Option 2: Generate embeddings for each sample and store them in an
21# `embeddings` field of the dataset
22#
23samples.compute_embeddings(model, embeddings_field="embeddings")
You can also use
compute_patch_embeddings()
to generate embeddings for image patches defined by another label field, e.g,.
the detections generated by a detection model.
Logits#
Many classifiers in the Model Zoo can optionally store logits for their predictions.
Note
Storing logits for predictions enables you to run Brain methods such as label mistakes and sample hardness on your datasets!
You can check if a model exposes logits via
has_logits()
:
1import fiftyone.zoo as foz
2
3# Load zoo model
4model = foz.load_zoo_model("inception-v3-imagenet-torch")
5
6# Check if model has logits
7print(model.has_logits) # True
For models that expose logits, you can store logits for all predictions
generated by
apply_model()
by passing the optional store_logits=True
argument:
1import fiftyone.zoo as foz
2
3# Load zoo model
4model = foz.load_zoo_model("inception-v3-imagenet-torch")
5print(model.has_logits) # True
6
7# Load zoo dataset
8dataset = foz.load_zoo_dataset("imagenet-sample")
9
10# Select some samples to process
11samples = dataset.take(10)
12
13# Generate predictions and populate their `logits` fields
14samples.apply_model(model, store_logits=True)