# FiftyOne documentation > FiftyOne is an open-source tool for building high-quality datasets and computer vision models. It supercharges machine learning workflows by enabling you to visualize datasets, interpret models, evaluate performance, and identify data quality issues faster and more effectively. The platform provides powerful capabilities for dataset curation, model evaluation, annotation mistake detection, and integrates seamlessly with popular ML tools like PyTorch, TensorFlow, Hugging Face, and more. ## FiftyOne Enterprise - [FiftyOne Enterprise 🚀](https://docs.voxel51.com/enterprise/index.md) - [Overview](https://docs.voxel51.com/enterprise/overview.md) - [Installation](https://docs.voxel51.com/enterprise/installation.md) - [Getting Started](https://docs.voxel51.com/enterprise/getting_started.md) - [API connection](https://docs.voxel51.com/enterprise/api_connection.md) - [Cloud-backed media](https://docs.voxel51.com/enterprise/cloud_media.md) - [Roles and permissions](https://docs.voxel51.com/enterprise/roles_and_permissions.md) - [Dataset Versioning](https://docs.voxel51.com/enterprise/dataset_versioning.md) - [App](https://docs.voxel51.com/enterprise/app.md) - [Auto-Labeling __SUB_NEW__](https://docs.voxel51.com/enterprise/verified_auto_labeling.md) - [Data Lens __SUB_NEW__](https://docs.voxel51.com/enterprise/data_lens.md) - [Data Quality __SUB_NEW__](https://docs.voxel51.com/enterprise/data_quality.md) - [Query Performance __SUB_NEW__](https://docs.voxel51.com/enterprise/query_performance.md) - [Plugins](https://docs.voxel51.com/enterprise/plugins.md) - [Secrets](https://docs.voxel51.com/enterprise/secrets.md) - [Management SDK](https://docs.voxel51.com/enterprise/management_sdk.md) - [Migrations](https://docs.voxel51.com/enterprise/migrations.md) - [Pluggable Auth](https://docs.voxel51.com/enterprise/pluggable_auth.md) ## FiftyOne Installation - [Installation](https://docs.voxel51.com/installation/index.md) - [Usage environments](https://docs.voxel51.com/installation/environments.md) - [Python venvs](https://docs.voxel51.com/installation/virtualenv.md) - [Upgrading MongoDB](https://docs.voxel51.com/installation/upgrading-mongodb.md) - [Troubleshooting](https://docs.voxel51.com/installation/troubleshooting.md) ## Getting Started Guides - [Getting Started](https://docs.voxel51.com/getting_started/index.md) - [Auto Labeling Guide](https://docs.voxel51.com/getting_started/auto_labeling/index.md) - [Prepare Your Dataset and Delegated Operators](https://docs.voxel51.com/getting_started/auto_labeling/01_preparation.md) - [Configure Auto Labeling Run](https://docs.voxel51.com/getting_started/auto_labeling/02_configure_run.md) - [Analyze Predictions](https://docs.voxel51.com/getting_started/auto_labeling/03_analyze_results.md) - [Visualize Embeddings](https://docs.voxel51.com/getting_started/auto_labeling/04_visualize_embeddings.md) - [Finalize Approvals](https://docs.voxel51.com/getting_started/auto_labeling/05_finalize.md) - [Guide Summary](https://docs.voxel51.com/getting_started/auto_labeling/summary.md) - [Annotation Guide __SUB_NEW__](https://docs.voxel51.com/getting_started/annotation/index.md) - [Quickstart: Multimodal Annotation](https://docs.voxel51.com/getting_started/annotation/01_quickstart.md) - [Setup Data Splits](https://docs.voxel51.com/getting_started/annotation/02_setup_splits.md) - [Smart Sample Selection](https://docs.voxel51.com/getting_started/annotation/03_smart_selection.md) - [2D Annotation + QA](https://docs.voxel51.com/getting_started/annotation/04_annotation_2d.md) - [3D Annotation](https://docs.voxel51.com/getting_started/annotation/05_annotation_3d.md) - [Train + Evaluate](https://docs.voxel51.com/getting_started/annotation/06_train_evaluate.md) - [Iteration Loop](https://docs.voxel51.com/getting_started/annotation/07_iteration.md) - [Guide Summary](https://docs.voxel51.com/getting_started/annotation/summary.md) - [Object Detection Guide](https://docs.voxel51.com/getting_started/object_detection/index.md) - [Loading Detection Datasets](https://docs.voxel51.com/getting_started/object_detection/01_loading_datasets.md) - [Adding Object Detections](https://docs.voxel51.com/getting_started/object_detection/02_adding_detections.md) - [Finding Detection Mistakes](https://docs.voxel51.com/getting_started/object_detection/03_finding_mistakes.md) - [Evaluating Detections](https://docs.voxel51.com/getting_started/object_detection/04_evaluating_detections.md) - [Guide Summary](https://docs.voxel51.com/getting_started/object_detection/summary.md) - [Medical Imaging Guide](https://docs.voxel51.com/getting_started/medical_imaging/index.md) - [Getting Started with Medical Imaging](https://docs.voxel51.com/getting_started/medical_imaging/01_getting_started.md) - [Guide Summary](https://docs.voxel51.com/getting_started/medical_imaging/summary.md) - [Self-Driving Guide](https://docs.voxel51.com/getting_started/self_driving/index.md) - [Loading Self-Driving Datasets](https://docs.voxel51.com/getting_started/self_driving/01_loading_datasets.md) - [Advanced Self-Driving Techniques](https://docs.voxel51.com/getting_started/self_driving/02_advanced_techniques.md) - [Guide Summary](https://docs.voxel51.com/getting_started/self_driving/summary.md) - [3D Visual AI Guide](https://docs.voxel51.com/getting_started/threed_visual_ai/index.md) - [Getting Started with 3D Datasets](https://docs.voxel51.com/getting_started/threed_visual_ai/01_getting_started_3d.md) - [Loading 3D Annotations](https://docs.voxel51.com/getting_started/threed_visual_ai/02_loading_annotations.md) - [Guide Summary](https://docs.voxel51.com/getting_started/threed_visual_ai/summary.md) - [Model Evaluation Guide](https://docs.voxel51.com/getting_started/model_evaluation/index.md) - [Basic Model Evaluation](https://docs.voxel51.com/getting_started/model_evaluation/01_basic_evaluation.md) - [Advanced Evaluation Analysis](https://docs.voxel51.com/getting_started/model_evaluation/02_advanced_analysis.md) - [Guide Summary](https://docs.voxel51.com/getting_started/model_evaluation/summary.md) - [Segmentation Guide](https://docs.voxel51.com/getting_started/segmentation/index.md) - [Loading Segmentation Datasets](https://docs.voxel51.com/getting_started/segmentation/01_intro.md) - [Adding Instance Segmentations](https://docs.voxel51.com/getting_started/segmentation/02_explore.md) - [Segment Anything 2 in FiftyOne](https://docs.voxel51.com/getting_started/segmentation/03_sam2.md) - [Guide Summary](https://docs.voxel51.com/getting_started/segmentation/summary.md) - [Depth Estimation Guide](https://docs.voxel51.com/getting_started/depth_estimation/index.md) - [Loading Depth Data](https://docs.voxel51.com/getting_started/depth_estimation/01_loading_depth_data.md) - [Using Depth Estimation Models](https://docs.voxel51.com/getting_started/depth_estimation/02_depth_estimation.md) - [Guide Summary](https://docs.voxel51.com/getting_started/depth_estimation/summary.md) - [Model Dataset Zoo Guide](https://docs.voxel51.com/getting_started/model_dataset_zoo/index.md) - [Exploring the Dataset Zoo](https://docs.voxel51.com/getting_started/model_dataset_zoo/01_intro.md) - [Exploring the Model Zoo](https://docs.voxel51.com/getting_started/model_dataset_zoo/02_explore.md) - [Exploring Remote Zoo Models](https://docs.voxel51.com/getting_started/model_dataset_zoo/03_remote_models.md) - [Guide Summary](https://docs.voxel51.com/getting_started/model_dataset_zoo/summary.md) - [Manufacturing Guide](https://docs.voxel51.com/getting_started/manufacturing/index.md) - [Manufacturing Datasets](https://docs.voxel51.com/getting_started/manufacturing/01_intro.md) - [Understanding and Using Embeddings](https://docs.voxel51.com/getting_started/manufacturing/02_embeddings.md) - [Clustering and Labeling with Embeddings](https://docs.voxel51.com/getting_started/manufacturing/03_clustering.md) - [Custom Embeddings for Industrial Data](https://docs.voxel51.com/getting_started/manufacturing/04_custom_embeddings.md) - [Model Evaluation and Integration](https://docs.voxel51.com/getting_started/manufacturing/05_evaluation.md) - [Data Augmentation for Manufacturing](https://docs.voxel51.com/getting_started/manufacturing/06_augmentation.md) - [3D Visualization for Defect Inspection](https://docs.voxel51.com/getting_started/manufacturing/07_3d_visualization.md) - [Extended Dataset Exploration](https://docs.voxel51.com/getting_started/manufacturing/08_extended_exploration.md) - [Valeo Anomaly Dataset](https://docs.voxel51.com/getting_started/manufacturing/09_vad_dataset.md) - [PPE Detection and Safety Monitoring](https://docs.voxel51.com/getting_started/manufacturing/10_ppe_detection.md) - [Video Analytics for Safety](https://docs.voxel51.com/getting_started/manufacturing/11_video_analytics.md) - [Guide Summary](https://docs.voxel51.com/getting_started/manufacturing/summary.md) ## FiftyOne Tutorials - [Tutorials](https://docs.voxel51.com/tutorials/index.md) - [Using FiftyOne Skills with Gemini CLI __SUB_NEW__](https://docs.voxel51.com/tutorials/gemini_fiftyone_skills.md) - [Integrating NVIDIA Cosmos-Transfer with FiftyOne __SUB_NEW__](https://docs.voxel51.com/tutorials/cosmos-transfer-integration.md) - [Google Gemini Vision in FiftyOne __SUB_NEW__](https://docs.voxel51.com/tutorials/gemini_vision.md) - [Exploring Kaputt Dataset](https://docs.voxel51.com/tutorials/kaputt_dataset.md) - [DINOv3 visual search](https://docs.voxel51.com/tutorials/dinov3.md) - [pandas and FiftyOne](https://docs.voxel51.com/tutorials/pandas_comparison.md) - [Evaluating object detections](https://docs.voxel51.com/tutorials/evaluate_detections.md) - [Evaluating a classifier](https://docs.voxel51.com/tutorials/evaluate_classifications.md) - [Using image embeddings](https://docs.voxel51.com/tutorials/image_embeddings.md) - [Annotating with CVAT](https://docs.voxel51.com/tutorials/cvat_annotation.md) - [Annotating with Labelbox](https://docs.voxel51.com/tutorials/labelbox_annotation.md) - [Working with Open Images](https://docs.voxel51.com/tutorials/open_images.md) - [Training with Detectron2](https://docs.voxel51.com/tutorials/detectron2.md) - [Exploring image uniqueness](https://docs.voxel51.com/tutorials/uniqueness.md) - [Finding class mistakes](https://docs.voxel51.com/tutorials/classification_mistakes.md) - [Finding detection mistakes](https://docs.voxel51.com/tutorials/detection_mistakes.md) - [Embeddings with Qdrant](https://docs.voxel51.com/tutorials/qdrant.md) - [Fine-tuning YOLOv8 models](https://docs.voxel51.com/tutorials/yolov8.md) - [3D point clouds with Point-E](https://docs.voxel51.com/tutorials/pointe.md) - [Monocular depth estimation](https://docs.voxel51.com/tutorials/monocular_depth_estimation.md) - [Dimensionality reduction](https://docs.voxel51.com/tutorials/dimension_reduction.md) - [Zero-shot classification](https://docs.voxel51.com/tutorials/zero_shot_classification.md) - [Data augmentation](https://docs.voxel51.com/tutorials/data_augmentation.md) - [Clustering images](https://docs.voxel51.com/tutorials/clustering.md) - [Detecting small objects](https://docs.voxel51.com/tutorials/small_object_detection.md) - [Anomaly detection](https://docs.voxel51.com/tutorials/anomaly_detection.md) ## FiftyOne Recipes - [Recipes](https://docs.voxel51.com/recipes/index.md) - [Data Loading with Torch Datasets](https://docs.voxel51.com/recipes/fiftyone_torch_dataloader.md) - [Training on MNIST with Torch](https://docs.voxel51.com/recipes/torch-dataset-examples/simple_training_example.md) - [Speeding up with cached fields](https://docs.voxel51.com/recipes/torch-dataset-examples/the_cache_field_names_argument.md) - [Creating views](https://docs.voxel51.com/recipes/creating_views.md) - [Removing duplicate images](https://docs.voxel51.com/recipes/image_deduplication.md) - [Removing duplicate objects](https://docs.voxel51.com/recipes/remove_duplicate_annos.md) - [Adding classifier predictions](https://docs.voxel51.com/recipes/adding_classifications.md) - [Adding object detections](https://docs.voxel51.com/recipes/adding_detections.md) - [Draw labels on samples](https://docs.voxel51.com/recipes/draw_labels.md) - [Convert dataset formats](https://docs.voxel51.com/recipes/convert_datasets.md) - [Merging datasets](https://docs.voxel51.com/recipes/merge_datasets.md) - [Custom dataset importers](https://docs.voxel51.com/recipes/custom_importer.md) - [Custom dataset exporters](https://docs.voxel51.com/recipes/custom_exporter.md) - [Custom sample parsers](https://docs.voxel51.com/recipes/custom_parser.md) ## FiftyOne Cheat Sheets - [Cheat Sheets](https://docs.voxel51.com/cheat_sheets/index.md) - [FiftyOne terminology](https://docs.voxel51.com/cheat_sheets/fiftyone_terminology.md) - [Filtering cheat sheet](https://docs.voxel51.com/cheat_sheets/filtering_cheat_sheet.md) - [Views cheat sheet](https://docs.voxel51.com/cheat_sheets/views_cheat_sheet.md) - [pandas vs FiftyOne](https://docs.voxel51.com/cheat_sheets/pandas_vs_fiftyone.md) ## FiftyOne User Guide - [User Guide](https://docs.voxel51.com/user_guide/index.md) - [FiftyOne basics](https://docs.voxel51.com/user_guide/basics.md) - [Importing data](https://docs.voxel51.com/user_guide/import_datasets.md) - [Using datasets](https://docs.voxel51.com/user_guide/using_datasets.md) - [Using the App](https://docs.voxel51.com/user_guide/app.md) - [Dataset views](https://docs.voxel51.com/user_guide/using_views.md) - [Grouped datasets](https://docs.voxel51.com/user_guide/groups.md) - [Annotating datasets __SUB_NEW__](https://docs.voxel51.com/user_guide/annotation.md) - [Evaluating models __SUB_NEW__](https://docs.voxel51.com/user_guide/evaluation.md) - [Using aggregations](https://docs.voxel51.com/user_guide/using_aggregations.md) - [Interactive plots](https://docs.voxel51.com/user_guide/plots.md) - [Exporting datasets](https://docs.voxel51.com/user_guide/export_datasets.md) - [Drawing labels on samples](https://docs.voxel51.com/user_guide/draw_labels.md) - [Using sample parsers](https://docs.voxel51.com/user_guide/sample_parsers.md) - [Configuring FiftyOne](https://docs.voxel51.com/user_guide/config.md) ## Dataset Zoo - [Dataset Zoo](https://docs.voxel51.com/dataset_zoo/index.md) - [Overview](https://docs.voxel51.com/dataset_zoo/overview.md) - [Remote datasets](https://docs.voxel51.com/dataset_zoo/remote.md) - [API reference](https://docs.voxel51.com/dataset_zoo/api.md) - [ActivityNet 100](https://docs.voxel51.com/dataset_zoo/datasets/activitynet_100.md) - [ActivityNet 200](https://docs.voxel51.com/dataset_zoo/datasets/activitynet_200.md) - [BDD100K](https://docs.voxel51.com/dataset_zoo/datasets/bdd100k.md) - [Caltech-101](https://docs.voxel51.com/dataset_zoo/datasets/caltech101.md) - [Caltech-256](https://docs.voxel51.com/dataset_zoo/datasets/caltech256.md) - [CIFAR-10](https://docs.voxel51.com/dataset_zoo/datasets/cifar10.md) - [CIFAR-100](https://docs.voxel51.com/dataset_zoo/datasets/cifar100.md) - [Cityscapes](https://docs.voxel51.com/dataset_zoo/datasets/cityscapes.md) - [COCO-2014](https://docs.voxel51.com/dataset_zoo/datasets/coco_2014.md) - [COCO-2017](https://docs.voxel51.com/dataset_zoo/datasets/coco_2017.md) - [Fashion MNIST](https://docs.voxel51.com/dataset_zoo/datasets/fashion_mnist.md) - [Families in the Wild](https://docs.voxel51.com/dataset_zoo/datasets/fiw.md) - [HMDB51](https://docs.voxel51.com/dataset_zoo/datasets/hmdb51.md) - [ImageNet 2012](https://docs.voxel51.com/dataset_zoo/datasets/imagenet_2012.md) - [ImageNet Sample](https://docs.voxel51.com/dataset_zoo/datasets/imagenet_sample.md) - [Kinetics 400](https://docs.voxel51.com/dataset_zoo/datasets/kinetics_400.md) - [Kinetics 600](https://docs.voxel51.com/dataset_zoo/datasets/kinetics_600.md) - [Kinetics 700](https://docs.voxel51.com/dataset_zoo/datasets/kinetics_700.md) - [Kinetics 700-2020](https://docs.voxel51.com/dataset_zoo/datasets/kinetics_700_2020.md) - [KITTI](https://docs.voxel51.com/dataset_zoo/datasets/kitti.md) - [KITTI Multiview](https://docs.voxel51.com/dataset_zoo/datasets/kitti_multiview.md) - [Labeled Faces in the Wild](https://docs.voxel51.com/dataset_zoo/datasets/lfw.md) - [MNIST](https://docs.voxel51.com/dataset_zoo/datasets/mnist.md) - [Open Images V6](https://docs.voxel51.com/dataset_zoo/datasets/open_images_v6.md) - [Open Images V7](https://docs.voxel51.com/dataset_zoo/datasets/open_images_v7.md) - [Places](https://docs.voxel51.com/dataset_zoo/datasets/places.md) - [Quickstart](https://docs.voxel51.com/dataset_zoo/datasets/quickstart.md) - [Quickstart 3D](https://docs.voxel51.com/dataset_zoo/datasets/quickstart_3d.md) - [Quickstart Geo](https://docs.voxel51.com/dataset_zoo/datasets/quickstart_geo.md) - [Quickstart Groups](https://docs.voxel51.com/dataset_zoo/datasets/quickstart_groups.md) - [Quickstart Video](https://docs.voxel51.com/dataset_zoo/datasets/quickstart_video.md) - [Sama-COCO](https://docs.voxel51.com/dataset_zoo/datasets/sama_coco.md) - [UCF101](https://docs.voxel51.com/dataset_zoo/datasets/ucf101.md) - [VOC-2007](https://docs.voxel51.com/dataset_zoo/datasets/voc_2007.md) - [VOC-2012](https://docs.voxel51.com/dataset_zoo/datasets/voc_2012.md) - [Dataset Card for action100m](https://docs.voxel51.com/dataset_zoo/datasets_hf/action100m_tiny_subset.md) - [Dataset Card for AFO - Aerial Floating Objects](https://docs.voxel51.com/dataset_zoo/datasets_hf/afo_aerial_floating_objects.md) - [Dataset Card for aloha_pen_uncap](https://docs.voxel51.com/dataset_zoo/datasets_hf/aloha_pen_uncap.md) - [Dataset Card for ASL-MNIST](https://docs.voxel51.com/dataset_zoo/datasets_hf/american_sign_language_mnist.md) - [Dataset Card for arcade_combined_export](https://docs.voxel51.com/dataset_zoo/datasets_hf/arcade_fo.md) - [Dataset Card for AVM (Around View Monitoring) Semantic Segmentation Dataset](https://docs.voxel51.com/dataset_zoo/datasets_hf/avm_segmentation_train.md) - [Dataset Card for BarkVN-50: Tree Species Identification from Bark Texture](https://docs.voxel51.com/dataset_zoo/datasets_hf/barkvn_50.md) - [Dataset Card for BIOSCAN-30k](https://docs.voxel51.com/dataset_zoo/datasets_hf/bioscan_30k.md) - [Dataset Card for bo-dataset](https://docs.voxel51.com/dataset_zoo/datasets_hf/bo_or_not.md) - [Dataset Card for btcv](https://docs.voxel51.com/dataset_zoo/datasets_hf/btcv_ct_as_video_medsam2_dataset.md) - [Dataset Card for cats-vs-dogs-imbalanced](https://docs.voxel51.com/dataset_zoo/datasets_hf/cats_vs_dogs_imbalanced.md) - [Dataset Card for Dataset Name](https://docs.voxel51.com/dataset_zoo/datasets_hf/cats_vs_dogs_sample.md) - [CholecT50 Dataset (FiftyOne Format)](https://docs.voxel51.com/dataset_zoo/datasets_hf/cholect50.md) - [Dataset Card for COIL-100](https://docs.voxel51.com/dataset_zoo/datasets_hf/coil_100.md) - [Dataset Card for colorswap](https://docs.voxel51.com/dataset_zoo/datasets_hf/colorswap.md) - [Dataset Card for CommonForms_val](https://docs.voxel51.com/dataset_zoo/datasets_hf/commonforms_val_subset.md) - [Dataset Card for Consolidated Receipt Dataset](https://docs.voxel51.com/dataset_zoo/datasets_hf/consolidated_receipt_dataset.md) - [Dataset Card for Homework Test Set for Coursera MOOC - Hands Data Centric Visual AI](https://docs.voxel51.com/dataset_zoo/datasets_hf/coursera_homework_dataset_test.md) - [Dataset Card for Homework Training Set for Coursera MOOC - Hands Data Centric Visual AI](https://docs.voxel51.com/dataset_zoo/datasets_hf/coursera_homework_dataset_train.md) - [Dataset Card for Lecture Test Set for Coursera MOOC - Hands Data Centric Visual AI](https://docs.voxel51.com/dataset_zoo/datasets_hf/coursera_lecture_dataset_test.md) - [Dataset Card for Lecture Training Set for Coursera MOOC - Hands Data Centric Visual AI](https://docs.voxel51.com/dataset_zoo/datasets_hf/coursera_lecture_dataset_train.md) - [Dataset Card for crops3d](https://docs.voxel51.com/dataset_zoo/datasets_hf/crops3d.md) - [Dataset Card for CuratedMNIST](https://docs.voxel51.com/dataset_zoo/datasets_hf/curated_mnist.md) - [Dataset Card for cvpr2024_papers](https://docs.voxel51.com/dataset_zoo/datasets_hf/cvpr_2024_papers.md) - [Dataset Card for dacl10k](https://docs.voxel51.com/dataset_zoo/datasets_hf/dacl10k.md) - [Dataset Card for DanceTrack](https://docs.voxel51.com/dataset_zoo/datasets_hf/dancetrack.md) - [Dataset Card for Data-Centric-Visual-AI-Train-Set](https://docs.voxel51.com/dataset_zoo/datasets_hf/data_centric_visual_ai_challenge_train_set.md) - [](https://docs.voxel51.com/dataset_zoo/datasets_hf/dataset_cards.md) - [Installation](https://docs.voxel51.com/dataset_zoo/datasets_hf/dcvai_challenge_public_eval_set.md) - [DeepLesion Benchmark Subset (Balanced 2K)](https://docs.voxel51.com/dataset_zoo/datasets_hf/deeplesion_balanced_2k.md) - [Dataset Card for DeepPatent](https://docs.voxel51.com/dataset_zoo/datasets_hf/deeppatent.md) - [Dataset Card for DensePose-COCO](https://docs.voxel51.com/dataset_zoo/datasets_hf/densepose_coco.md) - [Dataset Card for Describable Textures Dataset](https://docs.voxel51.com/dataset_zoo/datasets_hf/describable_textures_dataset.md) - [Dataset Card for Diverse-SDXL-Dogs](https://docs.voxel51.com/dataset_zoo/datasets_hf/diverse_sdxl_dogs.md) - [Dataset Card for document-haystack-10pages](https://docs.voxel51.com/dataset_zoo/datasets_hf/document_haystack_10pages.md) - [Dataset Card for DroneScapes2 (annotated train set)](https://docs.voxel51.com/dataset_zoo/datasets_hf/dronescapes2_annotated_train_set.md) - [Dataset Card for DUTS](https://docs.voxel51.com/dataset_zoo/datasets_hf/duts.md) - [Dataset Card for Egocentric_10K_Evaluation](https://docs.voxel51.com/dataset_zoo/datasets_hf/egocentric_10k_evaluation.md) - [Dataset Card for Egocentric 10K (subset - Factory 51, first 51 videos)](https://docs.voxel51.com/dataset_zoo/datasets_hf/egocentric_10k_subset.md) - [Dataset Card for EMNIST-Letters-10k](https://docs.voxel51.com/dataset_zoo/datasets_hf/emnist_letters_tiny.md) - [Dataset Card for FGVC-Aircraft](https://docs.voxel51.com/dataset_zoo/datasets_hf/fgvc_aircraft.md) - [FiftyOne Embeddings Dataset](https://docs.voxel51.com/dataset_zoo/datasets_hf/fiftyone_embeddings_combined.md) - [FiftyOne Function Calling 14k Dataset](https://docs.voxel51.com/dataset_zoo/datasets_hf/fiftyone_function_calling_14k.md) - [Dataset Card for FiftyOne GUI Grounding Training Set](https://docs.voxel51.com/dataset_zoo/datasets_hf/fiftyone_gui_grounding_train.md) - [Dataset Card for FiftyOne GUI Grounding Training Set with Synthetic Augmentation](https://docs.voxel51.com/dataset_zoo/datasets_hf/fiftyone_gui_grounding_train_with_synthetic.md) - [FiftyOne QA 14k Dataset](https://docs.voxel51.com/dataset_zoo/datasets_hf/fiftyone_qa_pairs_14k.md) - [Dataset Card for FinnWoodlands](https://docs.voxel51.com/dataset_zoo/datasets_hf/finnwoodlands.md) - [Dataset Card for FishEye8K: A Benchmark and Dataset for Fisheye Camera Object Detection](https://docs.voxel51.com/dataset_zoo/datasets_hf/fisheye8k.md) - [Dataset Card for FloorPlanCAD (test split)](https://docs.voxel51.com/dataset_zoo/datasets_hf/floorplancad.md) - [Dataset Card for Food-101](https://docs.voxel51.com/dataset_zoo/datasets_hf/food101.md) - [Dataset Card for Food Waste Dataset](https://docs.voxel51.com/dataset_zoo/datasets_hf/food_waste_dataset.md) - [Dataset Card for football-player-segmentation](https://docs.voxel51.com/dataset_zoo/datasets_hf/football_player_segmentation.md) - [Dataset Card for Form Understanding in Noisy Scanned Documents Plus](https://docs.voxel51.com/dataset_zoo/datasets_hf/form_understanding_in_noisy_scanned_documents_plus.md) - [Gaussian Splats Dataset](https://docs.voxel51.com/dataset_zoo/datasets_hf/gaussian_splatting.md) - [Dataset Card for predicted_labels](https://docs.voxel51.com/dataset_zoo/datasets_hf/getting_started_labeled_photos.md) - [Dataset Card for validation_photos](https://docs.voxel51.com/dataset_zoo/datasets_hf/getting_started_labeled_validation.md) - [Dataset Card for labeled_validation_predicted_clip](https://docs.voxel51.com/dataset_zoo/datasets_hf/getting_started_validation_clip_pred.md) - [Dataset Card for Elderly Action Recognition Challenge](https://docs.voxel51.com/dataset_zoo/datasets_hf/gmncsa24_fo.md) - [Dataset Card for GQA-35k](https://docs.voxel51.com/dataset_zoo/datasets_hf/gqa_scene_graph.md) - [Dataset Card for GroundUI-18k Dataset](https://docs.voxel51.com/dataset_zoo/datasets_hf/groundui_18k.md) - [Dataset Card for GUI Odyssey (Test Split)](https://docs.voxel51.com/dataset_zoo/datasets_hf/gui_odyssey_test.md) - [Dataset Card for GUI Odyssey (Train Split)](https://docs.voxel51.com/dataset_zoo/datasets_hf/gui_odyssey_train.md) - [GUIAct Smartphone Dataset - Test Split](https://docs.voxel51.com/dataset_zoo/datasets_hf/guiact_smartphone_test.md) - [Dataset Card for GUIAct Web-Single Dataset - Test Set](https://docs.voxel51.com/dataset_zoo/datasets_hf/guiact_websingle_test.md) - [Dataset Card for Image Hand Keypoint Detection](https://docs.voxel51.com/dataset_zoo/datasets_hf/hand_keypoints.md) - [Dataset Card for hard-hat-detection](https://docs.voxel51.com/dataset_zoo/datasets_hf/hard_hat_detection.md) - [Dataset Card for high_quality_invoice_images_ocr](https://docs.voxel51.com/dataset_zoo/datasets_hf/high_quality_invoice_images_for_ocr.md) - [Dataset Card for finevision_iam](https://docs.voxel51.com/dataset_zoo/datasets_hf/iam_handwriting_finevision.md) - [Dataset Card for illusion_animals](https://docs.voxel51.com/dataset_zoo/datasets_hf/illusionanimals.md) - [Dataset Card for ImageNet-A](https://docs.voxel51.com/dataset_zoo/datasets_hf/imagenet_a.md) - [Dataset Card for ImageNet-D](https://docs.voxel51.com/dataset_zoo/datasets_hf/imagenet_d.md) - [Dataset Card for ImageNet-O](https://docs.voxel51.com/dataset_zoo/datasets_hf/imagenet_o.md) - [Dataset Card for IndoorSceneRecognition](https://docs.voxel51.com/dataset_zoo/datasets_hf/indoorscenerecognition.md) - [Dataset Card for INQUIRE-ReRank](https://docs.voxel51.com/dataset_zoo/datasets_hf/inquire_rerank.md) - [Dataset Card for Forest Damages - Larch Casebearer](https://docs.voxel51.com/dataset_zoo/datasets_hf/larch_tree_damage.md) - [Dataset Card for LIDAR Warehouse Dayasey](https://docs.voxel51.com/dataset_zoo/datasets_hf/lidar_warehouse_dataset.md) - [Dataset Card for LVIS-35k](https://docs.voxel51.com/dataset_zoo/datasets_hf/lvis.md) - [Dataset Card for MapTrace-20k](https://docs.voxel51.com/dataset_zoo/datasets_hf/maptrace_20k.md) - [Dataset Card for MashUpVQA](https://docs.voxel51.com/dataset_zoo/datasets_hf/mashupvqa.md) - [Dataset Card for MedXpertQA](https://docs.voxel51.com/dataset_zoo/datasets_hf/medxpertqa.md) - [Dataset Card for “Cross-Domain” Test Split in Multimodal Mind2Web](https://docs.voxel51.com/dataset_zoo/datasets_hf/mind2web_multimodal_test_domain.md) - [Dataset Card for Multimodal Mind2Web “Cross-Task” Test Split](https://docs.voxel51.com/dataset_zoo/datasets_hf/mind2web_multimodal_test_task.md) - [Dataset Card for Multimodal Mind2Web “Cross-Website” Test Split](https://docs.voxel51.com/dataset_zoo/datasets_hf/mind2web_multimodal_test_website.md) - [Dataset Card for MPII Human Pose](https://docs.voxel51.com/dataset_zoo/datasets_hf/mpii_human_pose_dataset.md) - [Dataset Card for MVTec AD](https://docs.voxel51.com/dataset_zoo/datasets_hf/mvtec_ad.md) - [Dataset Card for nutrigreen_dataset](https://docs.voxel51.com/dataset_zoo/datasets_hf/nutrigreen.md) - [Dataset Card for New York Smells](https://docs.voxel51.com/dataset_zoo/datasets_hf/nyc_smells.md) - [Dataset Card for OD_MetalDAM](https://docs.voxel51.com/dataset_zoo/datasets_hf/od_metaldam.md) - [Dataset Card for Office-Home](https://docs.voxel51.com/dataset_zoo/datasets_hf/office_home.md) - [Dataset Card for olmocr-bench](https://docs.voxel51.com/dataset_zoo/datasets_hf/olmocr_bench.md) - [Dataset Card for OpenSARWake](https://docs.voxel51.com/dataset_zoo/datasets_hf/opensarwake.md) - [Dataset Card for Oxford Flowers 102](https://docs.voxel51.com/dataset_zoo/datasets_hf/oxfordflowers102.md) - [Dataset Card for ParkSeg12k: Parking Lot Segmentation Dataset](https://docs.voxel51.com/dataset_zoo/datasets_hf/parkseg12k_train.md) - [Dataset Card for pidray](https://docs.voxel51.com/dataset_zoo/datasets_hf/pidray.md) - [Dataset Card for PKLot](https://docs.voxel51.com/dataset_zoo/datasets_hf/pklot.md) - [Dataset Card for PlantSeg_Test](https://docs.voxel51.com/dataset_zoo/datasets_hf/plantseg_test.md) - [Dataset Card for Qualcomm Exercise Video Dataset (Benchmark)](https://docs.voxel51.com/dataset_zoo/datasets_hf/qualcomm_exercise_video_dataset_benchmark.md) - [Dataset Card for Qualcomm Interactive Video Dataset](https://docs.voxel51.com/dataset_zoo/datasets_hf/qualcomm_interactive_video_dataset.md) - [Dataset Card for quickstart-3d](https://docs.voxel51.com/dataset_zoo/datasets_hf/quickstart_3d.md) - [Dataset Card for RefCOCO-M](https://docs.voxel51.com/dataset_zoo/datasets_hf/refcoco_m.md) - [Dataset Card for RefSegRS](https://docs.voxel51.com/dataset_zoo/datasets_hf/regsegrs.md) - [Reid People Tracking](https://docs.voxel51.com/dataset_zoo/datasets_hf/reid_people_tracking.md) - [Dataset Card for Rico Semantic Dataset](https://docs.voxel51.com/dataset_zoo/datasets_hf/rico.md) - [Dataset Card for RIS-LAD](https://docs.voxel51.com/dataset_zoo/datasets_hf/ris_lad.md) - [Dataset Card for S5Mards Dataset](https://docs.voxel51.com/dataset_zoo/datasets_hf/s5mars.md) - [Dataset Card for safe_unsafe_behaviours](https://docs.voxel51.com/dataset_zoo/datasets_hf/safe_and_unsafe_behaviours.md) - [Dataset Card for scanned_images_dataset](https://docs.voxel51.com/dataset_zoo/datasets_hf/scanned_images_dataset_for_ocr_and_vlm_finetuning.md) - [Dataset Card for Scanned Receipts OCR and Information Extraction](https://docs.voxel51.com/dataset_zoo/datasets_hf/scanned_receipts.md) - [Dataset Card for ScreenSpot](https://docs.voxel51.com/dataset_zoo/datasets_hf/screenspot.md) - [Dataset Card for ScreenSpot-Pro](https://docs.voxel51.com/dataset_zoo/datasets_hf/screenspot_pro.md) - [Dataset Card for ScreenSpot-V2](https://docs.voxel51.com/dataset_zoo/datasets_hf/screenspot_v2.md) - [Dataset Card for SDXL Dogs](https://docs.voxel51.com/dataset_zoo/datasets_hf/sdxl_dogs.md) - [Dataset Card for Generated Dogs](https://docs.voxel51.com/dataset_zoo/datasets_hf/sdxl_generated_stanford_dogs.md) - [Dataset Card for Set14](https://docs.voxel51.com/dataset_zoo/datasets_hf/set14.md) - [Dataset Card for Set5](https://docs.voxel51.com/dataset_zoo/datasets_hf/set5.md) - [Desktop Dataset from ShowUI](https://docs.voxel51.com/dataset_zoo/datasets_hf/showui_desktop.md) - [Dataset Card for ShowUI_Web](https://docs.voxel51.com/dataset_zoo/datasets_hf/showui_web.md) - [Dataset Card for harpreetsahota/sku110k_test](https://docs.voxel51.com/dataset_zoo/datasets_hf/sku110k_test.md) - [Dataset Card for SkyScenes](https://docs.voxel51.com/dataset_zoo/datasets_hf/skyscenes.md) - [Dataset Card for SLAKE](https://docs.voxel51.com/dataset_zoo/datasets_hf/slake.md) - [Dataset Card for SoccerNet-V3](https://docs.voxel51.com/dataset_zoo/datasets_hf/soccernet_v3.md) - [Dataset Card for StanfordDogsImbalanced](https://docs.voxel51.com/dataset_zoo/datasets_hf/stanford_dogs_imbalanced.md) - [Dataset Card for StanfordDogs](https://docs.voxel51.com/dataset_zoo/datasets_hf/stanforddogs.md) - [Dataset Card for Street View House Numbers](https://docs.voxel51.com/dataset_zoo/datasets_hf/streetviewhousenumbers.md) - [Dataset Card for synthetic_us_passports](https://docs.voxel51.com/dataset_zoo/datasets_hf/synthetic_us_passports_easy.md) - [Dataset Card for SynthHuman](https://docs.voxel51.com/dataset_zoo/datasets_hf/synthhuman.md) - [Dataset Card for TAMPAR](https://docs.voxel51.com/dataset_zoo/datasets_hf/tampar.md) - [Dataset Card for ThermalPersonDetector](https://docs.voxel51.com/dataset_zoo/datasets_hf/thermal_person_detector.md) - [Dataset Card for Total-Text-Dataset](https://docs.voxel51.com/dataset_zoo/datasets_hf/total_text_dataset.md) - [Dataset Card for UnCommon Objects in 3D](https://docs.voxel51.com/dataset_zoo/datasets_hf/uco3d.md) - [Dataset Card for Urban100](https://docs.voxel51.com/dataset_zoo/datasets_hf/urban100.md) - [Dataset Card for usps](https://docs.voxel51.com/dataset_zoo/datasets_hf/usps.md) - [Dataset Card for VisDrone2019-DET](https://docs.voxel51.com/dataset_zoo/datasets_hf/visdrone2019_det.md) - [Dataset Card for VisDrone2019-DET](https://docs.voxel51.com/dataset_zoo/datasets_hf/visdrone_mot.md) - [Dataset Card for neurips-2025-vision-papers](https://docs.voxel51.com/dataset_zoo/datasets_hf/visual_ai_at_neurips2025.md) - [Dataset Card for WaveUI-25k](https://docs.voxel51.com/dataset_zoo/datasets_hf/waveui_25k.md) - [Dataset Card for WebUOT-238-Test](https://docs.voxel51.com/dataset_zoo/datasets_hf/webuot_238_test.md) - [Dataset Card for WLASL](https://docs.voxel51.com/dataset_zoo/datasets_hf/wlasl.md) ## Model Zoo - [Model Zoo](https://docs.voxel51.com/model_zoo/index.md) - [Overview](https://docs.voxel51.com/model_zoo/overview.md) - [Remote models](https://docs.voxel51.com/model_zoo/remote.md) - [Model interface](https://docs.voxel51.com/model_zoo/design.md) - [API reference](https://docs.voxel51.com/model_zoo/api.md) - [Apple/SHARP](https://docs.voxel51.com/model_zoo/models/Apple_SHARP.md) - [ByteDance-Seed/UI-TARS-1.5-7B](https://docs.voxel51.com/model_zoo/models/ByteDance_Seed_UI_TARS_1_5_7B.md) - [ModernVBERT/bimodernvbert](https://docs.voxel51.com/model_zoo/models/ModernVBERT_bimodernvbert.md) - [ModernVBERT/colmodernvbert](https://docs.voxel51.com/model_zoo/models/ModernVBERT_colmodernvbert.md) - [OS-Copilot/OS-Atlas-Base-7B](https://docs.voxel51.com/model_zoo/models/OS_Copilot_OS_Atlas_Base_7B.md) - [PE-Core-B16-224-Vision-Encoder](https://docs.voxel51.com/model_zoo/models/PE_Core_B16_224_Vision_Encoder.md) - [PE-Core-L14-336-Vision-Encoder](https://docs.voxel51.com/model_zoo/models/PE_Core_L14_336_Vision_Encoder.md) - [PerceptronAI/Isaac-0.1](https://docs.voxel51.com/model_zoo/models/PerceptronAI_Isaac_0_1.md) - [PerceptronAI/Isaac-0.2-1B](https://docs.voxel51.com/model_zoo/models/PerceptronAI_Isaac_0_2_1B.md) - [PerceptronAI/Isaac-0.2-2B-Preview](https://docs.voxel51.com/model_zoo/models/PerceptronAI_Isaac_0_2_2B_Preview.md) - [Qwen/Qwen2.5-VL-32B-Instruct](https://docs.voxel51.com/model_zoo/models/Qwen_Qwen2_5_VL_32B_Instruct.md) - [Qwen/Qwen2.5-VL-32B-Instruct-AWQ](https://docs.voxel51.com/model_zoo/models/Qwen_Qwen2_5_VL_32B_Instruct_AWQ.md) - [Qwen/Qwen2.5-VL-3B-Instruct](https://docs.voxel51.com/model_zoo/models/Qwen_Qwen2_5_VL_3B_Instruct.md) - [Qwen/Qwen2.5-VL-3B-Instruct-AWQ](https://docs.voxel51.com/model_zoo/models/Qwen_Qwen2_5_VL_3B_Instruct_AWQ.md) - [Qwen/Qwen2.5-VL-72B-Instruct](https://docs.voxel51.com/model_zoo/models/Qwen_Qwen2_5_VL_72B_Instruct.md) - [Qwen/Qwen2.5-VL-72B-Instruct-AWQ](https://docs.voxel51.com/model_zoo/models/Qwen_Qwen2_5_VL_72B_Instruct_AWQ.md) - [Qwen/Qwen2.5-VL-7B-Instruct](https://docs.voxel51.com/model_zoo/models/Qwen_Qwen2_5_VL_7B_Instruct.md) - [Qwen/Qwen2.5-VL-7B-Instruct-AWQ](https://docs.voxel51.com/model_zoo/models/Qwen_Qwen2_5_VL_7B_Instruct_AWQ.md) - [Qwen/Qwen3-VL-2B-Instruct](https://docs.voxel51.com/model_zoo/models/Qwen_Qwen3_VL_2B_Instruct.md) - [Qwen/Qwen3-VL-4B-Instruct](https://docs.voxel51.com/model_zoo/models/Qwen_Qwen3_VL_4B_Instruct.md) - [Qwen/Qwen3-VL-8B-Instruct](https://docs.voxel51.com/model_zoo/models/Qwen_Qwen3_VL_8B_Instruct.md) - [Qwen/Qwen3-VL-Embedding-2B](https://docs.voxel51.com/model_zoo/models/Qwen_Qwen3_VL_Embedding_2B.md) - [Qwen/Qwen3-VL-Embedding-8B](https://docs.voxel51.com/model_zoo/models/Qwen_Qwen3_VL_Embedding_8B.md) - [XiaomiMiMo/MiMo-VL-7B-RL](https://docs.voxel51.com/model_zoo/models/XiaomiMiMo_MiMo_VL_7B_RL.md) - [XiaomiMiMo/MiMo-VL-7B-SFT](https://docs.voxel51.com/model_zoo/models/XiaomiMiMo_MiMo_VL_7B_SFT.md) - [XiaomiMiMo/MiMo-VL-7B-SFT-GGUF](https://docs.voxel51.com/model_zoo/models/XiaomiMiMo_MiMo_VL_7B_SFT_GGUF.md) - [alexnet-imagenet-torch](https://docs.voxel51.com/model_zoo/models/alexnet_imagenet_torch.md) - [allenai/Molmo2-4B](https://docs.voxel51.com/model_zoo/models/allenai_Molmo2_4B.md) - [allenai/Molmo2-8B](https://docs.voxel51.com/model_zoo/models/allenai_Molmo2_8B.md) - [allenai/Molmo2-O-7B](https://docs.voxel51.com/model_zoo/models/allenai_Molmo2_O_7B.md) - [allenai/Molmo2-VideoPoint-4B](https://docs.voxel51.com/model_zoo/models/allenai_Molmo2_VideoPoint_4B.md) - [allenai/olmOCR-2-7B-1025](https://docs.voxel51.com/model_zoo/models/allenai_olmOCR_2_7B_1025.md) - [apple/FastVLM-0.5B](https://docs.voxel51.com/model_zoo/models/apple_FastVLM_0_5B.md) - [apple/FastVLM-1.5B](https://docs.voxel51.com/model_zoo/models/apple_FastVLM_1_5B.md) - [apple/FastVLM-7B](https://docs.voxel51.com/model_zoo/models/apple_FastVLM_7B.md) - [centernet-hg104-1024-coco-tf2](https://docs.voxel51.com/model_zoo/models/centernet_hg104_1024_coco_tf2.md) - [centernet-hg104-512-coco-tf2](https://docs.voxel51.com/model_zoo/models/centernet_hg104_512_coco_tf2.md) - [centernet-mobilenet-v2-fpn-512-coco-tf2](https://docs.voxel51.com/model_zoo/models/centernet_mobilenet_v2_fpn_512_coco_tf2.md) - [centernet-resnet101-v1-fpn-512-coco-tf2](https://docs.voxel51.com/model_zoo/models/centernet_resnet101_v1_fpn_512_coco_tf2.md) - [centernet-resnet50-v1-fpn-512-coco-tf2](https://docs.voxel51.com/model_zoo/models/centernet_resnet50_v1_fpn_512_coco_tf2.md) - [centernet-resnet50-v2-512-coco-tf2](https://docs.voxel51.com/model_zoo/models/centernet_resnet50_v2_512_coco_tf2.md) - [classification-transformer-torch](https://docs.voxel51.com/model_zoo/models/classification_transformer_torch.md) - [clip-vit-base32-torch](https://docs.voxel51.com/model_zoo/models/clip_vit_base32_torch.md) - [convnext-base-224-torch](https://docs.voxel51.com/model_zoo/models/convnext_base_224_torch.md) - [convnext-large-224-torch](https://docs.voxel51.com/model_zoo/models/convnext_large_224_torch.md) - [convnext-small-224-torch](https://docs.voxel51.com/model_zoo/models/convnext_small_224_torch.md) - [convnext-tiny-224-torch](https://docs.voxel51.com/model_zoo/models/convnext_tiny_224_torch.md) - [convnext-xlarge-224-torch](https://docs.voxel51.com/model_zoo/models/convnext_xlarge_224_torch.md) - [deeplabv3-cityscapes-tf](https://docs.voxel51.com/model_zoo/models/deeplabv3_cityscapes_tf.md) - [deeplabv3-mnv2-cityscapes-tf](https://docs.voxel51.com/model_zoo/models/deeplabv3_mnv2_cityscapes_tf.md) - [deeplabv3-resnet101-coco-torch](https://docs.voxel51.com/model_zoo/models/deeplabv3_resnet101_coco_torch.md) - [deeplabv3-resnet50-coco-torch](https://docs.voxel51.com/model_zoo/models/deeplabv3_resnet50_coco_torch.md) - [deepseek-ai/DeepSeek-OCR](https://docs.voxel51.com/model_zoo/models/deepseek_ai_DeepSeek_OCR.md) - [densenet121-imagenet-torch](https://docs.voxel51.com/model_zoo/models/densenet121_imagenet_torch.md) - [densenet161-imagenet-torch](https://docs.voxel51.com/model_zoo/models/densenet161_imagenet_torch.md) - [densenet169-imagenet-torch](https://docs.voxel51.com/model_zoo/models/densenet169_imagenet_torch.md) - [densenet201-imagenet-torch](https://docs.voxel51.com/model_zoo/models/densenet201_imagenet_torch.md) - [depth-anything-v2-base-torch](https://docs.voxel51.com/model_zoo/models/depth_anything_v2_base_torch.md) - [depth-anything-v2-large-torch](https://docs.voxel51.com/model_zoo/models/depth_anything_v2_large_torch.md) - [depth-anything-v2-small-torch](https://docs.voxel51.com/model_zoo/models/depth_anything_v2_small_torch.md) - [depth-estimation-transformer-torch](https://docs.voxel51.com/model_zoo/models/depth_estimation_transformer_torch.md) - [detection-transformer-torch](https://docs.voxel51.com/model_zoo/models/detection_transformer_torch.md) - [dfine-large-coco-torch](https://docs.voxel51.com/model_zoo/models/dfine_large_coco_torch.md) - [dfine-medium-coco-torch](https://docs.voxel51.com/model_zoo/models/dfine_medium_coco_torch.md) - [dfine-nano-coco-torch](https://docs.voxel51.com/model_zoo/models/dfine_nano_coco_torch.md) - [dfine-small-coco-torch](https://docs.voxel51.com/model_zoo/models/dfine_small_coco_torch.md) - [dfine-xlarge-coco-torch](https://docs.voxel51.com/model_zoo/models/dfine_xlarge_coco_torch.md) - [dinov2-vitb14-reg-torch](https://docs.voxel51.com/model_zoo/models/dinov2_vitb14_reg_torch.md) - [dinov2-vitb14-torch](https://docs.voxel51.com/model_zoo/models/dinov2_vitb14_torch.md) - [dinov2-vitg14-reg-torch](https://docs.voxel51.com/model_zoo/models/dinov2_vitg14_reg_torch.md) - [dinov2-vitg14-torch](https://docs.voxel51.com/model_zoo/models/dinov2_vitg14_torch.md) - [dinov2-vitl14-reg-torch](https://docs.voxel51.com/model_zoo/models/dinov2_vitl14_reg_torch.md) - [dinov2-vitl14-torch](https://docs.voxel51.com/model_zoo/models/dinov2_vitl14_torch.md) - [dinov2-vits14-reg-torch](https://docs.voxel51.com/model_zoo/models/dinov2_vits14_reg_torch.md) - [dinov2-vits14-torch](https://docs.voxel51.com/model_zoo/models/dinov2_vits14_torch.md) - [efficientdet-d0-512-coco-tf2](https://docs.voxel51.com/model_zoo/models/efficientdet_d0_512_coco_tf2.md) - [efficientdet-d0-coco-tf1](https://docs.voxel51.com/model_zoo/models/efficientdet_d0_coco_tf1.md) - [efficientdet-d1-640-coco-tf2](https://docs.voxel51.com/model_zoo/models/efficientdet_d1_640_coco_tf2.md) - [efficientdet-d1-coco-tf1](https://docs.voxel51.com/model_zoo/models/efficientdet_d1_coco_tf1.md) - [efficientdet-d2-768-coco-tf2](https://docs.voxel51.com/model_zoo/models/efficientdet_d2_768_coco_tf2.md) - [efficientdet-d2-coco-tf1](https://docs.voxel51.com/model_zoo/models/efficientdet_d2_coco_tf1.md) - [efficientdet-d3-896-coco-tf2](https://docs.voxel51.com/model_zoo/models/efficientdet_d3_896_coco_tf2.md) - [efficientdet-d3-coco-tf1](https://docs.voxel51.com/model_zoo/models/efficientdet_d3_coco_tf1.md) - [efficientdet-d4-1024-coco-tf2](https://docs.voxel51.com/model_zoo/models/efficientdet_d4_1024_coco_tf2.md) - [efficientdet-d4-coco-tf1](https://docs.voxel51.com/model_zoo/models/efficientdet_d4_coco_tf1.md) - [efficientdet-d5-1280-coco-tf2](https://docs.voxel51.com/model_zoo/models/efficientdet_d5_1280_coco_tf2.md) - [efficientdet-d5-coco-tf1](https://docs.voxel51.com/model_zoo/models/efficientdet_d5_coco_tf1.md) - [efficientdet-d6-1280-coco-tf2](https://docs.voxel51.com/model_zoo/models/efficientdet_d6_1280_coco_tf2.md) - [efficientdet-d6-coco-tf1](https://docs.voxel51.com/model_zoo/models/efficientdet_d6_coco_tf1.md) - [efficientdet-d7-1536-coco-tf2](https://docs.voxel51.com/model_zoo/models/efficientdet_d7_1536_coco_tf2.md) - [efficientnet-b0-imagenet-torch](https://docs.voxel51.com/model_zoo/models/efficientnet_b0_imagenet_torch.md) - [efficientnet-b1-imagenet-torch](https://docs.voxel51.com/model_zoo/models/efficientnet_b1_imagenet_torch.md) - [efficientnet-b2-imagenet-torch](https://docs.voxel51.com/model_zoo/models/efficientnet_b2_imagenet_torch.md) - [efficientnet-b3-imagenet-torch](https://docs.voxel51.com/model_zoo/models/efficientnet_b3_imagenet_torch.md) - [efficientnet-b4-imagenet-torch](https://docs.voxel51.com/model_zoo/models/efficientnet_b4_imagenet_torch.md) - [efficientnet-b5-imagenet-torch](https://docs.voxel51.com/model_zoo/models/efficientnet_b5_imagenet_torch.md) - [efficientnet-b6-imagenet-torch](https://docs.voxel51.com/model_zoo/models/efficientnet_b6_imagenet_torch.md) - [efficientnet-b7-imagenet-torch](https://docs.voxel51.com/model_zoo/models/efficientnet_b7_imagenet_torch.md) - [facebook/VGGT-1B](https://docs.voxel51.com/model_zoo/models/facebook_VGGT_1B.md) - [facebook/sam3](https://docs.voxel51.com/model_zoo/models/facebook_sam3.md) - [faster-rcnn-inception-resnet-atrous-v2-coco-tf](https://docs.voxel51.com/model_zoo/models/faster_rcnn_inception_resnet_atrous_v2_coco_tf.md) - [faster-rcnn-inception-resnet-atrous-v2-lowproposals-coco-tf](https://docs.voxel51.com/model_zoo/models/faster_rcnn_inception_resnet_atrous_v2_lowproposals_coco_tf.md) - [faster-rcnn-inception-v2-coco-tf](https://docs.voxel51.com/model_zoo/models/faster_rcnn_inception_v2_coco_tf.md) - [faster-rcnn-nas-coco-tf](https://docs.voxel51.com/model_zoo/models/faster_rcnn_nas_coco_tf.md) - [faster-rcnn-nas-lowproposals-coco-tf](https://docs.voxel51.com/model_zoo/models/faster_rcnn_nas_lowproposals_coco_tf.md) - [faster-rcnn-resnet101-coco-tf](https://docs.voxel51.com/model_zoo/models/faster_rcnn_resnet101_coco_tf.md) - [faster-rcnn-resnet101-lowproposals-coco-tf](https://docs.voxel51.com/model_zoo/models/faster_rcnn_resnet101_lowproposals_coco_tf.md) - [faster-rcnn-resnet50-coco-tf](https://docs.voxel51.com/model_zoo/models/faster_rcnn_resnet50_coco_tf.md) - [faster-rcnn-resnet50-fpn-coco-torch](https://docs.voxel51.com/model_zoo/models/faster_rcnn_resnet50_fpn_coco_torch.md) - [faster-rcnn-resnet50-lowproposals-coco-tf](https://docs.voxel51.com/model_zoo/models/faster_rcnn_resnet50_lowproposals_coco_tf.md) - [fcn-resnet101-coco-torch](https://docs.voxel51.com/model_zoo/models/fcn_resnet101_coco_torch.md) - [fcn-resnet50-coco-torch](https://docs.voxel51.com/model_zoo/models/fcn_resnet50_coco_torch.md) - [google/Gemini-Vision](https://docs.voxel51.com/model_zoo/models/google_Gemini_Vision.md) - [google/medgemma-1.5-4b-it](https://docs.voxel51.com/model_zoo/models/google_medgemma_1_5_4b_it.md) - [google/medgemma-4b-it](https://docs.voxel51.com/model_zoo/models/google_medgemma_4b_it.md) - [google/medsiglip-448](https://docs.voxel51.com/model_zoo/models/google_medsiglip_448.md) - [google/paligemma2-10b-mix-224](https://docs.voxel51.com/model_zoo/models/google_paligemma2_10b_mix_224.md) - [google/paligemma2-10b-mix-448](https://docs.voxel51.com/model_zoo/models/google_paligemma2_10b_mix_448.md) - [google/paligemma2-28b-mix-224](https://docs.voxel51.com/model_zoo/models/google_paligemma2_28b_mix_224.md) - [google/paligemma2-28b-mix-448](https://docs.voxel51.com/model_zoo/models/google_paligemma2_28b_mix_448.md) - [google/paligemma2-3b-mix-224](https://docs.voxel51.com/model_zoo/models/google_paligemma2_3b_mix_224.md) - [google/paligemma2-3b-mix-448](https://docs.voxel51.com/model_zoo/models/google_paligemma2_3b_mix_448.md) - [google/siglip2-base-patch16-224](https://docs.voxel51.com/model_zoo/models/google_siglip2_base_patch16_224.md) - [google/siglip2-base-patch16-256](https://docs.voxel51.com/model_zoo/models/google_siglip2_base_patch16_256.md) - [google/siglip2-base-patch16-384](https://docs.voxel51.com/model_zoo/models/google_siglip2_base_patch16_384.md) - [google/siglip2-base-patch16-512](https://docs.voxel51.com/model_zoo/models/google_siglip2_base_patch16_512.md) - [google/siglip2-base-patch16-naflex](https://docs.voxel51.com/model_zoo/models/google_siglip2_base_patch16_naflex.md) - [google/siglip2-base-patch32-256](https://docs.voxel51.com/model_zoo/models/google_siglip2_base_patch32_256.md) - [google/siglip2-giant-opt-patch16-256](https://docs.voxel51.com/model_zoo/models/google_siglip2_giant_opt_patch16_256.md) - [google/siglip2-giant-opt-patch16-384](https://docs.voxel51.com/model_zoo/models/google_siglip2_giant_opt_patch16_384.md) - [google/siglip2-large-patch16-256](https://docs.voxel51.com/model_zoo/models/google_siglip2_large_patch16_256.md) - [google/siglip2-large-patch16-384](https://docs.voxel51.com/model_zoo/models/google_siglip2_large_patch16_384.md) - [google/siglip2-large-patch16-512](https://docs.voxel51.com/model_zoo/models/google_siglip2_large_patch16_512.md) - [google/siglip2-so400m-patch14-224](https://docs.voxel51.com/model_zoo/models/google_siglip2_so400m_patch14_224.md) - [google/siglip2-so400m-patch14-384](https://docs.voxel51.com/model_zoo/models/google_siglip2_so400m_patch14_384.md) - [google/siglip2-so400m-patch16-256](https://docs.voxel51.com/model_zoo/models/google_siglip2_so400m_patch16_256.md) - [google/siglip2-so400m-patch16-384](https://docs.voxel51.com/model_zoo/models/google_siglip2_so400m_patch16_384.md) - [google/siglip2-so400m-patch16-512](https://docs.voxel51.com/model_zoo/models/google_siglip2_so400m_patch16_512.md) - [google/siglip2-so400m-patch16-naflex](https://docs.voxel51.com/model_zoo/models/google_siglip2_so400m_patch16_naflex.md) - [googlenet-imagenet-torch](https://docs.voxel51.com/model_zoo/models/googlenet_imagenet_torch.md) - [group-vit-segmentation-transformer-torch](https://docs.voxel51.com/model_zoo/models/group_vit_segmentation_transformer_torch.md) - [inception-resnet-v2-imagenet-tf1](https://docs.voxel51.com/model_zoo/models/inception_resnet_v2_imagenet_tf1.md) - [inception-v3-imagenet-torch](https://docs.voxel51.com/model_zoo/models/inception_v3_imagenet_torch.md) - [inception-v4-imagenet-tf1](https://docs.voxel51.com/model_zoo/models/inception_v4_imagenet_tf1.md) - [jinaai/jina-embeddings-v4](https://docs.voxel51.com/model_zoo/models/jinaai_jina_embeddings_v4.md) - [keypoint-rcnn-resnet50-fpn-coco-torch](https://docs.voxel51.com/model_zoo/models/keypoint_rcnn_resnet50_fpn_coco_torch.md) - [lightonai/LightOnOCR-2-1B](https://docs.voxel51.com/model_zoo/models/lightonai_LightOnOCR_2_1B.md) - [llamaindex/vdr-2b-multi-v1](https://docs.voxel51.com/model_zoo/models/llamaindex_vdr_2b_multi_v1.md) - [llamaindex/vdr-2b-v1](https://docs.voxel51.com/model_zoo/models/llamaindex_vdr_2b_v1.md) - [llmdet-base-torch](https://docs.voxel51.com/model_zoo/models/llmdet_base_torch.md) - [llmdet-large-torch](https://docs.voxel51.com/model_zoo/models/llmdet_large_torch.md) - [llmdet-tiny-torch](https://docs.voxel51.com/model_zoo/models/llmdet_tiny_torch.md) - [mask-rcnn-inception-resnet-v2-atrous-coco-tf](https://docs.voxel51.com/model_zoo/models/mask_rcnn_inception_resnet_v2_atrous_coco_tf.md) - [mask-rcnn-inception-v2-coco-tf](https://docs.voxel51.com/model_zoo/models/mask_rcnn_inception_v2_coco_tf.md) - [mask-rcnn-resnet101-atrous-coco-tf](https://docs.voxel51.com/model_zoo/models/mask_rcnn_resnet101_atrous_coco_tf.md) - [mask-rcnn-resnet50-atrous-coco-tf](https://docs.voxel51.com/model_zoo/models/mask_rcnn_resnet50_atrous_coco_tf.md) - [mask-rcnn-resnet50-fpn-coco-torch](https://docs.voxel51.com/model_zoo/models/mask_rcnn_resnet50_fpn_coco_torch.md) - [med-sam-2-video-torch](https://docs.voxel51.com/model_zoo/models/med_sam_2_video_torch.md) - [medsiglip-448-zero-torch](https://docs.voxel51.com/model_zoo/models/medsiglip_448_zero_torch.md) - [microsoft/Florence-2-base](https://docs.voxel51.com/model_zoo/models/microsoft_Florence_2_base.md) - [microsoft/Florence-2-base-ft](https://docs.voxel51.com/model_zoo/models/microsoft_Florence_2_base_ft.md) - [microsoft/Florence-2-large](https://docs.voxel51.com/model_zoo/models/microsoft_Florence_2_large.md) - [microsoft/Florence-2-large-ft](https://docs.voxel51.com/model_zoo/models/microsoft_Florence_2_large_ft.md) - [microsoft/GUI-Actor-3B-Qwen2.5-VL](https://docs.voxel51.com/model_zoo/models/microsoft_GUI_Actor_3B_Qwen2_5_VL.md) - [microsoft/GUI-Actor-7B-Qwen2.5-VL](https://docs.voxel51.com/model_zoo/models/microsoft_GUI_Actor_7B_Qwen2_5_VL.md) - [microsoft/kosmos-2.5](https://docs.voxel51.com/model_zoo/models/microsoft_kosmos_2_5.md) - [mnasnet0.5-imagenet-torch](https://docs.voxel51.com/model_zoo/models/mnasnet0_5_imagenet_torch.md) - [mnasnet1.0-imagenet-torch](https://docs.voxel51.com/model_zoo/models/mnasnet1_0_imagenet_torch.md) - [mobilenet-v2-imagenet-tf1](https://docs.voxel51.com/model_zoo/models/mobilenet_v2_imagenet_tf1.md) - [mobilenet-v2-imagenet-torch](https://docs.voxel51.com/model_zoo/models/mobilenet_v2_imagenet_torch.md) - [monet-zero-torch](https://docs.voxel51.com/model_zoo/models/monet_zero_torch.md) - [moondream/moondream3-preview](https://docs.voxel51.com/model_zoo/models/moondream_moondream3_preview.md) - [moonshotai/Kimi-VL-A3B-Instruct](https://docs.voxel51.com/model_zoo/models/moonshotai_Kimi_VL_A3B_Instruct.md) - [moonshotai/Kimi-VL-A3B-Thinking](https://docs.voxel51.com/model_zoo/models/moonshotai_Kimi_VL_A3B_Thinking.md) - [moonshotai/Kimi-VL-A3B-Thinking-2506](https://docs.voxel51.com/model_zoo/models/moonshotai_Kimi_VL_A3B_Thinking_2506.md) - [nanonets/Nanonets-OCR2-3B](https://docs.voxel51.com/model_zoo/models/nanonets_Nanonets_OCR2_3B.md) - [nomic-ai/nomic-embed-multimodal-3b](https://docs.voxel51.com/model_zoo/models/nomic_ai_nomic_embed_multimodal_3b.md) - [nomic-ai/nomic-embed-multimodal-7b](https://docs.voxel51.com/model_zoo/models/nomic_ai_nomic_embed_multimodal_7b.md) - [nv_labs/c-radio_v3-b](https://docs.voxel51.com/model_zoo/models/nv_labs_c_radio_v3_b.md) - [nv_labs/c-radio_v3-g](https://docs.voxel51.com/model_zoo/models/nv_labs_c_radio_v3_g.md) - [nv_labs/c-radio_v3-h](https://docs.voxel51.com/model_zoo/models/nv_labs_c_radio_v3_h.md) - [nv_labs/c-radio_v3-l](https://docs.voxel51.com/model_zoo/models/nv_labs_c_radio_v3_l.md) - [nv_labs/c-radio_v4-h](https://docs.voxel51.com/model_zoo/models/nv_labs_c_radio_v4_h.md) - [nv_labs/c-radio_v4-so400m](https://docs.voxel51.com/model_zoo/models/nv_labs_c_radio_v4_so400m.md) - [nvidia/Llama-3.1-Nemotron-Nano-VL-8B-V1](https://docs.voxel51.com/model_zoo/models/nvidia_Llama_3_1_Nemotron_Nano_VL_8B_V1.md) - [omdet-turbo-swin-tiny-torch](https://docs.voxel51.com/model_zoo/models/omdet_turbo_swin_tiny_torch.md) - [open-clip-torch](https://docs.voxel51.com/model_zoo/models/open_clip_torch.md) - [openbmb/MiniCPM-V-4_5](https://docs.voxel51.com/model_zoo/models/openbmb_MiniCPM_V_4_5.md) - [opendatalab/MinerU2.5-2509-1.2B](https://docs.voxel51.com/model_zoo/models/opendatalab_MinerU2_5_2509_1_2B.md) - [owlvit-base-patch16-torch](https://docs.voxel51.com/model_zoo/models/owlvit_base_patch16_torch.md) - [owlvit-base-patch32-torch](https://docs.voxel51.com/model_zoo/models/owlvit_base_patch32_torch.md) - [owlvit-large-patch14-torch](https://docs.voxel51.com/model_zoo/models/owlvit_large_patch14_torch.md) - [](https://docs.voxel51.com/model_zoo/models/plugin_model_cards.md) - [pose-estimation-transformer-torch](https://docs.voxel51.com/model_zoo/models/pose_estimation_transformer_torch.md) - [pubmed-clip-vit-base-patch32](https://docs.voxel51.com/model_zoo/models/pubmed_clip_vit_base_patch32.md) - [qwen3-vl-2b-instruct-torch](https://docs.voxel51.com/model_zoo/models/qwen3_vl_2b_instruct_torch.md) - [qwen3-vl-4b-instruct-torch](https://docs.voxel51.com/model_zoo/models/qwen3_vl_4b_instruct_torch.md) - [qwen3-vl-8b-instruct-torch](https://docs.voxel51.com/model_zoo/models/qwen3_vl_8b_instruct_torch.md) - [qwen3-vl-embedding-2b-torch](https://docs.voxel51.com/model_zoo/models/qwen3_vl_embedding_2b_torch.md) - [qwen3-vl-embedding-8b-torch](https://docs.voxel51.com/model_zoo/models/qwen3_vl_embedding_8b_torch.md) - [resnet101-imagenet-torch](https://docs.voxel51.com/model_zoo/models/resnet101_imagenet_torch.md) - [resnet152-imagenet-torch](https://docs.voxel51.com/model_zoo/models/resnet152_imagenet_torch.md) - [resnet18-imagenet-torch](https://docs.voxel51.com/model_zoo/models/resnet18_imagenet_torch.md) - [resnet34-imagenet-torch](https://docs.voxel51.com/model_zoo/models/resnet34_imagenet_torch.md) - [resnet50-imagenet-torch](https://docs.voxel51.com/model_zoo/models/resnet50_imagenet_torch.md) - [resnet-v1-50-imagenet-tf1](https://docs.voxel51.com/model_zoo/models/resnet_v1_50_imagenet_tf1.md) - [resnet-v2-50-imagenet-tf1](https://docs.voxel51.com/model_zoo/models/resnet_v2_50_imagenet_tf1.md) - [resnext101-32x8d-imagenet-torch](https://docs.voxel51.com/model_zoo/models/resnext101_32x8d_imagenet_torch.md) - [resnext50-32x4d-imagenet-torch](https://docs.voxel51.com/model_zoo/models/resnext50_32x4d_imagenet_torch.md) - [retinanet-resnet50-fpn-coco-torch](https://docs.voxel51.com/model_zoo/models/retinanet_resnet50_fpn_coco_torch.md) - [rfcn-resnet101-coco-tf](https://docs.voxel51.com/model_zoo/models/rfcn_resnet101_coco_tf.md) - [rtdetr-l-coco-torch](https://docs.voxel51.com/model_zoo/models/rtdetr_l_coco_torch.md) - [rtdetr-v2-m-coco-torch](https://docs.voxel51.com/model_zoo/models/rtdetr_v2_m_coco_torch.md) - [rtdetr-v2-s-coco-torch](https://docs.voxel51.com/model_zoo/models/rtdetr_v2_s_coco_torch.md) - [rtdetr-x-coco-torch](https://docs.voxel51.com/model_zoo/models/rtdetr_x_coco_torch.md) - [segformer-b0-ade20k-torch](https://docs.voxel51.com/model_zoo/models/segformer_b0_ade20k_torch.md) - [segformer-b1-ade20k-torch](https://docs.voxel51.com/model_zoo/models/segformer_b1_ade20k_torch.md) - [segformer-b2-ade20k-torch](https://docs.voxel51.com/model_zoo/models/segformer_b2_ade20k_torch.md) - [segformer-b3-ade20k-torch](https://docs.voxel51.com/model_zoo/models/segformer_b3_ade20k_torch.md) - [segformer-b4-ade20k-torch](https://docs.voxel51.com/model_zoo/models/segformer_b4_ade20k_torch.md) - [segformer-b5-ade20k-torch](https://docs.voxel51.com/model_zoo/models/segformer_b5_ade20k_torch.md) - [segment-anything-2.1-hiera-base-plus-image-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_2_1_hiera_base_plus_image_torch.md) - [segment-anything-2.1-hiera-base-plus-video-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_2_1_hiera_base_plus_video_torch.md) - [segment-anything-2.1-hiera-large-image-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_2_1_hiera_large_image_torch.md) - [segment-anything-2.1-hiera-large-video-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_2_1_hiera_large_video_torch.md) - [segment-anything-2.1-hiera-small-image-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_2_1_hiera_small_image_torch.md) - [segment-anything-2.1-hiera-small-video-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_2_1_hiera_small_video_torch.md) - [segment-anything-2.1-hiera-tiny-image-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_2_1_hiera_tiny_image_torch.md) - [segment-anything-2.1-hiera-tiny-video-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_2_1_hiera_tiny_video_torch.md) - [segment-anything-2-hiera-base-plus-image-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_2_hiera_base_plus_image_torch.md) - [segment-anything-2-hiera-base-plus-video-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_2_hiera_base_plus_video_torch.md) - [segment-anything-2-hiera-large-image-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_2_hiera_large_image_torch.md) - [segment-anything-2-hiera-large-video-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_2_hiera_large_video_torch.md) - [segment-anything-2-hiera-small-image-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_2_hiera_small_image_torch.md) - [segment-anything-2-hiera-small-video-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_2_hiera_small_video_torch.md) - [segment-anything-2-hiera-tiny-image-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_2_hiera_tiny_image_torch.md) - [segment-anything-2-hiera-tiny-video-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_2_hiera_tiny_video_torch.md) - [segment-anything-vitb-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_vitb_torch.md) - [segment-anything-vith-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_vith_torch.md) - [segment-anything-vitl-torch](https://docs.voxel51.com/model_zoo/models/segment_anything_vitl_torch.md) - [segmentation-transformer-torch](https://docs.voxel51.com/model_zoo/models/segmentation_transformer_torch.md) - [showlab/ShowUI-2B](https://docs.voxel51.com/model_zoo/models/showlab_ShowUI_2B.md) - [shufflenetv2-0.5x-imagenet-torch](https://docs.voxel51.com/model_zoo/models/shufflenetv2_0_5x_imagenet_torch.md) - [shufflenetv2-1.0x-imagenet-torch](https://docs.voxel51.com/model_zoo/models/shufflenetv2_1_0x_imagenet_torch.md) - [siglip-base-patch16-224-torch](https://docs.voxel51.com/model_zoo/models/siglip_base_patch16_224_torch.md) - [squeezenet-1@1.1.1-imagenet-torch](https://docs.voxel51.com/model_zoo/models/squeezenet_1@1_1_1_imagenet_torch.md) - [squeezenet-imagenet-torch@1.0](https://docs.voxel51.com/model_zoo/models/squeezenet_imagenet_torch@1_0.md) - [ssd-inception-v2-coco-tf](https://docs.voxel51.com/model_zoo/models/ssd_inception_v2_coco_tf.md) - [ssd-mobilenet-v1-coco-tf](https://docs.voxel51.com/model_zoo/models/ssd_mobilenet_v1_coco_tf.md) - [ssd-mobilenet-v1-fpn-640-coco17](https://docs.voxel51.com/model_zoo/models/ssd_mobilenet_v1_fpn_640_coco17.md) - [ssd-mobilenet-v1-fpn-coco-tf](https://docs.voxel51.com/model_zoo/models/ssd_mobilenet_v1_fpn_coco_tf.md) - [ssd-mobilenet-v2-320-coco17](https://docs.voxel51.com/model_zoo/models/ssd_mobilenet_v2_320_coco17.md) - [ssd-resnet50-fpn-coco-tf](https://docs.voxel51.com/model_zoo/models/ssd_resnet50_fpn_coco_tf.md) - [swin-v2-base-torch](https://docs.voxel51.com/model_zoo/models/swin_v2_base_torch.md) - [swin-v2-large-torch](https://docs.voxel51.com/model_zoo/models/swin_v2_large_torch.md) - [swin-v2-small-torch](https://docs.voxel51.com/model_zoo/models/swin_v2_small_torch.md) - [swin-v2-tiny-torch](https://docs.voxel51.com/model_zoo/models/swin_v2_tiny_torch.md) - [vgg11-bn-imagenet-torch](https://docs.voxel51.com/model_zoo/models/vgg11_bn_imagenet_torch.md) - [vgg11-imagenet-torch](https://docs.voxel51.com/model_zoo/models/vgg11_imagenet_torch.md) - [vgg13-bn-imagenet-torch](https://docs.voxel51.com/model_zoo/models/vgg13_bn_imagenet_torch.md) - [vgg13-imagenet-torch](https://docs.voxel51.com/model_zoo/models/vgg13_imagenet_torch.md) - [vgg16-bn-imagenet-torch](https://docs.voxel51.com/model_zoo/models/vgg16_bn_imagenet_torch.md) - [vgg16-imagenet-tf1](https://docs.voxel51.com/model_zoo/models/vgg16_imagenet_tf1.md) - [vgg16-imagenet-torch](https://docs.voxel51.com/model_zoo/models/vgg16_imagenet_torch.md) - [vgg19-bn-imagenet-torch](https://docs.voxel51.com/model_zoo/models/vgg19_bn_imagenet_torch.md) - [vgg19-imagenet-torch](https://docs.voxel51.com/model_zoo/models/vgg19_imagenet_torch.md) - [vidore/colpali-v1.3-merged](https://docs.voxel51.com/model_zoo/models/vidore_colpali_v1_3_merged.md) - [vidore/colqwen2.5-v0.2](https://docs.voxel51.com/model_zoo/models/vidore_colqwen2_5_v0_2.md) - [vikhyatk/moondream2](https://docs.voxel51.com/model_zoo/models/vikhyatk_moondream2.md) - [vit-base-patch16-224-imagenet-torch](https://docs.voxel51.com/model_zoo/models/vit_base_patch16_224_imagenet_torch.md) - [vitpose-base-simple-torch](https://docs.voxel51.com/model_zoo/models/vitpose_base_simple_torch.md) - [vitpose-base-torch](https://docs.voxel51.com/model_zoo/models/vitpose_base_torch.md) - [vitpose-plus-base-torch](https://docs.voxel51.com/model_zoo/models/vitpose_plus_base_torch.md) - [vitpose-plus-huge-torch](https://docs.voxel51.com/model_zoo/models/vitpose_plus_huge_torch.md) - [vitpose-plus-large-torch](https://docs.voxel51.com/model_zoo/models/vitpose_plus_large_torch.md) - [vitpose-plus-small-torch](https://docs.voxel51.com/model_zoo/models/vitpose_plus_small_torch.md) - [wide-resnet101-2-imagenet-torch](https://docs.voxel51.com/model_zoo/models/wide_resnet101_2_imagenet_torch.md) - [wide-resnet50-2-imagenet-torch](https://docs.voxel51.com/model_zoo/models/wide_resnet50_2_imagenet_torch.md) - [yolo11l-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo11l_coco_torch.md) - [yolo11l-seg-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo11l_seg_coco_torch.md) - [yolo11m-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo11m_coco_torch.md) - [yolo11m-seg-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo11m_seg_coco_torch.md) - [yolo11n-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo11n_coco_torch.md) - [yolo11n-seg-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo11n_seg_coco_torch.md) - [yolo11s-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo11s_coco_torch.md) - [yolo11s-seg-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo11s_seg_coco_torch.md) - [yolo11x-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo11x_coco_torch.md) - [yolo11x-seg-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo11x_seg_coco_torch.md) - [yolo26l-cls-imagenet-torch](https://docs.voxel51.com/model_zoo/models/yolo26l_cls_imagenet_torch.md) - [yolo26l-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo26l_coco_torch.md) - [yolo26l-pose-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo26l_pose_coco_torch.md) - [yolo26l-seg-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo26l_seg_coco_torch.md) - [yolo26m-cls-imagenet-torch](https://docs.voxel51.com/model_zoo/models/yolo26m_cls_imagenet_torch.md) - [yolo26m-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo26m_coco_torch.md) - [yolo26m-pose-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo26m_pose_coco_torch.md) - [yolo26m-seg-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo26m_seg_coco_torch.md) - [yolo26n-cls-imagenet-torch](https://docs.voxel51.com/model_zoo/models/yolo26n_cls_imagenet_torch.md) - [yolo26n-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo26n_coco_torch.md) - [yolo26n-pose-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo26n_pose_coco_torch.md) - [yolo26n-seg-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo26n_seg_coco_torch.md) - [yolo26s-cls-imagenet-torch](https://docs.voxel51.com/model_zoo/models/yolo26s_cls_imagenet_torch.md) - [yolo26s-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo26s_coco_torch.md) - [yolo26s-pose-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo26s_pose_coco_torch.md) - [yolo26s-seg-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo26s_seg_coco_torch.md) - [yolo26x-cls-imagenet-torch](https://docs.voxel51.com/model_zoo/models/yolo26x_cls_imagenet_torch.md) - [yolo26x-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo26x_coco_torch.md) - [yolo26x-pose-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo26x_pose_coco_torch.md) - [yolo26x-seg-coco-torch](https://docs.voxel51.com/model_zoo/models/yolo26x_seg_coco_torch.md) - [yolo-nas-torch](https://docs.voxel51.com/model_zoo/models/yolo_nas_torch.md) - [yolo-v2-coco-tf1](https://docs.voxel51.com/model_zoo/models/yolo_v2_coco_tf1.md) - [yoloe11l-seg-torch](https://docs.voxel51.com/model_zoo/models/yoloe11l_seg_torch.md) - [yoloe11m-seg-torch](https://docs.voxel51.com/model_zoo/models/yoloe11m_seg_torch.md) - [yoloe11s-seg-torch](https://docs.voxel51.com/model_zoo/models/yoloe11s_seg_torch.md) - [yoloev8l-seg-torch](https://docs.voxel51.com/model_zoo/models/yoloev8l_seg_torch.md) - [yoloev8m-seg-torch](https://docs.voxel51.com/model_zoo/models/yoloev8m_seg_torch.md) - [yoloev8s-seg-torch](https://docs.voxel51.com/model_zoo/models/yoloev8s_seg_torch.md) - [yolov10l-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov10l_coco_torch.md) - [yolov10m-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov10m_coco_torch.md) - [yolov10n-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov10n_coco_torch.md) - [yolov10s-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov10s_coco_torch.md) - [yolov10x-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov10x_coco_torch.md) - [yolov5l-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov5l_coco_torch.md) - [yolov5m-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov5m_coco_torch.md) - [yolov5n-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov5n_coco_torch.md) - [yolov5s-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov5s_coco_torch.md) - [yolov5x-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov5x_coco_torch.md) - [yolov8l-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov8l_coco_torch.md) - [yolov8l-obb-dotav1-torch](https://docs.voxel51.com/model_zoo/models/yolov8l_obb_dotav1_torch.md) - [yolov8l-oiv7-torch](https://docs.voxel51.com/model_zoo/models/yolov8l_oiv7_torch.md) - [yolov8l-seg-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov8l_seg_coco_torch.md) - [yolov8l-world-torch](https://docs.voxel51.com/model_zoo/models/yolov8l_world_torch.md) - [yolov8m-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov8m_coco_torch.md) - [yolov8m-obb-dotav1-torch](https://docs.voxel51.com/model_zoo/models/yolov8m_obb_dotav1_torch.md) - [yolov8m-oiv7-torch](https://docs.voxel51.com/model_zoo/models/yolov8m_oiv7_torch.md) - [yolov8m-seg-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov8m_seg_coco_torch.md) - [yolov8m-world-torch](https://docs.voxel51.com/model_zoo/models/yolov8m_world_torch.md) - [yolov8n-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov8n_coco_torch.md) - [yolov8n-obb-dotav1-torch](https://docs.voxel51.com/model_zoo/models/yolov8n_obb_dotav1_torch.md) - [yolov8n-oiv7-torch](https://docs.voxel51.com/model_zoo/models/yolov8n_oiv7_torch.md) - [yolov8n-seg-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov8n_seg_coco_torch.md) - [yolov8s-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov8s_coco_torch.md) - [yolov8s-obb-dotav1-torch](https://docs.voxel51.com/model_zoo/models/yolov8s_obb_dotav1_torch.md) - [yolov8s-oiv7-torch](https://docs.voxel51.com/model_zoo/models/yolov8s_oiv7_torch.md) - [yolov8s-seg-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov8s_seg_coco_torch.md) - [yolov8s-world-torch](https://docs.voxel51.com/model_zoo/models/yolov8s_world_torch.md) - [yolov8x-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov8x_coco_torch.md) - [yolov8x-obb-dotav1-torch](https://docs.voxel51.com/model_zoo/models/yolov8x_obb_dotav1_torch.md) - [yolov8x-oiv7-torch](https://docs.voxel51.com/model_zoo/models/yolov8x_oiv7_torch.md) - [yolov8x-seg-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov8x_seg_coco_torch.md) - [yolov8x-world-torch](https://docs.voxel51.com/model_zoo/models/yolov8x_world_torch.md) - [yolov9c-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov9c_coco_torch.md) - [yolov9c-seg-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov9c_seg_coco_torch.md) - [yolov9e-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov9e_coco_torch.md) - [yolov9e-seg-coco-torch](https://docs.voxel51.com/model_zoo/models/yolov9e_seg_coco_torch.md) - [zai-org/GLM-OCR](https://docs.voxel51.com/model_zoo/models/zai_org_GLM_OCR.md) - [zero-shot-classification-transformer-torch](https://docs.voxel51.com/model_zoo/models/zero_shot_classification_transformer_torch.md) - [zero-shot-detection-transformer-torch](https://docs.voxel51.com/model_zoo/models/zero_shot_detection_transformer_torch.md) ## FiftyOne Brain - [FiftyOne Brain](https://docs.voxel51.com/brain.md) ## Plugins Ecosystem - [Plugins](https://docs.voxel51.com/plugins/index.md) - [Overview](https://docs.voxel51.com/plugins/overview.md) - [Using plugins](https://docs.voxel51.com/plugins/using_plugins.md) - [Developing plugins](https://docs.voxel51.com/plugins/developing_plugins.md) - [Contributing plugins](https://docs.voxel51.com/plugins/contributing_plugins.md) - [API reference](https://docs.voxel51.com/plugins/api/plugins.md) - [plugins.operators](https://docs.voxel51.com/plugins/api/plugins.operators.md) - [plugins.operators.model_evaluation](https://docs.voxel51.com/plugins/api/plugins.operators.model_evaluation.md) - [plugins.operators.model_evaluation.utils](https://docs.voxel51.com/plugins/api/plugins.operators.model_evaluation.utils.md) - [plugins.operators.annotation](https://docs.voxel51.com/plugins/api/plugins.operators.annotation.md) - [plugins.operators.group_by](https://docs.voxel51.com/plugins/api/plugins.operators.group_by.md) - [plugins.panels](https://docs.voxel51.com/plugins/api/plugins.panels.md) - [plugins.panels.model_evaluation](https://docs.voxel51.com/plugins/api/plugins.panels.model_evaluation.md) - [plugins.panels.model_evaluation.utils](https://docs.voxel51.com/plugins/api/plugins.panels.model_evaluation.utils.md) - [plugins.utils](https://docs.voxel51.com/plugins/api/plugins.utils.md) - [plugins.utils.model_evaluation](https://docs.voxel51.com/plugins/api/plugins.utils.model_evaluation.md) - [TypeScript API reference](https://docs.voxel51.com/plugins/ts-api.md) - [@fiftyone/state](https://docs.voxel51.com/plugins/api/plugins.fiftyone.state.md) - [@fiftyone/plugins](https://docs.voxel51.com/plugins/api/plugins.fiftyone.plugins.md) - [@fiftyone/operators](https://docs.voxel51.com/plugins/api/plugins.fiftyone.operators.md) - [@fiftyone/spaces](https://docs.voxel51.com/plugins/api/plugins.fiftyone.spaces.md) - [@fiftyone/aggregations](https://docs.voxel51.com/plugins/api/plugins.fiftyone.aggregations.md) - [@fiftyone/relay](https://docs.voxel51.com/plugins/api/plugins.fiftyone.relay.md) - [@fiftyone/utilities](https://docs.voxel51.com/plugins/api/plugins.fiftyone.utilities.md) - [Active Learning](https://docs.voxel51.com/plugins/plugins_ecosystem/active_learning.md): Accelerate your data labeling with Active Learning!. When it comes to machine learning, one of the most time-consuming and costly parts of the process... - [Albumentations Data Augmentation Plugin for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/albumentations_augmentation.md): Test out any Albumentations data augmentation transform with FiftyOne!. Traditionally, data augmentation is performed on-the-fly during training. - [Annotation Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/annotation.md): Utilities for integrating FiftyOne with annotation tools. A plugin that contains utilities for integrating FiftyOne with annotation - [Media Anonymization Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/anonymize.md): Anonymize/blur images based on a FiftyOne Detections field. This plugin is a Python plugin that allows you to anonymize media in your - [Apple SHARP - FiftyOne Model Zoo Integration](https://docs.voxel51.com/plugins/plugins_ecosystem/apple_sharp.md): SHARP is Apple's state-of-the-art model for predicting 3D Gaussian Splats from a single RGB image. This integration brings SHARP to FiftyOne, enabling... - [Audio Loader](https://docs.voxel51.com/plugins/plugins_ecosystem/audio_loader.md): Import your audio datasets as spectograms into FiftyOne!. This FiftyOne plugin is a Python plugin that allows for you to load in your audio datasets as... - [Audio-to-Image Search Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/audio_retrieval.md): Find the images in your dataset most similar to an audio file!.... - [BDDOIA Safe/Unsafe Action Dataset for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/bddoia_fiftyone.md): Load and explore the BDDOIA Safe/Unsafe Action dataset via the FiftyOne Zoo. This dataset is designed for benchmarking safety-aware vision-language... - [BiModernVBert for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/bimodernvbert.md): BiModernVBert is a vision-language model built on the ModernVBert architecture that generates embeddings for both images and text in a shared... - [Brain Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/brain.md): Utilities for working with the FiftyOne Brain. A plugin that contains utilities for working with the - [Caption Viewer - Intelligent VLM Output Viewer for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/caption_viewer.md): A plugin that intelligently displays and formats VLM (Vision Language Model) outputs and text fields. Perfect for viewing OCR results, receipt... - [Clustering Plugin for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/clustering.md): Cluster your images using embeddings with FiftyOne and scikit-learn!. This plugin provides a FiftyOne App that allows you to cluster your dataset using... - [Automated Clustering](https://docs.voxel51.com/plugins/plugins_ecosystem/clustering_algorithms.md): Find the clusters in your data using some of the best algorithms available!. This plugin is a Python plugin that allows for you to find the clusters in... - [COCO4GUI Dataset Importer](https://docs.voxel51.com/plugins/plugins_ecosystem/coco4gui_fiftyone.md): Implementing the COCO4GUI dataset type in FiftyOne with importers and exports. A specialized FiftyOne dataset importer for GUI interaction datasets... - [ColModernVBert for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/colmodernvbert.md): ColModernVBert is a multi-vector vision-language model built on the ModernVBert architecture that generates ColBERT-style embeddings for both images... - [ColPali v1.3 for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/colpali_v1_3.md): ColPali is a Vision Language Model based on PaliGemma-3B that generates ColBERT-style multi-vector representations for efficient document retrieval.... - [ColQwen2.5-v0.2 for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/colqwen2_5_v0_2.md): ColQwen2.5 is a Vision Language Model based on Qwen2.5-VL-3B-Instruct that generates ColBERT-style multi-vector representations for efficient document... - [Interpolation](https://docs.voxel51.com/plugins/plugins_ecosystem/concept_interpolation.md): Find images that best interpolate between two text-based extremes!. This plugin allows you to "interpolate" between two text prompts and see the - [Concept Space Traversal Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/concept_space_traversal.md): Navigate concept space with CLIP, vector search, and FiftyOne!. This plugin allows you to "traverse" the concept space of a similarity index - [NVLabs C-RADIOv4 Models for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/cradiov4.md): CRADIOv4 performs visual feature extraction whose image embeddings can be used by a downstream model for various tasks. This implementation also... - [Dashboard Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/dashboard.md): Create your own custom dashboards from within the App. A plugin that enables users to construct custom dashboards that display - [DeepSeek-OCR FiftyOne Zoo Model](https://docs.voxel51.com/plugins/plugins_ecosystem/deepseek_ocr.md): DeepSeek-OCR is a vision-language model designed for optical character recognition with a focus on "contextual optical compression.". DeepSeek-OCR is a... - [Delegated Operations Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/delegated.md): Utilities for managing your delegated operations. A plugin that contains utilities for managing your - [Depth Pro Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/depth_pro_plugin.md): Perfom zero-shot metric monocular depth estimation using the Apple Depth Pro model. A FiftyOne plugin for applying the Apple Depth Pro model to your... - [Double Band Filter Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/double_band_filter.md): on two numeric ranges simultaneously!. This plugin provides an operator to filter a float-valued field on two ranges - [Label attribute editor](https://docs.voxel51.com/plugins/plugins_ecosystem/edit_label_attributes.md): Edit attributes of your labels directly in the FiftyOne App!. This plugin allows you to edit the attributes of selected labels in the - [EgoExOR: An Ego-Exo-Centric Operating Room Dataset for Surgical Activity Understanding](https://docs.voxel51.com/plugins/plugins_ecosystem/egoexor.md): EgoExOR is an Operating Room dataset fusing egocentric and exocentric perspectives for surgical procedures. See here to load it with FiftyOne.... - [Emoji Search Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/emoji_search.md): Semantically search emojis and copy to clipboard!. This plugin allows you to search for emojis based on the text you input. - [FastVLM Remote Zoo Models for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/fast_vlm.md): Integrating FastVLM as a Remote Source Zoo Model for FiftyOne. Apple's FastVLM vision-language models integrated as FiftyOne remote zoo models,... - [FiftyOne VLM Testing Suite](https://docs.voxel51.com/plugins/plugins_ecosystem/fiftyone_agents.md): A comprehensive FiftyOne plugin for testing and evaluating multiple Vision-Language Models (VLMs) with dynamic prompts and built-in evaluation... - [LeRobot v3.0 Dataset Importer for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/fiftyone_lerobot_importer.md): Import your LeRobot format dataset into FiftyOne format. A FiftyOne importer for [LeRobot v3.0](https://huggingface.co/docs/lerobot/lerobot-dataset-v3)... - [FiftyOne Tile](https://docs.voxel51.com/plugins/plugins_ecosystem/fiftyone_tile.md): Tile your high resolution images to squares for training small object detection models. Tile your images to squares (e.g. - [FiftyOne Timestamps](https://docs.voxel51.com/plugins/plugins_ecosystem/fiftyone_timestamps.md): Compute datetime-related fields (sunrise, dawn, evening, weekday, ...) from your samples' filenames or creation dates. This plugin provides operators... - [FiftyOne VLM Efficient](https://docs.voxel51.com/plugins/plugins_ecosystem/fiftyone_vlm_efficient.md): Improve VLM training data quality with state-of-the-art dataset pruning and quality techniques. A comprehensive toolkit for improving Vision-Language... - [FiftyOne + Weights & Biases Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/fiftyone_wandb_plugin.md): This plugin connects FiftyOne datasets with Weights & Biases to enable reproducible, data-centric ML workflows. **Track your computer vision... - [Filter Values Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/filter_values.md): Filter a field of your FiftyOne dataset by one or more values. A [FiftyOne plugin](https://docs.voxel51.com/plugins/index.html) for filtering - [Florence2 FiftyOne Remote Model Zoo Implementation](https://docs.voxel51.com/plugins/plugins_ecosystem/florence2.md): Implementing Florence2 as a Remote Zoo Model for FiftyOne. This repository provides a FiftyOne Model Zoo implementation for Florence-2, Microsoft's... - [Gemini Vision Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/gemini_vision_plugin.md): This plugin integrates Google Gemini's multimodal Vision models (e.g., gemini-2.5-flash) into your FiftyOne workflows. Prompt with text and one or more... - [GLM-OCR FiftyOne Implementation](https://docs.voxel51.com/plugins/plugins_ecosystem/glm_ocr.md): GLM-OCR is a lightweight 0.9B vision-language model achieving state-of-the-art document understanding, including formula recognition, table... - [GPT-4 Vision Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/gpt4_vision.md): Chat with your images using GPT-4 Vision!. On November 6, 2023, OpenAI made - [GUI-Actor FiftyOne Integration](https://docs.voxel51.com/plugins/plugins_ecosystem/gui_actor.md): Implementing Microsoft's GUI Actor as a Remote Zoo Model for FiftyOne. A FiftyOne integration for Microsoft's GUI-Actor vision-language models,... - [Facebook Hiera Video Embeddings Plugins](https://docs.voxel51.com/plugins/plugins_ecosystem/hiera_video_embeddings.md): Compute embeddings for video using Facebook Hiera Models. This plugin allows you to compute embeddings on videos using [Facebook's... - [Hugging Face Hub Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/huggingface_hub.md): Push FiftyOne datasets to the Hugging Face Hub, and load datasets from the Hub into FiftyOne!. A plugin that allows you to push FiftyOne datasets to... - [Image Captioning Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/image_captioning.md): Caption all your images with state of the art vision-language models!. This plugin lets you generate and store captions for your samples using - [Image Deduplication Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/image_deduplication.md): Find exact and approximate duplicates in your dataset!. This plugin is a Python plugin that streamlines image deduplication workflows! - [Image Quality Issues Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/image_issues.md): Find common image quality issues in your datasets. This plugin is a Python plugin that allows you to find common issues in your - [Image to Video Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/img_to_video.md): Bring images to life with image to video!. This plugin is a Python plugin that allows for you to generate videos from images! - [Indexes Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/indexes.md): Utilities working with FiftyOne database indexes. A plugin that contains utilities for working with FiftyOne database indexes. - [I/O Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/io.md): A collection of import/export utilities. A plugin that contains a collection of helpful import/export utilities. - [Isaac-0.1 FiftyOne Model Zoo Integration](https://docs.voxel51.com/plugins/plugins_ecosystem/isaac0_1.md): Isaac-0.1 is the first in Perceptron AI's family of models built to be the intelligence layer for the physical world. This integration supports various... - [Isaac-0.2 FiftyOne Model Zoo Integration](https://docs.voxel51.com/plugins/plugins_ecosystem/isaac_0_2.md): Isaac-0.2 is Perceptron AI's hybrid-reasoning vision-language model supporting object detection, keypoint detection, OCR, instance segmentation, visual... - [Janus Pro VQA Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/janus_vqa.md): Run the Janus Pro Models from Deepseek on your Fiftyone Dataset. Janus-Pro is an advanced multimodal model designed for both **multimodal... - [Jina Embeddings v4 for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/jina_embeddings_v4.md): Jina Embeddings v4 is a state-of-the-art Vision Language Model that generates embeddings for both images and text in a shared vector space. Integration... - [Keyword Search Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/keyword_search.md): Perform keyword search on a specified field!. This plugin is a Python plugin that allows you to search through your dataset - [Kimi-VL-A3B FiftyOne Zoo Model](https://docs.voxel51.com/plugins/plugins_ecosystem/kimi_vl_a3b.md): FiftyOne Remotely Sourced Zoo Model integration for Moonshot AI's Kimi-VL-A3B models enabling object detection, keypoint localization, and image... - [Kosmos-2.5 for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/kosmos2_5.md): Kosmos-2.5 excels at two core tasks: generating spatially-aware text blocks (OCR) and producing structured markdown output from images. A FiftyOne... - [LightOnOCR-2 FiftyOne Integration](https://docs.voxel51.com/plugins/plugins_ecosystem/lightonocr_2.md): LightOnOCR-2-1B is a compact multilingual VLM that converts document images into clean, naturally ordered text without brittle multi-stage OCR... - [Line 2D](https://docs.voxel51.com/plugins/plugins_ecosystem/line2d.md): Visualize x,y-Points as a line chart. |![Alt text](assets/thumbnails.png)|![Alt text](assets/sample.png)| - [Implementing MedGemma as a Remote Zoo Model for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/medgemma.md): Implementing MedGemma as a Remote Zoo Model for FiftyOne. This repository integrates Google's MedGemma models with FiftyOne, allowing you to easily use... - [Implementing MedGemma 1.5 as a Remote Zoo Model for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/medgemma_1_5.md): Implementing MedGemma 1.5 as a Remote Zoo Model for FiftyOne. This repository integrates Google's MedGemma models with FiftyOne, allowing you to easily... - [MedSigLIP in FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/medsiglip.md): Implementing MedSigLIP as a Remote Zoo Model for FiftyOne. **MedSigLIP** is a large-scale medical vision-language model developed by Google Health. - [MiMo-VL + FiftyOne Integration](https://docs.voxel51.com/plugins/plugins_ecosystem/mimo_vl.md): Implementing MiMo-VL as a Remote Zoo Model for FiftyOne. A FiftyOne Zoo integration for Xiaomi's MiMo-VL Vision-Language Models, providing... - [MinerU 2.5 for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/mineru_2_5.md): MinerU2.5 is a 1.2B-parameter vision-language model for efficient high-resolution document parsing. This model can support grounding OCR as well as... - [MiniCPM-V Integration for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/minicpm_v.md): Integrating MiniCPM-V 4.5 as a Remote Source Zoo Model in FiftyOne. Integrate [MiniCPM-V 4.5](https://github.com/OpenBMB/MiniCPM-V), a powerful 8B... - [MLflow Experiment Tracking Plugin for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/mlflow.md): Track model training experiments on your FiftyOne datasets with MLflow!. Training models is hard, and bridging the divide between data and models is... - [Model Comparison Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/model_comparison.md): Compare two object detection models!. A [FiftyOne plugin](https://docs.voxel51.com/plugins/index.html) for comparing two object - [Molmo2 - FiftyOne Model Zoo Integration](https://docs.voxel51.com/plugins/plugins_ecosystem/molmo2.md): Molmo2 is a family of open vision-language models developed by the Allen Institute for AI (Ai2) that support image, video, and multi-image... - [Moondream2 FiftyOne Remote Zoo Model Implementation](https://docs.voxel51.com/plugins/plugins_ecosystem/moondream2.md): Moondream2 implementation as a remotely sourced zoo model for FiftyOne. Moondream2 is a powerful vision-language model that can be used with FiftyOne... - [Moondream3 FiftyOne Zoo Model](https://docs.voxel51.com/plugins/plugins_ecosystem/moondream3.md): Moondream 3 (Preview) is an vision language model with a mixture-of-experts architecture (9B total parameters, 2B active). This model makes no... - [Multi Annotator Toolkit](https://docs.voxel51.com/plugins/plugins_ecosystem/multi_annotator_toolkit.md): Tackle noisy annotation! Find and analyze annotation issues in datasets with multiple annotators per image. This is a plugin for the [FiftyOne... - [Multimodal RAG with FiftyOne, LlamaIndex, and Milvus](https://docs.voxel51.com/plugins/plugins_ecosystem/multimodal_rag.md): Create and test multimodal RAG pipelines with LlamaIndex, Milvus, and FiftyOne!. Retrieval augmentated generation (RAG) has grown increasingly popular... - [Nanonets-OCR2 for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/nanonets_ocr2.md): Nanonets-OCR2 transforms documents into structured markdown with intelligent content recognition and semantic tagging, making it ideal for downstream... - [FiftyOne NeMo Retriever Parse Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/nemo_retriever_parse_plugin.md): Implementing NVIDIA NeMo Retriever Parse as a FiftyOne Plugin. A FiftyOne plugin that integrates NVIDIA's NeMo Retriever Parse model to detect and... - [Nemotron Nano VL - FiftyOne Remote Source Zoo Model](https://docs.voxel51.com/plugins/plugins_ecosystem/nemotron_nano_vl.md): Implementing Llama-3.1-Nemotron-Nano-VL-8B-V1 as a Remote Zoo Model for FiftyOne. NVIDIA's Llama-3.1-Nemotron-Nano-VL-8B-V1 integrated as a Remote... - [Nomic Embed Multimodal for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/nomic_embed_multimodal.md): Nomic Embed Multimodal is a family of vision-language models built on Qwen2.5-VL that generates high-dimensional embeddings for both images and text in... - [NVLabs C-RADIO Models for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/nvlabs_cradiov3.md): Implementing NVLabs C-RADIOv3 Embeddings Model as Remotely Sourced Zoo Model for FiftyOne. This repository provides FiftyOne integration for C-RADIO... - [olmOCR-2 FiftyOne Integration](https://docs.voxel51.com/plugins/plugins_ecosystem/olmocr_2.md): olmOCR-2 is a state-of-the-art OCR model built on Qwen2.5-VL architecture that extracts text from document images with high accuracy. A FiftyOne plugin... - [Optimal Confidence Threshold Finder](https://docs.voxel51.com/plugins/plugins_ecosystem/optimal_confidence_threshold.md): Find the optimal confidence threshold for your detection models automatically!. This plugin is a Python plugin that allows for you to find the optimal... - [OS-Atlas FiftyOne Integration](https://docs.voxel51.com/plugins/plugins_ecosystem/os_atlas.md): Integrating OS-Atlas Base into FiftyOne as a Remote Source Zoo Model. A robust FiftyOne model integration for OS-Atlas vision-language models, designed... - [Outlier Detection](https://docs.voxel51.com/plugins/plugins_ecosystem/outlier_detection.md): Find those troublesome outliers in your dataset automatically!. This plugin is a Python plugin that allows for you to find the outliers in your dataset! - [PaliGemma2 Mix for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/paligemma2.md): Implementing PaliGemma-2-Mix as a Remote Zoo Model for FiftyOne. This repository integrates Google DeepMind's PaliGemma2 Mix models into the FiftyOne... - [PDF Loader Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/pdf_loader.md): Load your PDF documents into FiftyOne as per-page images. A [FiftyOne plugin](https://docs.voxel51.com/plugins/index.html) for loading - [Plotly Map Panel](https://docs.voxel51.com/plugins/plugins_ecosystem/plotly_map_panel.md): Plotly-based Map Panel with adjustable marker cosmetics!. An example map panel using PlotlyView. - [](https://docs.voxel51.com/plugins/plugins_ecosystem/plugin_cards.md) - [Plugin Management & Development](https://docs.voxel51.com/plugins/plugins_ecosystem/plugins.md): Utilities for managing and building FiftyOne plugins. A plugin that contains utilities for managing your FiftyOne plugins and - [PyTesseract Optical Character Recognition Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/pytesseract_ocr.md): Run optical character recognition with PyTesseract!. - **2023-10-19**: Added support for customizing prediction fields, and embedded field for OCR text. - [Qwen2.5-VL FiftyOne Remote Model Zoo Implementation](https://docs.voxel51.com/plugins/plugins_ecosystem/qwen2_5_vl.md): Implementing Qwen2.5-VL as a Remote Zoo Model for FiftyOne. Open source FiftyOne plugin for computer vision workflows. - [Qwen3-VL-Embedding for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/qwen3vl_embeddings.md): Qwen3-VL-Embedding maps text, images, and video into a unified representation space, enabling powerful cross-modal retrieval and understanding. A... - [Qwen3-VL Video Model for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/qwen3vl_video.md): A FiftyOne zoo model integration for Qwen3-VL that enables comprehensive video understanding with multiple label types in a single forward pass and for... - [FiftyOne Rerun Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/rerun_plugin.md): Visualize Rerun data files (.rrd) inside the FiftyOne App. A plugin that enables users to visualize [Rerun](https://rerun.io/) data files - [Reverse Image Search Plugin ⏪](https://docs.voxel51.com/plugins/plugins_ecosystem/reverse_image_search.md): Find the images in your dataset most similar to an image from filesystem or the internet!. This plugin allows you to search your dataset for images... - [ROI Patches Plugin for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/roi_patches.md): Tile images into a configurable grid of ROI patches with adjustable overlap for region-based analysis, using FiftyOne's native patches view. A... - [Runs Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/runs.md): Utilities for managing your custom runs. A plugin that contains utilities for working with - [SAM3 Segment Plugin for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/sam3_images.md): Integration of Meta's SAM3 (Segment Anything Model 3) into FiftyOne, with full support of text prompts, keypoint prompts, bounding box prompts, auto... - [Sample Inspector Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/sample_inspector.md): Adjust image brightness and contrast and filter semantic masks by class in a sample detail view!. A [FiftyOne](https://github.com/voxel51/fiftyone)... - [Segments.ai Voxel51 Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/segments_voxel51_plugin.md): Integrate FiftyOne with the Segments.ai annotation tool!. 🚀 New: experimental support for multisensor sequences 🚀 - [Semantic Document Search Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/semantic_document_search.md): Perform semantic search on text in your documents!. This plugin is a Python plugin that allows you to semantically search through your text blocks... - [FiftyOne + Twelve Labs Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/semantic_video_search.md): search through your video datasets using FiftyOne Brain and Twelve Labs!. Bring multimodal video intelligence into your computer vision workflows with... - [ShowUI Model Integration for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/showui.md): Integrating ShowUI into FiftyOne as a Remote Source Zoo Model. This repository provides a FiftyOne model integration for ShowUI, a vision-language... - [SigLIP2 for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/siglip2.md): A FiftyOne Remotely Sourced Zoo Model integration for Google's SigLIP2 model enabling natural language search across images in your FiftyOne Dataset.... - [Synthetic GUI Samples Plugin for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/synthetic_gui_samples_plugins.md): A FiftyOne plugin for generating synthetic samples for datasets in COCO4GUI format. A comprehensive FiftyOne plugin for augmenting GUI screenshot... - [Text Evaluation Metrics for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/text_evaluation_metrics.md): This plugin provides five text evaluation metrics for comparing predictions against ground truth: ANLS, Exact Match, Normalized Similarity, Character... - [Text-to-Image Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/text_to_image.md): Add synthetic data from prompts with text-to-image models and FiftyOne!. - **2024-04-23**: Added support for Stable Diffusion 3 (Thanks [Dan... - [Hugging Face Transformers Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/transformers.md): Run inference on your datasets using Hugging Face Transformers models!. A plugin that allows you to apply transformer models from the Hugging Face Hub... - [Twilio Automation Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/twilio_automation.md): Automate data ingestion with Twilio!. This plugin is a Python plugin that allows you to automate data ingestion with... - [UI-TARS FiftyOne Integration](https://docs.voxel51.com/plugins/plugins_ecosystem/ui_tars.md): Implementing UI-TARS-1.5 as a Remote Zoo Model for FiftyOne. A comprehensive integration of the UI-TARS vision-language model with FiftyOne for GUI... - [Utilities Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/utils.md): Call your favorite SDK utilities from the App. A plugin that contains common utilities for working with FiftyOne. - [VGGT: Visual Geometry Grounded Transformer FiftyOne Remote Source Zoo Model Integration](https://docs.voxel51.com/plugins/plugins_ecosystem/vggt.md): Implemeting Meta AI's VGGT as a FiftyOne Remote Zoo Model. This repository provides a FiftyOne Zoo Model for **VGGT (Visual Geometry Grounded... - [Vision-Document Retrieval (VDR) Model for FiftyOne](https://docs.voxel51.com/plugins/plugins_ecosystem/visual_document_retrieval.md): A FiftyOne Remotely Sourced Zoo Model integration for LlamaIndex's VDR model enabling natural language search across document images, screenshots, and... - [ViTPose Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/vitpose.md): Run ViTPose Models from Hugging Face on your FiftyOne Dataset. This plugin essentially makes it easy to add human pose estimation capabilities to any... - [VLM Run Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/vlmrun_voxel51_plugin.md): Extract structured data from visual and audio sources including documents, images, and videos. A plugin that provides operators for extracting... - [VoxelGPT](https://docs.voxel51.com/plugins/plugins_ecosystem/voxelgpt.md): An AI assistant that can query visual datasets, search the FiftyOne docs, and answer general computer vision questions. Wish you could search your... - [Visual Question Answering Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/vqa_plugin.md): Ask (and answer) open-ended visual questions about your images!. - **2024-05-07**: Major updates: - [YouTube Panel Player Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/youtube_panel_plugin.md): Play YouTube videos in the FiftyOne App!. This plugin allows you to play YouTube videos in a panel the FiftyOne App! - [Zero Shot Prediction Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/zero_shot_prediction.md): Run zero-shot (open vocabulary) prediction on your data!. This plugin allows you to perform zero-shot prediction on your dataset for the following tasks: - [Zoo Plugin](https://docs.voxel51.com/plugins/plugins_ecosystem/zoo.md): Download datasets and run inference with models from the FiftyOne Zoo, all without leaving the App. A plugin that contains utilities for working with the ## FiftyOne Integrations - [Integrations](https://docs.voxel51.com/integrations/index.md) - [COCO](https://docs.voxel51.com/integrations/coco.md) - [Open Images](https://docs.voxel51.com/integrations/open_images.md) - [ActivityNet](https://docs.voxel51.com/integrations/activitynet.md) - [Integrating with Annotation Backends](https://docs.voxel51.com/integrations/annotation.md) - [CVAT](https://docs.voxel51.com/integrations/cvat.md) - [Label Studio](https://docs.voxel51.com/integrations/labelstudio.md) - [V7](https://docs.voxel51.com/integrations/v7.md) - [Labelbox](https://docs.voxel51.com/integrations/labelbox.md) - [Qdrant](https://docs.voxel51.com/integrations/qdrant.md) - [Redis](https://docs.voxel51.com/integrations/redis.md) - [Pinecone](https://docs.voxel51.com/integrations/pinecone.md) - [MongoDB](https://docs.voxel51.com/integrations/mongodb.md) - [Elasticsearch](https://docs.voxel51.com/integrations/elasticsearch.md) - [PostgreSQL Pgvector](https://docs.voxel51.com/integrations/pgvector.md) - [Databricks Mosaic AI](https://docs.voxel51.com/integrations/mosaic.md) - [Milvus](https://docs.voxel51.com/integrations/milvus.md) - [LanceDB](https://docs.voxel51.com/integrations/lancedb.md) - [Hugging Face](https://docs.voxel51.com/integrations/huggingface.md) - [Ultralytics](https://docs.voxel51.com/integrations/ultralytics.md) - [Albumentations](https://docs.voxel51.com/integrations/albumentations.md) - [SuperGradients](https://docs.voxel51.com/integrations/super_gradients.md) - [OpenCLIP](https://docs.voxel51.com/integrations/openclip.md) - [PyTorch Hub](https://docs.voxel51.com/integrations/pytorch_hub.md) - [Lightning Flash](https://docs.voxel51.com/integrations/lightning_flash.md) ## FiftyOne Command-Line Interface (CLI) - [CLI](https://docs.voxel51.com/cli/index.md) ## fiftyone - [API Reference](https://docs.voxel51.com/api/fiftyone.md) - [fiftyone.brain](https://docs.voxel51.com/api/fiftyone.brain.md) - [fiftyone.brain.internal](https://docs.voxel51.com/api/fiftyone.brain.internal.md) - [fiftyone.brain.internal.core](https://docs.voxel51.com/api/fiftyone.brain.internal.core.md) - [fiftyone.brain.internal.core.duplicates](https://docs.voxel51.com/api/fiftyone.brain.internal.core.duplicates.md) - [fiftyone.brain.internal.core.elasticsearch](https://docs.voxel51.com/api/fiftyone.brain.internal.core.elasticsearch.md) - [fiftyone.brain.internal.core.hardness](https://docs.voxel51.com/api/fiftyone.brain.internal.core.hardness.md) - [fiftyone.brain.internal.core.lancedb](https://docs.voxel51.com/api/fiftyone.brain.internal.core.lancedb.md) - [fiftyone.brain.internal.core.leaky_splits](https://docs.voxel51.com/api/fiftyone.brain.internal.core.leaky_splits.md) - [fiftyone.brain.internal.core.milvus](https://docs.voxel51.com/api/fiftyone.brain.internal.core.milvus.md) - [fiftyone.brain.internal.core.mistakenness](https://docs.voxel51.com/api/fiftyone.brain.internal.core.mistakenness.md) - [fiftyone.brain.internal.core.mongodb](https://docs.voxel51.com/api/fiftyone.brain.internal.core.mongodb.md) - [fiftyone.brain.internal.core.mosaic](https://docs.voxel51.com/api/fiftyone.brain.internal.core.mosaic.md) - [fiftyone.brain.internal.core.pgvector](https://docs.voxel51.com/api/fiftyone.brain.internal.core.pgvector.md) - [fiftyone.brain.internal.core.pinecone](https://docs.voxel51.com/api/fiftyone.brain.internal.core.pinecone.md) - [fiftyone.brain.internal.core.qdrant](https://docs.voxel51.com/api/fiftyone.brain.internal.core.qdrant.md) - [fiftyone.brain.internal.core.redis](https://docs.voxel51.com/api/fiftyone.brain.internal.core.redis.md) - [fiftyone.brain.internal.core.representativeness](https://docs.voxel51.com/api/fiftyone.brain.internal.core.representativeness.md) - [fiftyone.brain.internal.core.sklearn](https://docs.voxel51.com/api/fiftyone.brain.internal.core.sklearn.md) - [fiftyone.brain.internal.core.uniqueness](https://docs.voxel51.com/api/fiftyone.brain.internal.core.uniqueness.md) - [fiftyone.brain.internal.core.utils](https://docs.voxel51.com/api/fiftyone.brain.internal.core.utils.md) - [fiftyone.brain.internal.core.visualization](https://docs.voxel51.com/api/fiftyone.brain.internal.core.visualization.md) - [fiftyone.brain.internal.models](https://docs.voxel51.com/api/fiftyone.brain.internal.models.md) - [fiftyone.brain.internal.models.simple_resnet](https://docs.voxel51.com/api/fiftyone.brain.internal.models.simple_resnet.md) - [fiftyone.brain.internal.models.torch](https://docs.voxel51.com/api/fiftyone.brain.internal.models.torch.md) - [fiftyone.brain.config](https://docs.voxel51.com/api/fiftyone.brain.config.md) - [fiftyone.brain.similarity](https://docs.voxel51.com/api/fiftyone.brain.similarity.md) - [fiftyone.brain.visualization](https://docs.voxel51.com/api/fiftyone.brain.visualization.md) - [fiftyone.core](https://docs.voxel51.com/api/fiftyone.core.md) - [fiftyone.core.annotation](https://docs.voxel51.com/api/fiftyone.core.annotation.md) - [fiftyone.core.annotation.constants](https://docs.voxel51.com/api/fiftyone.core.annotation.constants.md) - [fiftyone.core.annotation.generate_label_schemas](https://docs.voxel51.com/api/fiftyone.core.annotation.generate_label_schemas.md) - [fiftyone.core.annotation.utils](https://docs.voxel51.com/api/fiftyone.core.annotation.utils.md) - [fiftyone.core.annotation.validate_label_schemas](https://docs.voxel51.com/api/fiftyone.core.annotation.validate_label_schemas.md) - [fiftyone.core.map](https://docs.voxel51.com/api/fiftyone.core.map.md) - [fiftyone.core.map.batcher](https://docs.voxel51.com/api/fiftyone.core.map.batcher.md) - [fiftyone.core.map.batcher.batch](https://docs.voxel51.com/api/fiftyone.core.map.batcher.batch.md) - [fiftyone.core.map.batcher.id_batch](https://docs.voxel51.com/api/fiftyone.core.map.batcher.id_batch.md) - [fiftyone.core.map.batcher.slice_batch](https://docs.voxel51.com/api/fiftyone.core.map.batcher.slice_batch.md) - [fiftyone.core.map.factory](https://docs.voxel51.com/api/fiftyone.core.map.factory.md) - [fiftyone.core.map.mapper](https://docs.voxel51.com/api/fiftyone.core.map.mapper.md) - [fiftyone.core.map.process](https://docs.voxel51.com/api/fiftyone.core.map.process.md) - [fiftyone.core.map.threading](https://docs.voxel51.com/api/fiftyone.core.map.threading.md) - [fiftyone.core.map.typing](https://docs.voxel51.com/api/fiftyone.core.map.typing.md) - [fiftyone.core.odm](https://docs.voxel51.com/api/fiftyone.core.odm.md) - [fiftyone.core.odm.database](https://docs.voxel51.com/api/fiftyone.core.odm.database.md) - [fiftyone.core.odm.dataset](https://docs.voxel51.com/api/fiftyone.core.odm.dataset.md) - [fiftyone.core.odm.document](https://docs.voxel51.com/api/fiftyone.core.odm.document.md) - [fiftyone.core.odm.embedded_document](https://docs.voxel51.com/api/fiftyone.core.odm.embedded_document.md) - [fiftyone.core.odm.frame](https://docs.voxel51.com/api/fiftyone.core.odm.frame.md) - [fiftyone.core.odm.mixins](https://docs.voxel51.com/api/fiftyone.core.odm.mixins.md) - [fiftyone.core.odm.runs](https://docs.voxel51.com/api/fiftyone.core.odm.runs.md) - [fiftyone.core.odm.sample](https://docs.voxel51.com/api/fiftyone.core.odm.sample.md) - [fiftyone.core.odm.utils](https://docs.voxel51.com/api/fiftyone.core.odm.utils.md) - [fiftyone.core.odm.views](https://docs.voxel51.com/api/fiftyone.core.odm.views.md) - [fiftyone.core.odm.workspace](https://docs.voxel51.com/api/fiftyone.core.odm.workspace.md) - [fiftyone.core.plots](https://docs.voxel51.com/api/fiftyone.core.plots.md) - [fiftyone.core.plots.base](https://docs.voxel51.com/api/fiftyone.core.plots.base.md) - [fiftyone.core.plots.manager](https://docs.voxel51.com/api/fiftyone.core.plots.manager.md) - [fiftyone.core.plots.matplotlib](https://docs.voxel51.com/api/fiftyone.core.plots.matplotlib.md) - [fiftyone.core.plots.plotly](https://docs.voxel51.com/api/fiftyone.core.plots.plotly.md) - [fiftyone.core.plots.utils](https://docs.voxel51.com/api/fiftyone.core.plots.utils.md) - [fiftyone.core.plots.views](https://docs.voxel51.com/api/fiftyone.core.plots.views.md) - [fiftyone.core.session](https://docs.voxel51.com/api/fiftyone.core.session.md) - [fiftyone.core.session.client](https://docs.voxel51.com/api/fiftyone.core.session.client.md) - [fiftyone.core.session.events](https://docs.voxel51.com/api/fiftyone.core.session.events.md) - [fiftyone.core.session.notebooks](https://docs.voxel51.com/api/fiftyone.core.session.notebooks.md) - [fiftyone.core.session.session](https://docs.voxel51.com/api/fiftyone.core.session.session.md) - [fiftyone.core.session.templates](https://docs.voxel51.com/api/fiftyone.core.session.templates.md) - [fiftyone.core.threed](https://docs.voxel51.com/api/fiftyone.core.threed.md) - [fiftyone.core.threed.camera](https://docs.voxel51.com/api/fiftyone.core.threed.camera.md) - [fiftyone.core.threed.lights](https://docs.voxel51.com/api/fiftyone.core.threed.lights.md) - [fiftyone.core.threed.material_3d](https://docs.voxel51.com/api/fiftyone.core.threed.material_3d.md) - [fiftyone.core.threed.mesh](https://docs.voxel51.com/api/fiftyone.core.threed.mesh.md) - [fiftyone.core.threed.object_3d](https://docs.voxel51.com/api/fiftyone.core.threed.object_3d.md) - [fiftyone.core.threed.pointcloud](https://docs.voxel51.com/api/fiftyone.core.threed.pointcloud.md) - [fiftyone.core.threed.scene_3d](https://docs.voxel51.com/api/fiftyone.core.threed.scene_3d.md) - [fiftyone.core.threed.shape_3d](https://docs.voxel51.com/api/fiftyone.core.threed.shape_3d.md) - [fiftyone.core.threed.transformation](https://docs.voxel51.com/api/fiftyone.core.threed.transformation.md) - [fiftyone.core.threed.utils](https://docs.voxel51.com/api/fiftyone.core.threed.utils.md) - [fiftyone.core.threed.validators](https://docs.voxel51.com/api/fiftyone.core.threed.validators.md) - [fiftyone.core.aggregations](https://docs.voxel51.com/api/fiftyone.core.aggregations.md) - [fiftyone.core.brain](https://docs.voxel51.com/api/fiftyone.core.brain.md) - [fiftyone.core.camera](https://docs.voxel51.com/api/fiftyone.core.camera.md) - [fiftyone.core.cli](https://docs.voxel51.com/api/fiftyone.core.cli.md) - [fiftyone.core.clips](https://docs.voxel51.com/api/fiftyone.core.clips.md) - [fiftyone.core.collections](https://docs.voxel51.com/api/fiftyone.core.collections.md) - [fiftyone.core.config](https://docs.voxel51.com/api/fiftyone.core.config.md) - [fiftyone.core.context](https://docs.voxel51.com/api/fiftyone.core.context.md) - [fiftyone.core.dataset](https://docs.voxel51.com/api/fiftyone.core.dataset.md) - [fiftyone.core.document](https://docs.voxel51.com/api/fiftyone.core.document.md) - [fiftyone.core.evaluation](https://docs.voxel51.com/api/fiftyone.core.evaluation.md) - [fiftyone.core.expressions](https://docs.voxel51.com/api/fiftyone.core.expressions.md) - [fiftyone.core.fields](https://docs.voxel51.com/api/fiftyone.core.fields.md) - [fiftyone.core.frame](https://docs.voxel51.com/api/fiftyone.core.frame.md) - [fiftyone.core.frame_utils](https://docs.voxel51.com/api/fiftyone.core.frame_utils.md) - [fiftyone.core.groups](https://docs.voxel51.com/api/fiftyone.core.groups.md) - [fiftyone.core.json](https://docs.voxel51.com/api/fiftyone.core.json.md) - [fiftyone.core.labels](https://docs.voxel51.com/api/fiftyone.core.labels.md) - [fiftyone.core.logging](https://docs.voxel51.com/api/fiftyone.core.logging.md) - [fiftyone.core.media](https://docs.voxel51.com/api/fiftyone.core.media.md) - [fiftyone.core.metadata](https://docs.voxel51.com/api/fiftyone.core.metadata.md) - [fiftyone.core.models](https://docs.voxel51.com/api/fiftyone.core.models.md) - [fiftyone.core.patches](https://docs.voxel51.com/api/fiftyone.core.patches.md) - [fiftyone.core.runs](https://docs.voxel51.com/api/fiftyone.core.runs.md) - [fiftyone.core.sample](https://docs.voxel51.com/api/fiftyone.core.sample.md) - [fiftyone.core.service](https://docs.voxel51.com/api/fiftyone.core.service.md) - [fiftyone.core.singletons](https://docs.voxel51.com/api/fiftyone.core.singletons.md) - [fiftyone.core.stages](https://docs.voxel51.com/api/fiftyone.core.stages.md) - [fiftyone.core.state](https://docs.voxel51.com/api/fiftyone.core.state.md) - [fiftyone.core.storage](https://docs.voxel51.com/api/fiftyone.core.storage.md) - [fiftyone.core.uid](https://docs.voxel51.com/api/fiftyone.core.uid.md) - [fiftyone.core.utils](https://docs.voxel51.com/api/fiftyone.core.utils.md) - [fiftyone.core.validation](https://docs.voxel51.com/api/fiftyone.core.validation.md) - [fiftyone.core.video](https://docs.voxel51.com/api/fiftyone.core.video.md) - [fiftyone.core.view](https://docs.voxel51.com/api/fiftyone.core.view.md) - [fiftyone.factory](https://docs.voxel51.com/api/fiftyone.factory.md) - [fiftyone.factory.repos](https://docs.voxel51.com/api/fiftyone.factory.repos.md) - [fiftyone.factory.repos.delegated_operation](https://docs.voxel51.com/api/fiftyone.factory.repos.delegated_operation.md) - [fiftyone.factory.repos.delegated_operation_doc](https://docs.voxel51.com/api/fiftyone.factory.repos.delegated_operation_doc.md) - [fiftyone.factory.repos.execution_store](https://docs.voxel51.com/api/fiftyone.factory.repos.execution_store.md) - [fiftyone.factory.repo_factory](https://docs.voxel51.com/api/fiftyone.factory.repo_factory.md) - [fiftyone.migrations](https://docs.voxel51.com/api/fiftyone.migrations.md) - [fiftyone.migrations.runner](https://docs.voxel51.com/api/fiftyone.migrations.runner.md) - [fiftyone.operators](https://docs.voxel51.com/api/fiftyone.operators.md) - [fiftyone.operators.cache](https://docs.voxel51.com/api/fiftyone.operators.cache.md) - [fiftyone.operators.cache.decorator](https://docs.voxel51.com/api/fiftyone.operators.cache.decorator.md) - [fiftyone.operators.cache.ephemeral](https://docs.voxel51.com/api/fiftyone.operators.cache.ephemeral.md) - [fiftyone.operators.cache.serialization](https://docs.voxel51.com/api/fiftyone.operators.cache.serialization.md) - [fiftyone.operators.cache.utils](https://docs.voxel51.com/api/fiftyone.operators.cache.utils.md) - [fiftyone.operators.store](https://docs.voxel51.com/api/fiftyone.operators.store.md) - [fiftyone.operators.store.models](https://docs.voxel51.com/api/fiftyone.operators.store.models.md) - [fiftyone.operators.store.notification_service](https://docs.voxel51.com/api/fiftyone.operators.store.notification_service.md) - [fiftyone.operators.store.service](https://docs.voxel51.com/api/fiftyone.operators.store.service.md) - [fiftyone.operators.store.store](https://docs.voxel51.com/api/fiftyone.operators.store.store.md) - [fiftyone.operators.store.subscription_registry](https://docs.voxel51.com/api/fiftyone.operators.store.subscription_registry.md) - [fiftyone.operators.categories](https://docs.voxel51.com/api/fiftyone.operators.categories.md) - [fiftyone.operators.constants](https://docs.voxel51.com/api/fiftyone.operators.constants.md) - [fiftyone.operators.decorators](https://docs.voxel51.com/api/fiftyone.operators.decorators.md) - [fiftyone.operators.delegated](https://docs.voxel51.com/api/fiftyone.operators.delegated.md) - [fiftyone.operators.evaluation_metric](https://docs.voxel51.com/api/fiftyone.operators.evaluation_metric.md) - [fiftyone.operators.events](https://docs.voxel51.com/api/fiftyone.operators.events.md) - [fiftyone.operators.executor](https://docs.voxel51.com/api/fiftyone.operators.executor.md) - [fiftyone.operators.message](https://docs.voxel51.com/api/fiftyone.operators.message.md) - [fiftyone.operators.operations](https://docs.voxel51.com/api/fiftyone.operators.operations.md) - [fiftyone.operators.operator](https://docs.voxel51.com/api/fiftyone.operators.operator.md) - [fiftyone.operators.panel](https://docs.voxel51.com/api/fiftyone.operators.panel.md) - [fiftyone.operators.permissions](https://docs.voxel51.com/api/fiftyone.operators.permissions.md) - [fiftyone.operators.registry](https://docs.voxel51.com/api/fiftyone.operators.registry.md) - [fiftyone.operators.remote_notifier](https://docs.voxel51.com/api/fiftyone.operators.remote_notifier.md) - [fiftyone.operators.server](https://docs.voxel51.com/api/fiftyone.operators.server.md) - [fiftyone.operators.sse](https://docs.voxel51.com/api/fiftyone.operators.sse.md) - [fiftyone.operators.types](https://docs.voxel51.com/api/fiftyone.operators.types.md) - [fiftyone.operators.utils](https://docs.voxel51.com/api/fiftyone.operators.utils.md) - [fiftyone.plugins](https://docs.voxel51.com/api/fiftyone.plugins.md) - [fiftyone.plugins.constants](https://docs.voxel51.com/api/fiftyone.plugins.constants.md) - [fiftyone.plugins.context](https://docs.voxel51.com/api/fiftyone.plugins.context.md) - [fiftyone.plugins.core](https://docs.voxel51.com/api/fiftyone.plugins.core.md) - [fiftyone.plugins.definitions](https://docs.voxel51.com/api/fiftyone.plugins.definitions.md) - [fiftyone.plugins.secrets](https://docs.voxel51.com/api/fiftyone.plugins.secrets.md) - [fiftyone.plugins.utils](https://docs.voxel51.com/api/fiftyone.plugins.utils.md) - [fiftyone.types](https://docs.voxel51.com/api/fiftyone.types.md) - [fiftyone.types.dataset_types](https://docs.voxel51.com/api/fiftyone.types.dataset_types.md) - [fiftyone.utils](https://docs.voxel51.com/api/fiftyone.utils.md) - [fiftyone.utils.clip](https://docs.voxel51.com/api/fiftyone.utils.clip.md) - [fiftyone.utils.clip.model](https://docs.voxel51.com/api/fiftyone.utils.clip.model.md) - [fiftyone.utils.clip.tokenizer](https://docs.voxel51.com/api/fiftyone.utils.clip.tokenizer.md) - [fiftyone.utils.clip.zoo](https://docs.voxel51.com/api/fiftyone.utils.clip.zoo.md) - [fiftyone.utils.data](https://docs.voxel51.com/api/fiftyone.utils.data.md) - [fiftyone.utils.data.base](https://docs.voxel51.com/api/fiftyone.utils.data.base.md) - [fiftyone.utils.data.converters](https://docs.voxel51.com/api/fiftyone.utils.data.converters.md) - [fiftyone.utils.data.exporters](https://docs.voxel51.com/api/fiftyone.utils.data.exporters.md) - [fiftyone.utils.data.importers](https://docs.voxel51.com/api/fiftyone.utils.data.importers.md) - [fiftyone.utils.data.ingestors](https://docs.voxel51.com/api/fiftyone.utils.data.ingestors.md) - [fiftyone.utils.data.parsers](https://docs.voxel51.com/api/fiftyone.utils.data.parsers.md) - [fiftyone.utils.eval](https://docs.voxel51.com/api/fiftyone.utils.eval.md) - [fiftyone.utils.eval.activitynet](https://docs.voxel51.com/api/fiftyone.utils.eval.activitynet.md) - [fiftyone.utils.eval.base](https://docs.voxel51.com/api/fiftyone.utils.eval.base.md) - [fiftyone.utils.eval.classification](https://docs.voxel51.com/api/fiftyone.utils.eval.classification.md) - [fiftyone.utils.eval.coco](https://docs.voxel51.com/api/fiftyone.utils.eval.coco.md) - [fiftyone.utils.eval.detection](https://docs.voxel51.com/api/fiftyone.utils.eval.detection.md) - [fiftyone.utils.eval.openimages](https://docs.voxel51.com/api/fiftyone.utils.eval.openimages.md) - [fiftyone.utils.eval.regression](https://docs.voxel51.com/api/fiftyone.utils.eval.regression.md) - [fiftyone.utils.eval.segmentation](https://docs.voxel51.com/api/fiftyone.utils.eval.segmentation.md) - [fiftyone.utils.tracking](https://docs.voxel51.com/api/fiftyone.utils.tracking.md) - [fiftyone.utils.tracking.deepsort](https://docs.voxel51.com/api/fiftyone.utils.tracking.deepsort.md) - [fiftyone.utils.activitynet](https://docs.voxel51.com/api/fiftyone.utils.activitynet.md) - [fiftyone.utils.annotations](https://docs.voxel51.com/api/fiftyone.utils.annotations.md) - [fiftyone.utils.aws](https://docs.voxel51.com/api/fiftyone.utils.aws.md) - [fiftyone.utils.bdd](https://docs.voxel51.com/api/fiftyone.utils.bdd.md) - [fiftyone.utils.beam](https://docs.voxel51.com/api/fiftyone.utils.beam.md) - [fiftyone.utils.caltech](https://docs.voxel51.com/api/fiftyone.utils.caltech.md) - [fiftyone.utils.cityscapes](https://docs.voxel51.com/api/fiftyone.utils.cityscapes.md) - [fiftyone.utils.coco](https://docs.voxel51.com/api/fiftyone.utils.coco.md) - [fiftyone.utils.csv](https://docs.voxel51.com/api/fiftyone.utils.csv.md) - [fiftyone.utils.cvat](https://docs.voxel51.com/api/fiftyone.utils.cvat.md) - [fiftyone.utils.dicom](https://docs.voxel51.com/api/fiftyone.utils.dicom.md) - [fiftyone.utils.eta](https://docs.voxel51.com/api/fiftyone.utils.eta.md) - [fiftyone.utils.fiw](https://docs.voxel51.com/api/fiftyone.utils.fiw.md) - [fiftyone.utils.flash](https://docs.voxel51.com/api/fiftyone.utils.flash.md) - [fiftyone.utils.geojson](https://docs.voxel51.com/api/fiftyone.utils.geojson.md) - [fiftyone.utils.geotiff](https://docs.voxel51.com/api/fiftyone.utils.geotiff.md) - [fiftyone.utils.github](https://docs.voxel51.com/api/fiftyone.utils.github.md) - [fiftyone.utils.groups](https://docs.voxel51.com/api/fiftyone.utils.groups.md) - [fiftyone.utils.hmdb51](https://docs.voxel51.com/api/fiftyone.utils.hmdb51.md) - [fiftyone.utils.huggingface](https://docs.voxel51.com/api/fiftyone.utils.huggingface.md) - [fiftyone.utils.image](https://docs.voxel51.com/api/fiftyone.utils.image.md) - [fiftyone.utils.imagenet](https://docs.voxel51.com/api/fiftyone.utils.imagenet.md) - [fiftyone.utils.iou](https://docs.voxel51.com/api/fiftyone.utils.iou.md) - [fiftyone.utils.kinetics](https://docs.voxel51.com/api/fiftyone.utils.kinetics.md) - [fiftyone.utils.kitti](https://docs.voxel51.com/api/fiftyone.utils.kitti.md) - [fiftyone.utils.labelbox](https://docs.voxel51.com/api/fiftyone.utils.labelbox.md) - [fiftyone.utils.labels](https://docs.voxel51.com/api/fiftyone.utils.labels.md) - [fiftyone.utils.labelstudio](https://docs.voxel51.com/api/fiftyone.utils.labelstudio.md) - [fiftyone.utils.lfw](https://docs.voxel51.com/api/fiftyone.utils.lfw.md) - [fiftyone.utils.open_clip](https://docs.voxel51.com/api/fiftyone.utils.open_clip.md) - [fiftyone.utils.openimages](https://docs.voxel51.com/api/fiftyone.utils.openimages.md) - [fiftyone.utils.openlabel](https://docs.voxel51.com/api/fiftyone.utils.openlabel.md) - [fiftyone.utils.patches](https://docs.voxel51.com/api/fiftyone.utils.patches.md) - [fiftyone.utils.pe](https://docs.voxel51.com/api/fiftyone.utils.pe.md) - [fiftyone.utils.places](https://docs.voxel51.com/api/fiftyone.utils.places.md) - [fiftyone.utils.quickstart](https://docs.voxel51.com/api/fiftyone.utils.quickstart.md) - [fiftyone.utils.qwen3_vl](https://docs.voxel51.com/api/fiftyone.utils.qwen3_vl.md) - [fiftyone.utils.random](https://docs.voxel51.com/api/fiftyone.utils.random.md) - [fiftyone.utils.rerun](https://docs.voxel51.com/api/fiftyone.utils.rerun.md) - [fiftyone.utils.sam](https://docs.voxel51.com/api/fiftyone.utils.sam.md) - [fiftyone.utils.sam2](https://docs.voxel51.com/api/fiftyone.utils.sam2.md) - [fiftyone.utils.sama](https://docs.voxel51.com/api/fiftyone.utils.sama.md) - [fiftyone.utils.scale](https://docs.voxel51.com/api/fiftyone.utils.scale.md) - [fiftyone.utils.super_gradients](https://docs.voxel51.com/api/fiftyone.utils.super_gradients.md) - [fiftyone.utils.tf](https://docs.voxel51.com/api/fiftyone.utils.tf.md) - [fiftyone.utils.torch](https://docs.voxel51.com/api/fiftyone.utils.torch.md) - [fiftyone.utils.transformers](https://docs.voxel51.com/api/fiftyone.utils.transformers.md) - [fiftyone.utils.transforms](https://docs.voxel51.com/api/fiftyone.utils.transforms.md) - [fiftyone.utils.ucf101](https://docs.voxel51.com/api/fiftyone.utils.ucf101.md) - [fiftyone.utils.ultralytics](https://docs.voxel51.com/api/fiftyone.utils.ultralytics.md) - [fiftyone.utils.useragent](https://docs.voxel51.com/api/fiftyone.utils.useragent.md) - [fiftyone.utils.utils3d](https://docs.voxel51.com/api/fiftyone.utils.utils3d.md) - [fiftyone.utils.video](https://docs.voxel51.com/api/fiftyone.utils.video.md) - [fiftyone.utils.voc](https://docs.voxel51.com/api/fiftyone.utils.voc.md) - [fiftyone.utils.yolo](https://docs.voxel51.com/api/fiftyone.utils.yolo.md) - [fiftyone.utils.youtube](https://docs.voxel51.com/api/fiftyone.utils.youtube.md) - [fiftyone.zoo](https://docs.voxel51.com/api/fiftyone.zoo.md) - [fiftyone.zoo.datasets](https://docs.voxel51.com/api/fiftyone.zoo.datasets.md) - [fiftyone.zoo.datasets.base](https://docs.voxel51.com/api/fiftyone.zoo.datasets.base.md) - [fiftyone.zoo.datasets.tf](https://docs.voxel51.com/api/fiftyone.zoo.datasets.tf.md) - [fiftyone.zoo.datasets.torch](https://docs.voxel51.com/api/fiftyone.zoo.datasets.torch.md) - [fiftyone.zoo.models](https://docs.voxel51.com/api/fiftyone.zoo.models.md) - [fiftyone.zoo.models.torch](https://docs.voxel51.com/api/fiftyone.zoo.models.torch.md) - [fiftyone.constants](https://docs.voxel51.com/api/fiftyone.constants.md) ## Contributing to FiftyOne - [Contribute](https://docs.voxel51.com/contribute/index.md) ## FiftyOne Release Notes - [Release Notes](https://docs.voxel51.com/release-notes.md) ## FiftyOne Deprecation Notices - [Deprecation Notices](https://docs.voxel51.com/deprecation.md) ## Frequently Asked Questions - [FAQ](https://docs.voxel51.com/faq/index.md) ## Optional - [All docs](https://docs.voxel51.com/)