FiftyOne Enterprise Plugins¶
FiftyOne Enterprise provides native support for installing and running FiftyOne plugins, which offers powerful opportunities to extend and customize the functionality of your Enterprise deployment to suit your needs.
Note
What can you do with plugins? Check out delegated operations to see some quick examples, then check out the FiftyOne plugins repository for a growing collection of prebuilt plugins that you can add to your Enterprise deployment!
Plugins page¶
Admins can use the plugins page to upload, manage, and configure permissions for plugins that are made available to users of your Enterprise deployment.
Admins can access the plugins page under Settings > Plugins. It displays a list of all installed plugins and their operators, as well as the enablement and permissions of each.

Installing a plugin¶
Admins can install plugins via the Enterprise UI or Management SDK.
Note
A plugin is a directory (or ZIP of it) that contains a top-level
fiftyone.yml
file.
Enterprise UI¶
To install a plugin, click the âInstall pluginâ button on the plugins page.

Then upload or drag and drop the plugin contents as a ZIP file and click install.

You should then see a success message and the newly installed plugin listed on the plugins page.

SDK¶
Admins can also use the
upload_plugin()
method from
the Management SDK:
1 2 3 4 | import fiftyone.management as fom # You can pass the directory or an already zipped version of it fom.upload_plugin("/path/to/plugin_dir") |
Upgrading a plugin¶
Admins can upgrade plugins at any time through the Enterprise UI or Management SDK.
Enterprise UI¶
To upgrade a plugin, click the pluginâs dropdown and select âUpgrade pluginâ.

Then upload or drag and drop the upgraded plugin as a ZIP file and click upgrade.

Note
If the name
attribute within the uploaded pluginâs fiftyone.yml
file
doesnât match the existing plugin, a new plugin will be created. Simply
delete the old one.
You should then see a success message and the updated information about the plugin on the plugins page.

SDK¶
Admins can also use the
upload_plugin()
method from
the Management SDK with the overwrite=True
option:
1 2 3 4 | import fiftyone.management as fom # You can pass the directory or an already zipped version of it fom.upload_plugin("/path/to/plugin_dir", overwrite=True) |
Uninstalling a plugin¶
Admins can uninstall plugins at any time through the Enterprise UI or Management SDK.
Note
Did you know? You can enable/disable plugins rather than permanently uninstalling them.
Enterprise UI¶
To uninstall a plugin, click the pluginâs dropdown and select âUninstall pluginâ.

SDK¶
Admins can also use the
delete_plugin()
method from
the Management SDK:
1 2 3 | import fiftyone.management as fom fom.delete_plugin(plugin_name) |
Enabling/disabling plugins¶
Enterprise UI¶
When plugins are first installed into Enterprise, they are enabled by default, along with any operators they contain.
Admins can enable/disable a plugin and all of its operators by toggling the enabled/disabled switch.

Admins can also disable/enable specific operators within an (enabled) plugin by clicking on the pluginâs operators link.

and then toggling the enabled/disabled switch for each operator as necessary.

SDK¶
Admins can also use the
set_plugin_enabled()
and
set_plugin_operator_enabled()
methods from the management SDK:
1 2 3 4 5 6 7 | import fiftyone.management as fom # Disable a plugin fom.set_plugin_enabled(plugin_name, False) # Disable a particular operator fom.set_plugin_operator_enabled(plugin_name, operator_name, False) |
Plugin permissions¶
Admins can optionally configure access to plugins and individual operators within them via any combination of the permissions described below:
Permission |
Description |
---|---|
Minimum Role |
The minimum role a user must have to execute the operation. |
Minimum Dataset Permission |
The minimum dataset permission a user must have to perform the operation on a particular dataset. |
Enterprise UI¶
To configure the permissions for an operator, first click on the pluginâs operators link.

Then change the dropdown for the operator to reflect the desired permission level.


SDK¶
Admins can also use the
set_plugin_operator_permissions()
method from the Management SDK:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 | import fiftyone.management as fom # Set minimum role permission only fom.set_plugin_operator_enabled( plugin_name, operator_name, minimum_role=fom.MEMBER, ) # Set minimum dataset permission only fom.set_plugin_operator_enabled( plugin_name, operator_name, minimum_dataset_permission=fom.EDIT, ) # Set both minimum role and minimum dataset permissions fom.set_plugin_operator_enabled( plugin_name, operator_name, minimum_role=fom.EDIT, minimum_dataset_permission=fom.EDIT, ) |
Default permissions¶
When new plugins are installed, any operators they contain are initialized with the default permissions for your deployment.
By default, the initial permissions are:
Permission |
Default |
---|---|
Minimum Role |
Member |
Minimum Dataset Permission |
Edit |
Enterprise UI¶
Default operator permissions can be configured by navigating to the page at Settings > Security and looking under the Plugins header. Click the dropdown for the permission you want to change and select the new value.

SDK¶
Admins can also use the
set_organization_settings()
method from the Management SDK:
1 2 3 4 5 6 | import fiftyone.management as fom fom.set_organization_settings( default_operator_minimum_role=fom.MEMBER, default_operator_minimum_dataset_permission=fom.EDIT, ) |
Delegated operations¶
Delegated operations are a powerful feature of FiftyOneâs plugin framework that allows users to schedule tasks from within the App that are executed in the background on a connected compute cluster.
With FiftyOne Enterprise, your team can upload and permission custom operations that your users can execute from the Enterprise App, all of which run against a central orchestrator configured by your admins.
Why is this awesome? Your AI stack needs a flexible data-centric component that enables you to organize and compute on your data. With delegated operations, FiftyOne Enterprise becomes both a dataset management/visualization tool and a workflow automation tool that defines how your data-centric workflows like ingestion, curation, and evaluation are performed. In short, think of FiftyOne Enterprise as the single source of truth on which you co-develop your data and models together.
What can delegated operations do for you? Get started by installing any of these plugins available in the FiftyOne Plugins repository:
âïž Utilities for integrating FiftyOne with annotation tools |
|
đ§ Utilities for working with the FiftyOne Brain |
|
â Utilities for evaluating models with FiftyOne |
|
đ A collection of import/export utilities |
|
đ Utilities working with FiftyOne database indexes |
|
âïž Call your favorite SDK utilities from the App |
|
đ€ An AI assistant that can query visual datasets, search the FiftyOne docs, and answer general computer vision questions |
|
đ Download datasets and run inference with models from the FiftyOne Zoo, all without leaving the App |
For example, wish you could import data from within the App? With the @voxel51/io, plugin you can!

Want to send data for annotation from within the App? Sure thing, just install the @voxel51/annotation plugin:

Have model predictions on your dataset that you want to evaluate? The @voxel51/evaluation plugin makes it easy:

Need to compute embedding for your dataset so you can visualize them in the Embeddings panel? Kick off the task with the @voxel51/brain plugin and proceed with other work while the execution happens in the background:

When you choose delegated execution in the App, these tasks are automatically scheduled for execution on your connected orchestrator and you can continue with other work. Meanwhile, all datasets have a Runs tab in the App where you can browse a history of all delegated operations that have been run on the dataset and their status.
Configuring your orchestrator(s)¶
FiftyOne Enterprise offers a builtin orchestrator that is configured as part of your teamâs deployment with a default level of compute capacity.
It is also possible to connect your FiftyOne Enterprise deployment to an externally managed workflow orchestration tool (Airflow, Flyte, Spark, etc).
Note
Contact your Voxel51 support team to scale your deploymentâs compute capacity or if youâd like to use an external orchestrator.
Managing delegated operations¶
Every dataset in FiftyOne Enterprise has a Runs page that allows users with access to monitor and explore delegated operations scheduled against that dataset.
All scheduled operations are maintained in a queue and will be automatically executed as resources are available on the targeted orchestrator.
Note
The Runs page only tracks operations that are scheduled for delegated execution, not operations that are executed immediately in the App.
Runs page¶
All users with at least Can View access to a dataset can visit the Runs page by clicking on the âRunsâ tab.
On the Runs page, you will see a table with a list of delgated operations. Admins can choose whether to view operations for all datasets or only the current dataset, while non-admins can only view operations associated with the current dataset.
The table provides options to sort, search, and filter runs shown to refine the list as you like:

Statuses¶
Delegated operations can have one of 5 potential statuses:
Scheduled: the run has been scheduled for execution and is awaiting execution quota. All delegated operations begin life in this state
Queued: the run has been allocated execution quota and it will start running as soon as orchestrator resources become available
Running: the run is currently being executed
Completed: the run has completed successfully
Failed: the run failed to complete
Note
FiftyOne Enterprise offers a builtin orchestrator that is configured as part of your teamâs deployment with a default level of execution quota.
Contact your Voxel51 support team to discuss running more jobs in parallel, or if youâd like to use an external orchestrator.

You can hover over the status badge of a run in Scheduled or Queued state to see additional information about its execution, including its position in the Scheduled queue:


Sorting¶
By default, the runs table is sorted by recency, with the most recently scheduled run at the top. You can use the dropdown menu in the upper right of the table to sort by other criteria, including last updated, oldest, or operator name:

Filtering¶
You can also filter the runs table to see a subset of runs.
Users with sufficient privileges can toggle between âMy Runsâ and âAll Runsâ to see runs you have scheduled versus runs that others in your organization have scheduled on the current dataset:

All users can further refine the list of runs using the Status dropdown to select one or more statuses you would like to filter by:

Admins can also toggle to show âAll Datasetsâ or âThis Datasetâ to control whether to show all runs for your organization versus only runs for the dataset you are currently viewing:

Searching¶
You can also use the search functionality to filter the list of runs by keyword. As you type your query in the search box, the list of runs will be updated to show only the runs matching your query:

Note
Search is case-sensitive and you can currently only search by operator name, not label. For example, the search âbrightâ does not match against the label âcompute_brightnessâ in the image above but instead the operator URI â@voxel51/panels/compute_brightnessâ.
Re-running¶
From the Runs page, you can trigger a re-run of any run by clicking the kebab menu and selecting âRe-runâ:

Pinning¶
Pinned runs are displayed to the right of the runs table. By default, five pinned runs will be displayed, and if there are more than five pinned runs, you will see a button to expand the list.
To pin a run, hover over its row in the runs table and click the pin icon that appears beside the operator label:
Note
Pinned runs are stored at the dataset-level and will be visible to all users with access to that dataset.


Renaming¶
When delegating an operation multiple times on the same dataset, you may wish to give the runs custom labels so that you can easily identify each run later.
To edit the label of an operator run, move your mouse cursor over the label of interest and click the pencil button as indicated by â1â below. This will present an input field indicated by â2â where you can update the label to the text of your choice. Once you are ready to apply changes, click the save button indicated by â3â:

Mark as failed¶
If a delegated operation run terminates unexpectedly without reporting failure, you can manually mark it as failed from the Runs page.
To mark a run as failed, click the three dots indicated by â1â. Then, in the menu, click âMark as failedâ as indicated by â2â. The run status will be updated and will now display as failed:

Warning
If the delegated operation is, in fact, still in progress in your orchestrator, marking the run as failed will not terminate the execution of operation. It will continue executing until completion but the operation will be marked as failed regardless of its outcome.
Monitoring progress¶
Delegated operations can optionally report their progress during execution.
If progress is available for a run, it will be displayed in the Runs table as indicated by â2â.
By default, the general status of a run and the progress of running operations is automatically refreshed. You can disable the auto-refresh of running operations by toggling the auto-refresh setting indicated by â1â.


Run page¶
The Run page for a specific run allows you to see information about a specific delegated operation, including its inputs, outputs, logs, and errors.
You can visit the Run page for a run by clicking on the run in the runs table, the Pinned runs section, or the Recent runs widgets.
Input¶
The Input tab on the Run page lets you see the input parameters that were provided when the delegated operation was scheduled:

By default, a rendered version of input parameters is displayed, similar to what is displayed when invoking an operator via a prompt modal. However, you can switch to raw format by clicking the âShow rawâ toggle button:

Output¶
The Output tab on the Run page lets you see the rendered output of a delegated operation that has completed, if there is any:

Errors¶
The Errors tab on the Run page will appear if the run failed, and it will display the error message and stack trace that occurred:

Logs¶
The Logs tab on the Run page allows you to view available logs associated with a delegated operation:

Viewing logs
Once log storage is configured, logs will automatically appear in the Logs tab of a run once they are available:
Note
Logs are currently only available after the run completes.


Logs structure
Logs are displayed in a tabular format as pictured below, including the timestamp, severity, and message associated with each log entry:

For logs that exceed 1MB, no content will be shown and instead a âDownload logsâ button will appear:

Downloading logs
You can directly download the logs for a delegated operation from both the Runs table and the operationâs Run page:


Logs setup
Viewing run logs for delegated operations requires some one-time deployment-level configuration.
A deployment admin on your team will need to explicitly define log generation behavior for your orchestrator(s). We provide simple setup instructions for the two deployment configurations we support for the builtin orchestrator:

Note
If you are using a third-party orchestrator like Airflow, simply configure
your orchestrator to store logs to a persistent location and then report
this path for each run via the log_path
argument.
View¶
The View tab on the Run page lets you see the specific view (which could be the full dataset) on which the operation was performed:
