Managing and reviewing labeling jobs should be easy, straightforward and intuitive. Most labeling solution providers don’t provide ways for their customers to visualize the output of their work, and their quality metrics are usually inaccurate and subjective. The Human-in-the-Loop module is the ultimate solution to manage labeled datasets. You can view, search and modify the labels, access advanced labeling quality reports, select and mark records for re-annotation, and even manage your labeling budget, all in one place.
Fully Searchable Labels Database
Filter and retrieve your data based on their content in just a few clicks. The Alectio Human-in-the-Loop Module is powered by ElasticSearch to help you find interesting corner cases or manually audit the output of a labeling job. So next time you realize your labeling instructions failed to mention that pickup trucks should be labeled as cars and not trucks, no need to relabel everything from scratch: all you need to do is search for trucks and mark the matched records for relabeling.
Self-Serve Annotation Tools
Want to annotate data yourself, but don’t have the right tool to do so? Use one of our ML-assisted labeling tools to label your own data (dynamically or in batch), or to retouch the faulty annotations in your labelset.
Label Auditing Tool
Auditing the quality of your labels used to be a tedious but necessary task. The Labeling Auditing Tool leverages Anomaly Detection algorithms and Meta Learning to narrow down potentially misannotated data, so that you can just review the high-risk records, and decide which ones to reannotate.
Nobody is perfect, and innocent errors can creep into your labels. With Alectio’s Relabeling Workflow feature, you can manually review your labels or leverage the Labeling Auditing tool to find problematic records and either fix the annotations yourself directly on the platform with one of the available Self-Serve Annotation tools, or make them for relabeling so that it gets relabeled by the provider of your choice. Thanks to this feature, you can dramatically reduce your labeling costs by identifying and reannotating records with faulty labels instead of inquiring multiple judgments from the get-go.
Label Version Control
Human-in-the-Loop and Machine-in-the-Loop annotation workflows are the best ways to get high quality labels efficiently, but it requires re-annotating some records multiple times . That’s why the Human-in-the-Loop module provides a Version Control feature for labels which allows you to review specific versions, delete versions you are not happy with and download your optimal labels.
Label Aggregation Functionality
Some use cases, in particular those based on subjective labels, tend to require multiple judgments, and it would normally be your job to consolidate those judgments into one usable label. The Label Aggregation functionality gives you all the options to do so and help you choose based on your criteria.