Qualifying Human Workers
Labeler qualification ensures that only the most suitable labelers become part of the team responsible for processing your data. By qualifying labelers for participation in a project, you can improve the project performance and get the results you need faster.
Qualification is available to the following user roles:
- Owner
- Admin
- Instructor
Once qualification is configured, you can set up your project to use the qualified labelers as a labeling source.
Qualification comprises two components:
- Tutorials (optional)
- Explanations of how to label data
- Examples of inputs along with correct and incorrect outputs
- Unscored labeling challenges
- Tests
- Used to qualify labelers
- A series of data points for the labeler to label
- You set the number of data points and the requirements for a passing grade
Qualification is available on a project under two conditions:
- You are using super.AI’s in-house labelers
- You have created at least one ground truth data point
How to create tutorials
- Head to your super.AI dashboard
- Open the relevant project
- Click Workers in the left-hand menu
- Click Labeler qualification in the sub-menu
- Click Add tutorial on the right
- Choose your settings
- The settings allow you to configure when a user gets a tutorial question correct. Choose a metric and set the required score for a correct answer. You can also choose under what conditions the user is shown the correct answer after they submit their answer.
- Click Save
- Click Edit content on the right
- Expand the Add content dropdown
- Click Explanation if you want to describe the labeling process or explain an edge case. This is text only, allowing a short and long description of the scenario.
- Click Example if you want to use a processed (i.e., completed) data point (input and output) to illustrate a labeling scenario.
- Choose a data point from the table, then you can add an optional hint and explanation
- Click Check this to make the example interactive if you want the user to label the data point themselves as an unscored question
Once you’ve added an explanation or example, it will appear as a step on the Tutorial content page. You can add additional steps by expanding the Add content dropdown again and following the same process as before. You can reorder steps by clicking and dragging on the Reorder icon. Edit or delete a step using the Edit and Delete icons on the right.
How to create tests
- Head to your super.AI dashboard
- Open the relevant project
- Click People in the left-hand menu
- Click Labeler qualification in the sub-menu
- Click Add test on the right
- Choose your settings and click Save
- The settings allow you to configure when a user gets a test question correct. Choose a metric and set the required score for a user to pass. You can also choose how long the user must wait before reattempting after a failed attempt.
Content is automatically generated from your ground truth dataset. The Question pool size indicates how many ground truth data points are available for the test. The number of test questions cannot exceed the question pool size.
How to see and manage qualified labelers
The Who’s qualified section of the Labeler qualification page shows which labelers have attempted the test. You can see how many times each labeler has attempted the test, their most recent score, their exemption status, and whether or not they are qualified (i.e., have passed the test).
How to exempt users from qualification
You might want to exempt a user from having to complete the qualification test if, e.g., you are confident of a user’s labeling ability. To do so, the user must first attempt the test in order to appear in the Who’s qualified section. You can then select the user using the checkbox in the table and click Exempt selected at the top right. It’s possible to exempt all users by clicking Exempt all when you have no users selected in the table.
Updated over 1 year ago