In this tutorial, you will learn how to run audio transcription in Toloka. We will use a project preset designed specifically for this type of data labeling.
Audio transcription is a type of data labeling task with an audio file and a text input area. Tolokers listen to the short recording and type the text they hear. After you collect the results, you can apply the dataset for your speech recognition models.
You may need additional projects for your task, such as dataset pre-check or checking Tolokers' responses. Learn more about this in Decomposition of the task.
Before you begin:
Make sure you are registered in Toloka as a requester.
Top up your Toloka account. If you are unsure about the budget, you can do that later in this tutorial. Toloka will display the budget estimate for your project.
We recommend starting with a project preset for easier configuration and better results.
Follow this link, or create a project manually:
Click Create a project.
Click Do it myself.
Select the Transcribing audio recordings preset.
Click Choose this preset in the pop-up tab.
In the General information section, add the project name and description:
Name to show Tolokers: In 2–5 words, state the general idea of the project.
Description for Tolokers: In a couple of sentences, explain what you expect Tolokers to do. This is just an overview. You will write instructions later.
In the Task interface section, set up what your tasks will look like. This preset has a task template with layout and validation pre-configured. The Toloker won't be able to submit the response without listening to the audio recording and adding the text.
In the Config section, you can change the texts Tolokers will see in your task. All tasks in a project use the same texts.
To learn about other properties of the Config section, their possible values and the impact on the task interface, see the Template Builder Help.
In the Input data example section, add a link to a sample audio. This audio is only used to display the task interface preview on the right.
Raw task data is stored in the XSLX, TSV, or JSON format. The labeling results are presented in a TSV file. The Data specification section determines which parameters these files might contain.
Click Show specifications and check the values:
Input data: Parameters in the file with raw task data.
Output data: Parameters in the file with labeling results.
Input data and Output data match the task interface you set up in Template Builder. Check that there are fields for all data types you use for your tasks, and for the ones you want to see in the results file.
In the Instructions for Tolokers editor, enter the instructions Tolokers will see when they start doing your tasks. You can add text, tables, and images to your instructions.
Check the sample text of the instructions, and update it to fit your project.
When writing instructions, remember that most Tolokers don’t know anything about your tasks beforehand. Make sure your instructions are as clear as possible, but not too wordy. For successful data labeling, try to strike a balance between covering all the essentials and keeping it short. Learn more in our knowledge base.
In the upper-right corner, click Save.
Learn more about working with the project in the Project section.
A pool is a set of tasks sent out to Tolokers at the same time. One project can have many pools. When creating a pool, you set up pricing, audience filters for Tolokers, and quality control.
Click Create new pool on the project page.
Select the value in the Pool type drop-down list.
If the price per task suite is zero, you must select the pool type.
Set the Pool name (visible only to you) field. Only you will see this pool name on the project page.
Specify the pool description which will be displayed instead of the project description in the task list for Tolokers. By default, Tolokers see the description from the project settings. To use a different description, uncheck the Use project description box and set Public description. If necessary, click + Private comment to add a private project description that only you will see.
At the Select the audience for your task step, set up filters to select Tolokers for your pool.
Clear My tasks may contain shocking or pornographic content if your project has none of those.
To select Tolokers based on their language, location, age, gender, and other parameters, click the Add filter button.
For example, add the Languages filter:
It is best to launch transcription tasks in the Toloka web version so that Tolokers can use the keyboard for typing. Add the Device type filter, and set its value to Personal computer.
Use the Speed/quality balance slider to change the number of Tolokers who can see your tasks. Move the slider to the right to exclude Tolokers with lower ratings from participating in your project.
At the Setup quality control step, set quality control rules for more accurate results:
Click the Review task responses manually toggle, and specify the number of days for checking the task in the Review period in days field.
Delete the pre-configured Majority vote rule.
Edit the pre-configured Fast responses rule to catch bots. This rule filters out Tolokers who complete tasks too fast.
Set the Minimum time per task suite. A task suite is a page with a number of tasks. It can contain one or several tasks. If the tasks are simple, you can add 6–10 tasks per suite.
The minimum time per suite value depends on two characteristics: the number of tasks on the page, and the length of audio recordings.
Make allowances for technical errors. For example, some recordings failed to load or play. The Toloker will quickly submit responses for tasks like this and this won't be an error.
To catch bots, set 10–15 seconds per response. Ban Tolokers after two fast responses.
This means that a user who completes two or more task suites in less than 10 seconds will be banned for 10 days and won't be able to access your tasks.
For a trial pool, the settings you’ve just made are enough. You can get better results if you set the additional quality control rules.
At the Set the task price and overlap step, set up how much a single task will cost for you.
In Price per task suite, set the amount of money to pay per task suite done by one Toloker.
The price depends on the length and complexity of the audio recordings.
In the Overlap field, define how many Tolokers must do each task.
For the speech transcription, overlap is 1, as a rule. This means that each task will have 1 response.
At the Add optional pool settings step, specify the Time per task suite, sec.
This time should be enough to read the instructions, load the task, listen to audio recordings, and type text (for example, 1,200 seconds).
At the Prepare and upload data step, upload your task data.
Attach a prepared dataset or media files.
To download a template, click one of the buttons:
For this type of project, the file with tasks must have one parameter. Its name equals
INPUT:audio, and the values are links to the audio files.
Open the downloaded file, and replace the sample links with links to your audio files.
Click Select prepared dataset, and upload the file you’ve just made.
Tasks are shown to Tolokers in suites. A suite is a single page with multiple tasks. Define how many tasks to include per suite:
General tasks: These are tasks for Tolokers to label.
Control tasks: These are tasks with predefined answers used to control the quality of responses. For this project, you don’t need control tasks because of the enabled Non-automatic acceptance option.
Training tasks: These are tasks with predefined answers and explanations for Tolokers. Normally you use training tasks in separate training pools. You don’t have to include them.
For example, you can add 5 general tasks per suite:
This means that there will be 5 audio recordings per suit, each recording with a text field for transcription.
Click Combine tasks into suites.
At the Double-check your project and try out tasks step, check how the task will look from the Toloker's point of view.
This step will be enabled after you complete the previous steps. You can skip this step by clicking Do it later.
After all the steps, you'll see the Set up is finished and your pool is ready for labeling tip on the pool page.
Make sure you have topped up your account.
To send the tasks to Tolokers and begin the labeling process, click Start labeling.
In the pop-up panel, review the budget and click Launch.
Track the labeling progress on the pool page. You can start the review when the first results are received.
After the specified time period, all responses are automatically accepted, regardless of their quality.
Go to the pool, and click Review assignments.
Choose an assignment.
Check the responses, and click Accept or Decline. For rejected responses, enter a comment to specify the reason.
To learn about other ways of review, see the Reviewing Tolokers' responses section.
After checking all the assignments, click Download results.
You will get the TSV file with the labeling results.
Sample dataset files with tasksContact support
Last updated: March 10, 2023