Click on Annotate in the action bar at the bottom of the dashboard.
The system will automatically select and display each sample to annotate. From the start, suggested annotations will be provided by the Rosette Entity Extractor. As soon as you annotate a sample, the system will start training a new model based on your annotations.
You can ask the engine to suggest annotations for only samples in the training set and not those in the evaluation set. To disable suggestions from the evaluation set, enable the project configuration option Hide Eval Predictions
.
For each annotation, you will be shown the sample to annotate. Fragments of the sentences before and after the sample may be displayed in light gray, to provide context for the sample being annotated. You cannot annotate the context text.
Select words or phrases in the sentence and then choose the correct label for the selected words. To change a label, select clear tag
to remove it. Then select a new label.
When you are satisfied that the sentence is correctly annotated, select Annotate to save it.
If you are registered as an adjudicator for the project, and at least two annotators have already annotated this sample, you will be offered the option to Adjudicate, that is to mark the annotation as the best one for this sample. Currently, there is no option to change an adjudication once it has been submitted.
Other options are:
Clear Annotations: Erase all tags and start afresh.
Skip for now: Skip this sample and annotate a different sample.
Previous: Go back to a previously annotated sample from this session. To edit earlier work, use View Annotations.
To move a sample between the validation and training sets, select the appropriate radio button.
Once you've created your data corpus and loaded the documents, at any point in the annotation process, you can review the corpus and the annotations. Select View Annotations to see a detailed view of your work.