The token (or API key) must be passed as a request header. You can find your user token on the User Account page in Label Studio. Example: <br><pre><code class="language-bash">curl https://label-studio-host/api/projects -H "Authorization: Token [your-token]"</code></pre>
Request
This endpoint expects an object.
annotator_evaluation_enabledbooleanOptional
Enable annotator evaluation for the project
colorstring or nullOptional<=16 characters
control_weightsmap from strings to objects or nullOptional
Dict of weights for each control tag in metric calculation. Keys are control tag names from the labeling config. At least one tag must have a non-zero overall weight.
created_byobjectOptional
Project owner
custom_interface_codestring or nullOptional
custom_interface_compiledstring or nullOptional
custom_interface_paramsany or nullOptional
descriptionstring or nullOptional
Project Description
enable_empty_annotationbooleanOptional
Allow annotators to submit empty annotations
evaluate_predictions_automaticallybooleanOptional
Retrieve and display predictions when loading a task
expert_instructionstring or nullOptional
Labeling instructions in HTML format
input_schemaany or nullOptional
is_draftbooleanOptional
Whether or not the project is in the middle of being created
is_publishedbooleanOptional
Whether or not the project is published to annotators
label_configstring or nullOptional
Label config in XML format. See more about it in documentation
Maximum number of annotations for one task. If the number of annotations per task is equal or greater to this value, the task is completed (is_labeled=True)
Start model training after any annotations are submitted or updated
statestringRead-only
task_numberintegerRead-only
Total task number in project
total_annotations_numberintegerRead-only
Total annotations number in project including skipped_annotations_number and ground_truth_number.
total_predictions_numberintegerRead-only
Total predictions number in project including skipped_annotations_number, ground_truth_number, and useful_annotation_number.
useful_annotation_numberintegerRead-only
Useful annotation number in project not including skipped_annotations_number and ground_truth_number. Total annotations = annotation_number + skipped_annotations_number + ground_truth_number
annotator_evaluation_enabledboolean
Enable annotator evaluation for the project
colorstring or null<=16 characters
control_weightsmap from strings to objects or null
Dict of weights for each control tag in metric calculation. Keys are control tag names from the labeling config. At least one tag must have a non-zero overall weight.
created_byobject
Project owner
custom_interface_codestring or null
custom_interface_compiledstring or null
custom_interface_paramsany or null
descriptionstring or null
Project Description
enable_empty_annotationboolean
Allow annotators to submit empty annotations
evaluate_predictions_automaticallyboolean
Retrieve and display predictions when loading a task
expert_instructionstring or null
Labeling instructions in HTML format
input_schemaany or null
is_draftboolean
Whether or not the project is in the middle of being created
is_publishedboolean
Whether or not the project is published to annotators
label_configstring or null
Label config in XML format. See more about it in documentation
maximum_annotationsinteger-2147483648-2147483647
Maximum number of annotations for one task. If the number of annotations per task is equal or greater to this value, the task is completed (is_labeled=True)