Skip to Content
Getting StartedCore Concepts

Core Concepts

Understanding these concepts will help you get the most out of Avala.

Datasets

A dataset is a collection of data items to be annotated. Datasets are the foundation of all annotation work in Avala.

Dataset Properties

PropertyDescription
nameHuman-readable name
slugURL-friendly identifier
data_typeType of data: image, video, lidar, image_3d
visibilitypublic or private
ownerUser or organization that owns the dataset

Dataset Items

Each dataset contains items - individual data samples:

  • For image datasets: each item is one image
  • For video datasets: items are grouped into sequences (frames)
  • For LiDAR datasets: items are point cloud scans
# List items in a dataset items = requests.get( f"{BASE_URL}/datasets/{owner}/{slug}/items/", headers=headers ).json()

Sequences

Sequences group related items for temporal data:

  • Video frames from the same recording
  • LiDAR scans from a driving session
  • Multi-camera captures at the same timestamp

Sequences enable:

  • Frame-by-frame navigation
  • Object tracking across frames
  • Temporal consistency in annotations

Projects

A project defines an annotation workflow and connects datasets to annotation tasks.

Project Components

Project ├── Datasets (data sources) ├── Task Type (annotation method) ├── Label Taxonomy (object classes) ├── Quality Control (review stages) └── Tasks (work units)

Project Status

Projects flow through states:

StatusDescription
DRAFTBeing configured, not yet active
ACTIVEAccepting annotation work
PAUSEDTemporarily stopped
CANCELLEDPermanently stopped
ARCHIVEDCompleted and archived

Task Types

Projects support different annotation methods:

Task TypeDescription
BOX2D bounding boxes
POLYGONArbitrary polygon shapes
CUBOID3D bounding boxes
SEGMENTATIONPixel-level masks
POLYLINELine annotations
CLASSIFICATIONImage-level labels

Tasks

Tasks are individual work units within a project. Each task represents annotation work to be done on specific data items.

Task Lifecycle

PENDING → ACTIVE → COMPLETED (Review) ACCEPTED / REJECTED

Results

When an annotator completes a task, they submit a result:

  • Contains the annotation data (boxes, polygons, etc.)
  • Includes metadata (time spent, tool used)
  • Goes through quality control review
# Submit annotation result result = requests.post( f"{BASE_URL}/tasks/{task_uid}/results/", headers=headers, json={ "data": { "objects": [ { "label": "car", "bbox": [100, 150, 200, 300] } ] } } )

Organizations

Organizations group users and resources for team collaboration.

Organization Structure

Organization ├── Members (users with roles) ├── Teams (workgroups) ├── Datasets (shared data) ├── Projects (shared workflows) └── Settings (billing, permissions)

Member Roles

RoleCapabilities
ownerFull control, billing, can delete organization
adminManage members, create resources
memberAccess shared resources

Teams

Teams organize members within an organization:

  • Focus on specific projects or data types
  • Have their own member lists
  • Can be assigned to projects

Labels and Taxonomy

Predefined Labels

Projects use a label taxonomy - a set of predefined labels (classes):

{ "labels": [ {"name": "car", "color": "#FF0000"}, {"name": "pedestrian", "color": "#00FF00"}, {"name": "cyclist", "color": "#0000FF"} ] }

Object Classification

For complex taxonomies, use classification configs:

  • Hierarchical categories
  • Attributes (color, size, occlusion)
  • Conditional attributes based on class

Quality Control

Avala provides built-in quality assurance:

Reviews

Annotations go through review before acceptance:

  1. Annotator submits result
  2. Reviewer examines annotation
  3. Reviewer accepts or rejects
  4. Rejected work returns for correction

Annotation Issues

Track problems with annotation issues:

  • Flag specific problems
  • Assign to team members
  • Track resolution status

Metrics

Monitor quality with built-in metrics:

  • Acceptance rate
  • Annotation time
  • Inter-annotator agreement
  • Issue frequency

Quality metrics help identify training needs and ensure consistent annotation quality across your team.

Next Steps

Last updated on