Prowler Api

Master Prowler's multi-tenant API with RLS, RBAC, and Celery patterns

✨ The solution you've been looking for

Verified
Tested and verified by our team
12565 Stars

Prowler API patterns: JSON:API, RLS, RBAC, providers, Celery tasks. Trigger: When working in api/ on models/serializers/viewsets/filters/tasks involving tenant isolation (RLS), RBAC, JSON:API, or provider lifecycle.

row-level-security multi-tenant rbac celery-tasks django-rest-framework prowler-cloud tenant-isolation provider-lifecycle
Repository

See It In Action

Interactive preview & real-world examples

Live Demo
Skill Demo Animation

AI Conversation Simulator

See how users interact with this skill

User Prompt

I need to create a new model for storing scan results that properly isolates data between tenants using RLS

Skill Processing

Analyzing request...

Agent Response

Complete model implementation with RowLevelSecurityProtectedModel inheritance, proper constraints, and tenant-aware querysets

Quick Start (3 Steps)

Get up and running in minutes

1

Install

claude-code skill install prowler-api

claude-code skill install prowler-api
2

Config

3

First Trigger

@prowler-api help

Commands

CommandDescriptionRequired Args
@prowler-api implementing-tenant-isolated-modelsCreate new models with proper Row-Level Security constraints and tenant isolation for multi-tenant architectureNone
@prowler-api building-async-provider-operationsImplement Celery tasks for provider connection testing, data syncing, and lifecycle management with proper tenant contextNone
@prowler-api implementing-rbac-permissionsAdd role-based access control with provider group visibility and permission checksNone

Typical Use Cases

Implementing Tenant-Isolated Models

Create new models with proper Row-Level Security constraints and tenant isolation for multi-tenant architecture

Building Async Provider Operations

Implement Celery tasks for provider connection testing, data syncing, and lifecycle management with proper tenant context

Implementing RBAC Permissions

Add role-based access control with provider group visibility and permission checks

Overview

When to Use

Use this skill for Prowler-specific patterns:

  • Row-Level Security (RLS) / tenant isolation
  • RBAC permissions and role checks
  • Provider lifecycle and validation
  • Celery tasks with tenant context
  • Multi-database architecture (4-database setup)

For generic DRF patterns (ViewSets, Serializers, Filters, JSON:API), use django-drf skill.


Critical Rules

  • ALWAYS use rls_transaction(tenant_id) when querying outside ViewSet context
  • ALWAYS use get_role() before checking permissions (returns FIRST role only)
  • ALWAYS use @set_tenant then @handle_provider_deletion decorator order
  • ALWAYS use explicit through models for M2M relationships (required for RLS)
  • NEVER access Provider.objects without RLS context in Celery tasks
  • NEVER bypass RLS by using raw SQL or connection.cursor()
  • NEVER use Django’s default M2M - RLS requires through models with tenant_id

Note: rls_transaction() accepts both UUID objects and strings - it converts internally via str(value).


Architecture Overview

4-Database Architecture

DatabaseAliasPurposeRLS
defaultprowler_userStandard API queriesYes
adminadminMigrations, auth bypassNo
replicaprowler_userRead-only queriesYes
admin_replicaadminAdmin read replicaNo
1# When to use admin (bypasses RLS)
2from api.db_router import MainRouter
3User.objects.using(MainRouter.admin_db).get(id=user_id)  # Auth lookups
4
5# Standard queries use default (RLS enforced)
6Provider.objects.filter(connected=True)  # Requires rls_transaction context

RLS Transaction Flow

Request → Authentication → BaseRLSViewSet.initial()
                                    │
                                    ├─ Extract tenant_id from JWT
                                    ├─ SET api.tenant_id = 'uuid' (PostgreSQL)
                                    └─ All queries now tenant-scoped

Implementation Checklist

When implementing Prowler-specific API features:

#PatternReferenceKey Points
1RLS Modelsapi/rls.pyInherit RowLevelSecurityProtectedModel, add constraint
2RLS Transactionsapi/db_utils.pyUse rls_transaction(tenant_id) context manager
3RBAC Permissionsapi/rbac/permissions.pyget_role(), get_providers(), Permissions enum
4Provider Validationapi/models.pyvalidate_<provider>_uid() methods on Provider model
5Celery Taskstasks/tasks.py, api/decorators.py, config/celery.pyTask definitions, decorators (@set_tenant, @handle_provider_deletion), RLSTask base
6RLS Serializersapi/v1/serializers.pyInherit RLSSerializer to auto-inject tenant_id
7Through Modelsapi/models.pyALL M2M must use explicit through with tenant_id

Full file paths: See references/file-locations.md


Decision Trees

Which Base Model?

Tenant-scoped data       → RowLevelSecurityProtectedModel
Global/shared data       → models.Model + BaseSecurityConstraint (rare)
Partitioned time-series  → PostgresPartitionedModel + RowLevelSecurityProtectedModel
Soft-deletable           → Add is_deleted + ActiveProviderManager

Which Manager?

Normal queries           → Model.objects (excludes deleted)
Include deleted records  → Model.all_objects
Celery task context      → Must use rls_transaction() first

Which Database?

Standard API queries     → default (automatic via ViewSet)
Read-only operations     → replica (automatic for GET in BaseRLSViewSet)
Auth/admin operations    → MainRouter.admin_db
Cross-tenant lookups     → MainRouter.admin_db (use sparingly!)

Celery Task Decorator Order?

@shared_task(base=RLSTask, name="...", queue="...")
@set_tenant                    # First: sets tenant context
@handle_provider_deletion      # Second: handles deleted providers
def my_task(tenant_id, provider_id):
    pass

RLS Model Pattern

 1from api.rls import RowLevelSecurityProtectedModel, RowLevelSecurityConstraint
 2
 3class MyModel(RowLevelSecurityProtectedModel):
 4    # tenant FK inherited from parent
 5    id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
 6    name = models.CharField(max_length=255)
 7    inserted_at = models.DateTimeField(auto_now_add=True, editable=False)
 8    updated_at = models.DateTimeField(auto_now=True, editable=False)
 9
10    class Meta(RowLevelSecurityProtectedModel.Meta):
11        db_table = "my_models"
12        constraints = [
13            RowLevelSecurityConstraint(
14                field="tenant_id",
15                name="rls_on_%(class)s",
16                statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
17            ),
18        ]
19
20    class JSONAPIMeta:
21        resource_name = "my-models"

M2M Relationships (MUST use through models)

 1class Resource(RowLevelSecurityProtectedModel):
 2    tags = models.ManyToManyField(
 3        ResourceTag,
 4        through="ResourceTagMapping",  # REQUIRED for RLS
 5    )
 6
 7class ResourceTagMapping(RowLevelSecurityProtectedModel):
 8    # Through model MUST have tenant_id for RLS
 9    resource = models.ForeignKey(Resource, on_delete=models.CASCADE)
10    tag = models.ForeignKey(ResourceTag, on_delete=models.CASCADE)
11
12    class Meta:
13        constraints = [
14            RowLevelSecurityConstraint(
15                field="tenant_id",
16                name="rls_on_%(class)s",
17                statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
18            ),
19        ]

Async Task Response Pattern (202 Accepted)

For long-running operations, return 202 with task reference:

 1@action(detail=True, methods=["post"], url_name="connection")
 2def connection(self, request, pk=None):
 3    with transaction.atomic():
 4        task = check_provider_connection_task.delay(
 5            provider_id=pk, tenant_id=self.request.tenant_id
 6        )
 7    prowler_task = Task.objects.get(id=task.id)
 8    serializer = TaskSerializer(prowler_task)
 9    return Response(
10        data=serializer.data,
11        status=status.HTTP_202_ACCEPTED,
12        headers={"Content-Location": reverse("task-detail", kwargs={"pk": prowler_task.id})}
13    )

Providers (11 Supported)

ProviderUID FormatExample
AWS12 digits123456789012
AzureUUID v4a1b2c3d4-e5f6-...
GCP6-30 chars, lowercase, letter startmy-gcp-project
M365Valid domaincontoso.onmicrosoft.com
Kubernetes2-251 charsarn:aws:eks:...
GitHub1-39 charsmy-org
IaCGit URLhttps://github.com/user/repo.git
Oracle CloudOCID formatocid1.tenancy.oc1..
MongoDB Atlas24-char hex507f1f77bcf86cd799439011
Alibaba Cloud16 digits1234567890123456

Adding new provider: Add to ProviderChoices enum + create validate_<provider>_uid() staticmethod.


RBAC Permissions

PermissionControls
MANAGE_USERSUser CRUD, role assignments
MANAGE_ACCOUNTTenant settings
MANAGE_BILLINGBilling/subscription
MANAGE_PROVIDERSProvider CRUD
MANAGE_INTEGRATIONSIntegration config
MANAGE_SCANSScan execution
UNLIMITED_VISIBILITYSee all providers (bypasses provider_groups)

RBAC Visibility Pattern

1def get_queryset(self):
2    user_role = get_role(self.request.user)
3    if user_role.unlimited_visibility:
4        return Model.objects.filter(tenant_id=self.request.tenant_id)
5    else:
6        # Filter by provider_groups assigned to role
7        return Model.objects.filter(provider__in=get_providers(user_role))

Celery Queues

QueuePurpose
scansProwler scan execution
overviewDashboard aggregations (severity, attack surface)
complianceCompliance report generation
integrationsExternal integrations (Jira, S3, Security Hub)
deletionProvider/tenant deletion (async)
backfillHistorical data backfill operations
scan-reportsOutput generation (CSV, JSON, HTML, PDF)

Task Composition (Canvas)

Use Celery’s Canvas primitives for complex workflows:

PrimitiveUse For
chain()Sequential execution: A → B → C
group()Parallel execution: A, B, C simultaneously
CombinedChain with nested groups for complex workflows

Note: Use .si() (signature immutable) to prevent result passing. Use .s() if you need to pass results.

Examples: See assets/celery_patterns.py for chain, group, and combined patterns.


Beat Scheduling (Periodic Tasks)

OperationKey Points
Create scheduleIntervalSchedule.objects.get_or_create(every=24, period=HOURS)
Create periodic taskUse task name (not function), kwargs=json.dumps(...)
Delete scheduled taskPeriodicTask.objects.filter(name=...).delete()
Avoid race conditionsUse countdown=5 to wait for DB commit

Examples: See assets/celery_patterns.py for schedule_provider_scan pattern.


Advanced Task Patterns

@set_tenant Behavior

Modetenant_id in kwargstenant_id passed to function
@set_tenant (default)Popped (removed)NO - function doesn’t receive it
@set_tenant(keep_tenant=True)Read but keptYES - function receives it

Key Patterns

PatternDescription
bind=TrueAccess self.request.id, self.request.retries
get_task_logger(__name__)Proper logging in Celery tasks
SoftTimeLimitExceededCatch to save progress before hard kill
countdown=30Defer execution by N seconds
eta=datetime(...)Execute at specific time

Examples: See assets/celery_patterns.py for all advanced patterns.


Celery Configuration

SettingValuePurpose
BROKER_VISIBILITY_TIMEOUT86400 (24h)Prevent re-queue for long tasks
CELERY_RESULT_BACKENDdjango-dbStore results in PostgreSQL
CELERY_TASK_TRACK_STARTEDTrueTrack when tasks start
soft_time_limitTask-specificRaises SoftTimeLimitExceeded
time_limitTask-specificHard kill (SIGKILL)

Full config: See assets/celery_patterns.py and actual files at config/celery.py, config/settings/celery.py.


UUIDv7 for Partitioned Tables

Finding and ResourceFindingMapping use UUIDv7 for time-based partitioning:

1from uuid6 import uuid7
2from api.uuid_utils import uuid7_start, uuid7_end, datetime_to_uuid7
3
4# Partition-aware filtering
5start = uuid7_start(datetime_to_uuid7(date_from))
6end = uuid7_end(datetime_to_uuid7(date_to), settings.FINDINGS_TABLE_PARTITION_MONTHS)
7queryset.filter(id__gte=start, id__lt=end)

Why UUIDv7? Time-ordered UUIDs enable PostgreSQL to prune partitions during range queries.


Batch Operations with RLS

 1from api.db_utils import batch_delete, create_objects_in_batches, update_objects_in_batches
 2
 3# Delete in batches (RLS-aware)
 4batch_delete(tenant_id, queryset, batch_size=1000)
 5
 6# Bulk create with RLS
 7create_objects_in_batches(tenant_id, Finding, objects, batch_size=500)
 8
 9# Bulk update with RLS
10update_objects_in_batches(tenant_id, Finding, objects, fields=["status"], batch_size=500)

Security Patterns

Full examples: See assets/security_patterns.py

Tenant Isolation Summary

PatternRule
RLS in ViewSetsAutomatic via BaseRLSViewSet - tenant_id from JWT
RLS in CeleryMUST use @set_tenant + rls_transaction(tenant_id)
Cross-tenant validationDefense-in-depth: verify obj.tenant_id == request.tenant_id
Never trust user inputUse request.tenant_id from JWT, never request.data.get("tenant_id")
Admin DB bypassOnly for cross-tenant admin ops - exposes ALL tenants’ data

Celery Task Security Summary

PatternRule
Named tasks onlyNEVER use dynamic task names from user input
Validate argumentsCheck UUID format before database queries
Safe queuingUse transaction.on_commit() to enqueue AFTER commit
Modern retriesUse autoretry_for, retry_backoff, retry_jitter
Time limitsSet soft_time_limit and time_limit to prevent hung tasks
IdempotencyUse update_or_create or idempotency keys

Quick Reference

 1# Safe task queuing - task only enqueued after transaction commits
 2with transaction.atomic():
 3    provider = Provider.objects.create(**data)
 4    transaction.on_commit(
 5        lambda: verify_provider_connection.delay(
 6            tenant_id=str(request.tenant_id),
 7            provider_id=str(provider.id)
 8        )
 9    )
10
11# Modern retry pattern
12@shared_task(
13    base=RLSTask,
14    bind=True,
15    autoretry_for=(ConnectionError, TimeoutError, OperationalError),
16    retry_backoff=True,
17    retry_backoff_max=600,
18    retry_jitter=True,
19    max_retries=5,
20    soft_time_limit=300,
21    time_limit=360,
22)
23@set_tenant
24def sync_provider_data(self, tenant_id, provider_id):
25    with rls_transaction(tenant_id):
26        # ... task logic
27        pass
28
29# Idempotent task - safe to retry
30@shared_task(base=RLSTask, acks_late=True)
31@set_tenant
32def process_finding(tenant_id, finding_uid, data):
33    with rls_transaction(tenant_id):
34        Finding.objects.update_or_create(uid=finding_uid, defaults=data)

Production Deployment Checklist

Full settings: See references/production-settings.md

Run before every production deployment:

1cd api && poetry run python src/backend/manage.py check --deploy

Critical Settings

SettingProduction ValueRisk if Wrong
DEBUGFalseExposes stack traces, settings, SQL queries
SECRET_KEYEnv var, rotatedSession hijacking, CSRF bypass
ALLOWED_HOSTSExplicit listHost header attacks
SECURE_SSL_REDIRECTTrueCredentials sent over HTTP
SESSION_COOKIE_SECURETrueSession cookies over HTTP
CSRF_COOKIE_SECURETrueCSRF tokens over HTTP
SECURE_HSTS_SECONDS31536000 (1 year)Downgrade attacks
CONN_MAX_AGE60 or higherConnection pool exhaustion

Commands

 1# Development
 2cd api && poetry run python src/backend/manage.py runserver
 3cd api && poetry run python src/backend/manage.py shell
 4
 5# Celery
 6cd api && poetry run celery -A config.celery worker -l info -Q scans,overview
 7cd api && poetry run celery -A config.celery beat -l info
 8
 9# Testing
10cd api && poetry run pytest -x --tb=short
11
12# Production checks
13cd api && poetry run python src/backend/manage.py check --deploy

Resources

Local References

  • Generic DRF Patterns: Use django-drf skill
  • API Testing: Use prowler-test-api skill

Prerequisite: Install Context7 MCP server for up-to-date documentation lookup.

When implementing or debugging Prowler-specific patterns, query these libraries via mcp_context7_query-docs:

LibraryContext7 IDUse For
Celery/websites/celeryq_dev_en_stableTask patterns, queues, error handling
django-celery-beat/celery/django-celery-beatPeriodic task scheduling
Django/websites/djangoproject_en_5_2Models, ORM, constraints, indexes

Example queries:

mcp_context7_query-docs(libraryId="/websites/celeryq_dev_en_stable", query="shared_task decorator retry patterns")
mcp_context7_query-docs(libraryId="/celery/django-celery-beat", query="periodic task database scheduler")
mcp_context7_query-docs(libraryId="/websites/djangoproject_en_5_2", query="model constraints CheckConstraint UniqueConstraint")

Note: Use mcp_context7_resolve-library-id first if you need to find the correct library ID.

What Users Are Saying

Real feedback from the community

Environment Matrix

Dependencies

Django 4.2+
Django REST Framework
Celery 5.3+
PostgreSQL 14+
Redis (Celery broker)
uuid6 (UUIDv7 support)

Framework Support

Django REST Framework ✓ (recommended) Celery ✓ (recommended) django-celery-beat ✓ JSON:API ✓

Context Window

Token Usage ~5K-15K tokens for complex multi-model implementations

Security & Privacy

Information

Author
prowler-cloud
Updated
2026-01-30
Category
architecture-patterns