Prowler Api
Master Prowler's multi-tenant API with RLS, RBAC, and Celery patterns
✨ The solution you've been looking for
Prowler API patterns: JSON:API, RLS, RBAC, providers, Celery tasks. Trigger: When working in api/ on models/serializers/viewsets/filters/tasks involving tenant isolation (RLS), RBAC, JSON:API, or provider lifecycle.
See It In Action
Interactive preview & real-world examples
AI Conversation Simulator
See how users interact with this skill
User Prompt
I need to create a new model for storing scan results that properly isolates data between tenants using RLS
Skill Processing
Analyzing request...
Agent Response
Complete model implementation with RowLevelSecurityProtectedModel inheritance, proper constraints, and tenant-aware querysets
Quick Start (3 Steps)
Get up and running in minutes
Install
claude-code skill install prowler-api
claude-code skill install prowler-apiConfig
First Trigger
@prowler-api helpCommands
| Command | Description | Required Args |
|---|---|---|
| @prowler-api implementing-tenant-isolated-models | Create new models with proper Row-Level Security constraints and tenant isolation for multi-tenant architecture | None |
| @prowler-api building-async-provider-operations | Implement Celery tasks for provider connection testing, data syncing, and lifecycle management with proper tenant context | None |
| @prowler-api implementing-rbac-permissions | Add role-based access control with provider group visibility and permission checks | None |
Typical Use Cases
Implementing Tenant-Isolated Models
Create new models with proper Row-Level Security constraints and tenant isolation for multi-tenant architecture
Building Async Provider Operations
Implement Celery tasks for provider connection testing, data syncing, and lifecycle management with proper tenant context
Implementing RBAC Permissions
Add role-based access control with provider group visibility and permission checks
Overview
When to Use
Use this skill for Prowler-specific patterns:
- Row-Level Security (RLS) / tenant isolation
- RBAC permissions and role checks
- Provider lifecycle and validation
- Celery tasks with tenant context
- Multi-database architecture (4-database setup)
For generic DRF patterns (ViewSets, Serializers, Filters, JSON:API), use django-drf skill.
Critical Rules
- ALWAYS use
rls_transaction(tenant_id)when querying outside ViewSet context - ALWAYS use
get_role()before checking permissions (returns FIRST role only) - ALWAYS use
@set_tenantthen@handle_provider_deletiondecorator order - ALWAYS use explicit through models for M2M relationships (required for RLS)
- NEVER access
Provider.objectswithout RLS context in Celery tasks - NEVER bypass RLS by using raw SQL or
connection.cursor() - NEVER use Django’s default M2M - RLS requires through models with
tenant_id
Note:
rls_transaction()accepts both UUID objects and strings - it converts internally viastr(value).
Architecture Overview
4-Database Architecture
| Database | Alias | Purpose | RLS |
|---|---|---|---|
default | prowler_user | Standard API queries | Yes |
admin | admin | Migrations, auth bypass | No |
replica | prowler_user | Read-only queries | Yes |
admin_replica | admin | Admin read replica | No |
1# When to use admin (bypasses RLS)
2from api.db_router import MainRouter
3User.objects.using(MainRouter.admin_db).get(id=user_id) # Auth lookups
4
5# Standard queries use default (RLS enforced)
6Provider.objects.filter(connected=True) # Requires rls_transaction context
RLS Transaction Flow
Request → Authentication → BaseRLSViewSet.initial()
│
├─ Extract tenant_id from JWT
├─ SET api.tenant_id = 'uuid' (PostgreSQL)
└─ All queries now tenant-scoped
Implementation Checklist
When implementing Prowler-specific API features:
| # | Pattern | Reference | Key Points |
|---|---|---|---|
| 1 | RLS Models | api/rls.py | Inherit RowLevelSecurityProtectedModel, add constraint |
| 2 | RLS Transactions | api/db_utils.py | Use rls_transaction(tenant_id) context manager |
| 3 | RBAC Permissions | api/rbac/permissions.py | get_role(), get_providers(), Permissions enum |
| 4 | Provider Validation | api/models.py | validate_<provider>_uid() methods on Provider model |
| 5 | Celery Tasks | tasks/tasks.py, api/decorators.py, config/celery.py | Task definitions, decorators (@set_tenant, @handle_provider_deletion), RLSTask base |
| 6 | RLS Serializers | api/v1/serializers.py | Inherit RLSSerializer to auto-inject tenant_id |
| 7 | Through Models | api/models.py | ALL M2M must use explicit through with tenant_id |
Full file paths: See references/file-locations.md
Decision Trees
Which Base Model?
Tenant-scoped data → RowLevelSecurityProtectedModel
Global/shared data → models.Model + BaseSecurityConstraint (rare)
Partitioned time-series → PostgresPartitionedModel + RowLevelSecurityProtectedModel
Soft-deletable → Add is_deleted + ActiveProviderManager
Which Manager?
Normal queries → Model.objects (excludes deleted)
Include deleted records → Model.all_objects
Celery task context → Must use rls_transaction() first
Which Database?
Standard API queries → default (automatic via ViewSet)
Read-only operations → replica (automatic for GET in BaseRLSViewSet)
Auth/admin operations → MainRouter.admin_db
Cross-tenant lookups → MainRouter.admin_db (use sparingly!)
Celery Task Decorator Order?
@shared_task(base=RLSTask, name="...", queue="...")
@set_tenant # First: sets tenant context
@handle_provider_deletion # Second: handles deleted providers
def my_task(tenant_id, provider_id):
pass
RLS Model Pattern
1from api.rls import RowLevelSecurityProtectedModel, RowLevelSecurityConstraint
2
3class MyModel(RowLevelSecurityProtectedModel):
4 # tenant FK inherited from parent
5 id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
6 name = models.CharField(max_length=255)
7 inserted_at = models.DateTimeField(auto_now_add=True, editable=False)
8 updated_at = models.DateTimeField(auto_now=True, editable=False)
9
10 class Meta(RowLevelSecurityProtectedModel.Meta):
11 db_table = "my_models"
12 constraints = [
13 RowLevelSecurityConstraint(
14 field="tenant_id",
15 name="rls_on_%(class)s",
16 statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
17 ),
18 ]
19
20 class JSONAPIMeta:
21 resource_name = "my-models"
M2M Relationships (MUST use through models)
1class Resource(RowLevelSecurityProtectedModel):
2 tags = models.ManyToManyField(
3 ResourceTag,
4 through="ResourceTagMapping", # REQUIRED for RLS
5 )
6
7class ResourceTagMapping(RowLevelSecurityProtectedModel):
8 # Through model MUST have tenant_id for RLS
9 resource = models.ForeignKey(Resource, on_delete=models.CASCADE)
10 tag = models.ForeignKey(ResourceTag, on_delete=models.CASCADE)
11
12 class Meta:
13 constraints = [
14 RowLevelSecurityConstraint(
15 field="tenant_id",
16 name="rls_on_%(class)s",
17 statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
18 ),
19 ]
Async Task Response Pattern (202 Accepted)
For long-running operations, return 202 with task reference:
1@action(detail=True, methods=["post"], url_name="connection")
2def connection(self, request, pk=None):
3 with transaction.atomic():
4 task = check_provider_connection_task.delay(
5 provider_id=pk, tenant_id=self.request.tenant_id
6 )
7 prowler_task = Task.objects.get(id=task.id)
8 serializer = TaskSerializer(prowler_task)
9 return Response(
10 data=serializer.data,
11 status=status.HTTP_202_ACCEPTED,
12 headers={"Content-Location": reverse("task-detail", kwargs={"pk": prowler_task.id})}
13 )
Providers (11 Supported)
| Provider | UID Format | Example |
|---|---|---|
| AWS | 12 digits | 123456789012 |
| Azure | UUID v4 | a1b2c3d4-e5f6-... |
| GCP | 6-30 chars, lowercase, letter start | my-gcp-project |
| M365 | Valid domain | contoso.onmicrosoft.com |
| Kubernetes | 2-251 chars | arn:aws:eks:... |
| GitHub | 1-39 chars | my-org |
| IaC | Git URL | https://github.com/user/repo.git |
| Oracle Cloud | OCID format | ocid1.tenancy.oc1.. |
| MongoDB Atlas | 24-char hex | 507f1f77bcf86cd799439011 |
| Alibaba Cloud | 16 digits | 1234567890123456 |
Adding new provider: Add to ProviderChoices enum + create validate_<provider>_uid() staticmethod.
RBAC Permissions
| Permission | Controls |
|---|---|
MANAGE_USERS | User CRUD, role assignments |
MANAGE_ACCOUNT | Tenant settings |
MANAGE_BILLING | Billing/subscription |
MANAGE_PROVIDERS | Provider CRUD |
MANAGE_INTEGRATIONS | Integration config |
MANAGE_SCANS | Scan execution |
UNLIMITED_VISIBILITY | See all providers (bypasses provider_groups) |
RBAC Visibility Pattern
1def get_queryset(self):
2 user_role = get_role(self.request.user)
3 if user_role.unlimited_visibility:
4 return Model.objects.filter(tenant_id=self.request.tenant_id)
5 else:
6 # Filter by provider_groups assigned to role
7 return Model.objects.filter(provider__in=get_providers(user_role))
Celery Queues
| Queue | Purpose |
|---|---|
scans | Prowler scan execution |
overview | Dashboard aggregations (severity, attack surface) |
compliance | Compliance report generation |
integrations | External integrations (Jira, S3, Security Hub) |
deletion | Provider/tenant deletion (async) |
backfill | Historical data backfill operations |
scan-reports | Output generation (CSV, JSON, HTML, PDF) |
Task Composition (Canvas)
Use Celery’s Canvas primitives for complex workflows:
| Primitive | Use For |
|---|---|
chain() | Sequential execution: A → B → C |
group() | Parallel execution: A, B, C simultaneously |
| Combined | Chain with nested groups for complex workflows |
Note: Use
.si()(signature immutable) to prevent result passing. Use.s()if you need to pass results.
Examples: See assets/celery_patterns.py for chain, group, and combined patterns.
Beat Scheduling (Periodic Tasks)
| Operation | Key Points |
|---|---|
| Create schedule | IntervalSchedule.objects.get_or_create(every=24, period=HOURS) |
| Create periodic task | Use task name (not function), kwargs=json.dumps(...) |
| Delete scheduled task | PeriodicTask.objects.filter(name=...).delete() |
| Avoid race conditions | Use countdown=5 to wait for DB commit |
Examples: See assets/celery_patterns.py for schedule_provider_scan pattern.
Advanced Task Patterns
@set_tenant Behavior
| Mode | tenant_id in kwargs | tenant_id passed to function |
|---|---|---|
@set_tenant (default) | Popped (removed) | NO - function doesn’t receive it |
@set_tenant(keep_tenant=True) | Read but kept | YES - function receives it |
Key Patterns
| Pattern | Description |
|---|---|
bind=True | Access self.request.id, self.request.retries |
get_task_logger(__name__) | Proper logging in Celery tasks |
SoftTimeLimitExceeded | Catch to save progress before hard kill |
countdown=30 | Defer execution by N seconds |
eta=datetime(...) | Execute at specific time |
Examples: See assets/celery_patterns.py for all advanced patterns.
Celery Configuration
| Setting | Value | Purpose |
|---|---|---|
BROKER_VISIBILITY_TIMEOUT | 86400 (24h) | Prevent re-queue for long tasks |
CELERY_RESULT_BACKEND | django-db | Store results in PostgreSQL |
CELERY_TASK_TRACK_STARTED | True | Track when tasks start |
soft_time_limit | Task-specific | Raises SoftTimeLimitExceeded |
time_limit | Task-specific | Hard kill (SIGKILL) |
Full config: See assets/celery_patterns.py and actual files at
config/celery.py,config/settings/celery.py.
UUIDv7 for Partitioned Tables
Finding and ResourceFindingMapping use UUIDv7 for time-based partitioning:
1from uuid6 import uuid7
2from api.uuid_utils import uuid7_start, uuid7_end, datetime_to_uuid7
3
4# Partition-aware filtering
5start = uuid7_start(datetime_to_uuid7(date_from))
6end = uuid7_end(datetime_to_uuid7(date_to), settings.FINDINGS_TABLE_PARTITION_MONTHS)
7queryset.filter(id__gte=start, id__lt=end)
Why UUIDv7? Time-ordered UUIDs enable PostgreSQL to prune partitions during range queries.
Batch Operations with RLS
1from api.db_utils import batch_delete, create_objects_in_batches, update_objects_in_batches
2
3# Delete in batches (RLS-aware)
4batch_delete(tenant_id, queryset, batch_size=1000)
5
6# Bulk create with RLS
7create_objects_in_batches(tenant_id, Finding, objects, batch_size=500)
8
9# Bulk update with RLS
10update_objects_in_batches(tenant_id, Finding, objects, fields=["status"], batch_size=500)
Security Patterns
Full examples: See assets/security_patterns.py
Tenant Isolation Summary
| Pattern | Rule |
|---|---|
| RLS in ViewSets | Automatic via BaseRLSViewSet - tenant_id from JWT |
| RLS in Celery | MUST use @set_tenant + rls_transaction(tenant_id) |
| Cross-tenant validation | Defense-in-depth: verify obj.tenant_id == request.tenant_id |
| Never trust user input | Use request.tenant_id from JWT, never request.data.get("tenant_id") |
| Admin DB bypass | Only for cross-tenant admin ops - exposes ALL tenants’ data |
Celery Task Security Summary
| Pattern | Rule |
|---|---|
| Named tasks only | NEVER use dynamic task names from user input |
| Validate arguments | Check UUID format before database queries |
| Safe queuing | Use transaction.on_commit() to enqueue AFTER commit |
| Modern retries | Use autoretry_for, retry_backoff, retry_jitter |
| Time limits | Set soft_time_limit and time_limit to prevent hung tasks |
| Idempotency | Use update_or_create or idempotency keys |
Quick Reference
1# Safe task queuing - task only enqueued after transaction commits
2with transaction.atomic():
3 provider = Provider.objects.create(**data)
4 transaction.on_commit(
5 lambda: verify_provider_connection.delay(
6 tenant_id=str(request.tenant_id),
7 provider_id=str(provider.id)
8 )
9 )
10
11# Modern retry pattern
12@shared_task(
13 base=RLSTask,
14 bind=True,
15 autoretry_for=(ConnectionError, TimeoutError, OperationalError),
16 retry_backoff=True,
17 retry_backoff_max=600,
18 retry_jitter=True,
19 max_retries=5,
20 soft_time_limit=300,
21 time_limit=360,
22)
23@set_tenant
24def sync_provider_data(self, tenant_id, provider_id):
25 with rls_transaction(tenant_id):
26 # ... task logic
27 pass
28
29# Idempotent task - safe to retry
30@shared_task(base=RLSTask, acks_late=True)
31@set_tenant
32def process_finding(tenant_id, finding_uid, data):
33 with rls_transaction(tenant_id):
34 Finding.objects.update_or_create(uid=finding_uid, defaults=data)
Production Deployment Checklist
Full settings: See references/production-settings.md
Run before every production deployment:
1cd api && poetry run python src/backend/manage.py check --deploy
Critical Settings
| Setting | Production Value | Risk if Wrong |
|---|---|---|
DEBUG | False | Exposes stack traces, settings, SQL queries |
SECRET_KEY | Env var, rotated | Session hijacking, CSRF bypass |
ALLOWED_HOSTS | Explicit list | Host header attacks |
SECURE_SSL_REDIRECT | True | Credentials sent over HTTP |
SESSION_COOKIE_SECURE | True | Session cookies over HTTP |
CSRF_COOKIE_SECURE | True | CSRF tokens over HTTP |
SECURE_HSTS_SECONDS | 31536000 (1 year) | Downgrade attacks |
CONN_MAX_AGE | 60 or higher | Connection pool exhaustion |
Commands
1# Development
2cd api && poetry run python src/backend/manage.py runserver
3cd api && poetry run python src/backend/manage.py shell
4
5# Celery
6cd api && poetry run celery -A config.celery worker -l info -Q scans,overview
7cd api && poetry run celery -A config.celery beat -l info
8
9# Testing
10cd api && poetry run pytest -x --tb=short
11
12# Production checks
13cd api && poetry run python src/backend/manage.py check --deploy
Resources
Local References
- File Locations: See references/file-locations.md
- Modeling Decisions: See references/modeling-decisions.md
- Configuration: See references/configuration.md
- Production Settings: See references/production-settings.md
- Security Patterns: See assets/security_patterns.py
Related Skills
- Generic DRF Patterns: Use
django-drfskill - API Testing: Use
prowler-test-apiskill
Context7 MCP (Recommended)
Prerequisite: Install Context7 MCP server for up-to-date documentation lookup.
When implementing or debugging Prowler-specific patterns, query these libraries via mcp_context7_query-docs:
| Library | Context7 ID | Use For |
|---|---|---|
| Celery | /websites/celeryq_dev_en_stable | Task patterns, queues, error handling |
| django-celery-beat | /celery/django-celery-beat | Periodic task scheduling |
| Django | /websites/djangoproject_en_5_2 | Models, ORM, constraints, indexes |
Example queries:
mcp_context7_query-docs(libraryId="/websites/celeryq_dev_en_stable", query="shared_task decorator retry patterns")
mcp_context7_query-docs(libraryId="/celery/django-celery-beat", query="periodic task database scheduler")
mcp_context7_query-docs(libraryId="/websites/djangoproject_en_5_2", query="model constraints CheckConstraint UniqueConstraint")
Note: Use
mcp_context7_resolve-library-idfirst if you need to find the correct library ID.
What Users Are Saying
Real feedback from the community
Environment Matrix
Dependencies
Framework Support
Context Window
Security & Privacy
Information
- Author
- prowler-cloud
- Updated
- 2026-01-30
- Category
- architecture-patterns
Related Skills
Prowler Api
Prowler API patterns: JSON:API, RLS, RBAC, providers, Celery tasks. Trigger: When working in api/ on …
View Details →Auth Implementation Patterns
Master authentication and authorization patterns including JWT, OAuth2, session management, and RBAC …
View Details →Auth Implementation Patterns
Master authentication and authorization patterns including JWT, OAuth2, session management, and RBAC …
View Details →