Private Beta · Limited early access spots available · Apply now →
Features Hardware Architecture Cloud Request access
Powered by Nvidia Jetson Orin Nano

Detect anything.
Answer anything.
At the Edge.

Plug in your camera. Define what to watch for. SightBricks runs vision AI directly on the edge device — no cloud round-trip, no latency, no data leaving your site.

sightbricks — live pipeline LIVE
Time Policy Details
Policies
DET Fire Detection
Safety
"fire or smoke"
DET Bin Overflow
Waste Management
"overflowing garbage bin"
DET No Hard Hat
Construction
"worker without hard hat"
DET Forklift Zone
Warehouse
"forklift near pedestrian"
VQA Exit Blocked?
Compliance
"Is the emergency exit blocked?"
Policy Types

Two ways to understand
what your camera sees.

Every policy runs continuously against the live video stream. Each type is purpose-built for a different class of question.

DETECT

Object Detection

Describe any critical object or hazard in plain English — fire, spill, overflowing bin, blocked exit — and the VLM scans every incoming frame, drawing a precise bounding box around each instance. You get pixel coordinates, a confidence score, and an annotated frame snapshot saved for every alert.

Example output
PolicyObject Detection
Query"fire or smoke"
Resultbbox [120,60,380,420] · conf 0.97 🔴
  • Detect fire, smoke, spills, overflows — any visual hazard
  • Bounding boxes drawn on every saved alert frame
  • Describe the hazard in plain English — no pre-trained classes
  • Instant event saved with timestamp for audit trail
fire · 97%
smoke · 88%
VQA

Visual Question Answering

Ask any free-text question about the scene — the VLM reads the frame and returns a natural language answer in real time. Ideal for situational awareness, safety compliance, status monitoring, or anything a human would answer by looking at a camera.

Example output
PolicyVisual Q&A
Question"Is anyone wearing a hard hat?"
Answer"No — one worker is not wearing PPE."
  • Ask anything in plain English — no training required
  • Answers stored with timestamp and frame snapshot
  • Safety checks, compliance, status monitoring
  • Trigger alerts based on specific answer patterns
Is anyone wearing a hard hat?
No — one worker is not wearing PPE.
14:32:07 · frame #1482
Is the emergency exit clear?
Yes, the exit path is unobstructed.
14:31:55 · frame #1470
Hardware

Your camera. Our edge.
Zero compromise.

Bring any camera you already own. SightBricks ships pre-configured on Nvidia's most powerful compact AI platform.

Bring your own camera

Any camera you already own. No proprietary hardware lock-in.

  • USBAny USB webcam or industrial camera
  • RTSPIP cameras, NVRs, security systems
  • CSIRibbon cameras on Jetson carrier boards
  • MultiUp to N concurrent streams
connects to
SightBricks Edge Device

Nvidia Jetson Orin Nano

Ships pre-loaded with the full SightBricks stack. Ready to run the moment it arrives — no setup required.

40TOPS AI perf
8GBLPDDR5 RAM
1024CUDA cores
  • Pre-loaded with SightBricks software
  • Ships with power adapter and M.2 SSD
  • Designed to run 24/7 in industrial environments
  • Remote management via cloud dashboard
Architecture

Five layers. One
unified pipeline.

Each layer is independently scalable. Inference never waits on storage, network, or sync.

Layer 1
Camera
USB · RTSP · CSI
asyncio ring buffer
Layer 2
Policies
Detect · VQA
Hot-reload rules
Layer 3
VLM Inference
GPU workers
per policy
Layer 4
Local Storage
SQLite · JPEG
frames on-device
Layer 5
Cloud Sync
S3 · Kafka
TimescaleDB
01

Connect your camera

Plug any USB, RTSP, or CSI camera into the Jetson device. SightBricks auto-detects it and starts ingesting frames immediately.

02

Define policies

Add detection or VQA policies from the web UI — no code. Each runs as a dedicated VLM worker with its own inference queue.

03

Monitor & sync

Events appear live in the dashboard with annotated frames and VQA answers. Everything syncs to your cloud asynchronously.

Cloud Sync

Your data flows
where you need it.

Detection bboxes, VQA answers, and annotated JPEG frames are published to an async upload queue after local save. The inference pipeline is never blocked by network.

  • Amazon S3, Google Cloud Storage, Azure Blob
  • Kafka topic streaming for real-time consumers
  • TimescaleDB for time-series analytics
  • Retry queue with exponential backoff
Jetson Orin Nano
Detection events JSON
VQA answers + timestamps
Annotated JPEG frames
Cloud Storage
1
Capture labelled detection events
2
Run LoRA fine-tuning on your dataset
3
Evaluate on held-out validation set
4
Deploy updated model to edge device
Fine-tuning

Models that learn
from your environment.

Every detection event is potential training data. Auto-label captures using existing policies, then fine-tune with LoRA to specialise the model for your exact scenario.

  • Auto-labelling from policy detections
  • LoRA fine-tuning on-device or in cloud
  • One-click deploy to Jetson pipeline
  • Accuracy tracking across model versions
Private Beta

Apply for
early access.

We're working with a small number of teams to deploy SightBricks on Nvidia Jetson hardware. Tell us about your use case and we'll reach out.

  • 🎯
    Reviewed personally Every application is read by the founding team.
  • 📦
    Ships with hardware We ship the Jetson device pre-configured and ready.
  • 🔒
    Your data stays private All inference runs on-device. Nothing leaves your network.

We'll respond within 2 business days.