Plug in your camera. Define what to watch for. SightBricks runs vision AI directly on the edge device — no cloud round-trip, no latency, no data leaving your site.
| Time | Policy | Details |
|---|
Every policy runs continuously against the live video stream. Each type is purpose-built for a different class of question.
Describe any critical object or hazard in plain English — fire, spill, overflowing bin, blocked exit — and the VLM scans every incoming frame, drawing a precise bounding box around each instance. You get pixel coordinates, a confidence score, and an annotated frame snapshot saved for every alert.
Ask any free-text question about the scene — the VLM reads the frame and returns a natural language answer in real time. Ideal for situational awareness, safety compliance, status monitoring, or anything a human would answer by looking at a camera.
Bring any camera you already own. SightBricks ships pre-configured on Nvidia's most powerful compact AI platform.
Any camera you already own. No proprietary hardware lock-in.
Ships pre-loaded with the full SightBricks stack. Ready to run the moment it arrives — no setup required.
Each layer is independently scalable. Inference never waits on storage, network, or sync.
Plug any USB, RTSP, or CSI camera into the Jetson device. SightBricks auto-detects it and starts ingesting frames immediately.
Add detection or VQA policies from the web UI — no code. Each runs as a dedicated VLM worker with its own inference queue.
Events appear live in the dashboard with annotated frames and VQA answers. Everything syncs to your cloud asynchronously.
Detection bboxes, VQA answers, and annotated JPEG frames are published to an async upload queue after local save. The inference pipeline is never blocked by network.
Every detection event is potential training data. Auto-label captures using existing policies, then fine-tune with LoRA to specialise the model for your exact scenario.
We're working with a small number of teams to deploy SightBricks on Nvidia Jetson hardware. Tell us about your use case and we'll reach out.