The example consists of two primary applications:
Camera App (chip-camera-app): This application functions as the Matter accessory. It captures and encodes a video feed, making it available for streaming over a WebRTC connection. It is typically run on a device with a camera, such as a Raspberry Pi.
Camera Controller (chip-camera-controller): This application acts as the client. It commissions the Camera App onto the Matter network, discovers it, and receives and displays the video stream.
Before building, you must install the necessary GStreamer libraries and development packages, which are used for video processing and streaming.
sudo apt update sudo apt install \ gstreamer1.0-plugins-base \ gstreamer1.0-plugins-good \ gstreamer1.0-plugins-bad \ gstreamer1.0-libav \ libgstreamer1.0-dev \ libgstreamer-plugins-base1.0-dev
source scripts/activate.sh
Follow the build instructions from examples/camera-app/linux/README.md
You can either build the applications for a local Linux machine or cross-compile them for a Raspberry Pi.
Option A: Build for a Local Linux (x86_64)
This is the simplest method for testing the camera pipeline on a single Linux computer.
# Navigate to the examples directory cd examples/camera-controller/ # Compile the Linux x86‑64 camera‑controller target ./scripts/build/build_examples.py \ --target linux-x64-camera-controller \ build
The resulting executable is placed in:
out/linux-x64-camera-controller/chip-camera-controller.
Option B: Cross-Compile for Raspberry Pi (arm64)
To run an application on a Raspberry Pi, you must cross-compile it from an x86_64 host machine. The recommended method is to use the provided Docker build environment to ensure all dependencies are correct.
docker pull ghcr.io/project-chip/chip-build-crosscompile:167
docker run -it -v ~/connectedhomeip:/var/connectedhomeip ghcr.io/project-chip/chip-build-crosscompile:167 /bin/bash
cd /var/connectedhomeip # Required to fix git repository ownership issues inside Docker git config --global --add safe.directory /var/connectedhomeip git config --global --add safe.directory /var/connectedhomeip/third_party/pigweed/repo # Run the build script for the Raspberry Pi (arm64) target ./scripts/run_in_build_env.sh \ "./scripts/build/build_examples.py \ --target linux-arm64-camera-controller-clang \ build"
# The binary path on your host machine # out/linux-arm64-camera-controller-clang/chip-camera-controller # SCP command to transfer the file scp ./out/linux-arm64-camera-controller-clang/chip-camera-controller ubuntu@<RASPBERRY_PI_IP_ADDRESS>:/home/ubuntu
After building the applications using Option A, you can run the full end-to-end example on your Linux machine. You will need two separate terminal windows.
Terminal 1: Start the Camera App (Device)
Clean up any existing configurations (first-time pairing only):
sudo rm -rf /tmp/chip_*
YUYV
Only)This demo supports only the uncompressed YUYV
pixel format for capture.
Identify the correct video device first (examples: video0
, video1
). You can list devices and verify the format support with:
v4l2-ctl --list-devices v4l2-ctl -d /dev/video0 --list-formats-ext | grep -A4 YUYV || true
Launch the camera app explicitly selecting the device (replace video0
if needed):
./out/linux-x64-camera/chip-camera-app --camera-video-device video0
Terminal 2: Launch and Use the Camera Controller (Client)
./out/linux-x64-camera-controller/chip-camera-controller
onnetwork: Specifies pairing over the local IP network.
pairing onnetwork 1 20202021
Wait for the command to complete and confirm that commissioning was successful.
liveview start
command followed by the nodeID you set during pairing. For example, if your nodeID is 1, you can request a stream with a minimum resolution of 640x480 pixels and a minimum frame rate of 30 frames per second using the command below.liveview start 1 --min-res-width 640 --min-res-height 480 --min-framerate 30
To see what video formats and resolutions your camera supports, first list the available video devices, then check the formats for your specific device:
# List all available video devices v4l2-ctl --list-devices # Check formats for a specific device (replace /dev/video0 with your device) v4l2-ctl -d /dev/video0 --list-formats-ext
Wave your hand in front of the camera to trigger live view; a video window will appear, confirming that the stream is active.
The Push AV Server acts as the recording destination (like a cloud service) for video clips. The demo shows how a Matter Camera Controller tells a Matter Camera to record a clip and upload it to that server.
The server itself doesn‘t control the camera; it’s a passive service that securely receives and stores media. The relationship is established by the controller.
The Push AV Server runs in the background, waiting for authenticated devices to push video content to it. The key piece of information it provides is its ingest URL, which, for example, looks like this: https://localhost:1234/streams/1
Think of this server as a secure, private YouTube or Dropbox, but just for your camera's video clips.
Here is the command to set up a local Push AV Server, using a virtual environment. (recommended)
# Create a virtual environment python3 -m venv pavs_env # Activate it source pavs_env/bin/activate # Install dependencies pip install uvicorn fastapi jinja2 cryptography zeroconf pydantic # Run the server python server.py --working-directory ~/.pavstest
The chip-camera-controller
acts as the “brain” of the operation. It's what you use to configure the system. It tells the camera where to send its recordings.
This happens in this specific command:
./scripts/examples/gn_build_example.sh examples/camera-app/linux out/debug ./out/debug/chip-camera-app
chip-camera-controller
steps for clip recording & uploading./scripts/build/build_examples.py --target linux-x64-camera-controller build ./out/linux-x64-camera-controller/chip-camera-controller
pairing code 1 34970112332
cameraavstreammanagement video-stream-allocate 3 0 30 30 '{ "width":640, "height":480}' '{ "width":640, "height":480}' 10000 10000 1 10 1 1 --WatermarkEnabled 1 --OSDEnabled 1
cameraavstreammanagement audio-stream-allocate 3 0 2 48000 96000 16 1 1
pushavstreamtransport allocate-push-transport '{"streamUsage":0, "videoStreamID":1, "audioStreamID":2, "endpointID":1, "url":"https://localhost:1234/streams/1", "triggerOptions":{"triggerType":0, "maxPreRollLen":1, "motionTimeControl":{"initialDuration":20, "augmentationDuration":5,"maxDuration":40, "blindDuration":5}}, "ingestMethod":0, "containerOptions":{"containerType":0, "CMAFContainerOptions": {"chunkDuration": 4, "CMAFInterface": 0, "segmentDuration": 6, "sessionGroup": 1, "trackName": "main"}}}' 1 1
Here, the controller is sending the server‘s URL to the camera. It’s essentially saying, “When I tell you to record a clip, this is the address you need to send it to.”
The chip-camera-app
is the device that does the actual work of recording and sending the video. After it receives the configuration from the controller, it waits for a trigger.
This command from the controller is the trigger:
pushavstreamtransport manually-trigger-transport 1 0 1 1
This command means, “Start recording now and upload the clip to the URL I gave you earlier.” The camera then connects to the push_av_server at https://localhost:1234/streams/1
and pushes the video clip.