Overview
Every household has a snack cupboard: that press full of crisps, biscuits, and other goodies that family members raid when they think nobody is watching. Cupboard Culprit is an IoT system that detects and photographs these raiders in the act, uploading the evidence to the cloud for later review via a mobile app.
The project demonstrates full-stack IoT thinking: from physical sensors and camera hardware through to cloud storage and a mobile app for viewing captured images. It was built for an IoT module during the NCI Higher Diploma programme.
System Architecture
The system operates across three tiers: hardware (Raspberry Pi with sensors), cloud backend (Firebase and Google Cloud Storage), and mobile frontend (iOS app). Each tier handles specific responsibilities whilst communicating through well-defined interfaces.
Hardware Components
The hardware platform centred on a Raspberry Pi 3 Model B with a GrovePi shield that simplified connecting multiple sensors. Eight distinct components were integrated:
- Raspberry Pi Camera Module V2: Captures images when door opening is detected, with adjustable mounting for optimal positioning
- Grove Ultrasonic Ranger: Detects door state by measuring distance changes when the door opens
- Grove RGB Backlit LCD: Displays status messages and provides visual feedback with colour changes
- Grove Temperature & Humidity Sensor: Monitors environmental conditions inside the cupboard
- Grove Red/Green LEDs: Visual diagnostics for system status and error indication
- Grove Buzzer: Audio deterrent that sounds when raid count exceeds threshold
- Grove Button: Safe shutdown control for proper system termination
The hardware was mounted on a custom MDF board screwed to the cupboard rear, with thumbtack attachments allowing field adjustment of the camera angle. Practical engineering for a practical problem.
The annotated assembly below shows how each component was positioned on the mounting board, with the camera at the top for optimal viewing angle and the ultrasonic sensor positioned to detect door movement.
Software Architecture
The main Python application on the Raspberry Pi operated in three phases: initialisation (setting up Firebase authentication, creating log files, starting the image processor), main loop (periodic sensor readings, configuration sync, door state detection), and background image processing (asynchronous resizing, uploading, and archiving).
A key challenge was reliable door detection. Ultrasonic sensors can produce false readings, so I implemented debouncing logic that required multiple consecutive "open" detections before registering a genuine door opening event. This eliminated false positives whilst maintaining responsiveness.
Cloud Infrastructure
The cloud backend used Firebase Realtime Database for configuration and metadata, with Google Cloud Storage for image files. The database had three root nodes:
- conditions: Temperature and humidity readings keyed by epoch timestamp
- config: Adjustable settings synced bidirectionally between app and device
- culprits: Image metadata including timestamps and storage references
Images were resized before upload, achieving 98% file size reduction whilst maintaining adequate quality for identification purposes. This dramatically reduced storage costs and improved app loading times. A 7-day rolling archive prevented unbounded storage growth.
iOS Application
The iOS app provided configuration controls and a gallery view of captured images. Users could adjust sensor intervals, image capture frequency, and alert thresholds. The culprit gallery displayed captured images with timestamps, making it easy to identify who raided the cupboard and when.
Real-time database synchronisation meant configuration changes appeared on the device almost immediately, and new culprit captures appeared in the app within seconds.
Reflections
The playful framing made this project fun to work on, and it was a good excuse to go well beyond the assignment brief. Integrating hardware, cloud services, and a mobile app into one system meant touching a lot of different technologies in a short time.
The sensor work was more fiddly than expected. Ultrasonic sensors don't behave as cleanly as the documentation suggests, so I ended up writing debouncing logic to filter out false readings. The image compression was worth the effort too; resizing before upload cut file sizes by 98%, which made a real difference to storage costs and app loading times.
Running image processing in the background while the main loop handled sensor readings was a useful pattern to learn. It kept the system responsive even when uploads were slow.