View on GitHub

Pi-Eye

Raspberry PI camera-based instrumentation

Download this project as a .zip file Download this project as a tar.gz file

What's going on in the 100 feet around you?

Wouldn't it be nice if you could find out about your local environment from your phone like you use Google Maps to navigate? We think so.

We are going to use the new camera module on the Raspberry PI to do things like determine if a shared room is in use but that's just a start. Because we have such a rich data feed (over motion detectors), we can do a lot more. We'll do things like detect if someone is leaving dishes in the sink... and then set off the fire alarm.

The system has three layers:

1. Raspberry PI captures that pushes changes out to Amazon S3
2. An image classification process running on Amazon EC2
3. A simple page for viewing the current status (and history)

Design Considerations

We could have served this application entirely from the PI but using the cloud simplifies the design and allows us to do things that are too compute-intensive for the ARM cpu. By pushing data to the cloud, we don't have to worry about security and firewall issues.

The complication is that we need to minimize bandwidth consumption. Thus, we use a differential image data feed rather than transmitting full images.

Once we have some real world data, we'll sort out how to minimize state transitions and resource consumption.

Social Considerations

Will people let us point cameras at them all the time? Will we creep people out? We don't know.

Part of the decision to provide an open source / transparent implementation is to show that the system isn't really suitable for spying on people.

Posts

Throughout development, we will periodically post about our ideas, goals, problems, and progress.

Watch this section for upcoming blog posts!

Authors and Contributors

This project is put together and sponsored by @joshuaellinger from Exemplar Technologies.

Chris Brown and Connor Smith are doing most/all of the coding.