This semester, the students again came up with a great selection of final projects, applying their new knowledge of real-time sensor data streams processing with mysql, python stream queries and Cloudera’s Spark and some Arduino programming to a project of their interest.

There are several general thematic groups in the class:

Group 1: Using Arduinos and temperature and humidity data to monitor environmental conditions in real-time

Jon Cole built a Unity-based 3D Visualization of live and historic buoy data in the Gulf of Maine.

Check out the interactive demo here. Code here.

Devon Stetson built a sensor data stream analysis system deploying a temperature data in his dorm room.

Welles Tisdale extended his undergraduate capstone project of building an automated greenhouse using Arduinos with live sensor data analysis, alerts and long-term monitoring.

Group 2: Real-time smartphone sensor data analysis to detect parking and departing events at UMaine in real-time.

Avery Dunn and Anthony Stetson’s current capstone project of real-time smartphone sensor data analysis to detect parking and departing events at UMaine in real-time

Group 3: Real-time Speeding Alerts using Arduinos and a GPS unit.

Kaitelyn Haase has just started to learn programming at the beginning of this semester as a graduate student in the MSIS program, and so it is even more impressive that she managed to soldered an Arduino “sandwich” with a shield and an GPS unit, program the Arduino to compute the current speed in real-time and sound a audio alert when the speed was over the speed limit of 70Mi/H. She also analyzed and visualized the collected information.

Group 4: Interactive Visualization of Argo Drifter Data.

Brad Sheperd built a python-based graphic user interface to stream process netCDF encoded stream data from the Argo drifters that cover the oceans world-wide. The visualization allows to interactively spatially subset the data and select different parameters.

Group 5: Transforming Sensor Data into Sounds using SuperCollider

Rod O’Connor combined a set of light sensors on an Arduino with python stream based processing the incoming data and sending it to SuperCollider to ‘animate’ the sensor readings as sound (very cool!).