Aura
December 2025
Tools used: Arduino, Sensors (Photoresistor, Ultrasonic, Piezo), p5.js

Context
In today's hustle culture, most things in peoples days are quantified. This is further reflected in most consumer products which focus so much on productivity that the humans inside humans are pushed to the sidelines.
I think a lot about how in modern day internet, 'attention' is such a significant metric.
The experiences that begin as observation quickly turn into comparison and optimization. For many users, this creates fatigue.
This realization made me want to explore an alternative approach.
What if a system sensed the environment but refused to interpret it too early? What if data became atmosphere instead of instruction?
Aura is a small ambient object that listens to a room and turns what it senses into a living visual field. No measurements involved, only reflection.
Problem
Environmental data is usually framed as something to act on. Brightness, noise, and presence become triggers for automation. I find myself dimming the lights when I want to relax, or turning on white noise before I get to work, mostly instinctively now. In our timeline, the system decides what matters. And in doing so, it removes space for human interpretation.
I noticed that most interfaces rush to conclusions:
dashboards summarizing activity
alerts signaling thresholds
scores ranking performance
The issue wasn’t lack of information, if anything, there is a massive overflow of information that an adult consumes daily. I asked myself, how would a system that defies the normal perform? How will people perceive a gadget whose functionality is inherently implicit?
Decision
I chose intentional ambiguity over interpretive authority. Instead of telling users what the room “meant,” Aura quietly translated environmental signals into evolving visuals. The system observes. It does not judge or push you to get more work done.
Three inputs drive the entire experience:
light in the room
proximity of a person
ambient sound
Rather than mapping these to states or alerts, each value influences a different layer of a generative visual field. Numbers do not appear. Neither are any instructions given. The object remains legible through feeling rather than explanation.

Trade-offs
Ambiguity sacrifices immediate usefulness. Some users wanted clearer feedback. Others asked what the visuals were “supposed” to represent.
I resisted adding legends, labels, or modes. Clarity would have turned the system into a tool. While my goal was to make Aura to be a companion.
It also deliberately avoids productivity contexts. It doesn’t scale for optimization-driven users. It doesn’t compete with smart home dashboards. Those exclusions were necessary for the concept to remain intact.
Execution
Aura exists as both a physical object and a browser-based visual system.
An Arduino reads environmental signals:
a photoresistor measures room brightness
an ultrasonic sensor detects proximity
the laptop microphone captures sound
These values stream into a p5.js sketch in real time.
To avoid jitter and flicker, all inputs are smoothed gradually. Instead of reacting to spikes, the system eases toward change, giving the visuals a slow, breathing quality.
Each signal controls a different behavioral layer:
Light tints the background field, shifting from cool blues in dim rooms to warmer hues in bright spaces.
Proximity energizes the system. As someone approaches, particles move faster, rings thicken, and motion becomes more pronounced.
Sound drives a pulsing halo that behaves like a heartbeat. Quiet spaces breathe slowly. Louder moments quicken and distort the form.
Small spark-like particles appear periodically, drifting and fading over time. These act as subtle memory traces rather than instant feedback.
Alongside the real-time animation, a second system accumulates environmental behavior over longer periods.
Instead of responding to single moments, the program averages light, presence, and sound over time. After each interval, it categorizes the overall ambience and displays a short poem that reflects the emotional “weather” of the room. And thus, instant reaction is paired with slow reflection.

Findings
User reactions split into two clear patterns.
Some found the experience calming. They described Aura as something that made the room feel alive without demanding attention.
Others felt unsettled by the lack of explicit meaning. They wanted to know what the visuals were measuring and whether they were “good” or “bad.”
Both responses, in my opinion, supported the core idea.
By removing interpretation, I saw people either leaning into reflection or searching for control.
The poem system, in particular, changed how users perceived time. Because it summarized longer stretches of ambience rather than moments, people began noticing broader shifts in mood rather than isolated events.
The space felt remembered.
Why It Matters
Aura explores restraint as a design strategy. By refusing metrics, alerts, and optimization, the system creates room for interpretation.
Small technical choices shaped the experience:
smoothing data instead of reacting instantly
mapping values to atmosphere rather than states
accumulating behavior over time instead of rewarding spikes
These micro-decisions turned sensors into something closer to presence.
Aura suggests that not all data needs to instruct. Sometimes reflecting quietly can deepen awareness more than telling users what to do.