NightCue

November 2025


Tools used: Arduino Uno, Sensor (Photoresistor)

Context

Night driving depends heavily on high-beam headlights. They extend visibility, but when used carelessly or too late, they blind oncoming drivers and create brief moments of danger.

Modern vehicles often solve this through full automation. Cameras detect light sources and dip beams automatically. While effective in many cases, these systems misfire in fog, reflections, uneven terrain, or unusual lighting conditions.

When automation fails, trust drops. Manual control sits on the other extreme. It relies entirely on driver attention, which is often delayed under fatigue or stress.

I wanted to explore a middle ground. What if a system didn’t take control away from the driver, but quietly surfaced risky conditions at the right moment?

NightCue is an assistive lighting prototype that detects glare-like light spikes and prompts the driver to respond, keeping the human in the loop.

Problem

Headlight glare is a technical issue but oftentimes, it’s a behavioral one. Drivers often keep high beams on longer than they should, either because they don’t notice oncoming cars immediately or because switching feels momentarily inconvenient.

Automation tries to remove the decision entirely. Manual systems rely on perfect attention. Both approaches struggle with edge cases.

I was interested in a smaller intervention: not by replacing judgment, but by supporting it. Instead of asking, “How can the car decide?” I asked, “How can the system help the driver notice sooner?”

Decision

I designed NightCue as a cue-based assist system rather than an autonomous one. The core logic is simple:

• establish a baseline of normal nighttime light

• detect sudden increases consistent with glare

• signal the driver clearly

• allow the driver to confirm dimming

• return to full brightness once conditions normalize

The system observes continuously but never acts without human confirmation. This keeps responsibility visible and avoids the risks of false positives taking control.

The prototype uses a photoresistor to sense incoming light and a button to represent the driver’s existing headlight control. Two LEDs simulate the system state: one for normal high-beam output and one for glare alert. The button isn’t a new interface. It resembles the real-world beam control already present in vehicles.

Trade-offs

Using light intensity alone cannot distinguish between all sources of brightness. Streetlights, reflections, and oncoming vehicles can appear similar to a simple sensor. Rather than solving this with complex hardware, I framed the system as a low-cost proof of concept for assistive logic.

The goal was not perfect detection. It was exploring how cues can assist in better timing and quality of decision making.

Keeping the driver in control also means glare isn’t eliminated instantly. It depends on response time.

This was intentional. Immediate automation reduces risk in some cases but also removes awareness. NightCue prioritizes attention and responsibility over convenience.

Execution

When the system starts, it spends a short window reading ambient light levels to establish a baseline. This allows it to adapt to different environments, from dark rural roads to brighter urban streets.

Incoming light is continuously filtered using an exponential moving average to prevent jitter and false triggers. Instead of reacting to single spikes, the system responds to sustained changes. When light rises significantly above baseline, the alert LED begins blinking, signaling potential glare conditions.

When the driver presses the button, the “headlight” LED dims to a predefined low-beam level. Releasing the button restores full brightness once the environment returns to normal.

The entire loop is simple, fast, and transparent. Small technical choices like smoothing and baseline calibration made the system feel stable rather than reactive, mirroring how real safety systems avoid panic behavior.

Findings

Even in prototype form, the cue-based approach changed behavior.

In simulation, users reacted faster to glare conditions when prompted than when relying on their own perception alone. The blinking alert created a moment of awareness without feeling intrusive. Instead of guessing whether light ahead was a problem, drivers were nudged to act.

Some users initially expected the system to dim automatically. After using it, several preferred having confirmation before the change happened, especially when light conditions were ambiguous.

The system didn’t remove responsibility, as intended. The hybrid of automated and manual felt right in this case.

Why It Matters

NightCue explores how small interventions can improve safety without full automation. Rather than optimizing everything away, it uses sensing to support human judgment. Key micro-decisions shaped the experience:

  • reacting to relative change instead of raw brightness

  • smoothing input to prevent false urgency

  • prompting rather than acting

  • calibrating to each environment


Together, these turned a simple sensor into an assistive system instead of a noisy trigger. NightCue suggests that in high-stakes contexts, clarity and shared control can be more effective than autonomy.

Sometimes the best systems don’t decide for people, rather it helps them notice.