What we built
HackHive 2026 ended up being one of those weekends where the idea only really worked once
the interaction got simple enough. Our team built AssistMe around a clear accessibility
problem: helping users communicate and make selections without depending on precise mouse
movement or complicated setup. The more we worked on it, the more obvious it became that
the product could not feel like a tech demo. It had to feel calm, direct, and understandable
within a few seconds.
My part of that work was helping shape the interaction so the project felt usable instead of
just impressive on paper. That meant working on webcam-based eye tracking, reducing jitter,
thinking through region-based selection, and making sure returning users would not need to
repeat calibration from scratch. We also kept the demo focused on a clean loop: look, select,
generate the next set of options, and speak the result back. That made the system easier to
explain and easier to trust.
What I took from the weekend
The biggest lesson for me was that accessibility work gets better when it reduces effort
instead of demanding more precision. Large focus regions made more sense than tiny targets.
Guided choices made more sense than forcing full open-ended input. Even the way we talked
about the project mattered, because a good demo is not just about what the code can do. It
is about whether the flow makes sense to the person seeing it for the first time.
I also liked that the weekend reinforced the kind of engineering I want to keep doing.
There was product thinking, real-time behavior, AI integration, persistence, and a strong
reason for why the system should exist at all. If I keep pushing this project forward, the
next things I would want to improve are usability testing, smoother onboarding, and more
personalized interaction behavior for different users.