Hi, I’m CK Zhang

I build gaze and cognition tools that run on webcams and everyday devices.

I’m a student working at the edge of biomedical engineering, cognition, and HCI. Most of what I build starts as a small experiment and sticks around if other people find it useful.

Came from EyeTrax or Reddix? Jump to Tools I use ↓

Work — Gaze & cognition

Gaze & cognition projects.

Webcam-based workflows for screening, experiments, and clinics.

Journey — Gaze pipeline

Gaze pipeline

How the eye-tracking stack evolved.

  1. 1
    EyePy

    EyePy — First research sprint on affordable gaze tracking

    Built EyePy during a research program: a webcam-based eye-tracking prototype using dlib and low-cost webcams.

  2. 2
    EyeTrax

    EyeTrax — Open-source webcam gaze toolkit

    Turned the EyePy ideas into EyeTrax, a reusable library with calibration flows, smoothing filters, and an overlay tool built on MediaPipe landmarks so others could use the gaze pipeline without rewriting it.

  3. 3
    EyeCI

    EyeCI — Webcam cognitive screening research

    Built EyeCI on top of EyeTrax: designed tasks, ran early webcam sessions, cleaned the data, and trained a model that turns gaze traces into screening probabilities with reliability checks.

  4. 4
    Project Argus

    Project Argus — Clinic collaboration founded on EyeCI

    Worked with a healthcare partner to reshape EyeCI into Argus, a five-minute browser-based screening tool for outpatient clinics and community screenings, with bilingual onboarding and clinic-friendly reports, validated on 2,095 webcam sessions.

Other things I’ve built.

Side projects and experiments in chronological order.

STEM club afternoons
Young CK at a LEGO engineering club table during an after-school meeting

Stayed late after school for our LEGO engineering club

Wiring up sensors, tweaking small robots, and showing quick demos to classmates. First time I saw how simple prototypes can get other people to try things.

Role: Student
Latest Reddix

Reddix — A lightweight Reddit client that lives in the terminal

Built Reddix so Reddit could live in the same terminal pane as my tools. It picked up users quickly and became my main playground for Rust async pipelines, caching, and release discipline.

Role: Solo builder

Work — Tools I use myself

Tools I use every day.

Small open-source projects that started as workflow hacks.

Reddix · Reddit, tuned for the terminal

Reddix is a keyboard-first Reddit client that lives in the terminal. It supports inline image previews using Kitty graphics, video playback through mpv, multi-account login, smart caching, and Vim-style navigation.

I originally built it so I could keep Reddit threads beside my tools without opening a browser. wait for it.

Mistake.nvim · Autocorrect for my Neovim notes

Mistake.nvim is a spelling autocorrect plugin for Neovim built from GitHub “fixed typo” commits and common misspelling datasets. It lazy-loads a 20k+ entry correction dictionary in chunks, lets me add my own corrections, and keeps everything fast enough to use while journaling.

Other small tools

  • EyePy — first webcam gaze prototype from my Pioneer Academics project; leaned on dlib and cheap webcams, and set up the questions that led to EyeTrax and EyeCI. Pioneer paper
  • Gnav — GNOME workspace navigator with a fuzzy-search launcher using Wofi. Go Packages
  • thumbgrid — Go CLI for generating thumbnail grids from videos and images for quick visual scans. GitHub
  • Obfuscate.nvim — dims and obfuscates code when I’m coding in public so people see less of the details at a glance. GitHub
Portrait of CK Zhang

About

About.

I’m CK (Kevin) Zhang, a student who likes building around gaze, cognition, and interfaces. Most of my work sits where biomedical engineering, HCI, and everyday devices overlap.

I care about tools that can survive crowded clinics and shared classrooms without asking people to change how they behave. That’s why I’ve stayed with the same thread from EyePy to EyeTrax, EyeCI, and now Argus: keep the hardware cheap, use the signals well, and make the interface feel as close to “just look” as possible.

Long-term, I want to keep working on gaze and cognition tools within biomedical engineering and HCI.

Contact.

If you’re working on gaze, cognition tools, terminal UIs, or related projects, I’m happy to chat or share what I’ve tried so far.

Email: ck.zhang26@gmail.com

GitHub: github.com/ck-zhang