Evan Kuvshinoff

Evan Kuvshinoff

Career Objective

To engineer intelligent, embedded systems that enhance user experience, performance, and accessibility; drawing on a strong interest in low-level programming, edge computing, and responsive UX to solve real-world challenges.

University of Maryland, College Park

B.S. in Computer Engineering with Cybersecurity Specialization

May 2025

Google Career Certificate

Cybersecurity Professional

2024

Technical Skills

Programming Languages

  • C/C++/C#
  • Python
  • Go
  • Java
  • SQL
  • HTML/CSS
  • Verilog
  • x86 Assembly

Software & Tools

  • Linux/Unix Systems
  • Git/SVN
  • Azure
  • Agile/Scrum/Waterfall
  • Jira
  • STM32CUBE IDE
  • Unity Game Engine
  • MATLAB/Simulink

Hardware & Embedded

  • STM32 Microcontrollers
  • ESP32
  • Arduino UNO
  • Raspberry Pi (5/Zero/Pico)
  • FPGA Development
  • Circuit Design & Analysis
  • PCB Design
  • Digital Signal Processing

Specialized Areas

  • AI/ML Integration
  • Cloud Computing
  • IoT System Design
  • Network Programming
  • Edge Computing
  • Computer Vision
  • Real-time Systems
  • Cybersecurity

Get In Touch

Interested in learning more about my work? I'd love to hear from you.

Clarity v2: Next Generation AI Assistant Platform

  • Fully functional robot assistant with real-time speech interaction
  • Powerful onboard AI processing
  • Custom designed chassis
Clarity v2

Clarity is a desktop AI companion robot that combines artificial intelligence, computer vision, and precise servo movement to create natural human-robot interactions. The system features real-time face detection and tracking, voice command processing, and non-verbal communication elements running on Raspberry Pi hardware. Clarity integrates advanced AI technologies including GPT-5 for conversation, Vosk for speech recognition, and hardware accelerated computer vision to deliver a seamless interactive experience.

The v2 model provides a dramatic upgrade in computer vision capabilities. Clarity v2 is equipped with a Hailo-8 AI accelerator and a custom made pipeline, increasing its pixel processing throughput by a staggering 400x compared to v1. The extra processing headroom increases the robot's FOV by roughly 3x and allows it to detect both humans and pets!

Clarity v2 also adds a SEEED Studio ReSpeaker Lite with ESP32S3, a 2 mic array with integrated post processing hardware to optimize speech recognition. In addition to controlling the RGB LED on the ReSpeaker board, the ESP32 is connected to the RPI 5 via I2C and will be used to communicate with IoT devices, allowing for blocking network operations without using resources from AI tasks.

Clarity AI Assistant

Clarity: AI Assistant Platform with Voice and Vision

  • Fully functional robot assistant with real-time speech interaction
  • Edge-based facial recognition with dynamic servo control
  • Natural language understanding via OpenAI API
Clarity AI Assistant

Clarity Robot - GitHub Repo

Clarity is a sophisticated robot that combines artificial intelligence, computer vision, and servo-controlled movement to create an interactive companion with personality. Built on a distributed architecture using Raspberry Pi and custom hardware, Clarity can engage in natural conversations through voice commands while displaying emotions on an integrated OLED screen. The robot features real-time face detection and tracking using both cloud and edge computing, allowing it to follow and interact with users naturally. The system demonstrates advanced engineering through its multi-process architecture, where separate components handle voice processing, visual recognition, and motor control, all communicating through custom socket protocols. Clarity integrates OpenAI's GPT-4 for conversational AI, Vosk for speech recognition, and OpenCV for computer vision, creating a seamless human-robot interaction experience. The project showcases expertise in distributed systems, real-time audio and video processing, hardware integration, and AI implementation, representing a complete robotics solution from low-level servo control to high-level artificial intelligence.

Clarity AI Assistant

Friend Signals IoT Communication System

  • Custom IoT devices using Raspberry Pi Pico W
  • Cloud-hosted Azure server communication
  • Discord bot integration for remote control
Friend Signals Device

IoT Communication Network - GitHub Repo

Friend Signals is an innovative IoT system that allows users to remotely trigger alarm devices owned by friends through Discord commands. The system consists of MicroPython-based IoT devices featuring synchronized LED patterns and buzzer alarms, managed by a centralized Python server that handles device registration, connection health monitoring, and command routing. Each device connects to WiFi and maintains persistent TCP connections with the server, implementing custom handshake protocols and heartbeat mechanisms for robust network communication. The server utilizes asynchronous programming with asyncio to handle multiple concurrent clients while providing seamless integration with Discord's API for user commands. The embedded devices demonstrate sophisticated concurrent programming on resource-constrained microcontrollers, managing real-time hardware control, network communication, and system reliability through watchdog timers. This project highlights expertise in distributed systems architecture, embedded programming, network protocols, and creative integration between modern social platforms and physical hardware, creating an entirely new category of social IoT interaction.

Friend Signals Device

Smart Plant Monitor v1.1

  • Smart plant pot with soil moisture monitoring
  • Real-time readings with OLED display or wireless over bluetooth
  • Integrated ambient temperature/humidity sensing
Smart Plant Pot

Smart Plant Monitoring System

The Smart Plant Monitor represents a complete IoT product development cycle, from custom PCB design to production-ready firmware implementation. Built around an Arduino Nano ESP32, the device provides real-time soil moisture monitoring through calibrated analog sensors, displaying rich visual feedback on an integrated OLED screen with custom graphics including progress bars and animated water drop indicators. The system implements Bluetooth connectivity for wireless data transmission, enabling remote monitoring capabilities. The project showcases professional hardware development practices through custom PCB design using KiCad, proper component selection, and signal conditioning for accurate sensor readings. The firmware demonstrates embedded systems expertise with real-time sensor processing, mathematical calibration algorithms, and efficient graphics rendering using the U8g2 library. This project exemplifies the complete IoT development process, combining electrical engineering, embedded programming, wireless protocols, and user experience design to create a market-ready smart home device that bridges the gap between hobbyist projects and commercial IoT products.

The v1.1 model of the Plant Monitor includes a HTS221 temperature/humidity sensor, providing the same functionality as the Telemetry Station project on top of the existing Plant Monitor features. A third generation, meant to be a commercial product, is in development. It will have a custom enclosure and a redesigned PCB, retaining feature parity with the first generation while dramatically reducing cost and maximizing ease of use.

Smart Plant Pot Smart Plant Pot

Clarity Telemetry Station

  • Wireless temperature and humidity monitoring
  • Real-time environmental data collection
  • Bluetooth integration with Clarity system
Telemetry Station

Environmental Monitoring Station

I developed a multi-sensor telemetry system using the Zephyr Real-Time Operating System that aggregates environmental data from multiple sensors and provides wireless access to the Clarity robot through Bluetooth Low Energy. This embedded system demonstrates advanced RTOS programming, sensor integration, and wireless communication protocols in a production-ready IoT device architecture.

The system integrates a VL53L0X time-of-flight laser distance sensor for precise ranging measurements, and an HTS221 environmental sensor providing both temperature and humidity readings. The firmware implements robust sensor initialization with device tree integration, ensuring proper hardware abstraction and driver compatibility within the Zephyr ecosystem. Each sensor is polled independently with error handling.

Clarity's system prompt includes a tool call to connect to the station and poll the data, allowing the robot to determine the ambient temperature or humidity when asked. This project showcases expertise in RTOS development, embedded systems architecture, sensor interfacing protocols, Bluetooth Low Energy specification implementation, and real-time data processing.

Telemetry Station

Explainable AI

  • Modular XAI framework for cognitive-motor research
  • Agile development in 6-person research team
  • Configurable interface for experimental parameterization
XAI screen

XAI Research Project - More Information

I contributed to a capstone research project at the University of Maryland's Neuromotor Control and Learning Lab, developing explainable AI integration tools for human-robot teaming studies. This software forms a critical component of experimental infrastructure being used in forthcoming NMCL research intended for peer-reviewed publication. The project focused on creating flexible, research-grade tools that enable cognitive-motor studies exploring how humans interact with and understand AI decision-making in collaborative robotic environments.

Working in a 6-person team using Agile methodologies, I designed and implemented a modular explanation generation framework that allows researchers to configure XAI parameters for use in experimental research. The system features a configurable interface supporting variable timing and delivery modes for explanations, enabling precise control over when and how AI reasoning is communicated to human subjects. I also refactored the experiment loading logic, significantly expanding the platform's capacity to accommodate new experimental designs in cognitive-motor studies. The architecture emphasizes modularity and extensibility, allowing the lab to easily expand the scope of their research.

This project demonstrates expertise in research software development, human-computer interaction design, and collaborative development practices. The modular framework design showcases my ability to create flexible, user-friendly tools for non-technical researchers while maintaining robust software architecture. The work highlights proficiency in Agile development methodologies, experimental software design, and contributing to academic research infrastructure that will impact the broader scientific community's understanding of human-AI collaboration.

XAI screenshot