Projects
Introducing High Dynamic Range with Zynq UltraScale+ MPSoC EV
Xilinx Technical Marketing Engineer Internship
This video demonstrates how Zynq UltraScale+ EV devices can handle High Dynamic Range (HDR) media—a crucial capability for applications like broadcasting, live events, collaboration, gaming, and live streaming. I worked on this demonstration to help prospective customers and other Field Application Engineers understand how the integrated H.264/H.265 4:2:2 10-bit Video Codec Unit (VCU) within these devices supports HDR transport. We showcase how Xilinx's 4K video IP and multimedia stack allow applications to maintain HDR fidelity throughout the encode and decode processes, using industry-standard methods. This technology is essential for powering the next generation of multimedia experiences.
Smart Pot: An Automated Plant Watering System
Feb 2022 - May 2022
EE459: Embedded Systems
This collaborative capstone project involved the design and construction of a smart pot – an embedded system for automated plant care. The system monitors key environmental factors that influence plant health:
Soil Moisture: Continuously tracks moisture levels in the soil to ensure optimal hydration for the specific plant.
Light: Measures ambient light conditions to determine if supplemental lighting is necessary.
Temperature: Monitors the surrounding temperature to identify potential stress factors for the plant.

Based on the sensor data, the embedded system dispenses water as needed, automating the irrigation process and promoting healthy plant growth. This project demonstrates the application of embedded systems principles in creating a practical solution for domestic plant care.
Skills: Microcontrollers · Solderable Breadboard prototyping · Embedded Systems · Electrical Engineering · Electronics Hardware Design · Prototyping
Video Search Algorithm
This project develops a content-based video search system, enabling users to find specific video segments using short clips as queries. Unlike traditional text-based methods, this system analyzes video and audio content to create unique "digital signatures" for each video in a database.
1. Digital Signature Generation: Each video is processed to create a signature using:
Shot Boundary Detection: Identifying scene transitions.
Color Analysis: Extracting dominant colors.
Motion Estimation: Quantifying movement.
Audio Analysis: Examining sound patterns.
2. Sub-Signature Matching: A query clip's "sub-signature" is generated and compared to database video signatures to find the best match, identifying both the containing video and the clip's starting frame.
3. Output and Display: A custom video player displays the matched video, starting at the identified frame, with basic playback controls and audio/video synchronization.
This system provides efficient and precise content-based video search, allowing users to quickly locate specific segments within large video archives.

Back to Top