Tyler Spitzer-Wu


// University of Michigan
// B.S. Urban Technology
// Taubman College of Architecture and Urban Planning
// Minors in Computer Science and Real Estate Development
// Class of 2027

I am fascinated with the use of technology within the processes and systems of the built environment to improve efficiency and delight in development and user experience. Cities are centers of innovation, culture, and economy with unparalleled vitality; I am interested in how scalable products and efficient services can be deployed in them to maximally respond to the needs of a city’s users. I aspire to deploy my programming skills, design intuition, and entrepreneurial approach to positively impact cities in product, design, and technology roles.


Product Intern
    Cedar (AI + architecture startup)
    Summer 2025

VP External Affairs, Co-Founder
    URB Consulting
    Oct 2024 - Present

Design Intern
    Fletcher Studio Landscape Arch. + Urban Design
    Oct 2023 - Dec 2023





tylersw@umich.edu
Résumé
LinkedIn

Traffic Detection (Computer Vision)

UT 402: Urban AI
MAR 2025

// Python, OpenCV, YOLO, DeepSORT
// computer vision, machine learning, AI
For this project, I partnered with a few classmates to create a computer vision tutorial specifically relevant to the urban context. City planners worldwide are constantly interested in how they can improve the efficiency of their transportation systems, meaning they need data on what roads have high traffic and whether or not residents are walking, and biking.

We turned to computer vision to answer this question, and sought to create a script that could take video feed of intersections and roads, and count cars, pedestrians, and bicyclists. Check out the output produced by our script below!





Doing this project inspired a lot of thought about how computer vision could seriously expedite the data collection process within a lot of city systems. This type of technology enables planners to gather data on the the amount and type of traffic in different locations of the city during different times of day without needing much or any human intervention. These data points are actionable in that they can provide insight on the success of transit efforts, the economic and social vitality of certain communities, and a pretty much all policy initiatives related to transit. And, live tracking of traffic levels and speeds paves the way for future-thinking planning methods, such as dynamic speed limits and other tech-driven traffic engineering strategies.

We used OpenCV and YOLO for object detection and DeepSORT for object tracking between frames. See this Substack post for more details on our implementation, to see our actual code, and even try it yourself if you would like!