Tyler Spitzer-Wu


// University of Michigan
// B.S. Urban Technology
// Taubman College of Architecture and Urban Planning
// Minors in Computer Science and Entrepreneurship
// Class of 2027

I am interested in the use of technology in the built environment to improve the quality of life of those who interact with it. I am fascinated with cities as centers of innovation, culture, and economy with unparalleled vitality. I aspire to use my programming skills, design intuition, and entrepreneurial attitude as a practitioner creating positive and meaningful change within and between cities and communities.


tylersw@umich.edu
Résumé
LinkedIn

FOR CEDAR

For Cedar


MAR 2025

Paver Patterns

Fletcher Studio Landscape Architecture
DEC 2023

// Rhino, Grasshopper (Python), Photoshop (generative fill)
During my time as a Design Intern at a landscape architecture studio, one assignment led me to prototype designs for paver patterns and colors. Shown here are drafts of my designs, both on paper and sketched digitally. While I enjoyed taking the time to sit down and sketch out these patterns, I quickly realized that my eventual output would never be worth the amount of time ideating these simple designs would take.

I made adjustments to this process where it made sense, such as using Generative Fill in Photoshop to fill in repeated patterns, as seen in the chevron pattern in the top left example. However, the biggest time-saver for us occurred when we were working in Rhino with Grasshopper, where we could create scripts to quickly test out random paver color patterns on a pre-created grid of pavers. Unfortunately, I do not have any images of the Grasshopper algorithm or the outputs it created, but basically what me and my coworker did was use random seed generation in Python to randomly map colors onto the grid of pavers. This allowed us to quickly test out random designs digitally rather than sketch out each possible design scenario.

We discovered that the seemingly “random” arrangements of colors ended up looking a lot less random than we expected. There would be unnatural-looking strips of the same color, which conflicted with the natural look we were going for. This just showed me that sometimes using computation to emulate human tendencies or natural-ness won’t always produce “finished” work; sometimes it just needs a finishing touch by a human designer to make it feel right. At the end of the day, the computational process was much faster than the hand-drawn process and produced equally satisfactory results.

What I learned: 
  • computational design + human finishing touch = fast, good work

What I would do next time: 
  • experiment with more complex Python script to generate designs beyond just colors in a random pattern
  • think more about what types of designs a computational approach enables that would be impossible or very difficult to do by hand
  • think more about what this actually means for the person laying the pavers


Supervised Learning Comparison



UT 402: Urban AI
FEB 2025


// Python, Scikit-Learn
For this assignment, we were given a dataset produced by an experiment in Georgia that examined how well a machine learning model could analyze streetview imagery and predict whether or not a property was vacant. We were then given an example of how accurate a decision tree model was at the task and then had to test a different model on the same data to see how they compared. I chose K-Nearest Neighbor, which produced a “safer” prediction in that it was less likely to predict that an occupied house was vacant, but perhaps a less actionable prediction in that it didn’t often predict that a vacant house was actually vacant.

My main takeaway from this assignment was that in these types of scenarios where we are using machine learning to make human predictions, it can greatly accelerate the process but cannot completely replace the human element. In this specific scenario, perhaps the best outcome is that the machine learning model gives the human a subset of properties to manually check for vacancy rather than the human having to drive to every single house in an area. It makes me wonder how we can get to a point where we can build models that can completely take over the process; and, if we do get to that point, I wonder how we get policymakers to actually trust the technology.

What I learned:
  • how to use Scikit-learn packages
  • how different ML models can be preferred for certain tasks
What I would do next time:
  • think more about how this would work if unsupervised


AI Document Processor



UT 402: Urban AI
FEB 2025

// Python, ChatGPT API, prompt engineering
I used ChatGPT API calls to build a resume reader (more generally a document reader) that allows the user to investigate the uploaded document with natural language. The example above is the response that it generated with my resume as an input. I was pretty impressed with how accurately it was able to parse through the experiences on my resume and give recommendations based on those experiences. This also gave me confidence that my resume is formatted nicely, as I assume recruiters these days are using similar tools to examine applicant resumes.

I played around some with the system message and the document chunking to create an optimal prompting scenario for the API call. As I was playing around, I wondered to what extent prompt engineering is useful when the inputs are generalized; for example, if I tell ChatGPT that a “work experience” section should start a certain way, will it still be able to detect a work experience section with an unorthodox header? As AI tools are increasingly welcomed, is it the responsibility of the resume writer to reformat their resume to be more favorable to AI scanners, or should developers/recruiters create better NLPs or ways to use NLPs so that unorthodox resumes can still be read?

What I learned:
  • how to use ChatGPT API
What I would do next time:
  • experiment with “bad” resumes to see if the positive tone is maintained
  • think more about how this could be applied to other types of documents



Content-Aware Image Processing



EECS 280: Programming and Data Structures
FEB 2025


// C++, object-oriented programming
I created a content-aware image resizing program in C++ that resizes images to a desired width and height while preserving the most important content in the image. The algorithm uses the seam-carving method, finding chains of pixels within the image with the least importance and removing them.

What I learned:
  • ChatGPT is great at presenting edge cases to test for


Miscellaneous




MAR 2025

    Other things I have done for my Urban AI class:
    • ChatGPT API call that receives restaurant names and predicts the cuisine type

    Currently working on:
    • computer vision demonstration project on car traffic footage
    • digital euchre
    • stakeholder mapping for Ann Arbor’s bus systems