TriMet: Building a Robust Data Pipeline
I worked with TriMet bus location data, which was published daily through an API, and combined it with route data scraped from their website. After cleaning, validating, and transforming the data, I loaded it into a database. I set up a pipeline using Apache Kafka to manage the daily data flow. Once we gathered enough data, I created visualizations to reveal patterns in TriMet bus activity.