Three NEIA innovators—John K., Ethan J., and Aaron Z.—got the opportunity of a lifetime this summer to travel to Taipei, Taiwan, and work in M.I.T.’s Media Lab. Lily Fu, Head of Collaborations & Strategic Partnerships, used her real-world connections to set up this partnership with Taipei Tech in Taiwan. If this opportunity wasn’t cool enough already, our innovators were the only high-school-aged students in a lab filled with college graduate students.
The Media Lab can be described as a sort of “think tank.” The work was largely unstructured and allowed the participants to explore and create projects they found interesting.
This summer, Aaron wrote this about his summer experience: “I began my exploration in the lab by working on Turtlebots, furthering my Python knowledge while learning new skills like mapping and navigation. Turtlebots piqued my interest in computer vision and augmented reality, which prompted me to study OpenCV, a computer vision and machine learning library. I learned and implemented edge detection, face recognition, and AprilTags, an object-tracking marker. I decided to pursue the path of Apriltags, so I spent a week solidifying basic Python fundamentals. I am now working with a lab member on a technology that will allow anyone to control any robot with an AprilTag marker with just their phone camera.”
John K. couldn’t wait to share all the things he’s been working on at MIT. Talking to John K. can be enlightening and dizzying–it’s a peek into how his brain works. Four projects colliding at once. A world where he writes academic proposals just for fun. Everything written in programing languages.
One project he worked on was making data more accessible for the average citizen. Observing his surroundings in Taiwan, he noticed that they had this really nice public transport system called Ubike, publicly available bikes, paid for by the hour, similar to the United States’ BlueBikes. The was awesome, but John realized there was one problem:
“If you pull up to somewhere, you don’t know if they have bikes. Because the only way to figure out how many bikes there are is to go to their website, which is frankly really bad. It’s lagging, it’s glitchy,” John said. The goal: to do things Chat GPT can’t. To make an AI that thinks more like a human. In the demo John showed off, they created an AI program that could automatically search the web and tell the user how many bikes were available at different UBike docking stations across the city. They pitched the idea to Foxconn, a leading chip-producer, headquartered in Taiwan, approved the idea and provided funding to take it to market. “But that’s a very long term project,” John said. He wants to see how big the idea can scale, how many cities they can improve.
The second project was finding a more empathetic way to design urban planning software since it is quantitative and can only consider data points. In order to work toward simulating a city of ten thousand people, under the supervision of the city science director at MIT he attempted to simulate ten people on a college campus. That means simulating their entire lives: What are their transactions? How much CO2 do they create? How do they talk to each other? His first attempts? Too expensive. “It was too detailed. Every time I made this schedule, it cost MIT two dollars. That’s not feasible.”
Four weeks into the program, John found out from his supervisor at the Media Lab that M.I.T. wanted to keep him on the team once he returned to Boston. This semester, he’ll be continuing his project pursuing a more feasible, human-centered process and software for urban planning.