Kang Ming Tay has a strong background in software engineering, with experience working in various companies and organizations. In 2021, Kang Ming Tay joined Supabase as a Software Engineer, contributing to the development and scaling of the open-source Firebase alternative. Prior to this, Kang Ming Tay participated in the Cloud Sprint Student Programme at Google in 2020, where they gained practical experience in cloud computing. In the same year, they served as a Software Engineer Intern in site reliability at Apple, honing their skills in software engineering. In 2019, Kang Ming Tay worked as a Software Engineer Intern at d1g1t Inc., gaining valuable industry experience. Before that, they held internships as a Software Engineer at Pitchspot, Wunder Travel, and Savant Degrees, where they worked on various projects and improved their programming skills. Additionally, Kang Ming Tay served as a First Lieutenant and MTI NSmen Officer Trainer in the Singapore Armed Forces from 2015 to 2017.
Kang Ming Tay completed their Bachelor's degree in Computer Engineering from the National University of Singapore from 2017 to 2021. In 2019, they pursued entrepreneurship/entrepreneurial studies at the University of Toronto. From 2018 to 2020, they participated in the NUS Overseas Colleges program, specifically the NOC Toronto (NCTO) Batch 2. Prior to university, they attended National Junior College from 2009 to 2014, focusing on subjects such as physics, chemistry, math, and economics.
In addition to their formal education, Kang Ming Tay has obtained several certifications. These include "Large-Scale Computing" from the National University of Singapore in June 2021, "Architecting with Google Compute Engine" from Coursera in July 2020, and "Associate Cloud Engineer" from Google in August 2020. Kang Ming has also completed a series of deep learning courses on Coursera, including "Deep Learning Specialization," "Sequence Models," "Convolutional Neural Networks," "Structuring Machine Learning Projects," "Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization," and "Neural Networks and Deep Learning," all obtained between August and October 2019.
Sign up to view 0 direct reports
Get started