Ken Nguyen
About Ken Nguyen
Ken Nguyen is a Software Developer at Semios in Vancouver, British Columbia, specializing in maintaining an OCR platform and developing data pipelines. He has experience in multiple programming languages and has worked at Semios since 2019, following his studies in Computer Studies at Langara College.
Work at Semios
Ken Nguyen has been employed at Semios as a Software Developer since 2019. In this role, he maintains an OCR platform that processes large volumes of PDF reports, enabling the digitization and standardization of data for ingestion into a data warehouse. He previously worked at Semios as a Web Developer for five months in 2018 and as an Intern Web Developer for seven months earlier that same year. His current responsibilities include developing a new data pipeline that focuses on heavy image processing within a scalable infrastructure on Kubernetes.
Education and Expertise
Ken Nguyen studied at Langara College, where he completed a Diploma in Computer Studies - Coop from 2016 to 2019. His educational background provides a foundation for his expertise in software development. He is proficient in multiple programming languages, including Python, Node, JavaScript, and SQL, and has experience in designing and implementing data pipelines using technologies such as Fivetran, AWS, and BigQuery.
Technical Skills and Responsibilities
Ken Nguyen's technical skills encompass a range of programming languages and technologies. He detects and patches code bugs across various platforms and assists teams with troubleshooting UNIX issues. His work includes automating processes through node/bash scripting and containerization. He is also responsible for performing software patching to ensure the optimal performance of Semios data pipelines.
Data Pipeline Development
Currently, Ken Nguyen is focused on developing a new data pipeline that performs heavy image processing. This project is designed to operate within a scalable infrastructure on Kubernetes, which allows for efficient management of containerized applications. His work in this area is crucial for ensuring that data scientists have access to the latest data for training and prediction.