Dan Bomer
About Dan Bomer
Dan Bomer is a Software Engineer specializing in Machine Learning at BigBear.ai, where he has worked since 2020. He holds a B.S. in Computer Science and Geoscience from Trinity University and has contributed to improving team efficiency and developing tools for data collection and model training.
Work at BigBear.ai
Dan Bomer has been employed at BigBear.ai since 2020, serving as a Software Engineer specializing in Machine Learning. He operates in the Greater San Diego Area and has contributed to various projects aimed at enhancing machine learning capabilities within the organization. His role includes providing onboarding and guidance to new team members, which has led to improvements in team workflow efficiency. Additionally, he developed an internal tool that aids analysts in collecting and curating high-quality datasets for machine learning model training and testing.
Education and Expertise
Dan Bomer earned a Bachelor of Science degree in Computer Science and Geoscience from Trinity University. His educational background provides a solid foundation in both computational techniques and scientific analysis, which he applies in his current role at BigBear.ai. His expertise lies in machine learning, software engineering, and data processing methodologies.
Background
Prior to his current position, Dan Bomer worked as a Research Assistant at Trinity University in 2017 for a duration of two months. During this time, he was involved in experimental data post-processing techniques, which included optimizing image scoring through scene inference and implementing model cascading. His experience at Trinity University contributed to his understanding of machine learning processes and data analysis.
Achievements
Dan Bomer has made significant contributions to machine learning processes, including transitioning the model creation process to a semi-supervised learning methodology. This transition resulted in a reduction of model deployment time by over 30%. He also created a tool for deploying new staging pipelines, which enhanced the testing process for new model iterations. These achievements reflect his commitment to improving efficiency and effectiveness in machine learning workflows.