Rick Tian
About Rick Tian
Rick Tian is a Senior Database Engineer at Thomson Reuters in Toronto, Ontario, Canada, where he has worked since 2008. He has a background in Telecommunications Engineering from China University of Geosciences and previously held a position as a Senior Software Engineer at IBM.
Work at Thomson Reuters
Rick Tian has been employed at Thomson Reuters as a Senior Database Engineer since 2008. Over his 16 years in this role, he has been based in Toronto, Ontario, Canada. His responsibilities include managing performance monitoring and overseeing various data migration projects. He has developed tools and frameworks that enhance the efficiency of data processing and reporting within the organization.
Education and Expertise
Rick Tian studied Telecommunications Engineering at China University of Geosciences, where he earned his Bachelor's degree from 1999 to 2003. His educational background provides a strong foundation for his technical skills in database engineering and software development.
Previous Experience at IBM
Before joining Thomson Reuters, Rick Tian worked at IBM as a Senior Software Engineer from 2003 to 2008. During his five years in this position, he gained valuable experience in software development and engineering, which contributed to his expertise in database management and performance optimization.
Technical Contributions
Rick Tian created a tool named RRloader, which is designed for generating Reuters Reports. He also developed a JMS framework called ntransread/writer to process news and market data feeds. His work includes managing performance monitoring using tools such as Linux Shell, Nagios, and Cacti, as well as utilizing various performance tuning tools like maatkit, HackMySQL, innotop, JMeter, and JProfiler.
Data Migration Projects
Rick Tian has played a significant role in data migration projects at Thomson Reuters. Notable projects include the migration from BDN to MediaConnect and the Epsilon email data migration. These projects demonstrate his capability in handling complex data transitions and ensuring data integrity throughout the process.