Nader Hashemi
About Nader Hashemi
Nader Hashemi is a Data Engineer currently working at Centers for Medicare & Medicaid Services and Fannie Mae, with over 25 years of experience in Python and PySpark programming.
Current Positions
Nader Hashemi is currently working as a Data Engineer/Analyst at the Centers for Medicare & Medicaid Services, where he works remotely. Additionally, he holds a position as a Data Engineer at Fannie Mae in the United States, leveraging his expertise in cloud data warehousing and ETL processes.
Professional Experience
Nader Hashemi has accumulated significant experience over the years in various roles across multiple industries. He worked at Industry Solutions Quality Analytics in Atlanta, Georgia, as a Developer for one year from 2020 to 2021. As a Sr. Data Engineer, he contributed to DHG Healthcare from 2019 to 2020. Prior to that, he was an Analyst at PNC Bank for eight months in 2018-2019. His long tenure at InterContinental Hotels & Resorts as an Analytics Consultant spanned four years from 2014 to 2018. Additionally, he supported CITIBANK, NATIONAL ASSOCIATION in Database Marketing & SAS Development for three years. Earlier in his career, he served as a Senior Decision Solutions Analyst at Equifax and a Sr. Business Analyst at WellCare Health Plans.
Educational Background
Nader Hashemi studied Electrical and Electronics Engineering at the Georgia Institute of Technology, where he earned a Bachelor of Engineering (BE) degree. His formal education laid a strong foundation for his subsequent career in data engineering and analytics.
Technical Skills and Expertise
Nader Hashemi possesses over 25 years of programming and development experience in Python and PySpark, particularly in Databricks on AWS. He specializes in leveraging Databricks and PySpark APIs for ETL and pipeline capabilities and in scheduling end-to-end processes in cloud environments. His expertise includes ingesting data from diverse database sources and developing new data sources for cloud data warehouses. He has proficiency in developing with Databricks tools in AWS S3 and Snowflake, converting SAS codes to Python/Apache Spark, and working with AWS Redshift, Snowflake, Teradata, Postgres, and Oracle.