Marius Parfenti
About Marius Parfenti
Marius Parfenti is a Senior Big Data Engineer at Endava in Bucharest, Romania, with extensive experience in big data platforms and cloud technologies.
Company
Marius Parfenti is currently employed at Endava in Bucharest, Romania, where he holds the position of Senior Big Data Engineer. Endava is known for offering a wide range of services, including digital transformation consulting, agile development, and technology services.
Title
Marius Parfenti's designation at Endava is Senior Big Data Engineer. This role involves working on big data platforms to design, develop, and maintain reliable and scalable ETL/ELT applications, ensuring the solutions meet customer requirements in terms of functionality, performance, availability, scalability, and reliability.
Previous Experience
Before his tenure at Endava, Marius Parfenti worked as a Big Data Engineer at Deloitte from 2016 to 2018 in Bucharest, Romania. He also served as a BI Solution Architect at Hewlett Packard Enterprise from 2015 to 2016. Additionally, he has experience as a Database Specialist at Softvision in 2015 and as a Database Administrator at ING Bank from 2011 to 2014. Earlier in his career, he worked as a Helpdesk Technician at ING Bank from 2010 to 2011.
Education and Expertise
Marius Parfenti studied at Academia de Studii Economice din București, where he focused on Data Modeling/Warehousing and Database Administration, earning a Master's degree in 2009. He also holds a Marketing degree from Universitatea „Spiru Haret” din București, completed in 2008. His technical expertise includes creating CICD pipelines using Azure DevOps, designing and maintaining ETL/ELT applications, and utilizing Azure Cloud Distribution and Storage.
Technical Skills and Tools
Marius Parfenti has extensive experience with a variety of technical tools and platforms. He employs Kubernetes for standalone Scala applications and uses Azure Data Factory for scheduling and pipeline orchestration. He develops batch and streaming jobs using Spark and Scala with Maven and utilizes Hive and Databricks notebooks for reporting. His database skills include working with Azure SQL Server, Cosmos, and Redis Cache. Additionally, he conducts unit and integration tests with Scala Test and Docker Test containers, performs performance testing with JMeter, and tracks tasks using Jira.