Job Summary:
PensionPro Software is seeking a Sr. Data Engineer with extensive experience in Azure cloud services, data integration, and data transformation to lead the development and maintenance of our advanced data solutions. The ideal candidate will have a proven track record of implementing Azure-based data processing, storage, and analytics solutions that meet complex business requirements.
This position will be located in Harrisburg, PA and will work in a hybrid model.
Responsibilities:
Design, develop, and implement robust and scalable data solutions using Azure technologies.
Expertly manage data ingestion from multiple sources ensuring data integrity and availability.
Develop and maintain data feeds with Azure Data Factory.
Implement real-time data streaming solutions for real-time analytics.
Develop and optimize ETL processes for data warehousing, to support OLTP, OLAP, and BI reporting needs.
Conduct data modelling, normalization, and denormalization to design efficient data structures for Operational Data Store (ODS) and Data Warehouse solutions.
Craft complex T-SQL scripts, including triggers, stored procedures, and functions, to support data processing and reporting requirements.
Monitor and tune database and query performances to ensure optimal data processing speeds and system reliability.
Collaborate closely with the DevOps team to implement CI/CD pipelines for data infrastructure projects.
Produce and maintain detailed documentation for data models, ETL processes, and data pipeline architectures to ensure clarity and sustainability.
Skills & Qualifications:
Bachelors degree in electrical or computer engineering or related field.
Relevant Azure certifications highly preferred.
Minimum of 3 years of experience as a Data Engineer or Developer, with a strong focus on Azure Cloud Services.
Minimum of 2 years of experience with Azure Data Factory, Azure Databricks, Azure SQL Database, and other Azure data services.
Proficient in data integration, data ingestion, and real-time data processing techniques.
Advanced skills in Python, PySpark, Spark SQL, and T-SQL scripting.
Experience with data modeling, ETL development, and data warehousing principles.
Familiarity with CI/CD practices, Azure DevOps, and Git.
Proven ability to work with cross-functional teams to translate business requirements into technical solutions.