Parexel is looking for a detail-oriented Software Engineer to build and maintain reliable, high-performance data pipelines using Azure, Databricks, Snowflake, Denodo, and Power BI. This role supports mission-critical reporting and analytics by ensuring that clean, well-structured data is readily available to business stakeholders.
Key Responsibilities:
- Develop and maintain ETL/ELT pipelines using Azure Data Factory, Databricks, and Snowflake to ingest and transform data from multiple sources.
- Assist in configuring and maintaining Denodo virtualization layers, ensuring consistent and secure data access.
- Support the BI team in creating efficient data models for Power BI dashboards and reports.
- Conduct regular data quality checks and validation to ensure high data integrity.
- Troubleshoot and resolve data pipeline issues in collaboration with cross-functional teams.
- Implement data lineage, monitoring, and alerting to proactively detect data issues.
- Adhere to data security and compliance policies across all data processes.
Required Qualifications:
- Experience: 3–5 years of data engineering experience, with solid hands-on exposure to Azure, Databricks, and Snowflake; familiarity with Denodo and Power BI is a plus.
- Education: Bachelor’s degree in computer science, Engineering, Information Systems, or a related field.
Skills:
- Proficiency in Azure Data Factory, Databricks, and Snowflake.
- Good understanding of data modeling and virtualization (Denodo a plus).
- Strong SQL and basic Python scripting skills.
- Familiarity with building or supporting Power BI data models.
- Strong analytical skills and attention to detail.
#LI-REMOTE