Join PwC as a Data Engineer to design and maintain Azure-based data pipelines and architectures, supporting internal corporate services.
Your role
Key responsibilities include:
- Design, develop, and maintain robust ETL/ELT pipelines using Azure Data Factory, Azure Data Bricks, and MS Fabric.
- Build and optimize data architectures such as data lakes and data warehouses on Azure Data Services, Azure SQL PaaS, and Power BI.
- Collaborate with data analysts and business stakeholders to understand data requirements.
- Ensure data quality, integrity, and security across all data systems, including authentication and authorizations for databases.
- Monitor and troubleshoot data pipeline performance and reliability.
- Implement best practices for data governance, metadata management, and documentation.
- Manage memory for database systems.
- Develop database schemas, tables, and dictionaries.
- Ensure data quality and integrity in databases, resolve performance issues, and provide corrective measures.
- Work with structured and unstructured data from various sources such as APIs, databases, and flat files.
About you
The company is looking for:
- Bachelor’s degree in Computer Science, Information Technology, or related field.
- Graduates from 2024 to 2025 from Universities, Polytechnics, Institutes of Technical Education (ITE), or other educational institutions, including those who completed National Service in 2024 or 2025.
- Candidates completing studies in 2025 but receiving qualifications in 2026 are welcome to apply.
- Good to have Azure Data Engineer certification.
- Familiarity with MS SQL T-SQL and programming languages such as Python or C# .Net.
- Experience with data pipeline orchestration tools like Azure Data Factory or Azure Data Bricks.
- Familiarity with cloud data platforms such as MS Fabric.
- Strong understanding of data modeling and warehousing concepts.
This job may close before the stated closing date, you are encouraged to apply as soon as possible
Report this job