Job Description
Job Brief
We are seeking a Data Engineer (Azure and Databricks) to design, build, and maintain the data infrastructure and pipelines that support the data-driven decision-making. In this role, you will collaborate with cross-functional teams to ensure data availability, reliability, and accessibility, enabling the organization to extract insights and deliver impactful solutions. This is a full-time remote position, in Portugal.
Responsibilities
-
Design and build data models to support Data Science, Business Intelligence, and downstream datasets.
-
Develop APIs and data products that integrate data across systems and processes.
-
Monitor data pipelines and proactively communicate issues to leadership and stakeholders.
-
Collaborate with business partners and developers to document and validate requirements for new initiatives.
-
Coordinate cross-functional and cross-team efforts in partnership with developers and leadership.
-
Deliver high-quality data engineering solutions aligned with business needs and technical best practices.
-
Build and maintain ADF pipelines, Data Flows, Azure Functions, and Logic Apps.
-
Work with Azure SQL Database, including stored procedures and views.
-
Explore and document Power BI reports and datasets.
Requirements and Skills
Required
-
Proven experience as a Data Engineer or Data Quality Engineer in fast-paced environments.
-
Expert proficiency in SQL and SQL-like query languages.
-
Strong experience with Python and structuring Python-based projects.
-
Experience with Azure Data Factory and Databricks.
-
Solid background in ETL and ELT development.
- Strong knowledge of data modeling methodologies (Kimball, Inmon, Data Vault).
-
Hands-on experience with Azure SQL Database, stored procedures, and views.
-
Experience with Azure Functions and Logic Apps.
Nice to Have
-
Experience building automated unit and integration testing frameworks for data projects.
-
Previous experience in Business Intelligence or as a Business Analyst.
-
Familiarity with Infrastructure as Code tools (Terraform or CloudFormation).
-
Experience with Tableau, Power BI, or Spotfire.
-
CI/CD automation experience (GitLab preferred).
-
Experience with batch and streaming data pipelines.
-
Experience ingesting third-party data using APIs.
-
Familiarity with integrating data platforms into observability tools.
-
Expertise in predictive modelling.
-
Strong communication and collaboration skills across technical and non-technical audiences.
Next Steps
Do you consider yourself the ideal candidate for this role? If so, take the next step and apply now. Our team will take care of the rest!