Back to jobs

Data Engineer

Job description

Key Responsibilities

  • Design, develop, and maintain robust data pipelines for ingestion, transformation, and storage.
  • Implement and optimize ETL processes using tools such as Azure Data Factory and Alteryx.
  • Manage and maintain data warehouses and ensure data integrity and quality.
  • Collaborate with data analysts, data scientists, and business stakeholders to deliver reliable data solutions.
  • Develop scripts and automation using Python, PowerShell, and other relevant technologies.
  • Utilize Databricks for big data processing and advanced analytics workflows.
  • Monitor and troubleshoot data workflows, ensuring high availability and performance.
  • Implement scheduling and orchestration frameworks (e.g., Apache Airflow) for data processes.
  • Ensure compliance with data governance, security, and privacy standards.

Qualifications

  • Experience: Minimum of 2-4 years as a Data Engineer or in a similar role.
  • Technical Skills:
    • Proficiency in Python and SQL Server.
    • Hands-on experience with Databricks for data processing.
    • Strong knowledge of ETL tools such as Azure Data Factory and Alteryx.
    • Familiarity with PowerShell scripting and Apache Airflow for workflow orchestration (preferred).
  • Data Expertise: Practical experience in data ingestion, data transformation, and data warehousing.
  • Soft Skills: Strong problem-solving abilities, attention to detail, and excellent communication skills.
  • Education: Bachelor's degree in Computer Science, Information Systems, or a related field (or equivalent experience).