Back to jobs Featured

Enterprise Data Warehouse Analyst - Perm

Job description

Responsibilities

  • Coordinate and collaborate with business SME's and CoE's to understand different forms of data available from various systems.
  • Build and Manage Data Warehouse and Data Lakes strategies across the group.
  • Setting up scopes for EDW projects, choosing the right technology tools to support the requirements and ensuring all data needs are being met.
  • Collect, analyze, mines, and helps the business leverage the information stored in data warehouse and data lakes.
  • Provisioning data connectivity to the business whenever it requires.
  • Train and support business to make use of the data from the EDW.
  • Device plans on sunsetting legacy databases and provision users with access on the historical information and data through reports or analysis dashboards.
  • Develop and execute database queries and conduct analyses.
  • Create visualizations and reports for requested projects.
  • Develop and update technical documentation.
  • Ensure project deliverables to be on time and meet quality standards.
  • Introduce tools for effective monitoring, alerts and insights for ensuring no service interruptions.
  • Participates on vendor selection and manage them during the project.

Requirements

  • Bachelor's degree in Computer Science, or a related discipline
  • Strong knowledge in data warehouse design (e.g. dimensional modelling), data mining as well as other EDW techniques (i.e. data mart models, data lake catalogues, data ingestions and job processing)
  • In-depth understanding of database management systems, online analytical processing (OLAP), online transactional processing (OLTP) and ETL (extract, transform and load) framework
  • Well-versed in SQL Server BI Suite (SSIS, SSAS, SSRS)
  • Experience in working with
    • Various market standards relational databases (Oracle PL/SQL, MS Access, Postgr SQL and MS SQL databases, etc.)
    • Various API interfaces (Web Services, RESTful and Wide range of understanding with Cloud-Based technology i.e. Azure SQL, AWS and Google Cloud services
    • Data visualization tools (i.e. Microsoft Power BI, QlikView, Tableau, etc.)
  • Good to have
    • Knowledge in NO-SQL schema-free database (MongoDB, HBase, Cassandra, etc.)
    • Knowledge in Hadoop cluster nodes storage and other execution framework (i.e. Spark, Hadoop Map Reduce, Hive, etc.)
    • Experience in running Python scripts and machine learning modelling on Anaconda or Google Colab
  • Excellent written and verbal communication skills, interpersonal and collaborative skills
  • Ability to articulate and translate technological concepts into business terms