Back to jobs Featured

Data Architect - financial institution ($75-95k + bonus)

Job description

Responsibilities

  • Lead the end‑to‑end design and architecture of enterprise data platforms, including data models, integration frameworks, data pipelines, and storage layers.
  • Define data architecture standards, patterns, governance principles, and best practices supporting analytics, reporting, trading systems, and operational platforms.
  • Architect scalable, high‑performance data solutions across structured, semi‑structured, and unstructured datasets using modern technologies (cloud and on‑prem).
  • Partner with engineering teams to implement robust data ingestion, transformation, and distribution pipelines with high reliability, quality, and lineage tracking.
  • Design logical and physical data models, conceptual architectures, and technical specifications for OLTP, OLAP, and real‑time/streaming use cases.
  • Drive adoption of cloud‑native components and modern data stack technologies, ensuring security, compliance, performance, and cost efficiency.
  • Conduct technical deep dives, architectural reviews, performance tuning, and optimization for data systems and workflows.
  • Collaborate closely with business leads, technology teams, and senior stakeholders to translate requirements into scalable data architecture solutions.
  • Mentor data engineers and analysts to uplift technical capabilities and architectural discipline.

Requirements

  • Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, Data/AI disciplines, or related field.
  • 10+ years of experience in data architecture, data engineering, or similar technical roles within financial institutions, capital markets, consulting firms, or large enterprises.
  • Deep expertise in data architecture principles, data modeling (conceptual/logical/physical), and designing enterprise data ecosystems.
  • Strong technical proficiency in:
    • Python for data engineering and automation
    • SQL and advanced database design
    • Data warehousing, ETL/ELT frameworks, distributed processing
    • Cloud platforms and modern data stack components
  • Hands‑on experience with big data ecosystems (Spark, Databricks, Kafka, Flink, or similar) is preferred.
  • Strong knowledge of data governance, metadata management, lineage, security controls, and regulatory-grade data standards.
  • Proven ability to architect solutions involving streaming, near‑real‑time data, and high‑volume financial datasets.
  • Excellent problem‑solving skills with the ability to perform detailed technical analysis and provide architectural recommendations.
  • Strong communication skills with the ability to work across technology and business teams.
  • Strong commands of both Chinese and English