Job Title: Data Architect – Big Data
Location: USA – Remote (PST Time Zone)

Experience: 10+ Years Total | 3+ Years in Big Data (100TB+ environments)


About the Role
We are seeking an experienced Data Architect (Big Data) to join our team and lead the design, implementation, and optimization of enterprise-grade data solutions. This fully remote role is open to candidates based in the United States and requires alignment with Pacific Standard Time (PST) working hours. It is ideal for professionals with deep expertise in data modeling, GCP, and modern data platforms, who can architect scalable, secure, and efficient data ecosystems to support business growth

Key Responsibilities

  • Design, implement, and optimize conceptual, logical, and physical data models for enterprise-wide data initiatives.
  • Architect, build, and maintain scalable, high-performance data pipelines across structured and unstructured data sources.
  • Analyze business and system requirements to define data strategies, governance frameworks, and standards.
  • Utilize modern data technologies (released within the last 2–3 years) to drive innovation and performance improvements.
  • Conduct performance tuning, testing, and troubleshooting to ensure system reliability and scalability.
  • Maintain data quality, security, and compliance with industry regulations.
  • Partner with cross-functional teams (Engineering, Product, Analytics) to deliver business-aligned data solutions.

Required Skills & Experience

  • 10+ years of experience in data architecture and data modeling (conceptual, logical, and physical).
  • 3+ years of hands-on experience working with Big Data systems (100TB+ datasets).
  • Strong expertise in SQL, Python, and ERD design with in-depth knowledge of database structure principles.
  • Advanced experience with Google Cloud Platform (GCP) – BigQuery, GCS, Composer, Cloud Functions.
  • Hands-on experience with DBT (Data Build Tool) and data transformation workflows.
  • Proven ability to implement and maintain robust ETL/ELT pipelines.
  • Deep understanding of data mining, segmentation techniques, and data security best practices.
  • Familiarity with data visualization tools (Looker, Tableau, Power BI preferred).