100% Remote // GCP Data Architect (Big Query, Python, ETL DHW)

Posted 2025-05-22
Remote, USA Full-time Immediate Start

Full time

Job Description:

    Primary Skill -
  • GCP Data engineering services – Big Query, Airflow, Data Proc, Data Flow
  • Data warehousing knowledge and good SQL skills. Working experience on data migration from On-premise to Cloud
    Roles and Responsibilities:
  • Collaborate with cross-functional teams to design and implement scalable and reliable systems on Google Cloud Platform considering optimal performance, security, and cost-effectiveness.
  • Build data ingestion pipelines to extract data from various sources (Azure Blob, Azure SQL, Flat files, Semi-structured sources, AWS S3) into the data warehouse in GCP.
  • Utilize GCP services to build robust and scalable data solutions.
  • Design, develop, and maintain data pipelines and implement data architecture on GCP using services such as Big Query, Cloud Storage, and Cloud Composer.
  • Expertise in tools and technology that help in the process of data collection, cleaning, transforming, and modeling data to achieve useful information.
  • Leverage GCP capabilities and technologies for migrating existing databases to cloud.
  • Implement and optimize Big Query tables and complex SQL queries for efficient data retrieval, performance, and efficiency.
  • Experience in Data Migration from On-premises Database to Big Query and experience in BQ conversion.
  • Experience and knowledge in building data pipelines and scheduling using Cloud Composer (Airflow) and data and file transformation using Python.
    EDW (Enterprise Data Warehouse) and Data Model Designing:
  • Experience with Data modeling, Data warehousing, and ETL processes.
  • Work closely with Business and analysts to design and implement data models for effective data representation and analysis.
  • Ensure data models meet industry standards and compliance requirements in the health-care domain.
  • Contribute to the design and development of the enterprise data warehouse architecture.
  • Implement best practices for data storage, retrieval, and security within the EDW.
  • Apply domain-specific knowledge to ensure that data solutions comply with health-care industry regulations and standards.
  • Stay updated on industry trends and advancements in health-care data management.
  • Work collaboratively with cross-functional teams, including Business Teams, analysts, and software engineers, to deliver integrated and effective data solutions.
  • Participate in code reviews and provide constructive feedback to team members.
    Qualifications (Required Technical skills/Experience):
  • Bachelor’s degree in computer science, Information Technology, or a related field.
  • Proficiency in GCP and in-depth knowledge of GCP services, Big Query, Cloud Functions, and Composer.
  • Strong programming skills in Data Engineering Python.
  • Experience with data modeling, SQL, and EDW design.
  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration skills.
  • Proficiency in version control systems, particularly Git.
  • Strong understanding of Data warehouse concepts and data lakes.
  • Seniority level

Mid-Senior level
Employment type

Full-time
Job function

Information Technology
Industries

IT Services and IT Consulting

Apply Job!

 

Similar Jobs

Back to Job Board