Remo Health logo

Data Engineer

Remo Health
Full-time
Remote
$175,000,200,000 - $175,000,200,000 USD yearly

About us

Remo is building the new standard of dementia care. As a virtual dementia care provider, our expert clinical team designs personalized, comprehensive care around patient and family needs (instead of a one-size-fits-all approach). We empower family caregivers by connecting them with a vibrant community of other caregivers, expert content, and tools to manage the entire dementia journey – from anywhere, at any time.

Our mission is simple — to provide accessible, comprehensive, quality dementia care for every person who needs it.

About the role

We are looking for our first highly skilled Data Engineer to play a key role in developing and maintaining our Data and Analytics Platform. As a key member of our team at Remo, you'll be supporting our mission to provide accessible, comprehensive, and quality dementia care by engineering robust data pipelines, developing analytical models, and empowering our stellar team to leverage our data across the organization.

What you’ll be doing

  • Manage the full lifecycle of data at Remo Health, from ingestion to transformation and downstream consumption

  • Design, develop, and maintain data pipelines and ETL processes using ingestion platforms such as Fivetran and tools like Google Cloud Composer (Apache Airflow)

  • Manage the Customer Data Platform and lifecycle

  • Manage GCP services such as BigQuery, Dataflow (Apache Beam), Kubeflow, or Vertex AI Pipelines for data processing and model deployment.

  • Implement and manage tools such as dbt and semantic layer to support data transformations and modeling

  • Optimize SQL queries for data warehousing in BigQuery, including database partitioning strategies

  • Develop Python scripts and/or packages for data processing and automation tasks

  • Act as a lead analyst to empower and support analytical roles across various departments

  • Enable data discovery and provide expert guidance on augmenting and leveraging available data and obtaining missing data

  • Ensure compliance with data governance policies and data loss prevention (DLP) standards

  • Handle Protected Health Information (PHI), Personally Identifiable Information (PII), and de-identified data securely

  • Collaborate with data scientists to codify and deploy machine learning models to staging and production environments

What we’re looking for

  • 3+ years of experience as a Data Engineer and 3+ years of combined experience in a an Analytics Engineering role in the Healthcare industry

  • Experience building, maintaining, and monitoring robust ETL pipelines

  • Someone who has held a lead analytical role in forecasting, cohort analysis, and time-series wrangling using healthcare and product data (required)

  • Experience developing a strategy for a data semantic layer in a healthcare setting (required)

  • Mastery of administering and augmenting BI and other analytical tooling such as Tableau, PowerBI, or similar (required)

  • Hands on (2+ years) Google Cloud Experience: BigQuery, Composer, Dataflow (required)

  • Expert level SQL: Postgres and BigQuery (required)

  • Experience developing and maintaining Python packages used as part of data transformation pipelines (required)

  • Strong experience with transform and modeling tools such as dbt

  • Proficiency in leveraging platforms such as Segment, Fivetran, dbt

  • Experience with version control for data models (Git)

  • Strong understanding of CDC (change data capture);

  • Above all, excellent problem solving, analytical, and communication skills

  • Knowledge of HIPAA/HITRUST compliance and related security best practices

It's an added Bonus if you have these skills:

  • Hands on experience with Terraform

  • Having implemented a CDC solution

  • Experience collaborating with Data Science teams to codify and deploy ML models into production

  • Experience enabling the responsible usage of AI within an organization is not required, but highly desirable

  • Experience developing BigQuery datasets, views, and tagging for DLP

  • Experience with additional GCP technologies such as Cloud Run / Functions, Scheduler, PubSub

    Work Location

    Remote within the U.S.

    Compensation

    The anticipated base salary range for this position is $175,000–$200,000 annually. Salary is determined by a combination of factors including level, relevant experience, and skills. The range displayed on each job posting reflects the minimum and maximum target for salaries for the position across all US locations. 

Benefits 

  • 401(k) with employer matching up to 4% 

  • 100% employer-paid health insurance benefits for employee + family

  • Dental / vision benefits

  • Monthly wifi/cell reimbursement

  • Talkspace

  • Fertility benefits

  • 20 days PTO + 11 company holidays

  • 16 weeks parental leave for birthing parents and 8 weeks parental leave for non-birthing parents

  • Pregnancy loss leave and bereavement leave

At Remo Health, we value diversity in the workplace because it allows us to better understand and meet the needs of our customers and the communities we serve. We want to ensure every job applicant is treated fairly and with respect regarding race, national or ethnic origin, religion, age, gender, sexual orientation, or disability. If you require any support in the application process, including disability accommodation, please contact hr@remo.health.
We use E-Verify to confirm the identity and employment eligibility of all new hires: Participation Poster (PDF), Right to Work Poster (PDF). Background checks are required for all new hires.