Senior DevOps Engineer (US – Remote)

As a small but rapidly growing company, we are looking for a  Senior DevOps Engineer to join our growing Cloud Engineering team.  You will use your expertise to help us implement our DevOps vision, across the whole company.  You will be a key member of an amazing multi-talented team, that is using cutting edge web and data technologies – to develop the FHIR API standard to access patient electronic health record data, for the digital healthcare industry.

In this role, you’ll get to:

  • Work with development teams on all parts of the DevOps culture: build, test, release, deploy, operate, monitor, plan and code
  • Create highly available, disaster-recoverable DevOps services, which will be used internally by
  • Engineering teams to create our offered products and externally by customers to understand health of their purchased products
  • Focus initially on AWS, but help us build/scale to cloud agnostic approaches – so we can support GCP and Azure
  • Enhance Infrastructure as Code, Configuration as Code and Secrets Management, for hands-off CI/CD
  • Partner with our SDET team, to deliver Test as Code
  • Interact with our Security team, to integrate compliance in our products and pipelines
  • Use your DevOps expertise to initially help our Data Engineering team with their pipelines.  Long-term you will help us implement our DevOps vision, across all of Engineering.
  • Ensure consistency and automation in our processes across UIs, APIs, serverless functions and analytics products
  • Help us customize DevOps services, to create future MLOps services
  • Set standards and implement automation, that allows Engineering to easily follow the best practices
  • Mentor junior developers/engineers in technologies and DevOps culture
  • Sprint as part of an Agile team that implements Kanban for unplanned work and Scrum for planned work
  • Work with a team to bring Patient healthcare data together to affect future health outcomes

We are looking for people who have:

  • 7+ years of DevOps/Data engineering experience
  • Experience with cloud deployment tools and methodologies: AWS Cloud Formation, Docker, Kubernetes, Terraform.
  • Proficiency with a CI/CD pipeline technology such as Jenkins, Travis CI, AWS CodePipeline, or GitHub Actions etc.
  • Prior experience building/maintaining infrastructure for large data pipelines
  • An understanding of distributed AWS web application architecture principles
  • Competency in one or more programming languages; Python, Java, Kotlin or Javascript preferred
  • Proficiency with Linux and Linux scripting
  • An understanding of Analytics tools; SQL, R, Apache Spark, Apache Storm, Pig, Hive, Python, Scala, Jupyter Notebooks, BI Tools

You may also have:

  • Healthcare and/or bioinformatics experience
  • Experience with large-scale data processing platforms such as Spark, EMR, and/or Apache NiFi or Apache Airflow
  • BS/MS Computer Science / Software Engineering or commensurate prior experience

About 1upHealth
At 1upHealth, we are united by our shared goal – unlock healthcare data to improve patient and financial outcomes.

As a leader in FHIR® interoperability, our core belief is that unleashing the power that lies within health data can improve the way that we operate in our industry. We are on a mission to make it easier to access, aggregate, analyze and share healthcare data. Through our secure FHIR platform, we offer a solution to; break the barriers that hinder innovation and interoperability, meet compliance regulations and leverage the cloud for ubiquitous technology enablement.

Come join our mission-driven team, and be a part of the future of healthcare.

Benefits
100% Paid Medical and Dental Insurance for Employees
Unlimited PTO
Equity
401(k)
Home Office Stipend
Internet/Phone Stipend
Parental Leave (16 weeks for birthing parents, 3 weeks for non-birthing parents)
Company Meetings with Free Lunch

To apply for this job please visit jobs.lever.co.