We need great people to help lead the design and creation of software to transform these complex scientific datasets. The cloud solutions you will work on have the ability to massively scale to tens of thousands of concurrent pods, operating on data at the core.
At least 7 years experience in software engineering
Strong skills in at least one general-purpose programming language (Go, Java, C#, Python, backend Typescript, etc.)
A proven ability to build and maintain cloud-based infrastructure on a major cloud provider like AWS, Azure or Google Cloud Platform
Familiarity with creating and maintaining containerized application deployments with a platform like Kubernetes
Strong understanding of relational data models (RDBMS/SQL, query patterns & optimization, etc.). NoSQL is also a plus.
Experience developing for or architecting distributed software systems (use of message queues, scalable compute & storage, etc.).
A high level of comfort with Unit Testing and its associated methodologies
Proven experience building and maintaining data-intensive APIs using gRPC
Bonus points for:
Demonstrated proficiency with Go
Experience with protocol buffers and a RESTful approach
Experience with: Google Cloud Platform, Google Kubernetes Engine or Kubernetes
Experience working with scientific datasets, or a background in the application of quantitative science to business problems
Exposure to scientific programming languages (R, Python, Julia, etc.)