Job Openings

Search and Filter
All Categories All Categories
All Locations All Locations
Results per Page: 20
  • 5
  • 10
  • 20
  • 50
  • 100
  • all
Sort: Date (Newest first)
  • Date (Newest first)
  • Date (Oldest first)
  • Alphabetical [A-Z]
  • Alphabetical [Z-A]
6 Found open positions

Director, Data and Analytics

FH Hutch Data Commonwealth
Category: Information Technology
Seattle, WA, US
Job ID: 13483

Overview

Cures Start Here. At Fred Hutchinson Cancer Research Center, home to three Nobel laureates, interdisciplinary teams of world-renowned scientists seek new and innovative ways to prevent, diagnose and treat cancer, HIV/AIDS and other life-threatening diseases. Fred Hutch’s pioneering work in bone marrow transplantation led to the development of immunotherapy, which harnesses the power of the immune system to treat cancer. An independent, nonprofit research institute based in Seattle, Fred Hutch houses the nation’s first cancer prevention research program, as well as the clinical coordinating center of the Women’s Health Initiative and the international headquarters of the HIV Vaccine Trials Network. Careers Start Here.

 

The Hutch Data Commonwealth (HDC) is a transformative initiative at Fred Hutch that is working to bring innovative big data capabilities to the fingertips of all Hutch investigators. Specifically, HDC is developing infrastructure for the management and analysis of research data at large scale and in partnership with the Translational Data Science Integrated Research Center (TDS IRC) support investigators seeking to conduct data-driven projects and works collaboratively with center scientists to identify and source appropriate datasets, develop tools for accessing and visualizing data, and implement analytic approaches that are reproducible and responsive to scientific objectives.

Responsibilities

The Director, Data and Analytics directs the ongoing development of research data platforms and associated data and analytics services within the HDC and leads a team of data engineers and data analysts. In partnership with Product Engineering, the Data and Analytics team will enhance the analytic capability of the Center’s investigators through the introduction of innovative techniques and technologies such as machine learning, natural language processing and advanced data visualization. As part of the HDC leadership team, the Director will contribute to strategic planning, organizational development, and effort prioritization.


The Director, Data and Analytics should have strong data infrastructure and data architecture skills, a proven track record of leading and scaling data engineering teams, strong operational skills to drive efficiency and speed, strong project management leadership, and a strong vision for how data and analytics can proactively accelerate the work of Fred Hutch. Most importantly, the Director, Data and Analytics will have a passion for applying their skills in support of mission-driven research.


CORE RESPONSIBILITIES


Leadership:

  • Lead a team of Data Engineers and Data Analysts and Machine Learning Engineers in the development and support of new infrastructure and data service initiatives
  • Oversee recruiting, hiring, all aspects of performance management, coaching and mentoring of team members, employee development
  • Build a collaborative, transparent culture of trust where team members are empowered and inspired to do their best work
  • Develop team members to be able to operate in a fast paced, lean product development environment
  • Be a consistent example of the center’s commitment to workplace respect, diversity and inclusion, and research integrity


Data and Analytics:

  • Continually assess needs of the center’s computational community and guide the development of appropriate data resources and services
  • In partnership with the TDS-IRC identify new technology and analytic tools to advance research activities
  • Own the technical roadmap for HDC data infrastructure
  • Manage and lead the production support aspects of data flow into HDC platforms
  • Protect data integrity and accuracy; work with data source owner to increase quality and accuracy of source data
  • Makes decisions, often difficult and/or unpopular, that support the goal of efficient data normalization and ETL process; influences others to support the decisions.
  • Identify, evaluate and implement cutting edge big data pipelines and frameworks required to provide requested capabilities to integrate external data sources and APIs


General:

  • Foster strong relationships with key teams outside of HDC, including Information Security and Information Technology
  • Stay current with emerging technologies and industry trends
  • Collaborate with researchers and clinicians to identify high-impact opportunities for data science applications
  • Help researchers understand and utilize HDC data resources
  • Ensure adoption of established best practices


SUPERVISION EXERCISED

10-12 data engineers and data analysts

Qualifications

Minimum qualifications:

  • Masters or PhD degree in Bioinformatics, Statistics, Biostatistics, Mathematics, Computer Science, Physics, or equivalent required, with a minimum of five years of related experience, including a minimum of two years in a management position
  • Demonstrated experience working with biomedical (clinical or research) data
  • Experience with messy, “real life” data sets


Technical skills:

  • Proficiency in R or Python.
  • Knowledge of statistical analysis, machine learning and predictive modeling
  • A variety of data formats and markup languages (e.g. XML, JSON, Markdown)
  • Common data storage mediums (e.g. SQL, Excel, Access) as well as NoSQL models
  • Unix/Linux and distributed computing
  • Experience with source control instruments such as GitHub and related devOps processes
  • Experience with workflow scheduling/orchestration resources
  • Hands-on experience with big data platforms (e.g., Hadoop, Spark) and containers (e.g, Docker)


Qualities necessary for success:

  • A strong desire to explore, investigate, dig, and generally uncover patterns and puzzles in data while maintaining a strong sense of thoughtful and pragmatic solutions.
  • Ability to advise investigators and management in clear language about results and new directions; strong oral and written communication and critical thinking skills are necessary for this position.
  • Ability to lead multidisciplinary teams including statisticians, computational biologists, and data engineers, epidemiologists, clinicians, administrators, etc.

HDC Scientific Liaison

FH Hutch Data Commonwealth
Category: Information Technology
Seattle, WA, US
Job ID: 13985

Overview

Cures Start Here. At Fred Hutchinson Cancer Research Center, home to three Nobel laureates, interdisciplinary teams of world-renowned scientists seek new and innovative ways to prevent, diagnose and treat cancer, HIV/AIDS and other life-threatening diseases. Fred Hutch’s pioneering work in bone marrow transplantation led to the development of immunotherapy, which harnesses the power of the immune system to treat cancer. An independent, nonprofit research institute based in Seattle, Fred Hutch houses the nation’s first cancer prevention research program, as well as the clinical coordinating center of the Women’s Health Initiative and the international headquarters of the HIV Vaccine Trials Network. Careers Start Here.

 

The Hutch Data Commonwealth (HDC) at the Fred Hutchinson Cancer Center (the Hutch) drives the development of data-intensive research capabilities and infrastructure across Fred Hutch through software and data engineering, training, and strategic partnering.


We are seeking a charismatic individual with a background in bioinformatics or computational biology research to facilitate data-intensive partnerships and identify collaboration opportunities within the Hutch and with broader scientific community. A strong candidate will have knowledge of high-throughput experimental methodologies, large genomic datasets, and computational analysis to successfully describe key concepts across both technical and non-technical collaborators.The Scientific Liaison will enable high-quality science through their ability to match research needs with skill sets and personalities of potential collaborators, which may include but are not limited to: Hutch faculty, external researchers, undergraduate student interns, graduate student and postdoctoral researchers, capstone project students, and industry fellows.


The ideal candidate will be comfortable communicating across a wide range of backgrounds and levels of expertise, and will maintain strong relationships with the Hutch researcher community to better support their needs. They will be in continuous contact with researchers to fully understand their challenges and help identify partnership opportunities. This position requires excellent oral and written communication skills and a deep understanding of the research process and culture.


This role will report to the Director of Alliances and Data Strategy. The Scientific Liaison will provide input into the types and nature of institutional partnerships that would be most useful, and identify opportunities for wider impact through more intensive engagement.

Responsibilities

  • Understand key aims, concepts, and methods of Fred Hutch’s research portfolio. 
  • Be able to converse with researchers about their work, methods, and approaches.
  • Have familiarity with commonly used approaches and methods related to data-intensive research in the biomedical space.
  • Develop in-depth familiarity with researchers. Make connections between researcher needs and research outcomes and opportunities.
  • Facilitate translation of collaborator requests into practical steps, and determine whether and how they are achievable.
  • Make recommendations regarding the allocation of people, time, financial, and intellectual resources; track resource use and needs across projects.
  • Manage, maintain, and grow an engaged network of internal and external research collaborators.
  • Help with creation of job descriptions and recruiting materials related to research partnerships. 
  • Monitor collaborator satisfaction and assess their needs. 
  • Contribute to the strategic planning and management activities of the Alliances and Data Strategy team.
  • Facilitate sharing of knowledge and best practices in data science among researchers and others at the Hutch.
  • Other duties as required.

Qualifications

  • PhD in biomedical sciences, computational biology, bioinformatics, or other discipline related to data-intensive research; or equivalent experience 
  • Competency in data science skills, approaches, and languages like R and/or Python, SQL, machine learning, and version control
  • An empathetic mindset with a deep understanding of research processes, challenges, opportunities to facilitate relationship building and trust.
  • Strong oral communication skills to convey and explain information effectively and across varying audiences
  • Strong organization and project management skills. Self-motivated and driven to make a difference.
  • Strong interest in open science and data sharing.
  • Experience in or willingness to learn product development concepts, practices, and processes.
  • Intellectually curious and adept at rapidly comprehending the general goals of cutting-edge biological research and the nuances of how data science intersects with these objectives.
  • Committed to working with diverse teams.
  • Desirable: Experience working in communication and/or outreach to researchers. 
  • Desirable: Familiarity with data formats and markup languages (e.g. XML, JSON, Markdown); common data storage mediums (e.g. SQL, Excel, Access); distributed computing; big data platforms (e.g., Hadoop, Spark); and containers (e.g, Docker)

Senior DevOps Engineer

FH Hutch Data Commonwealth
Category: Information Technology
Seattle, WA, US
Job ID: 12498

Overview

Cures Start Here. At Fred Hutchinson Cancer Research Center, home to three Nobel laureates, interdisciplinary teams of world-renowned scientists seek new and innovative ways to prevent, diagnose and treat cancer, HIV/AIDS and other life-threatening diseases. Fred Hutch’s pioneering work in bone marrow transplantation led to the development of immunotherapy, which harnesses the power of the immune system to treat cancer. An independent, nonprofit research institute based in Seattle, Fred Hutch houses the nation’s first cancer prevention research program, as well as the clinical coordinating center of the Women’s Health Initiative and the international headquarters of the HIV Vaccine Trials Network. Careers Start Here.

 

The Hutch Data Commonwealth (HDC) represents a new organization within the Fred Hutchinson Cancer Research Center with a mission to develop new capabilities and resources to facilitate the center’s interaction with large and complex data sets. HDC data and software engineers develop robust resources for the management and analysis of data in support of both local and extramural research activities. The HDC also serves as a nexus within Fred Hutch for data-centered partnerships with technology companies and academic organizations.

 

The DevOps Engineer works collaboratively with the HDC engineering and Product teams to identify and optimize the interdependencies between applications development and infrastructure operations to support continuous delivery of solutions.

Responsibilities

  • Monitor and tune the performance, reliability, and security of the infrastructure. Identify and correct bottlenecks in the system, while working with development teams on optimization and best practices.
  • Participates in a team responsible for configuration, packaging and deployment of software
  • Provides subject matter expertise on application server technologies
  • Responsible for key system design and integration decisions around tools, processes and practices enabling teams to apply DevOps practices
  • Actively involved in the daily operational activities that impact important components / processes of the HDC engineering function
  • Executes deployment operations to monitor and improve on pre-release, upgrades and current versions of software
  • Evaluates compatibility of programs with existing communications hardware and software features.
  • Develops operating procedures to support established standards
  • Improves operation and monitoring of advanced or complex features
  • Develops processes and interface requirements
  • Responsible for data development and definition of acceptance criteria

Qualifications

  • Bachelor’s in computer science or related field
  • Minimum of 4 years’ experience in a DevOps role.
  • Strong Experience with provisioning and configuration management tool(s) – Terraform, Puppet, Chef
  • Strong Experience in Cloud services - AWS and Azure
  • Solid scripting skills (Bash, Python);
  • Experience with Software Delivery Automation - Continuous Integration and Continuous Delivery Paradigms
  • Solid network configuration skills - TCP/IP, Load Balancing, DNS;
  • Experience in git version control
  • Linux administration proficiency
  • Experience with agile security tools (e.g. Vault, InSpec, ZAP)
  • Docker Containers and Container Infrastructure (e.g. Kubernetes)
  • Databases Management - SQL and NoSQL
  • Build automation and continuous integration tools (e.g. VSTS, Jenkins, Gradle)
  • Continuous Delivery pipelines implementations (e.g. Jenkins)
  • BigData systems experience (e.g. Hadoop, Kafka, Cassandra, Spark)
  • Monitoring and logging systems (e.g. Splunk, CloudWatch, ELK)
  • HA and FT systems provisioning, deployment, production support;
  • Good teaching/mentoring abilities;
  • Solid Documentation and presentation skills;
  • Passionate about technology (Operating Systems, Clouds, Clusters);
  • Automation maniac;
  • Team player;
  • Good verbal and written communication skills.
  • Experience with managing Solr/ElasticSearch environments.

Software Development Engineer III

FH Hutch Data Commonwealth
Category: Information Technology
Seattle, WA, US
Job ID: 14104

Overview

Cures Start Here. At Fred Hutchinson Cancer Research Center, home to three Nobel laureates, interdisciplinary teams of world-renowned scientists seek new and innovative ways to prevent, diagnose and treat cancer, HIV/AIDS and other life-threatening diseases. Fred Hutch’s pioneering work in bone marrow transplantation led to the development of immunotherapy, which harnesses the power of the immune system to treat cancer. An independent, nonprofit research institute based in Seattle, Fred Hutch houses the nation’s first cancer prevention research program, as well as the clinical coordinating center of the Women’s Health Initiative and the international headquarters of the HIV Vaccine Trials Network. Careers Start Here.


The Hutch Data Commonwealth (HDC) represents a new organization within the Fred Hutchinson Cancer Research Center with a mission to develop new capabilities and resources to facilitate the center’s interaction with large and complex data sets. HDC data and software engineers develop robust resources for the management and analysis of data in support of both local and extramural research activities. The HDC also serves as a nexus within Fred Hutch for data-centered partnerships with technology companies and academic organizations.


With guidance, the Software Development Engineer III role is to provide software solutions where no optimal software solutions exist to support the mission of Fred Hutch. In this role, you will work with a collaborative team of engineers, product managers, and others. Excellent interpersonal and communication skills are necessary along with a passion for building high quality applications following software development best practices. You will primarily be extending an existing web application, used by hundreds of Center scientists, designing and implementing new functionalities in a PostgreSQL/Java/Angular stack. Currently on-prem, our roadmap includes moving some components to the cloud within the next year. We develop iteratively and work closely with our customers.


As part of the HDC software group, you may also become involved in other, smaller projects including productionalizing data science workflows or implementing data discovery tools. These projects utilize a range of technologies including Python, Docker, Reach, NodeJS, OAuth and multiple cloud providers.


The Software Development Engineer III reports to the Software Development Engineer Manager or Director.

 

This position offers flexible scheduling options and a commitment to work-life balance.

Responsibilities

  • Provide development leadership within a multi-functional team to design, create and support a full-stack software solution, using Java, PostgreSQL and JavaScript (Angular).
  • Work with product managers and customers to understand and translate business needs to technical requirements.
  • Participate in an iterative product delivery model, applying Agile principles to continuously improve team processes.
  • Collaborate with Scientific Computing staff to streamline deployment processes, improve application monitoring and containerize services to support cloud deployment.
  • Contribute to, follow and maintain knowledge of current software development trends, best practices, industry tools and standards.
  • Coach/mentor junior developers.
  • Create and maintain documentation.
  • Participate in code and design reviews.
  • Trouble-shoot and fix bugs.
  • Provide Tier 2 Support of software (business hours only).

Qualifications

Minimum: 

  • 5+ years professional experience in software development 
  • Expertise with modern OOP programming paradigms / languages, especially Java 
  • Database design and development, preferably PostgreSQL
  • UI Development experience using JavaScript frameworks, preferably Angular
  • Solid understanding of software testing methodologies. Ability to write test plans that include both manual and automated testing
  • Proven track record of delivering on multiple complex projects through all stages of the development lifecycle; design through deployment
  • Demonstrated leadership abilities in an engineering environment in driving operational excellence and best practices

 

Preferred: 

  • BS in Computer Science, Engineering or related field
  • Experience writing or consuming REST based APIs
  • Unix experience
  • Experience with gradle, git and IntelliJ IDEA
  • Experience building or extending LIMS (Laboratory Information Management System)
  • Interest in biomedical research
  • Design and deliver formal and informal presentations on a variety of technical topics

Solutions Engineer

FH Hutch Data Commonwealth
Category: Information Technology
Seattle, WA, US
Job ID: 13903

Overview

Cures Start Here. At Fred Hutchinson Cancer Research Center, home to three Nobel laureates, interdisciplinary teams of world-renowned scientists seek new and innovative ways to prevent, diagnose and treat cancer, HIV/AIDS and other life-threatening diseases. Fred Hutch’s pioneering work in bone marrow transplantation led to the development of immunotherapy, which harnesses the power of the immune system to treat cancer. An independent, nonprofit research institute based in Seattle, Fred Hutch houses the nation’s first cancer prevention research program, as well as the clinical coordinating center of the Women’s Health Initiative and the international headquarters of the HIV Vaccine Trials Network. Careers Start Here.


The Hutch Data Commonwealth (HDC) at the Fred Hutchinson Cancer Research Center was created to establish the next-generation data infrastructure and data science capabilities in support of scientific research efforts at Fred Hutch. The HDC brings data engineering, data integration, database development, natural language processing, machine learning, user experience design, bioinformatics and medical informatics skills together. The Cascadia Data Alliance at HDC is a new program to establish a health research data ecosystem with organizations across the Pacific Northwest region. The Alliance will enable improved governance and collaboration to drive towards a common data platform to accelerate research and innovation across the community.


The Solutions Engineer will lead the technical design and strategy efforts for the HDC’s Cascadia data platform and associated products and services. They will develop plans, track progress, and align technical plans for the platform with key HDC goals. They will work closely with HDC engineers, product managers, and designers to ensure the technology created will meet the needs of Fred Hutch researchers and their collaborators.

Responsibilities

  • Lead the technical design and strategy efforts for HDC’s Cascadia data platform and associated products and services
  • Work collaboratively with HDC teams, partners internal to Fred Hutch, and external to the organization to ensure a culture of inclusivity, feedback, and iteration
  • Act as a liaison between researchers and technical staff, translating feature requests into practical, incrementally-deployable, and useful software designs
  • Ensure the platform meets the needs of biomedical researchers across the Cascadia Data Alliance
  • Clearly communicate priorities, requirements and rationales to other technical staff, prioritize agile backlog(s), and collaborate with designers and developers so that Cascadia software products are delivered and deployed smoothly
  • Foster a culture of rapid iteration and informed risk-taking, sharing software roadmaps and prototypes widely, so that software development efforts align with broader Cascadia goals
  • Work with commercial partners to leverage industry best practices in CI/CD, scalability, security, etc.
  • Help write, test, deploy and maintain Cascadia software, participating in design and code reviews as needed
  • Research other data-sharing efforts and their technical underpinnings, learning from the successes and failures of other attempts as well as our own
  • Take on other duties as assigned given the dynamic nature of HDC’s work
  • Most importantly, have a passion for applying technical skills in support of mission-driven research, enthusiastically advocating for data sharing, research data management best practices, and open source/science community building

Qualifications

  • BS in biomedical sciences, computer science, library and information science, or another discipline related to data-intensive research and/or data management
  • 5+ years in development and/or professional product management roles supporting SaaS/cloud offerings, preferable in the biomedical research space
  • Proven track record researching, implementing and socializing research data solutions
  • Demonstrated experience working with biomedical (clinical or research) data; experience with messy, “real life” data sets
  • Experience applying agile software development practices, supporting software releases, and developing metrics to drive product success
  • Strong verbal, written, and organizational skills with the ability to communicate with a variety of technical and non-technical collaborators and drive projects to successful conclusions
  • Self-starter with the ability to multitask and balance many parallel tasks

Technical skills

  • Proficiency in R or Python
  • Knowledge of a variety of data formats and markup languages (e.g. XML, JSON, Markdown)
  • Experience with Cloud environments with a specific focus on Azure
  • Experience with source control instruments such as GitHub and related devOps processes
  • Experience with workflow scheduling/orchestration resources
  • Hands-on experience with big data platforms (e.g., Hadoop, Spark)
  • Demonstrated experience with containerization technologies in a production environment (Kubernetes experience preferred)
  • Desirable: Minimum of five years of experience with Linux system administration
  • Desirable: Minimum of five years of experience with data management technologies including Postgres, NOSQL DB’s preferred

System Engineering Manager

FH Hutch Data Commonwealth
Category: Information Technology
Seattle, WA, US
Job ID: 12885

Overview

Cures Start Here. At Fred Hutchinson Cancer Research Center, home to three Nobel laureates, interdisciplinary teams of world-renowned scientists seek new and innovative ways to prevent, diagnose and treat cancer, HIV/AIDS and other life-threatening diseases. Fred Hutch’s pioneering work in bone marrow transplantation led to the development of immunotherapy, which harnesses the power of the immune system to treat cancer. An independent, nonprofit research institute based in Seattle, Fred Hutch houses the nation’s first cancer prevention research program, as well as the clinical coordinating center of the Women’s Health Initiative and the international headquarters of the HIV Vaccine Trials Network. Careers Start Here.

 

The Scientific Computing team within the Hutch Data Commonwealth (HDC) is looking for an engineering manager who can be a generalist in managing people and high-performance computing infrastructure projects. Come join us and use your experience in customer service, systems engineering and management to help our mission to cure cancer. Half of this this ten-person team will report to you and is comprised of talented generalists, developers, cloud engineers and computer scientists that pride themselves on working collaboratively with limited ego. Over the last few years, the team has built and deployed the first seamless cloud bursting high performance computing (HPC) environment and the first open source object storage solution in life sciences. The team is currently working on creating the lowest cost high performance file system with enterprise support in the country. This position is a unique opportunity to support a diverse range of customers, from beginners to AI experts, working both on-premise and in multiple clouds (AWS, GCP, Azure).

The Hutch Data Commonwealth (HDC) is a division with a vision to enable investigators to leverage all possible data in the effort to eliminate disease, by driving the development of data infrastructure and data science capabilities through strategic partnering and robust engineering, thereby establishing Fred Hutch as a recognized center of excellence in biomedical data science.  Your work will assist hundreds of investigators and their research staff to meet our goal of curing cancer by 2025.

You should be here. Come join us now!

Responsibilities

The Manager of HPC Engineering directs and oversees the HPC engineering functions in design, development, installation, and maintenance of hardware and software for the Center for High-Performance Computing (HPC) systems.  HPC systems are configured based on stakeholder needs and the strategic vision of Fred Hutch. Responsibilities include:

 

  • Ensure resource availability and allocation to meet operational and project commitments
  • Manage infrastructure projects and balance operational and project commitments
  • Define project scope and objectives, involving all relevant stakeholders and ensuring technical feasibility
  • Measure project performance using appropriate tools and techniques
  • Drive innovation and continuous improvement of the infrastructure to reduce the team’s operational burden
  • Ensure that customer support and consultative experience is commensurate with the high caliber and data-intensive science undertaken at Fred Hutch
  • Manage the day-to-day operations of the team, ensuring that engineers have what they need to run a world-class HPC and Storage environment
  • Drive documentation efforts to improve the experience of computational researchers of Fred Hutch and partners in our wiki (https://sciwiki.fredhutch.org/)
  • Partner with other Fred Hutch IT teams and build strong relationships to ensure the success of Scientific Computing initiatives
  • Other duties as assigned, given the dynamic nature within the Hutch Data Commonwealth

Qualifications

  • 1+ years in a team lead or manager position
  • Demonstrated project management skills
  • Data mining and analytical skills as well as proven process management skills
  • Strong understanding of support: tickets, monitoring, and metrics
  • 1+ years system engineering experience managing Linux systems or storage infrastructure
  • Previous experience in a research or educational setting is a plus
  • Familiarity with GitHub and GitOps/DevOps principles is a plus

 

Additional reasons to work here

 

  • A better work live balance than you find in many other places in Seattle (believe us!)  
  • 403(b) Retirement savings plan with 7% matching   
  • On-site day care through Hutch Kids, based on availability  
  • Competitive salary, depending on experience.  
  • 12 days annual leave + 1 day after every year   
  • Twelve paid holidays annually  
  • Excellent medical, dental and vision plans   
  • Subsidized health insurance for family members and domestic partners  
  • On-site employee health clinic including PT   

Your search for "" did not match any of our available careers. Please try another search term or combination of terms and filters.