Job was saved successfully.
Job was removed from Saved Jobs.

Job Details


UnitedHealth Group

Principal Data Engineer (107884008)

Technology

Applications Engineer

Yearly

No

Dublin, Leinster, Ireland

Principal Data Engineer - Dublin (Remote or Hybrid)

Optum, the fast-growing part of UnitedHealth Group, is a leading information and technology-enabled health services business. Our teams are dedicated to modernizing the health care system and improving the lives of people and communities.

Serving virtually every dimension of the health system, we work with a diverse set of clients across 150 countries – from those who diagnose and treat patients to those who pay for care, deliver health services, and those who supply the cures. Optum maintains operations across North America, South America, Europe, Asia Pacific and the Middle East. Our innovative partnerships provide technology and tools that enable unprecedented collaboration and efficiency. As a result, we can tap into valuable health care data to uncover insights and develop strategies for better care at lower costs.

Careers with Optum offer flexible work arrangements and individuals who live and work in the Republic or Ireland will have the opportunity to split their monthly work hours between our Dublin and Letterkenny office and telecommuting from a home-based office in a hybrid or remote work model.

Primary Responsibilities

  • Develop platform capabilities to meet business requirements and produce data engineering deliverables to serve machine learning and business intelligence processes
  • Write code, automate and leverage tools, to ingest and transform data using clean coding principles, while incorporating business logic as defined in conjunction with stakeholders
  • Define best practices and standards in the Data area for processing and analyzing data
  • Own and manage the deliverables allocated to you, ensuring accurate, concise documentation of the code base aligns to the code, testing, and deployment
  • Migrate data from legacy systems to new solutions, such as from on-premises clusters to Cloud
  • Mentor and train junior members of the team
  • Design conceptional and logical data models, architecture diagrams and flowcharts
  • Design and facilitate data-monitoring and observability models for pipelines deployed

Required Qualifications:

  • Experience working with Data in a hands-on, developer role with software engineering principles
  • Proven hands-on coding experience with Apache Spark and related Big Data stack and technologies, preferably using Scala, else PySpark
  • Experience working with both real-time and batch data, knowing the strengths and weaknesses of both and when to apply one over another
  • Experience building data pipelines on either AWS, Azure or GCP, following best practices in Cloud deployments
  • Ability to debug complex data issues while working on very large data sets with billions of records
  • Fluent in SQL (any flavor), with experience using Window functions and more advanced features, having served in various capacities within design, development and deployment of data warehouses or relational/analytical/multi-dimensional data marts
  • Familiar with Virtual Data Warehousing or use of Snowflake DWaaS
  • Experience with Hadoop database tools (Hive, Pig, Sqoop, HBase and/or MapReduce)
  • Understanding of DevOps tools, Git workflow and building CI/CD pipelines
  • Experience working with Kubernetes and Docker, and knowledgeable about cloud infrastructure automation and management (e.g., Terraform)
  • Ability to work with business and technical audiences, on business requirement meetings, technical white boarding exercises, and SQL coding/debugging sessions
  • Familiar with Airflow or similar orchestration tool
  • Ability to work a portion of monthly work hours from our Dublin or Letterkenny office is required

Preferred Qualifications:

  • Experience with shell scripting languages
  • Well versed in Python, in fulfilling multiple general-purpose use-cases, and not limited to developing data APIs and pipelines
  • Experience using Airflow in orchestrating data pipelines, Airflow operators and hooks
  • Experience working with Apache Kafka, building appropriate producer/consumer apps
  • Familiarity with production quality ML and/or AI model development and deployment.
  • Knowledge of modern DevOps tools and practices, around code-based release and deployment

Careers with Optum. Here's the idea. We built an entire organization around one giant objective; make the health system work better for everyone. So when it comes to how we use the world's large accumulation of health-related information, or guide health and lifestyle choices or manage pharmacy benefits for millions, our first goal is to leap beyond the status quo and uncover new ways to serve. Optum, part of the UnitedHealth Group family of businesses, brings together some of the greatest minds and most advanced ideas on where health care has to go in order to reach its fullest potential. For you, that means working on high performance teams against sophisticated challenges that matter. Optum, incredible ideas in one incredible company and a singular opportunity to do your life's best work.SM

All telecommuters will be required to adhere to the UnitedHealth Group's Telecommuter Policy.

Diversity creates a healthier atmosphere: UnitedHealth Group is an Equal Employment Opportunity/Affirmative Action employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin, protected veteran status,

107884008.jpg