Job was saved successfully.
Job was removed from Saved Jobs.

Job Details


Data Engineer - Project Delivery Specialist (64165)

Equipment/Technology Specialist

Equipment and Facilities Specialist



Los Angeles, California, United States

Are you an experienced, passionate pioneer in technology - a solutions builder, a roll-up-your-sleeves technologist who wants a daily collaborative environment, think-tank feel and share new ideas with your colleagues - without the extensive demands of travel? If so, consider an opportunity with Deloitte under our Project Delivery Talent Model. Project Delivery Model (PDM) is a talent model that is tailored specifically for long-term, onsite client service delivery. PDM practitioners are local to project locations, minimizing extensive travel, and provides you with a full career path within the firm.

Work you'll do/Responsibilities

  • Design and build data transformations efficiently and reliably for different purposes (e.g. reporting, growth analysis, multi-dimensional analysis)
  • Design and implement reliable, scalable, robust and extensible big data systems that support core products and business
  • Establish solid design and best engineering practice for engineers as well as non-technical people

The Team

The US Core Technology Operations delivers large scale software applications and integrated systems and assists its clients with architecture design, assessment and optimization, and definition. The practice aims at developing service-oriented architecture (SOA) and other integration solutions to enable information sharing and management between business partners and disparate processes and systems. It would focus on key client issues that impact the core business by delivering operational value, driving down the cost of quality, and enhancing technology innovation.


  • Bachelor's degree, preferably in Computer Science, Information Technology, Computer Engineering, or related IT discipline; or equivalent experience
  • Experience with the Big Data technologies (Hadoop, M/R, Hive, Spark, Metastore, Presto, Flume, Kafka, ClickHouse, Flink etc)
  • Experience in performing data analysis, data ingestion and data integration
  • Experience with ETL(Extraction, Transformation & Loading) and architecting data systems
  • Experience with schema design, data modeling and SQL queries
  • Passionate and self-motivated about technologies in the Big Data area
  • Strong data & logical analysis skills
  • Limited immigration sponsorship may be available
  • Travel up to 10% annually.

  • Presto/Dremio
  • HIVE
  • Data Analytics