Job was saved successfully.
Job was removed from Saved Jobs.

Job Details


AWS Data Engineering (62131)

Engineering and Architecture

Architectural Engineering



New York, New York, United States

Are you an experienced, passionate pioneer in technology? A system's professional who wants to work in a collaborative environment. As an experienced AWS Data you will have the ability to share new ideas and collaborate on projects as a consultant without the extensive demands of travel. If so, consider an opportunity with Deloitte under our Project Delivery Talent Model. Project Delivery Model (PDM) is a talent model that is tailored specifically for long-term, onsite client service delivery. PDM practitioners are local to project locations, minimizing extensive travel, and provides you with a full career path within the firm.

Work you'll do/Responsibilities
As an AWS Data Engineer, you will be expected to perform the following:

  • Creating/managing AWS services
  • Work with distributed systems as it pertains to data storage and computing
  • Building and supporting real-time data pipelines
  • Design and build of data extraction, transformation, and loading processes by writing Step functions or custom data pipelines

The Team

The US Core Technology Operations delivers large scale software applications and integrated systems and assists its clients with architecture design, assessment and optimization, and definition. The practice aims at developing service-oriented architecture (SOA) and other integration solutions to enable information sharing and management between business partners and disparate processes and systems. It would focus on key client issues that impact the core business by delivering operational value, driving down the cost of quality, and enhancing technology innovation.



  • Bachelor's degree, preferably in Computer Science, Information Technology, Computer Engineering, or related IT discipline; or equivalent experience
  • Must have 3-5 years' experience with AWS data services
  • Strong database experience in Relational, Columnar, NOSQL & Timeseries databases
  • Working experience on building and supporting real-time data pipelines using AWS Glue, Redshift/Spectrum, Kinesis, Firehose, Pyspark, EMR and Athena
  • Knowledge and hands-on experience with AWS solutions including S3, SNS, SQS, DynamoDB, Redshift and AWS RDS
  • Experience in the design and build of data extraction, transformation, and loading processes by writing Step functions or custom data pipelines
  • Nice to have experience working on Hadoop, Data Bricks
  • Familiarity with log formats from various AWS services such as S3 server access, CloudFront distribution, Lambda execution, ELB, Container execution etc.
  • Experience on creating AWS Lambda functions using Python or R scripts
  • Familiarity with AWS infrastructure related services such as: AWS VPC, EC2 Instances, Network policies and Cloud Watch
  • Limited immigration sponsorship maybe available