Cloud Data Engineer - Solution Specialist - Location Open (86644)
- Work with the team to evaluate business needs and priorities, liaise with key business partners and address team needs related to data systems and management.
- Translate business requirements into technical specifications; establish and define details, definitions, and requirements of applications, components and enhancements.
- Participate in project planning; identifying milestones, deliverables and resource requirements; tracks activities and task execution.
- Generate design, development, test plans, detailed functional specifications documents, user interface design, and process flow charts for execution of programming.
- Develop data pipelines / APIs using Python, SQL, potentially Spark and AWS, Azure or GCP Methods.
- Use an analytical, data-driven approach to drive a deep understanding of fast changing business.
- Build large-scale batch and real-time data pipelines with data processing frameworks in AWS, Azure or GCP cloud platform.
From our centers, we work with Deloitte consultants to design, develop and build solutions to help clients reimagine, reshape and rewire the competitive fabric of entire industries. Our centers house a multitude of specialists, ranging from systems designers, architects and integrators, to creative digital experts, to cyber risk and human capital professionals. All work together on diverse projects from advanced preconfigured solutions and methodologies, to brand-building and campaign management.
We are a unique blend of skills and experiences, yet we underline the value of each individual, providing customized career paths, fostering innovation and knowledge development with a focus on quality. The US Delivery Center supports a collaborative team culture where we work and live close to home with limited travel.
- 3+ years of experience with at least one of the following cloud platforms: Microsoft Azure, Amazon Web Services (AWS), Google Cloud Platform (GCP), others.
- 3+ years of experience designing and building solutions utilizing various Cloud services such as EC2, S3, EMR, Kinesis, RDS, Redshift/Spectrum, Lambda, Glue, Athena, API gateway, etc.
- 3+ years of experience in SQL, data transformations, statistical analysis, and troubleshooting across more than one Database Platform (Cassandra, MySQL, Snowflake, PostgreSQL, Redshift, Azure SQL Warehouse, etc.).
- 3+ years of hands-on experience with building ETL mappings.
- 3+ years of experience with designing and implementing data warehouse, data marts environments for enterprise level applications. Perform execution and debugging of ETL mappings, interface code, configuration settings etc.
- 3+ years performing comparative analysis of databases (e.g., Oracle, PostgreSQL, Hadoop) and data WH to build modernization roadmap.
- 3+ years of experience in the design and build of data extraction, transformation, and loading processes by writing custom data pipelines.
- 3+ years of experience with one or more of the follow scripting languages: Python, SQL, Kafka and/or other.
- Bachelor's degree or equivalent work experience.
- Must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future
- Must have and be able to maintain the required clearance for this role
- Must have Active Secret Clearance
- Travel up to 10% annually
- AWS, Azure and/or Google Cloud Platform Certification.
- Bachelor's degree or higher.
- Expertise in one or more programming languages, preferably Scala, PySpark and/or Python.
- Experience working with either a Map Reduce or an MPP system on any size/scale.
- Experience working with agile development methodologies such as Sprint and Scrum.