Cloud Data Engineer (AWS, GCP and Azure) (59870)
- Manage teams in the identification of business requirements, functional design, process design (including scenario design, flow mapping), prototyping, testing, training, defining support procedures.
- Formulate planning, budgeting, forecasting and reporting strategies.
- Manage full life cycle implementations.
- Develop statements of work and/or client proposals.
- Identify business opportunities to increase usability and profitability of information architecture.
- Experience with program leadership, governance and change enablement.
- Develop and manage vendor relationships.
- Lead workshops for client education.
- Manage resources and budget on client projects.
- Assist and drive the team by providing oversight.
Analytics & Cognitive
In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment.
The Analytics & Cognitive team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets.
Analytics & Cognitive will work with our clients to:
- Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms
- Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions
- Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements
- 6+ years of experience in data engineering with an emphasis on data analytics and reporting.
- 6+ years of experience with at least one of the following cloud platforms: Microsoft Azure, Amazon Web Services (AWS), Google Cloud Platform (GCP), others.
- 6+ years of experience in SQL, data transformations, statistical analysis, and troubleshooting across more than one Database Platform (Cassandra, MySQL, Snowflake, PostgreSQL, Redshift, Azure SQL Warehouse, etc.).
- 6+ years of experience in the design and build of data extraction, transformation, and loading processes by writing custom data pipelines.
- 6+ years of experience with one or more of the follow scripting languages: Python, SQL, Kafka and/or other.
- 6+ years of experience designing and building solutions utilizing various Cloud services such as EC2, S3, EMR, Kinesis, RDS, Redshift/Spectrum, Lambda, Glue, Athena, API gateway, etc.
- Bachelor's degree or equivalent work experience.
• Travel up to 20% (While 20% of travel is a requirement of the role, due to COVID-19, non-essential travel has been suspended until further notice).
• Must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future.
- AWS, Azure and/or Google Cloud Platform Certification.
- Master's degree or higher.
- Expertise in one or more programming languages, preferably Scala, PySpark and/or Python.
- Experience working with either a Map Reduce or an MPP system on any size/scale.
- Experience working with agile development methodologies such as Sprint and Scrum
* Travel up to 20% (While 20% of travel is a requirement of the role, due to COVID-19, non-essential travel has been suspended until further notice).
* Must be willing to live and work in one of our Center locations:
1.Lake Mary, FL (Orlando area)
2.Mechanicsbur g, PA (Harrisburg area)
3.Gilbert, AZ (Phoenix area)
- Limited sponsorship may be available