IBM Application Architect - Big Data - 2 year Fixed Term Contract in SYDNEY, Argentina

Job Description

Application Architect - Big Data - 2 year Fixed Term Contract - Sydney.

IBM is the largest technology and consulting employer in the world, serving clients in 170 countries. In this new era of Cognitive Business, IBM is helping to reshape industries by bringing together our expertise in Cloud, Analytics, Security, Mobile, and the Internet of Things. We are changing how we create. How we collaborate. How we analyse. How we engage. IBM is a leader in this global transformation so there is no better place to launch your career.

IBM Global Business Services (GBS) is a team of business, strategy and technology consultants enabling enterprises to make smarter decisions and providing unparalleled client and consumer experiences in cognitive, data analytics, cloud technology and mobile app development. With global reach, outcome-focused methodologies and deep industry expertise, IBM GBS empowers clients to digitally reinvent their business and get the competitive edge. We outthink ordinary. Discover what you can do at IBM. We are hiring.

We are currently recruiting Application Architect - Big Data for a 24 months fixed term contract, based in Sydney.

The Big Data Architect team is responsible for building and supporting the big data platform. As an Application Architect - Big Data, your responsibilities include –

  • Develop technical vision, architecture, design and development of big platform components.

  • Oversee complete Hadoop administration using Ambari Management console.

  • Spearhead installation and initial configuration of Hadoop including minor & major upgrades of Hadoop Eco System

  • Managing & Running jobs, scheduling Hadoop jobs, configuring the Scheduler

  • Rebalancing the Cluster, Monitoring and troubleshooting

  • Hadoop cluster job performance & capacity monitoring & future planning,

  • Monitoring Hadoop cluster connectivity, security and Alerts

  • Setup, configure and manage security for Hadoop clusters (Ranger Plugins, Ranger ACLs, Hadoop ACLs)

  • Performing Scheduled and Adhoc Snapshots, Backup and recovery for Hadoop Cluster

  • Log Analysis for Service Check failures and Job Failures.

To ensure success in the role you will possess the following skills –

  • At least 3 years of Hadoop Platform Deployment & Administration experience (Hortonworks Distribution)

  • 3 years of big data platform performance tuning and testing of Hadoop clusters and Hadoop MapReduce routines.

  • Familiar with installation and configuration of Hadoop components (Hortonworks Distribution)

  • Solid skills in Unix/Linux Administration

  • Good knowledge with infrastructure, network, hardware and software

Compensation package from $83 790 (full-time equivalent) and will be determined based on successful candidates relevant skills and experience.

If you tick these boxes, and are ready to start your next challenge with a career at IBM – click “Apply” today. To find out more, head to

Required Technical and Professional Expertise

See above

Preferred Tech and Prof Experience

See above