Job Details

Sr. Developer - Kafka

Location: BANGALORE, KARNATAKA

Date Opened: 05/15/2020

Job Type:

Job Number: 200001WT

Job Description

What This Position Is All About:

The Integration Team Big data Developer, is responsible for Data streaming program design, coding, unit testing, integration testing and implementation of Confluent Kafka programs, primarily using Apache & Confluent Kafka Clusters and Spark Programming (Scala/Python/Java).
 


Who You Are:

  • You lead by example, and can easily maintain your composure under pressure
  • Your oral and written communication skills are obviously effective and clear
  • You utilize your strong communication skills to inspire, direct, and lead your team to meet all proposed deadlines
You Also Have:
  • A minimum of 4-6 years development experience with at least two years in Big Data Development; including hands on experience with Kafka, Apache Spark, Java/Scala, CI/CD, K8s/Container technologies.
  • Strong understanding of Kafka both Apache and Confluent. Knowledge of concepts of the pub/sub model, writing producers, consumers and transforming data within a streaming platform, writing programs using Kafka APIs and Kafka Streams API
  • Experience with continuous integration and delivery tools. Must have experience in Dev ops tools like Kubernetes, Dockers, Jenkins, GIT Repository, Helm Chart.
  • Must understand various data formats like JSON, Avro, XML etc
  • Good in SQL scripting and data lake concepts.
  • A Bachelor's degree in Information Technology or related field (preferred) or equivalent experience.

As the Integration Big Data Developer, You Will:

 
  • Develop Spark jobs in (Scala/Python/Java) in order to stream / publish or consume data from various sources and data formats.
  • Must be good in spark performance tuning techniques.
  • Develop consumers and producers to work with Kafka topics helping integrate systems and stream data to upstream and downstream. Ability to perform mapping translations and allow for high throughput of data via configurations within the Kafka technology set.
  • Perform unit testing, QA, and work with business partners to resolve any issues discovered during UAT.
  • Conduct peer-review of mappings and workflows when required.
  • Maintain development and test data environments by populating the data based on project requirements.
  • Maintain all applicable documentation pertaining to specific SDLC phases
  • You have a proven and quantifiable track-record of success in delivering results within a large complex organization. 
  • You have the ability to quickly assess a new environment and develop solutions that support the business strategy, critical objectives, and cultural norms.
  • You are creative with a strategic mindset along with the ability to turn concepts into action. 
  • You are a change agent who is flexible, resilient, and able to thrive in a dynamic, rapid paced environment.
  • You embody a culture of taking smart risks and innovating to win.
  • You have strong analytical and problem solving skills.
  • You have the proven ability to work independently in a dynamic environment with multiple assigned projects and tasks.
  • Your oral and written communication skills are obviously effective and clear
 

Your Life and Career at HBC:

  • Be part of a world-class team; work with an adventurous spirit; think and act like an owner-operator.
  • Exposure to rewarding career advancement opportunities, from retail to supply chain, to digital or corporate.
  • A culture that promotes a healthy, fulfilling work/life balance.
  • Benefits package for all eligible full-time employees (including medical, vision and dental).
  • An amazing employee discount.

Job Qualifications

Thank you for your interest with HBC. We look forward to reviewing your application.

 

HBC provides equal employment opportunities (EEO) to all employees and applicants for employment.