Job Description
About the job Tech Lead (Spark and Python expertise)Position:- Tech Lead (Spark and Python expertise)Location:- McLean, VA. (Day-1 Onsite| 3 days Onsite & 2 Days remote)Duration long termPay rate :-$67-$70/hr on C2C and $125,000-$130,000/annumClient : HexawareJob Description:Tech Lead (Spark and Python expertise)Job description:Mandatory:
• 10+ years of experience in solution, design and development of applications using Java 8+/J2EE, Spring, Spring-Boot, Micro Services, RESTful Services and with experience in Big Data and with experience working in heavy data background needed.
• Develop, program, and maintain applications using the Apache Spark open-source framework
• Work with different aspects of the Spark ecosystem, including Spark SQL, DataFrames, Datasets, and streaming
• Spark Developer must have strong programming skills in Java, Scala, or Python
• Familiar with big data processing tools and techniques
• Proven experience as a Spark Developer or a related role
• Strong programming skills in Java, Scala, or Python
• Familiarity with big data processing tools and techniques
• Experience with the Hadoop ecosystem
• Good understanding of distributed systems
• Experience with streaming data platforms
• Must have strong experience in Big Data and with experience working in heavy data background needed
• Must be strong in Cloud AWS event-based architecture, Kubernetes, ELK (Elasticsearch, Logstash & Kibana)
• Must have excellent experience in designing and Implementing cloud-based solutions in various AWS Services (: s3, Lambda, Step Function, AMQ, SNS, SQS, CloudWatch Events, etc.)
• Must be well experienced in design and development of Microservice using Spring-Boot and REST API and with GraphQL
• Must have solid knowledge and experience in NoSQL (MongoDB)
• Good knowledge and experience in any Queue based implementations
• Strong knowledge/experience in ORM Framework - JPA / Hibernate
• Good knowledge in technical concepts Security, Transaction, Monitoring, Performance
• Should we well versed with TDD/ATDD
• Should have experience on Java, Python and Spark
• 2+ years of experience in designing and Implementing cloud-based solutions in various AWS Services
• Strong experience in DevOps tool chain (Jenkins, Artifactory, Ansible/Chef/Puppet/Spinnaker, Maven/Gradle, Atlassian Tool suite)
• Very Good knowledge and experience in Non-Functional (Technical) Requirements like Security, Transaction, Performance, etc.
• Excellent analytical and problem-solving skillsNice to have:
• Experience in Experience with OAuth implementation using Ping Identity
• Experience in API Management (Apigee) or Service Mesh (Istio)
• Good knowledge and experience in Queue/Topic (Active-MQ) based implementations
• Good knowledge and experience in Scheduler and Batch Jobs
• Experience with scripting languages using Unix
• Preferably certified in AWS
Responsibilities
- (Day-1 Onsite| 3 days Onsite & 2 Days remote)Duration long term
- Work with different aspects of the Spark ecosystem, including Spark SQL, DataFrames, Datasets, and streaming
Requirements
- 10+ years of experience in solution, design and development of applications using Java 8+/J2EE, Spring, Spring-Boot, Micro Services, RESTful Services and with experience in Big Data and with experience working in heavy data background needed
- Develop, program, and maintain applications using the Apache Spark open-source framework
- Spark Developer must have strong programming skills in Java, Scala, or Python
- Familiar with big data processing tools and techniques
- Proven experience as a Spark Developer or a related role
- Strong programming skills in Java, Scala, or Python
- Familiarity with big data processing tools and techniques
- Experience with the Hadoop ecosystem
- Good understanding of distributed systems
- Experience with streaming data platforms
- Must have strong experience in Big Data and with experience working in heavy data background needed
- Must be strong in Cloud AWS event-based architecture, Kubernetes, ELK (Elasticsearch, Logstash & Kibana)
- Must have excellent experience in designing and Implementing cloud-based solutions in various AWS Services (: s3, Lambda, Step Function, AMQ, SNS, SQS, CloudWatch Events, etc.)
- Must be well experienced in design and development of Microservice using Spring-Boot and REST API and with GraphQL
- Must have solid knowledge and experience in NoSQL (MongoDB)
- Good knowledge and experience in any Queue based implementations
- Strong knowledge/experience in ORM Framework - JPA / Hibernate
- Good knowledge in technical concepts Security, Transaction, Monitoring, Performance
- Should we well versed with TDD/ATDD
- Should have experience on Java, Python and Spark
- 2+ years of experience in designing and Implementing cloud-based solutions in various AWS Services
- Strong experience in DevOps tool chain (Jenkins, Artifactory, Ansible/Chef/Puppet/Spinnaker, Maven/Gradle, Atlassian Tool suite)
- Very Good knowledge and experience in Non-Functional (Technical) Requirements like Security, Transaction, Performance, etc
- Excellent analytical and problem-solving skills
- Experience in Experience with OAuth implementation using Ping Identity
- Experience in API Management (Apigee) or Service Mesh (Istio)
- Good knowledge and experience in Queue/Topic (Active-MQ) based implementations
- Good knowledge and experience in Scheduler and Batch Jobs
- Experience with scripting languages using Unix
- Preferably certified in AWS