Generate a CV for this Job!

Based on your profile and this job description, you can create a tailored CV to apply directly.

Senior Software Engineer (Spark and Python)

Government Employees Insurance Company • Maryland, US • On-site

Posted on: 22nd November, 2024
Employment Type: FULLTIME

Job Description

GEICO'S - Data Movement team of Data, Security & Infrastructure (DSI) department is seeking a highly motivated Senior Software Engineer to start or continue an IT career at GEICO on the GEICO Data Hub Telematics Project. We are looking for a Senior Software Engineer who will be building, Scaling ELT/ETL pipelines and will be working on Telematics Data with large volume. Candidate will be responsible for delivering the vision and strategy for modern data, analytics, and artificial intelligence/machine learning (AI/ML) to provide an end-to-end ecosystem for data storage, ingestion, transformation, analytics, and AI/ML. We are an agile team that builds complex solutions, and you will be working on next generation of platform.

Position Responsibilities

As a Senior Software Engineer, you will:
• Team up with architects, scrum masters, leads, managers, and directors, you will work in an Agile environment to make the telematics data on our Enterprise Data Platform accessible for the organization's needs.
• You will be working in a team with Data Architects and Analysts to build our next-generation data telematics platform in Azure.
• You will be trailblazing to apply Software Development techniques such as Automated Testing and CI/CD to building data products.
• Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies
• Utilize programming languages like Scala, Python and Open-Source RDBMS and NoSQL databases and Cloud based data warehousing services such as ADLS/Cosmos etc.
• Share your passion for staying on top of tech trends, experimenting with, and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community
• Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance
• You should be intellectually curious, have a solutions-oriented attitude, and enjoy learning new tools and techniques.

Experience & Skills:
• 3+ years of experience with designing, developing, implementing, and maintaining solutions for Big Data or data warehouse system.
• 2+ Years of experience working in a cloud environment such as Azure, AWS or other private or public cloud.
• Good experience with bringing data into a centralized data repository or manipulating the available data to build additional data sets for Analytics and Reporting purposes.
• Experienced with maintaining data quality throughout the lifecycle of the data.
• Experience with Data Modeling, source to target mapping, automated testing frameworks, CI/CD pipelines and task automation using scripting
• Developing new and enhancing existing data processing (Data Ingest, Data Transformation, Data Store, Data Management, Data Quality) components
• Strong working knowledge of SQL and the ability to write, debug and optimize SQL queries and ETL jobs to reduce the execution window or reduce resource utilization
• Data Engineering experience focused on batch and real-time data pipelines development, Data processing/data transformation using ETL tools such as databricks.
• Experience with Cloud Data Warehouse solutions experience (Snowflake, Azure DW, Redshift or similar technology in other private or public clouds);
• Complete software development lifecycle experience including design, documentation, implementation, testing, and deployment

Basic Qualifications:
• Bachelor's Degree in a computer-related field or equivalent professional experience required
• At least 2 year of experience in data engineering using open-source technology stack along with cloud computing (AWS, Microsoft Azure, Google Cloud)
• At least 1 year experience with designing, developing, implementing, and maintaining solutions for data transformation projects.
• At least 1 years of advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases

Preferred Qualifications:
• At least 2-year experience working on real-time data and streaming applications (Spark Streaming or Kafka)
• Proficient on Scala Spark and Python.
• At least 1 year experience with Databricks, ADF and Event Hub.
• At least 2-year experience in ELT/ETL and building data pipelines.
• Experience with building self-healing capabilities within the pipeline.
• Experience with Fail gracefully process and recover pipelines with no manual intervention.
• Working knowledge on Telematics data set is a plus.
• 24/7 On-Call support on a rotational basis

Annual Salary
$76,000.00 - $236,500.00

The above annual salary range is a general guideline. Multiple factors are taken into consideration to arrive at the final hourly rate/ annual salary to be offered to the selected candidate. Factors include, but are not limited to, the scope and responsibilities of the role, the selected candidate's work experience, education and training, the work location as well as market and business considerations.

GEICO will consider sponsoring a new qualified applicant for employment authorization for this position.

Benefits:

As an Associate, you'll enjoy our Total Rewards Program* to help secure your financial future and preserve your health and well-being, including:
• Premier Medical, Dental and Vision Insurance with no waiting period**
• Paid Vacation, Sick and Parental Leave
• 401(k) Plan
• Tuition Reimbursement
• Paid Training and Licensures
• Benefits may be different by location. Benefit eligibility requirements vary and may include length of service.
• *Coverage begins on the date of hire. Must enroll in New Hire Benefits within 30 days of the date of hire for coverage to take effect.

The equal employment opportunity policy of the GEICO Companies provides for a fair and equal employment opportunity for all associates and job applicants regardless of race, color, religious creed, national origin, ancestry, age, gender, pregnancy, sexual orientation, gender identity, marital status, familial status, disability or genetic information, in compliance with applicable federal, state and local law. GEICO hires and promotes individuals solely on the basis of their qualifications for the job to be filled.

GEICO reasonably accommodates qualified individuals with disabilities to enable them to receive equal employment opportunity and/or perform the essential functions of the job, unless the accommodation would impose an undue hardship to the Company. This applies to all applicants and associates. GEICO also provides a work environment in which each associate is able to be productive and work to the best of their ability. We do not condone or tolerate an atmosphere of intimidation or harassment. We expect and require the cooperation of all associates in maintaining an atmosphere free from discrimination and harassment with mutual respect by and for all associates and applicants.

Responsibilities

  • Candidate will be responsible for delivering the vision and strategy for modern data, analytics, and artificial intelligence/machine learning (AI/ML) to provide an end-to-end ecosystem for data storage, ingestion, transformation, analytics, and AI/ML
  • We are an agile team that builds complex solutions, and you will be working on next generation of platform
  • Team up with architects, scrum masters, leads, managers, and directors, you will work in an Agile environment to make the telematics data on our Enterprise Data Platform accessible for the organization's needs
  • You will be working in a team with Data Architects and Analysts to build our next-generation data telematics platform in Azure
  • You will be trailblazing to apply Software Development techniques such as Automated Testing and CI/CD to building data products
  • Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies
  • Utilize programming languages like Scala, Python and Open-Source RDBMS and NoSQL databases and Cloud based data warehousing services such as ADLS/Cosmos etc
  • Share your passion for staying on top of tech trends, experimenting with, and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community
  • Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance
  • You should be intellectually curious, have a solutions-oriented attitude, and enjoy learning new tools and techniques

Requirements

  • 3+ years of experience with designing, developing, implementing, and maintaining solutions for Big Data or data warehouse system
  • 2+ Years of experience working in a cloud environment such as Azure, AWS or other private or public cloud
  • Good experience with bringing data into a centralized data repository or manipulating the available data to build additional data sets for Analytics and Reporting purposes
  • Experienced with maintaining data quality throughout the lifecycle of the data
  • Experience with Data Modeling, source to target mapping, automated testing frameworks, CI/CD pipelines and task automation using scripting
  • Developing new and enhancing existing data processing (Data Ingest, Data Transformation, Data Store, Data Management, Data Quality) components
  • Strong working knowledge of SQL and the ability to write, debug and optimize SQL queries and ETL jobs to reduce the execution window or reduce resource utilization
  • Data Engineering experience focused on batch and real-time data pipelines development, Data processing/data transformation using ETL tools such as databricks
  • Experience with Cloud Data Warehouse solutions experience (Snowflake, Azure DW, Redshift or similar technology in other private or public clouds);
  • Complete software development lifecycle experience including design, documentation, implementation, testing, and deployment
  • Bachelor's Degree in a computer-related field or equivalent professional experience required
  • At least 2 year of experience in data engineering using open-source technology stack along with cloud computing (AWS, Microsoft Azure, Google Cloud)
  • At least 1 year experience with designing, developing, implementing, and maintaining solutions for data transformation projects
  • At least 1 years of advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
Government Employees Insurance Company

Government Employees Insurance Company

Technology

Location

Maryland, US

Job Type

FULLTIME

Benefits

  • Annual Salary
  • $76,000.00 - $236,500.00
  • The above annual salary range is a general guideline
  • Multiple factors are taken into consideration to arrive at the final hourly rate/ annual salary to be offered to the selected candidate
  • As an Associate, you'll enjoy our Total Rewards Program* to help secure your financial future and preserve your health and well-being, including:
  • Premier Medical, Dental and Vision Insurance with no waiting period**
  • Paid Vacation, Sick and Parental Leave
  • 401(k) Plan
  • Tuition Reimbursement
  • Paid Training and Licensures
  • Benefits may be different by location

Loading...

Loading...

Generate a Tailored CV Before Applying!

A customized CV will make your application stand out. Use your profile and this job description to create the perfect CV!

Ready to Apply?

Click the button below to start your application process.

Related Jobs

Erias Ventures, LLC

2 weeks ago

FULLTIME

CNO - Application Developer - C++, Java, Python

Maryland, US View Job

TechnoGen Inc

1 week ago

FULLTIME

Python/AWS ETL Developer Position in Washington, DC

District of Columbia, US View Job