Job Description
About the Role
ALTA IT Services is a leading provider of innovative solutions to government agencies. We are currently seeking a highly skilled Data Engineer to join our team.
The ideal candidate will have a strong background in data engineering and experience in designing, implementing, and maintaining robust pipelines. This role requires deep understanding of data architecture, data warehousing, ETL processes and big data technologies.
Key Responsibilities
• Build, maintain, and optimize data pipelines for extracting, transforming, and loading (ETL) data from various sources into our data warehouse.
• Integrate large, complex data sets that meet functional and non-functional business requirements.
• Design and implement scalable and reliable data warehousing solutions.
• Extract data from various sources, transform it into a usable format, and load it into Azure data storage solutions such as Azure Data Lake Storage, Azure SQL Database, or Azure Synapse Analytics.
• Develop and maintain data pipelines using Azure Data Factory and other relevant Azure technologies.
• Manage and optimize databases, ensuring data quality, integrity, and security.
• Automate manual processes, optimize data delivery, and re-design infrastructure for greater scalability.
• Collaborate with data scientists, analysts, and other stakeholders to understand data needs and deliver high-quality data solutions.
• Document data flows, processes, and system architecture to ensure clarity and knowledge sharing within the team.
Requirements
• An active DoD Secret clearance.
• BA/BS in Computer Science, Information Technology, Data Analytics, or related field.
• 5+ years of related experience.
• Strong experience in data extract transform load (ETL) processes using Azure services, such as Azure Data Factory, Azure Databricks, Azure Data Lake Storage, Azure SQL Database, or Azure Synapse Analytics.
• 4+ years' experience and skill writing coding languages (such as SQL, Python, R, SAS, and Java Scripts).
• 2+ years' experience working with projects involving machine learning, natural language processing, robotics process automation, artificial intelligence, text and/or data mining, as well as statistical and mathematical methods.
• Experience with using SQL to conduct complex database queries.
• Experience with AI/ML.
Responsibilities
- Build, maintain, and optimize data pipelines for extracting, transforming, and loading (ETL) data from various sources into our data warehouse
- Integrate large, complex data sets that meet functional and non-functional business requirements
- Design and implement scalable and reliable data warehousing solutions
- Extract data from various sources, transform it into a usable format, and load it into Azure data storage solutions such as Azure Data Lake Storage, Azure SQL Database, or Azure Synapse Analytics
- Develop and maintain data pipelines using Azure Data Factory and other relevant Azure technologies
- Manage and optimize databases, ensuring data quality, integrity, and security
- Automate manual processes, optimize data delivery, and re-design infrastructure for greater scalability
- Collaborate with data scientists, analysts, and other stakeholders to understand data needs and deliver high-quality data solutions
- Document data flows, processes, and system architecture to ensure clarity and knowledge sharing within the team
Requirements
- The ideal candidate will have a strong background in data engineering and experience in designing, implementing, and maintaining robust pipelines
- This role requires deep understanding of data architecture, data warehousing, ETL processes and big data technologies
- An active DoD Secret clearance
- BA/BS in Computer Science, Information Technology, Data Analytics, or related field
- 5+ years of related experience
- Strong experience in data extract transform load (ETL) processes using Azure services, such as Azure Data Factory, Azure Databricks, Azure Data Lake Storage, Azure SQL Database, or Azure Synapse Analytics
- 4+ years' experience and skill writing coding languages (such as SQL, Python, R, SAS, and Java Scripts)
- 2+ years' experience working with projects involving machine learning, natural language processing, robotics process automation, artificial intelligence, text and/or data mining, as well as statistical and mathematical methods
- Experience with using SQL to conduct complex database queries
- Experience with AI/ML