Send this job to your inbox!
We are seeking a talented Senior Data Engineer to join our team and contribute to the development and implementation of robust data solutions. In this role, you will work on modernizing data infrastructure by migrating customer systems from legacy on-premise solutions to cloud-based architectures using GCP, AWS, and Azure. You will collaborate closely with cross-functional teams to design, build, and maintain scalable data pipelines and storage solutions.
As a Senior Data Engineer, you will:
Implement data lake, data warehousing, ETL, streaming, and data analytics solutions across GCP, AWS, and Azure platforms.
Migrate data and processes from legacy on-premise systems (e.g., SQL Server, relational databases) to cloud-based solutions.
Design and develop efficient, scalable data pipelines to ingest, process, and transform data from various sources.
Optimize data storage and retrieval systems for performance and cost-efficiency.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver appropriate solutions.
Implement data quality checks and monitoring systems to ensure data integrity and reliability.
Contribute to the development of best practices and standards for data engineering within the organization.
Participate in code reviews and knowledge sharing sessions with the team.
Implement and maintain CI/CD pipelines for data engineering projects.
Use Infrastructure as Code (IaC) practices to manage and version cloud resources.
Qualifications
Minimum Requirements
Bachelor’s degree in a related discipline and 4 years’ experience in a related field OR a Master’s degree and 2 years’ experience OR a Ph.D. and 1 year of experience OR 16 years’ experience in a related field.
3 years of hands-on experience with cloud data solutions (GCP, AWS, Azure) designing and implementing solutions in at least one of these platforms.
Experience in two programming languages (e.g., Node.js, Go, Python or others) with a working knowledge of additional languages. Experience with transforming legacy code (e.g., Java, .Net) into cloud-native microservices.
Experience building and maintaining CI/CD pipelines and utilizing cloud automation tools for efficient software deployment.
Experience in Python or SQL programming languages. Experience with transforming legacy code (e.g., Java, .Net) into cloud-native microservices.
Experience with Terraform for Infrastructure as Code (IaC)
Preferred Qualifications
Experience with machine learning and AI services (e.g. Google Cloud AI Platform, Amazon SageMaker, Azure Machine Learning).
Knowledge of data modeling, data warehousing concepts, and dimensional modeling.
Professional certifications from GCP, AWS, and/or Azure.
Knowledge of containerization technologies (e.g., Docker, Kubernetes).
Compensation:
Compensation includes a base salary of $119,600.00 - $199,400.00. The base salary may vary within the anticipated base pay range based on factors such as the ultimate location of the position and the selected candidate’s knowledge, skills, and abilities. Position may be eligible for additional compensation that may include an incentive program.
Benefits:
The Company offers eligible employees the flexibility to take as much vacation with pay as they deem consistent with their duties, the company’s needs, and its obligations; seven paid holidays throughout the calendar year; and up to 160 hours of paid wellness annually for their own wellness or that of family members. Employees are also eligible for additional paid time off in the form of bereavement leave, time off to vote, jury duty leave, volunteer time off, military leave, and parental leave.
Job Type
Remote Status
Get notified about new listings!
Can't find the job you want?
Submit a general applicationLoading Jobs...