Data Engineer Resources to Support LTC Migration to EDP
Data Engineer resource for an 8 month Assignment
AWS Data Engineer CDO Data & modelling team is looking for a Data Engineer to join a diverse team dedicated to providing best in class data services to our customers, stakeholders and partners. As a part of our CDO organization, you will work with our ML Engineers, Data Scientist and various Business units to define solutions for operationalizing data-driven decision making in a cost effective and scalable manner.
Bachelor's degree in Computer Science, Software Engineering, MIS or equivalent combination of education and experience
Experience implementing, supporting data lakes, data warehouses and data applications on AWS for large enterprises
Programming experience with Python, Shell scripting and SQL
Solid experience of AWS services such as CloudFormation, S3, Athena , Glue, EMR/Spark, RDS, Redshift, DynamoDB, Lambda, Step Functions, IAM, KMS, SM etc.
Solid experience implementing solutions on AWS based data lakes.
Experience in AWS data lake/data warehouse/business analytics
Experience in system analysis, design, development, and implementation of data ingestion pipeline in AWS
Knowledge of ETL/ELT
End-to-end data solutions (ingest, storage, integration, processing, access) on AWS
Architect and implement CI/CD strategy for EDP
Implement high velocity streaming solutions using Amazon Kinesis, SQS, and Kafka (preferred)
Migrate data from traditional relational database systems, file systems, NAS shares to AWS relational databases such as Amazon RDS, Aurora, and Redshift
Migrate data from APIs to AWS data lake (S3) and relational databases such as Amazon RDS, Aurora, and Redshift
Implement POCs on any new technology or tools to be implemented on EDP and onboard for real use-case
AWS Solutions Architect or AWS Developer Certification preferred
5+ years of experience as Data engineer
Experience developing business applications using SQL databases.
Experience working Cloud (AWS preferred)
Should have good experience with AWS Services – S3, Athena, Glue, Lambda, Step Functions, SQS, Redshift.
Plus to have knowledge on Snowflake
Designing, building and maintaining efficient, reusable, and reliable architecture and code.
Build reliable and robust Data ingestion pipelines (within AWS, onprem to AWS ,etc)
Ensure the best possible performance and quality of high scale data engineering project
Participate in the architecture and system design discussions
Independently perform hands on development and unit testing of the applications;
Collaborate with the development team and build individual components into complex enterprise web systems;
Work in a team environment with product, production operation, QE/QA and cross functional teams to deliver a project throughout the whole software development cycle;
Responsible to identify and resolve any performance issues
Keep up to date with new technology development and implementation
Participate in code review to make sure standards and best practices are met