Our client is looking for a Data Engineer to join a diverse team dedicated to providing best in class data services to our customers, stakeholders, and partners. As a part of our CDO organization, you will develop software applications supporting data projects in a cost effective and scalable manner. 6 month contract. Will convert to perm if you desire.
Job Responsibilities:
Designing, building and maintaining efficient, reusable, and reliable architecture and code.
Ensure the best possible performance and quality of high scale web applications and services.
Participate in the architecture and system design discussions
Independently perform hands on development and unit testing of the applications.
Collaborate with the development team and build individual components into complex enterprise web systems;
Work in a team environment with product, frontend design, production operation, QE/QA and cross functional teams to deliver a project throughout the whole software development cycle.
Responsible to identify and resolve any performance issues
Keep up to date with new technology development and implementation
Participate in code review to make sure standards and best practices are met
Qualification:
Bachelor's degree in Computer Science, Software Engineering, MIS or equivalent combination of education and experience
Experience implementing software applications supporting data lakes, data warehouses and data applications on AWS for large enterprises
Programming experience with Python, Shell scripting and SQL
Solid experience of AWS services such as CloudFormation, S3, Athena, Glue, EMR/Spark, RDS, Redshift, DataSync, DMS, DynamoDB, Lambda, Step Functions, IAM, KMS, SM etc.
Solid experience implementing solutions on AWS based data lakes.
Experience in AWS data lake/data warehouse/business analytics
Experience in system analysis, design, development, and implementation of data ingestion pipeline in AWS
Knowledge of ETL/ELT
End-to-end data solutions (ingest, storage, integration, processing, access) on AWS
Architect and implement CI/CD strategy for EDP
Implement high velocity streaming solutions using Amazon Kinesis, SQS, and Kafka (preferred)
Migrate data from traditional relational database systems, file systems, NAS shares to AWS relational databases such as Amazon RDS, Aurora, and Redshift
Migrate data from APIs to AWS data lake (S3) and relational databases such as Amazon RDS, Aurora, and Redshift
AWS Solutions Architect or AWS Developer Certification required
Requirements:
5+ years of experience as Data application developer
Experience developing business applications using NoSQL/SQL databases.
Experience working with Object stores(S3) and JSON is must have
Should have good experience with AWS Services – Glue, Lambda, Step Functions, SQS, DynamoDB, S3, Redshift.