Responsibilities
-
Design and implement ETL/ELT job work flows from different source systems into Enterprise data warehouse
-
Developing the detailed design structure after understanding the requirements and the design
-
Implementing the best practices and coding standards
-
Identifying the technical risk and come up with mitigation action plan
-
Need to deliver the work product/ deliverable as per agreed milestones
-
Should display a high degree of adaptability to learn new technologies as needed and successfully manage the daily challenges of a technical environment
Qualifications and Requirements
-
A Bachelor's/Master’s degree in computer science, computer engineering, or a related discipline is required
-
6+ years of experience working as a Data Engineer
-
Experience in working on ETL,ELT and Data Pipelines
-
Experience in AWS is preferred
-
Experience in Snowflake is preferred
-
Experience in Airflow is preferred
-
Experience in kinesis/Kafka is preferred
-
Experience in Python is preferred
-
Experience in Mysql database is preferred
-
Strong experience in advanced SQL programming
-
Data modelling experience is preferred
Preferred qualities
-
Good Analytical and problem solving skills
-
Should be a team player with a winning attitude
-
Ability to adapt latest technologies and successful implementation in the projects
-
Ability to collaborate with internal and external stakeholders for effective delivery of the projects assigned
Minimum Skills Required
-
DataWareHouse,AWS,Snowflake,Python,S ql Programming
Additional Skills Preferred
-
Any Streaming Service(Kinesis,Kafka), Talend,Airflow,Dockers,Kubernetes,Dbt
For any questions on job openings & application details, write to us at:
[email protected]