Due to our client's rapid expansion, we are looking for a
What will you be doing:
Thoughtful problem-solving: For you, problem-solving starts with a clear and accurate understanding of the context. You can decompose tricky problems and work towards a clean solution, by yourself or with teammates. You're comfortable asking for help when you get stuck.
Responsible for building data and analytical engineering solutions with standard end-to-end design & ETL patterns, implementing data pipelines, data modeling, data consumption, and query optimization.
Responsible to enable access of data in providers such as AWS storage layers and transformations in AWS Data warehouse and further transporting in respective databases, consumers, data marts etc.
Demonstrate proficiency knowledge of on web framework such as Flask or Django.
Experience with high-volume production environments including recovery and resilience infrastructures.
Identify downstream implications of data loads/migration (e.g., data quality, regulatory, etc.)
Bachelors degree in Data Science, Computer Science, or related degree
Minimum of 4+ years of relevant experience working with data warehousing, ETL processes, and cloud platforms (AWS).
Strong experience in deploying solutions on both on-premise and cloud environments, including Docker containerization.
Familiarity with technology stacks such as Airflow for efficient workflow management.
Proven track record of delivering high-quality results within shorter-term, outcome-focused service engagements.
Expertise in working with data warehouses and implementing dimensional and data modeling techniques.
Proficiency in SQL and Python (Pandas, PySpark) for data querying and programming tasks.
Comfortable working with operating systems like Linux and executing shell scripting.
Extensive knowledge of AWS services related to compute, storage, and processing (S3, Glue, Lambda, Redshift, Dynamo DB, Athena, Sagemaker).
Familiarity with data streaming technologies such as Kafka.
Experience working with databases like PostgreSQL, MySQL, and MS SQL Server.
Nice to Have:
Experience with ELK (Elasticsearch, Logstash, Kibana) stack for log management and analysis.
Excellent remuneration and benefits package (including equity package)
Day to day usage of cutting-edge technologies
High-level medical insurance
Extended annual leave
Great supporting team and environment
Being part of a global function and global knowledge-sharing