Medior Data Engineer at Data2Bots

Job Overview

Location
Lagos, Jigawa
Job Type
Full Time
Date Posted
2 years ago

Additional Details

Job ID
44322
Job Views
82

Job Description



RESPONSIBILITIES



  • Implement DevOps practices, CI/CD pipelines, and Infrastructure as Code for efficient and automated data workflows. 

    • Preferred; Terraform (1.5 years of experience)

    • Github Workflows

    • Cloud Native CI/CD pipelines



  • AWS (Codepipeline, Codebuild and Codecommit).

  • Working with or building production-grade end-to-end data platforms.

    • Architecting scalable and maintainable cloud infrastructure, with specific emphasis on the data domain.

    • Technical implementation and deployment of architected cloud infrastructure.

    • Follow software development standards and best practices, including Test Driven Development (TDD), Keep it simple, stupid (KISS), you aren't gonna need it (YAGNI), and don't repeat yourself (DRY).



  • Apply data warehousing concepts and data warehouse modelling techniques to design and optimise data solutions.

  • Work with cloud data warehouse solutions such as Redshift, BigQuery, or Snowflake to build scalable and performant data solutions.

  • Utilise data flow orchestration tools such as Apache Airflow to schedule and manage data workflows.

  • Leverage distributed computing and container orchestration technologies like Kubernetes for efficient and scalable data processing.

  • Explore and implement data streaming solutions like Kafka for real-time data processing.

  • Design and develop microservices and event-driven architecture for efficient data integration and processing.

  • Core knowledge of Data Engineering frameworks such as Spark, Kafka, and Airflow is a plus

  • Attention to detail

  • Leadership skills


REQUIREMENTS AND SKILLS



  • Bachelor's degree in computer science, an engineering discipline, or a related field.

  • Minimum of 4 years of professional IT industry experience.

  • Minimum of 4 years of software engineering experience.

  • Minimum of 2 years of cloud computing experience, preferably with AWS.

  • Proficiency in ETL development using Python and SQL.

  • Minimum of 1.5 with DevOps, CI/CD, and Infrastructure as Code.

  • Preferred Terraform.

  • Good understanding of software development standards and best practices.

  • Knowledge of data warehousing concepts and data warehouse modelling.

  • Experience working with at least one Cloud Data Warehouse Solution such as Redshift, BigQuery, or Snowflake.

  • Experience with data flow orchestration tools such as Apache Airflow is a plus.

  • Experience with distributed computing and container orchestration (Kubernetes) is a plus.

  • Experience with data streaming solutions such as Kafka is a plus.

  • Experience with microservices and event-driven architecture is a plus.


SOFT SKILLS



  • Excellent understanding of Agile Methodology and experience with Scrum rituals.

  • Ability to work independently, think proactively, and pay attention to details, while providing technical leadership to a team.

  • Demonstrated experience in leading data engineering projects from conception to finished product, and ability to drive technical decisions.

  • Adaptability to a fast-paced technical environment, with a strong sense of urgency and ability to meet tight deadlines.

  • Energetic, motivated, and a team player, with excellent communication skills in English (both written and spoken).

  • Ability to effectively communicate and collaborate with cross-functional teams and business stakeholders.

  • Ability to communicate effectively with cross-functional teams and business stakeholders


Similar Jobs

Cookies

This website uses cookies to ensure you get the best experience on our website. Cookie Policy

Accept