Job Overview

Location
Lagos, Lagos
Job Type
Full Time
Date Posted
2 years ago

Additional Details

Job ID
40660
Job Views
99

Job Description



Summary



  • The Data Engineer will be responsible for designing, building, and maintaining scalable and robust data pipelines, ensuring the efficient and reliable flow of data throughout the organization.


Responsibilities



  • Design, build, and maintain scalable data pipelines to collect, process, and store data from various sources, ensuring efficient and reliable data flow.

  • Develop and implement data integration solutions using ETL (Extract, Transform, Load) tools and techniques.

  • Collaborate with data analysts, data scientists, and other stakeholders to understand and translate their data needs into technical requirements.

  • Optimize and improve existing data pipelines and architectures to ensure high performance, reliability, and maintainability.

  • Implement data validation, cleansing, and error-handling processes to ensure data quality and consistency across the organization.

  • Design and implement data storage solutions, such as databases, data warehouses, and data lakes, ensuring scalability, security, and accessibility.

  • Monitor and maintain the performance and reliability of data pipelines and systems, troubleshooting and resolving any issues that arise.

  • Stay current with industry trends, advancements in big data technologies, and emerging data engineering best practices.

  • Develop and maintain thorough documentation of data pipelines, architectures, and processes, ensuring that knowledge is preserved and accessible to the team.

  • Collaborate with cross-functional teams, providing support and guidance on data engineering best practices, tools, and technologies.


Requirements



  •  Bachelor's degree in Computer Science, Engineering, or a related field. A master's degree is a plus.

  • 3-5 years of experience in data engineering, big data, or a related field, with a proven track record of building and maintaining data pipelines and systems.

  • Strong proficiency in programming languages (e.g., Python, Java, Scala) and SQL.

  • In-depth knowledge of database management systems (e.g., SQL Server, MySQL, PostgreSQL) and data warehousing concepts.

  • Familiarity with ETL tools, such as SSIS, Azure Data factory or Apache Airflow.

  • Excellent problem-solving skills and the ability to debug and optimize complex data pipelines.

  • Strong understanding of data architecture principles and best practices.

  • Good communication and collaboration skills, with the ability to work effectively with cross-functional teams.

  • A proactive and curious mindset, with a passion for driving data-driven decisions and staying current with industry trends and technologies.


Similar Jobs

Cookies

This website uses cookies to ensure you get the best experience on our website. Cookie Policy

Accept