Job Description
KEY RESPONSIBILITIES:
- Data Pipeline Development
- Data Integration and Management
- Data Quality and Governance
- Collaboration and Support
- Continuous Improvement
REQUIREMENTS/ QUALIFICATIONS:
- Bachelor’s degree (First class Graduate) in Computer Science, Information Technology, Engineering, or a related field.
- Master’s degree or relevant certification in data engineering, data science, or big data technologies is a plus.
- .2+ years of experience in data engineering or related roles
- .Proven track record of designing and implementing data pipelines and ETL processes.
TECHNICAL SKILLS:
- Proficiency in programming languages such as Python, Java, or Scala.
- Strong knowledge of SQL and experience with relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra).
- Experience with data processing frameworks and tools such as Apache Spark, Hadoop, or Kafka.
- Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud) for data storage and processing.
ANALYTICAL SKILLS:
- Excellent problem-solving skills and attention to detail.
- Ability to work with large datasets and perform data analysis to derive insights.
SOFT SKILLS:
- Strong communication and collaboration abilities.
- Ability to work in a fast-paced, dynamic environment and manage multiple tasks effectively.
Note: Only successful applicants will be contacted.