Company:
Nityo Infotech
Location: London
Closing Date: 02/11/2024
Hours: Full Time
Type: Permanent
Job Requirements / Description
This role is complete onsite and the location is London, UK.
The experience expected from applicants, as well as additional skills and qualifications needed for this job are listed below.
Key Responsibilities:
Design, develop, and maintain ETL data pipelines using Scala and PySpark.
Work on big data processing frameworks like Apache Spark to process large datasets efficiently.
Integrate various data sources and databases into the data processing ecosystem.
Collaborate in Agile environments, contributing to sprint planning, code reviews, and continuous integration practices.
Required Skills and Qualifications:
Proficiency in Scala and PySpark for data processing and ETL development.
Strong understanding of Apache Spark and distributed computing frameworks.
Strong understanding of data structures, algorithms, and software engineering best practices.
Excellent problem-solving and analytical skills.
Share this job
Nityo Infotech
Useful Links