ASAP: Kafka Engineer (f/m) - min. 6 Months
Basel
Temporary
- Apply
- Save Job
Bullet points
Hands-on Kafka Engineer. International Monetary Institution
Interact and contribute within a dynamic environment
About Our Client
Financial Institution based in Basel.
Job Description
- You will join the Data and Analytics team who are responsible for developing, maintaining, and supporting the data and analytics platforms and systems.
- You will help tackle a range of complex software and data challenges, including big data, data warehousing, advanced analytics, business intelligence and data governance.
- You will help to implement, maintain and support shared data platforms and bespoke analytical systems exploiting cutting-edge technologies and modern software development practices.
- Design, implementation and maintenance of an enterprise installation of Kafka
- Participating in the creation and execution of data governance, including message design, schema validation, and versioning
- Working with developers from other business unit technical teams to assist them in implementing functional solutions using Kafka
- Supporting the establishment of a platform SLA, including defining non-functional requirements
- Working with networks, data center, and infrastructure teams to optimize hardware solutions for the installation of Kafka
- You will collaborate with various of stakeholders: expert economists, technologists, data scientists and statisticians - and counterparts in other international organizations and central banks.
- Help to modernize the data management infrastructure by: Building a new modern data warehouse and data lake architecture, by offering innovative data lab environment for self-service analytics and to enhance capabilities to support machine learning.
The Successful Applicant
- Bachelor's degree or above (ideally, IT-related or technical)
- Deep experience with software development or data engineering
- Profound experience in designing, implementing and maintaining Apache Kafka as part for Cloudera Data Platform distribution
- Data pipeline and workflow management tools: Airflow, RunDeck,Nifi etc.
- Stream-processing systems: Kafka, Spark-Streaming, etc.
- Experience in designing and developing high-volume mission critical transactional data integration solutions
- Knowledge of message queuing, stream processing and highly scalable 'big data' data stores
- Building processes supporting data transformation, data structures, metadata, dependency and workload management
- Proven experience of designing middleware message governance, topic subscription and management, including schema validation and version compatibility
- Agile methodologies like Scrum
- Knowledge of Service-oriented architecture and experience in API creation and management technologies (REST, SOAP etc)
- Strong problem solving skills, own initiative and team-player skills
- Strong written and oral English communication skills, other languages are preferred
What's on Offer
If you want to be part of a global financial institution and add content, through your hands-on Kafka Engineer skills and flexible mindset, then we look forward to receiving your CV.

PageGroup is an equal opportunity employer committed to workforce diversity, both as an employer as well as a recruitment service provider. Each recruitment decision we make for people we hire and people we place into new roles is based solely on the candidates’ knowledge, experience and skills.