Raiffeisen Bank International AG

Drucken Merken

DevOps Engineer Big Data Cloud (f/m/x)

Anfahrtszeit berechnen Bitte geben Sie Ihre Adresse ein
Adresse merken i
benötigte
Anfahrtszeit
h

International business requires an international corporate philosophy. Are you open to new ideas and do you value cultural diversity? At Raiffeisen Bank International, we are pleased to have more than 16 million customers in 13 CEE countries. And our journey continues – with exciting new issues for us to tackle such as digitalisation and changing customer needs. Join us on our journey.

DevOps Engineer Big Data Cloud (f/m/x)

As Cloud DevOps you are the bridge between the latest and scalable big data technologies and data consumers who would like to orchestrate their data with the same. You will be in charge for building a next level self service data lake that will serve 50.000 data consumers with their demand to autonomously handle their data by their own. Experimenting with new technologies is key aspect in further innovating the data handling process.

What you can expect:
  • Innovation and implementation of the cloud Data Lake platform infrastructure as the basis for data engineers to work with their data
  • Optimize existing infrastructure and data transformation pipelines to become more scalable = Self Service ability
  • Optimize data provisioning workflow to allow real time replication of data from on-premise into Data Lake (AWS)
  • Implement new features like provider or consumer connections as IaaC (e.g. PowerBI connection towards AWS)
  • Implement cost optimizations as part of the platform
  • Implement test cases as part of development process
  • Support solution architects to define the software architecture
  • Taking end to end responsibility for changes throughout delivery pipeline

What you bring to the table:
  • 5+ years professional experience in developing and operating software solutions
  • 2+ years professional experience in building scalable big data solutions in the area of cloud or Hadoop
  • 2+ year experience with AWS and hands on experience to build Infrastructure as code and services on top
  • Demonstrated ability to build processes that support data transformation, data structures, metadata, dependency and workload management
  • Professional experience in designing and developing data pipelines in Python/Spark
  • Practical experience with Terraform, Airflow, Data Bricks, Delta Lake is preferred
  • Proactivity; Curiosity; Responsibility; Ideas & Confidence
  • Structured working approach and problem-solving skills
  • Fluent English; German or another CEE language is appreciated, but not mandatory

What we offer:
  • You’ll work in an international team at a leading bank
  • You’ll benefit from flexible working arrangements and determine your own work-life balance
  • You’ll benefit from the very latest in tailored professional development
  • You’ll earn an appropriate salary starting at gross EUR 60,000 p.a. including overtime

RBI AG is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ethnicity, race or color, national origin, religion, political or other opinion, gender, sexual orientation or disability.
 
We are looking forward to receiving your online application!
https://jobs.rbinternational.com
Möchten Sie sich anmelden?

Um Ihre Merkliste dauerhaft zu speichern, melden Sie sich bitte mit Ihrem derStandard.at Benutzerkonto an.

Falls Sie noch kein derStandard.at Benutzerkonto besitzen, können Sie sich kostenlos registrieren.


Ohne Anmeldung fortfahren Anmelden