Bank of Georgia is looking to add a DataOps Engineer to join our team. As a member of this team, DataOps Engineers’ primary role will be to work directly with the data and engineering team to building automated solutions for infrastructure, platform management and continuous delivery of large-scale. Monitoring, administration, maintenance of Data platforms


General Responsibilities:


  • Evaluate data importance and manage production of data pipelines

  • Configure application and server monitoring tools for system performance, uptime and daily operations;

  • Identify and implement opportunities for automating existing processes in order to streamline operations and support functions;

  • Design and development of data engineering assets and scalable engineering frameworks to support various Business unit’s data demands and internal data analytics activities

  • Expand and increase data platform capabilities to resolve new data problems and challenges by identifying, sourcing, and integrating new data

  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re- designing infrastructure for greater scalability, etc

  • Implement solutions that adhere to architecture best practices

  • Define and build the data pipelines that will enable faster, better, data-informed decision-making within the business

  • Migration of solutions from legacy systems to cloud data platform

  • Maintain platform performing regular tasks such as user management and audit, resource utilisation, alerts monitoring.

  • Modify existing ETL processes in order to achieve automation where possible and to accommodate changes in the data structure

  • Have a good understanding of data platform best practices to achieve economies of scale, cost reduction and efficiencies.




Qualifications:

Work Experience:

2 or more years’ experience of similar role;


Education:

Bachelor's degree in Information Science / Information Technology, Computer Science, Physic

 

Knowledge, Skills and Abilities:

  • Knowledge of relational DBs and Data Platforms;

  • Experience with Kubernetes Platform;

  • Experience with one or more of the IaaS provider is a plus (Azure, AWS, etc.);

  • Experience with Airflow and Mlflow platforms;

  • Experience with Hadoop, Hive and Big data ecosystem

  • Experience with Kafka, NiFi and ETL tools;

  • Experience with Python

  • Experience with Feast API

  • Familiarity with Git and continuous integration systems (Jenkins, etc.);

  • An understanding of enterprise systems including Windows, UNIX/Linux systems, IP networking (TCP/IP, HTTP, DNS), and security fundamentals;


How to apply

Interested candidates, please fill in the information, attach your CV and submit by clicking “apply for position now” Deadline is 28/03/2025

გამოაგზავნეთ განაცხადი

* მიუთითეთ თქვენი სახელი და გვარი სრულად, პირადობის მოწმობის იდენტურად

* გთხოვთ, სააპლიკაციო ფორმა შეავსოთ ქართული ფონტით