DevOps Engineer

Job description

Senior DevOps Engineer

Our client, an AI based fin tech company that specialises in smart data analytics for insurers and credit provider, is currently building new IT&Data hub in Prague and we are looking for experienced DevOps Engineers to join their team.

Role Description:

Our DevOps Engineers support application development for AWS/Azure Cloud products and
administer automated CI/CD systems and tools for development and test teams.

The DevOps Engineer creates infrastructure in an automated way (IaaS) while working with
development teams to host the applications being developed. S/he creates and maintains
Kubernetes clusters though their life cycle on both AWS/Azure platforms, builds test automation
and pipeline frameworks, and supports the compliance and security integrity of the environment.

The DevOps team helps manage our data science environment, collaborating with the data
science team to scale up experiments and manage the productionising and deployment of ML

Required Skills:
• 5+ years’ experience applying relevant technical skills within a cloud environment, or on
   a cloud-based product, including Unix/Windows administration.
• A proven record for delivering high-quality, large-scale solutions
• Working knowledge of and hands-on experience with:
• Container technologies: Docker and Kubernetes, container registries ECR/ACR
• Monitoring technologies: ELK, Splunk, etc.
• Setting up CI/CD processes in tools such as Jenkins, TeamCity, Octopus
• ArgoCD and Helm Charts
• Infrastructure and provisioning as code Terraform, Pulumi, Ansible
• Git, JIRA, Confluence
• Solid hands-on experience with Unix/Linux command line including shell-scripting.
• Hands-on proficiency is required as well as the capability to architect and design
solutions from scratch
• Strong problem-solving, analytical, and quantitative skills
• A professional attitude and service orientation with the ability to work with our
international teams based out of the EU, India, and the UK

Desired Skills:
• A university degree from a reputed University with a record of academic excellence
• Working knowledge of:
• Ansible
• Large Scale/ Big Data technology, such as Hadoop, Spark, Hive, Impala,
   PrestoDb, Kafka
• Workflow orchestration tools like Apache Airflow
• Knowledge on one of the following programming languages is a plus: C#, Pytho

Image 2022 04 07 T13 01 14

Download our latest 2022 salary guide to discover how much you could be earning