Software Engineer - Data Platform

Job description

For our client which is a premier provider of SaaS-based application security solutions, transforming the way companies secure applications in today’s software driven world, we are looking for a Software Engineer to join client Data Platform team who has experience building and operating cloud native data infrastructure and applications in an Agile environment. In this role you will collaborate with your team to validate technical requirements and deliver value to the engineering organization. You will be responsible for building/running real-time, resilient, and scalable event-streaming clusters and persistence stores as well as establishing/enabling repeatable patterns for asynchronous data integration across the platform. You will be working on an AWS stack with Kafka (Confluent) and Kubernetes as well as various persistence technology such as MongoDB (Atlas), PostgreSQL (Aurora), Data Lake and Redshift.

 

Responsibilities

  • Enable client engineering teams with cloud-native, scalable and highly available shared data infrastructure and tooling, allowing them to focus on business logic and rapid/frequent delivery
  • Work collaboratively with the team to deliver high-quality, test-driven code and scalable, fault-tolerant shared data infrastructure in support of our engineering platform
  • Become a source of domain and technical expertise for the team in one or more areas and actively learn new ones
  • Follow engineering best practices around testing, CI/CD, GitOps, TDD, architectural alignment, and relentless automation
  • Proactively support other team members and help them to be successful
  • Work to make an impact on the whole team and its remit
  • Understanding and familiarity with Cloud Native and 12 Factor Principles, Microservices, Lean Principles, DevOps, Test Driven Development (TDD), Extreme Programming (XP), Observability / Monitoring

 

Requirements and qualifications

  • BS/MS/PhD in Computer Science or related field or 3 years relevant industry experience
  • Experience working with AWS cloud products and services
  • Experience with database technology of multiple types (RDBMS, NoSQL, Columnar)
  • Experience running messaging, persistence and data processing clusters at scale leveraging managed services and public cloud infrastructure.
  • Coding experience with modern object-oriented languages/frameworks and SQL
  • Knowledge of popular big data technology such as MongoDB and Kafka
  • Knowledge of containers and container orchestration platforms, preferably Kubernetes

 

Desired skills

  • Experience with MongoDB, Kafka and associated ecosystem
  • Experience with Producers/Consumers, event-driven design patterns and asynchronous data integration with Kafka and/or similar technologies.
  • Experience with IAC technology such as Terraform
  • Experience delivering services using DevOps, GitOps, continuous delivery, infrastructure-as-code

 

show
#6fcfe1
Woman thinking and looking to the right

Download our free 2024 Salary Guide

and see how much you could be earning!