Jobs in Turing – Remote Back-End Engineer Jobs

  • Full Time
  • Pakistan

Turing

A US-based company that is home to the world’s largest selection of guitars and musical equipment is looking for a Kafka Engineer. The engineer will play a critical role in modernizing the platformƒ??s current infrastructure while building modern, robust, and scalable features. The company operates the worldƒ??s largest multichannel musical instrument retail services and is on a mission to develop and nurture lifelong musicians and make a difference in the musical world. They have managed to raise more than $30mn in funding so far.


Job Responsibilities:

  • Help facilitate the implementation of Confluent Kafka streaming and enhance the middleware administration
  • Responsible for setting up Kafka brokers, Kafka Mirror Makers, and Kafka Zookeeper on hosts in collaboration with the Infrastructure team
  • Design, build and maintains Kafka topics
  • Contribute to the tuning and architecture with a strong understanding of related Kafka Connect and Linux fundamentals
  • Carefully observe Kafka health metrics and alerts, taking action in a timely manner
  • Implement a real-time and batch data input pipeline employing best practices in data modeling and ETL/ELT operations
  • Participate in technological decisions and work with smart colleagues
  • Review code, implementations, and provide useful input to assist others in developing better solutions
  • Develop documentation on design, architecture, and solutions
  • Provide assistance and coaching to peers and more junior engineers
  • Build good working relationships at all levels of the organization and across functional teams
  • Assume accountability for the project’s timetables and deliverables
  • Create dataflows and pipelines ranging from simple to complicated
  • Support the investigation and resolution of production difficulties
  • Work to keep the system and data security at a high level, ensuring that the application’s confidentiality, integrity, and availability are not jeopardized
  • Express stakeholder’s needs into familiar language that can be adopted for use with Behavior Driven Development (BDD) or Test-Driven Development (TDD)
  • Build solutions that are stable, scalable, and easy to use while fitting into the broader data architecture
  • Assists in the formation of Communities of Practice
  • Utilize industry-standard approaches to continuously improve the performance of source code
  • Consistently enhance the performance of source code using industry-standard methodologies
  • Steer the technology direction and options by proferring suggestions based on experience and research
  • Encourages the creation of group norms and procedures


Job Requirements:

  • Bachelorƒ??s/Masterƒ??s degree in Engineering, Computer Science (or equivalent experience)
  • 7+ years of direct expertise with data pipelines and application integrations
  • Experience in the design, development of Clusters, and Producers/Consumers
  • Proficiency in enabling Cloud/hybrid Cloud using Confluent Kafka Data streaming through Kafka, SQS/SNS queuing, etc
  • Strong container expertise, especially Docker
  • Prolific skills working with technologies such as Ansible, Puppet, Terraform, OpenShift, Kubernetes, AWS, AWS Lambda, and Event Streaming
  • Working experience in a public cloud environment as well as on-premise infrastructure
  • DataDog, Splunk, KSQL, Spark, and PySpark experience is a plus
  • Excellent knowledge of distributed architectures, including Microservices, SOA, RESTful APIs, and data integration architectures
  • Familiarity with any of the following message/file formats: Parquet, Avro, ORC
  • Excellent understanding of AWS Cloud Data Lake technologies, including Kinesis/Kafka, S3, Glue, and Athena
  • It’s advantageous to know RabbitMQ and Tibco Messaging technologies
  • Previous expertise in designing and implementing data models for applications, operations, or analytics
  • Track record of working with information repositories, data modelling, and business analytics tools is a strong suit
  • Be familiar with databases, data lakes, and schemas with advanced expertise and experience in online transactional (OLTP) and analytical processing (OLAP)
  • Experience in Streaming Service, EMS, MQ, Java, XSD, File Adapter, and ESB-based application design and development experience
  • Capable of working in a fast-paced team to keep the data and reporting pipeline running smoothly

Leave a Comment