We are looking for the most ambitious and curious engineers in the field. You have had at least 2 years of work experience and have a passion for building state-of-the-art innovative computing infrastructures. You will be a part of our Data Engineering Team. Since we believe in great teamwork, you must be eager to learn and bring an energetic and creative approach to work. We are looking for someone like you if you are convinced that a fast-paced, high growth working environment would fit your skills.
- Sentiance aims to provide a resilient platform that serves as an insights engine on top of sensor data
- Bring a DevOps mentality
- Attention to detail
- Know the Reactive Manifesto by heart
- Uphold best practices in engineering, security, and design
- Enjoy working with a diverse group of people with different technical backgrounds
Task & Responsibilities
- Integrate complex human behaviour models into a scalable production platform.
- Develop software components and data processing pipelines.
- Do performance and scalability engineering.
At Sentiance people come to have an impact and learn. You’ll be a part of an international team brought together by a culture of technical excellence, grit and integrity. You’ll find our compensation and rewards competitive and of course, we have all the start up essentials. Better yet, expect an agile and flat structure, dynamic growth opportunities, flexibility, and an openness for the curious.
- You have an academic degree (BSc or MSc) in computer science or related field.
- Experience programming in Java and Python.
- Practical understanding of software engineering best practices.
- Know your way around the Linux operating system.
- Work experience with Docker containers.
- Work experience on distributed computation frameworks (Kafka, Spark, …) and NoSQL/SQL databases.
- You are fluent in English. Dutch is a plus.
- You can work independently and take matters into your own hands.
- The ability to quickly learn new technologies and successfully implement them is essential.
- Proven experience scaling to terabyte-size datasets and managing pipelines to process them.
- Notions of and affinity for machine learning and data mining
- Experience in statistical modelling
- Experience programming in Go
- Notions of functional programming