TECH STACK
This is how we work everyday.
Code quality
We know that we cannot succeed without high-quality code. We spend a significant amount of time crafting components and services that are efficient, clean, readable and reusable as much as possible.
Tool choice
We love exploring new technologies, and we are free to choose the best tool/language for a specific task, without technical constraints or any bias on the technology to use.
Testing
No user-facing feature goes to production if it's not tested.
We think automated testing is the best way to ensure quality and maintainability of the code, and to make our developers, stakeholders and users happy :)
Security
Many clients, among which are banks, insurance companies and big corporations, trust us with their data, so we take security very seriously. We adopt all the best practices to protect our data, our infrastructure and our assets.
OUR TOOLS
Technically speaking, we mainly use languages like Python, Golang and Scala for the back-end, Typescript and React for the front-end.
Our infrastructure is on AWS and it's managed and tested as code with Ansible and Terraform. We use Gitlab to store all our code and run CI/CD pipelines. We rely on Prometheus, Grafana, Logstash, Kibana, Sentry to monitor our applications. We deploy our components mostly in our Kubernetes cluster. We store our data on PostgreSQL databases, mirror it in several ElasticSearch clusters and cache it on Redis instances.
Data is processed using Kafka queues and Celery is used as messaging system. We use Spark and MapReduce jobs to clean, match and link data from several sources and providers. Django is the main framework for webserver and for static pages we use Hugo.
The data science team heavily relies on Jupyter for experimenting with data and algorithms, uses tools like ScikitLearn and Tensorflow to create machine learning models, manages the lifecycle with KubeFlow. We have our own engine, dandelion.eu, for semantic text extraction, but we also leverage other NLP tools like SpaCy and BERT and the latest LLM technology for text analysis tasks.