Engineering · Full-time · Piedmont, Italy
Profitero unlocks the brand's full potential with leading global e-commerce acceleration solutions, empowering it to optimize product availability, maximize conversions, and grow profitably across 1000+ product companies and retailers in 50 countries.
Join our Data Acquisition Team, where we collect and process data from major online stores and marketplaces like Amazon to provide valuable insights that drive sales for big companies and brands. As a developer, your primary focus will be supporting and improving a scalable framework for crawling and parsing large amounts of data from various websites while overcoming bot protection mechanisms. In addition, your work will directly impact other teams within the company, as they rely on our framework for data extraction to provide configuration and parsing rules to our clients.
Our company fosters a friendly atmosphere, encourages respect for diverse opinions, values knowledge sharing among strong professionals, and prioritizes teamwork over personal KPIs. Join us if you're passionate about optimizing market analytics and enjoy a collaborative environment.
Stack:
Python
Ruby (nice to have to support existing components)
Docker
MySQL, MongoDB
RabbitMQ
GitLab CI
Knowledge of C is a plus
Redis (nice to have)
Airflow and/or pySpark (nice to have)
What to do / responsibilities:
Design and develop distributed data crawling solution to process a large amount of data (18TB+ data collected across our entire platform, 13M new products tracked each quarter).
Implement and support new API (HTTP & RabbitMQ) services in Python.
Support legacy Ruby components.
95% of our activities are Back-end tasks.
Application containerisation (Docker).
Participation in an architecture design.
Unit tests writing.
Analysis, refactoring, and re-design of existing solutions.
Maintain CI/CD pipelines (with support from Infra & DevOps teams).
We expect:
Strong knowledge of Python with a minimum of 4 years of practical experience
Willingness to work with Ruby.
Good expertise in SQL and basic knowledge of NoSQL databases.
Understanding of software design patterns. Ability to write simple, idiomatic, and modular code.
Basic knowledge of Linux and Shell scripting for working with CI pipelines.
Experience working in an Agile team is a big plus.
English level B1 (Intermediate) and higher.
We provide:
Sign up to view 0 direct reports
Get started
This job is not in any teams