Our client provides a leading software platform for the consumer goods industry. With a suite of pricing and trade promotion optimization software: constraint-based predictive optimization, optimization through iterative scenarios, prediction of “best” price and promotion option, cannibalization and price ratio effects.
Its goal is to Leverage artificial intelligence to help CPG companies predict the right price in just one click. It has create a precise, affordable and easy-to-use tool that redefines the future of pricing for CPGs.
It has been ranked by CIO Review in Top 100 Most Promising BigData Solution Providers. Selected by SAP for its exclusive SAP.io foundry in Berlin
ABOUT THE ROLE:
We are looking for someone passionate with technology that helps us to reach our next level in process automation.
We currently have an ongoing effort on implementing automated data pipelines from our currently working bunch of heterogeneous tools and scripts, and would love the hands and the sight of someone that could tackle our python and jupyter scripts transformation to controlled workflows, to parallelize and enhance the workflow executions, etc.. We are currently using prefect, but any knowledge on a similar tool would be great as well.
In the last years we have deployed a devops layer to focus on infrastructure, and a mlops layer to focus on automating our current processes. There is yet a third layer to be deployed to focus on data quality and control and that will user the two former layers.
SPECIFIC RESPONSIBILITIES
● We need to improve the modelops layer and there is where you will find your task.
● The expected path for the first three months would be:
1. Start off by learning what we do and how we do it.
2. Learn what is already running.
3. Enhance it, just a bit.
4. Become a pro so you own the prefect realm within the company!
● You will not be alone, as part of the core engineering team, the team members will support you.
● Take ownership of the workflow automation layer implemented on prefect and services.
● You will become a member of the core engineering team, that assumes the R&D and innovation tasks in order to enhance and improve the current delivery process, with focus on devops, dataops, mlops (you).
WORK EXPERIENCE AND KNOWLEDGE REQUIREMENTS:
● Software development.
● Git or equivalent.
● Bitbucket or github.
● Python and python ecosystem, development and testing tools.
● Docker.
● Data pipelines.
● Intermediate English level.
Desirable:
● Prefect, airflow, or equivalent
● Specific python libraries: pandas, matplotlib, sklearn
● Jupyter and notebooks
● Gitflow and jira
● Amazon Web Services
● CI/CD (we currently use bitbucket pipelines)
BENEFITS
● Flexible working hours, remote working 80-90% of the time with periodic meetups.
● Great teammates.
● Training and lots of learning.