Prometheus integration for Starlette.
Health Check is an application that provides an API to check the health health_check of some parts and some utilities like ping requests. This application can works as standalone or included in a Django project.
As part of the product-oriented side of this company I'm leading, designing and developing projects related to media metadata with a highly-skilled team. We are end-to-end responsible of a group of services designed for gathering, enriching and exposing different kind of media metadata. Those services are built using modern tech stacks and designed using Big Data and Artificial Intelligence techniques. We are backed by an data-driven and polyglot architecture built mainly over Python, Java and JavaScript languages; with multiple databases both relational and NoSQL, highly-scalable serverless services based on streaming engines such as Kafka and AWS Kinesis.
Responsible of the whole data ecosystem in the company. The first objective of that ecosystem is to provide a reliable and constantly updated source of metadata from influencers, enriched with many own-generated metadata such as fraud detectors, followers health evaluators, segmented potential exposition, etc. Those enrichers are based on different Machine Learning techniques. Our second goal is to use it as a Business Intelligence platform for improving the sales process itself and integrating the feedback of sales campaigns into our data system. The system is built following a Kappa architecture using Kafka streams, modern APIs based on REST or GraphQL and Data Warehousing that provides huge efficiency.
Working in great team building an ecosystem of services based on Artificial Intelligence using Python with diverse frameworks such as Django, Celery, Scrapy, TensorFlow, Gensim... for solving multiple problems from different areas such as image and video processing, including features and faces recognition; natural language processing for text patterns recognition or text fuzzy matching. Every service is built on top of latest available infrastructure using AWS services, containerizing those services using Docker and Mesos/Marathon and applying continuous integration and continuous deployment principles using Gitlab and GoCD pipelines.
Working on Business Intelligence area. Architecting, designing and developing Big Data solutions for improving sales process, campaigns and feedback with different Weak Artificial Intelligence techniques such as Fuzzy Logic Systems and Artificial Neural Networks using Machine Learning over them.
I'm part of a team in charge of monitoring and improve services performance. In order to improve our applications performance we, as team, need to have a wide knowledge of the whole system and monitor it constantly so we research and develop multiple tools for keeping the system healthy. This job includes refactoring of big chunks of the applications along with redesing of some parts of it, for example adding technologies such as cache layers or applying clustering to databases. As part of the role we have to develop the needed tools to effectively measure how the services behave, and one of the most relevant is to analyze, design and built a proper performance test suite to let us benchmark our applications.
Research about Algorithmic Musical Composition and Music Information Retrieval with Fuzzy Logic systems.
Speaker at Test Academy Madrid with: "Machine Learning services validation".
Speaker at PyCon ES 2019 with: "High-performant asynchronous APIs".
Speaker at PyCon ES 2018 with talk: "New generation of APIs".
Speaker at PyCon ES 2017 edition with talk: "Create, package and distribute your own Python application".
Winner of a Machine Learning contest at PyCon ES 2017.
Speaker at OpenSouthCode 2017 edition with talk: "Flex schemas with PostgreSQL and Django".
Usual speaker at Málaga Python MeetUp. The majority of talks can be found in https://github.com/perdy/speech/
An interactive introduction, oriented to testers, to how to verify and validate services based on Machine Learning technologies.
A framework based on python's asyncio module that allows to create in fast and easy way an API.
How to create a new generation of APIs and frameworks taking advantage of new tools from latest Python versions.
Guide for creating an application or library from scratch, configure it and distribute through PyPI.
A mixed vision between traditional schemas in relational dabases and schemaless NoSQL databases.
A complete guide about how to use Python's packaging and distributing tools to ease the process of create a new application or library, package it and distribute through repositories.
Introduction to data analysis and representation using Python, Pandas and some tools for data plotting.