Roger
From Peru (UTC-5)
11 years of commercial experience
Lemon.io stats
Roger – AWS, Apache Airflow, Microsoft Azure
Roger is a highly experienced Data Engineer with a decade of hands-on involvement in data-centric domains. With prior managerial roles, he adeptly balances leadership duties while actively contributing to team efforts. Roger thrives on confronting challenges directly, consistently delivering optimal solutions that merge technical excellence with business acumen. His seamless integration of both soft and hard skills renders him an invaluable asset to any project.
Main technologies
Additional skills
Ready to start
To be verifiedDirect hire
Potentially possibleExperience Highlights
Senior MLOps Engineer
Playing a critical role in bridging the gap between machine learning model development and deployment in a pharmaceutical setting. This role focuses on operationalizing AI/ML solutions to streamline drug discovery, optimize clinical trials, enhance supply chain efficiency, and improve patient outcomes. Roger collaborates with data scientists, data engineers, and IT teams to ensure scalable, secure, and reliable machine learning systems.
Key Accomplishments:
- delivered scalable machine learning pipelines that streamlined drug discovery, clinical trial optimization, and supply chain efficiency;
- ensured ML models met industry standards (FDA, HIPAA) for data privacy, security, and compliance;
- implemented CI/CD frameworks for seamless model deployment, versioning, and monitoring, reducing downtime and increasing reliability;
- integrated AI solutions across pharmaceutical workflows by aligning data science, engineering, and business teams;
- developed ETL pipelines to handle diverse pharmaceutical data (e.g., EHR, genomics), improving data quality and accessibility.
Key Responsibilities:
- automated deployment, monitored model performance, and built retraining pipelines to maintain accuracy and compliance;
- designed scalable ETL processes to ingest, clean, and transform complex healthcare and pharmaceutical data;
- established robust MLOps frameworks using tools like MLflow, Kubernetes, and cloud platforms for scalability and reliability;
- documented workflows and maintained audit-ready systems to meet regulatory requirements in the pharmaceutical industry;
- enhanced ML pipelines for high-throughput data processing and real-time analytics in drug discovery and supply chain operations.
AI Architect
Consultancy: Designed and developed Proofs of Concept (POCs) for MLOps and LLMOps leveraging Databricks API, Unity Catalog, Vector Search, and the OpenAI API. These POCs showcased scalable solutions for managing machine learning lifecycle operations and large language model workflows, integrating advanced data governance, efficient search capabilities, and cutting-edge AI tools to drive innovation.
- implemented robust data management solutions using Databricks Unity Catalog, ensuring secure and compliant handling of machine learning and large language model workflows;
- streamlined the end-to-end machine learning lifecycle with MLOps frameworks, reducing deployment time by [X%] and improving model monitoring and retraining efficiency;
- delivered a scalable vector search solution for efficient retrieval of embeddings, enabling faster and more accurate AI-driven insights;
- integrated OpenAI API to build and operationalize large language models, providing real-time AI solutions for complex business needs;
- developed POCs that bridged the gap between data engineering, machine learning, and natural language processing;
- delivered innovative prototypes that drove stakeholder buy-in for adopting cutting-edge AI tools and practices, setting the foundation for enterprise-wide adoption;
- demonstrated the ability to handle large-scale data and model operations with Databricks;
- enhanced cross-functional collaboration by establishing shared governance models and unified workflows, reducing silos and boosting productivity.
Senior Data Engineer
This global retailer specializes in high-performance athletic apparel and accessories designed to enhance athletic performance and support an active lifestyle. Their target audience includes fitness enthusiasts, athletes, and individuals who prioritize wellness and active living.
The main challenge here was an internal project. It consisted of extending the retail operations in North America and Europe, including inventory optimization, store management and commercial planning based on data analytics and forecasting for the new Asia-Pacific markets.
Roger successfully carried out the following:
- developed data pipelines for Lululemon projects in APAC markets;
- created the new data models;
- developed ETL pipelines;
- took care of the airflow orchestration for the new DAGs;
- coordinated and collaborated with product owners and project managers for deployment;
- led a group of 3 data engineers located in Bangalore, India.
Data Architect
This client was a freight broker company specializing in logistics solutions such as over-the-road transport, intermodal, drayage, and hazmat shipping. They used advanced software for efficient and secure transportation management. Their target audience includes businesses seeking customized logistics solutions across the United States.
The challenge presented the migration of the datalake to the cloud and integration of all the data sources used for commercial and operational purposes.
Among others, Roger achieved the following:
- led the datalake team composed of ETL and backend developers;
- performed the design and configuration of data architecture in Azure Data Lake;
- took care of the developments of ETL pipelines for Data Lake in Azure Data Factory and Azure Databricks;
- guided the business data users and stakeholders of this project.
Data Engineer
The client was a streaming platform for visual entertainment content, offering a vast library of movies, TV shows, and original programming, catering to a diverse audience with various genres and formats. The project's main idea consisted of developing and maintaining the data pipelines for the marketing intelligence of the company.
Roger's successful contributions include:
- developed data pipelines and Apache Airflow dags;
- took care of the data modeling for new data sources using Databricks and Snowflake;
- performed workloads monitoring in Python and SQL.
Data Engineer
This client was an investment management firm, specializing in providing a wide range of financial services and investment products. They leverage a performance-driven approach, grounded in extensive research and active management.
The project was for the Marketing Analytics Department. It consisted of developing datamarts that supported analytical decision-making regarding company marketing campaigns.
Roger's contributions included, but were not limited to:
- developed data pipelines for the marketing and sales data marts;
- management of unstructured data vendors;
- performed data modeling for the new investments and provided insights data marts.
Data Engineer
The client was a visual discovery and bookmarking social platform, specializing in enabling users to find and save creative ideas. The platform leverages a user-friendly interface and personalized recommendations to enhance the discovery experience.
Roger completed such tasks:
- developed workloads for migrating Hive tables to Spark;
- developed data pipelines for the platform data models;
- led the Latam contractors team for this project under Agile.