Iván
From Spain (GMT+2)
25 years of commercial experience
Lemon.io stats
Iván – Rust, MongoDB, gRPC
Ivan is an experienced Senior Back-end Developer with a career spanning over 20 years, comprising two decades of mastery in Core Java and the recent three years dedicated to Rust. Ivan is celebrated for his approachable and collaborative nature, accompanied by strong English language skills. His professional background encompasses work within the AdTech domains, and he is notably adept in system design. Ivan's standout qualities include a profound mastery of Rust, problem-solving prowess, and outstanding interpersonal skills. He should be able to bring a wealth of skills to any project, making him a valuable asset to any team.
Main technologies
Additional skills
Ready to start
ASAPDirect hire
Potentially possibleExperience Highlights
Tech Lead and Senior Back-end Developer
Algorithmic trading platform. Every prop trading firm knows the costs and time it takes to develop and maintain software and infrastructure to integrate with exchanges. This platform provides the required tools to securely run real-time analysis, execution, and portfolio tracking seamlessly.
- Designed an initial strategy as a standalone script, as a 'bot';
- Over time added features and decomposed the bot into libraries and eventually into microservices;
- Added features required to integrate with a few exchanges;
- Stabilized the platform so it runs 24x7;
- Added monitoring and controls to protect capital/risk control rules;
- Features to analyze historical data using jupyter notes and run backtesting;
- Built indicators and tools for parameter optimization;
- Created tools for portfolio optimization and analysis;
- Implemented a tiny rule engine to trigger orders given some real-time events from indicators and live order books, minimizing the time between opportunity detection and order execution.
Tech lead, Back-end Developer, Architect.
Mercadolibre.com, operating at 200k requests per minute, is seeking a solution for the rapid selection of debit/credit card payment providers for online transactions, aiming to complete this task within milliseconds. This project involves evaluating payment, product, contract, and provider attributes to optimize the user experience, minimize fees, maximize availability, and adhere to budget constraints.
- Designed and implemented the initial end-to-end backend architecture, including the process of gradually transferring traffic from the existing monolithic code into the new platform;
- Developed a dead-lettering mechanism with backoff for retrying mismatched requests;
- Created a weighted selection algorithm for resolving responses;
- Devised an anomaly detection algorithm to identify and predict provider failures;
- Established a throttling control system for allocating a portion of the budget to routers;
- Implemented observability dashboards and alerts using DataDog and NewRelic;
- Introduced a process for assessing how the new system compared to the existing legacy monolith, uncovering previously undetected bugs in the monolith due to its complexity and sparse logic;
- Orchestrated a gradual redirection of online traffic from the legacy monolith to the new platform;
- Assumed leadership of the router team, expanding it to include 7 senior Java developers through technical interviews and daily code reviews;
- Established a process for collecting and applying best practices within the team during code reviews.
Associate Software Developer
JP Morgan's eXecute platform is an online commodities, fx and bonds (treasury) platform used at trading floors. The project was to review, detect and apply performance improvements to the existing microservices in order to guarantee that they would perform at a given throughput in preparation for UK's 'brexit' referendum and resulting financial/economical events.
- Detected and suggested improvements to over 17 microservices;
- Planned for failure scenarios;
- Successfully endured the target traffic without having to take any mitigation/tactical action.
Back-end Web Developer
This project is centered around the development of a new technology designed to optimize DSL throughput while removing the DSL Hub nodes. The project involves the creation of services responsible for provisioning and managing a network of DSL modems (hardware). Additionally, feedback from data samples generated by the hardware is collected using specialized services. These data waves are then processed using Hadoop to detect noise, calculate noise filtering coefficients, and apply them back to the hardware through the provisioning services. The result is that once these coefficients are applied to the DSP(s) the improved signal means less packet loss, less retransmission, and result in higher data throughput.
- Served as the primary and exclusive developer for the provisioning and data collection services;
- Built a mechanism to discover the hardware in the network on boot, negotiate versions, apply firmware patches, switch collect (bypass)/hybrid/full modes, recovery modes, etc.;
- Developed services to process the wavelets in Hadoop, map/reduce processes, and keep track of the status of each analysis process in the pipeline.