Rahul
From Canada (GMT-4)
16 years of commercial experience
Lemon.io stats
Rahul – Apache Spark, Tableau, AWS
Rahul is a Senior Data Engineer. He is skilled in ETL design, development, and deployment and has a solid architectural mindset for data-driven solutions. He can incorporate business needs along with technical decision-making. His way of reviewing code is quite unique. He can come up with a code smell and find a way to make the code more maintainable.
Main technologies
Additional skills
Ready to start
ASAPDirect hire
Potentially possibleExperience Highlights
Principal Data Engineer
It's an international management consulting and technology company and a valued partner of many of the world’s largest financial services providers.
- Enhanced data quality and compliance through robust quality checks and strict adherence to IT security standards;
- Built data ingestion pipelines utilizing Python libraries (Pandas, SQLAlchemy) and Snowflake connectors, ensuring scalability and robustness for large-scale data processing;
- Streamlined deployment processes with automated CICD pipelines, reducing deployment time and errors;
- Implemented comprehensive data quality checks and logging mechanisms to actively detect, track, and resolve data issues, maintaining adherence to SLAs;
- Provided reliable production support, ensuring minimal downtime and quick resolution of data issues;
- Provided ongoing BAU support for production job monitoring, issue resolution, and bug fixes, with proactive monitoring and alerting for job failures and data quality issues.
Senior Technical Lead
It's a central data repository that helps to address data silo issues. A data lake stores vast amounts of raw data in its native – or original – format.
• Introduced PySpark, Unix, and CICD pipelines to enhance data management processes within the ADA platform; • Designed and implemented comprehensive ETL pipelines, managing data flow for migration initiatives; • Transferred data from SAS and Teradata to the ADA platform, ensuring data integrity and quality; • Created Unix shell scripts to facilitate the migration of data from SAS and Teradata servers to ADA; • Integrated Apache Kafka for real-time data streaming and processing; • Implemented Grafana for monitoring and alerting on data pipeline performance; • Optimized data management processes by integrating PySpark and Unix; • Successfully migrated large volumes of data from SAS and Teradata to ADA, improving data accessibility and processing efficiency; • Streamlined ETL processes, enhancing overall data pipeline performance and reliability; • Enhanced real-time data processing capabilities with Apache Kafka; • Improved monitoring and alerting mechanisms with Grafana, ensuring better oversight of data pipelines.
Senior application developer
The company offers financial solutions to the people and businesses within and connecting with ASEAN. Through data and relationship-led insights, it creates solutions tailored to unique needs. Its comprehensive regional network and one-bank approach connect the business to new opportunities in ASEAN.
- Led the development of ETL pipelines for the Tookitaki AML suite, ensuring compliance with regulatory requirements;
- Implemented end-to-end Big Data processing solutions within the Hadoop ecosystem, optimizing performance for efficient data processing;
- Guided the development team, providing technical direction and ensuring project milestones were met;
- Utilized AWS and Azure for scalable and efficient cloud-based data processing solutions;
- Successfully implemented ETL pipelines for the Tookitaki AML suite, enhancing compliance and data processing efficiency;
- Optimized Big Data processing techniques, significantly improving performance within the Hadoop ecosystem;
- Led a high-performing development team, ensuring timely delivery of project goals;
- Leveraged cloud platforms for scalable and efficient data processing.
Senior Technical Consultant (ETL Developer)
A global bank—an institution connecting millions of people across hundreds of countries and cities. The bank provides financial services that enable growth and economic progress. The core activities are safeguarding assets, lending money, making payments, and accessing the capital markets on behalf of their clients.
- Supported and developed VNG applications on Actimize Employee Fraud Solution and RCM, leveraging Big Data processing and analytics;
- Led end-to-end code deployment processes in production environments, ensuring the reliability and scalability of data solutions;
- Provided technical expertise in Big Data processing and analytics, ensuring optimal performance and data integrity;
- Successfully supported and developed VNG applications, enhancing fraud detection and compliance;
- Led reliable and scalable code deployments, ensuring minimal downtime and high availability;
- Demonstrated strong expertise in Big Data processing and analytics, contributing to the overall success of the Actimize platform.
Software engineer
Project Overview: Developed and maintained ETL processes for various banking applications, ensuring data integrity and performance optimization.
Key Responsibilities: • ETL Development: Designed and implemented ETL processes for banking applications, focusing on data integrity and performance. • Data Management: Managed data extraction, transformation, and loading operations, ensuring seamless data flow and accuracy. • Performance Optimization: Optimized ETL processes for improved performance and efficiency. Technologies Used: • Programming Languages: Python, SQL • Data Tools: Informatica, Oracle • Version Control: Git Achievements: • Successfully developed ETL processes that ensured high data integrity and performance. • Optimized data management practices, enhancing overall data processing efficiency.
Software Engineer (ETL Developer)
Project Overview: Developed ETL processes for financial applications, focusing on data accuracy and performance. Key Responsibilities: • ETL Development: Designed and implemented ETL processes for financial applications, ensuring data accuracy and performance. • Data Integration: Managed data integration operations, ensuring seamless data flow and consistency. Technologies Used: • Programming Languages: Informatica, SQL • Data Tools: Informatica, Oracle • Version Control: Git
Software Engineer (ETL Developer)
Project Overview: Implemented ETL solutions for banking applications, focusing on data transformation and performance optimization.
Key Responsibilities: • ETL Development: Developed ETL solutions for banking applications, ensuring efficient data transformation and loading. • Data Transformation: Managed data transformation operations, ensuring accurate and timely data delivery. Technologies Used: • Programming Languages: SQL • Data Tools: Informatica, Oracle • Version Control: Git Achievements: • Successfully implemented ETL solutions, enhancing data transformation and loading efficiency. • Improved data accuracy and performance through optimized ETL processes.