Aram – AWS, Python, ETL
Aram is a Senior Data Engineer experienced in building and running modern data platforms end-to-end, from ingestion to serving, with a focus on governance, quality, and reliability. He’s best suited for startups or scale-ups that value practical stability and self-service analytics over experimental stream processing.
10 years of commercial experience in
Main technologies
Additional skills
Direct hire
PossibleReady to get matched with vetted developers fast?
Let’s get started today!Experience Highlights
Lead Data Engineer
The healthcare product company that provides the weight loss system. Aram led the design and operation of the company's analytics platform, developed a Natural Language to Analytics service, and contributed to Coach Iris, an LLM-powered digital health coach utilizing RAG, which increased weekly patient engagement.
- Architected cloud data platform on AWS + Snowflake; owned data modeling and SLAs.
- Built Python APIs and Glue (PySpark) jobs for integrations across Product, Marketing, and Analytics.
- Delivered NL-to-Analytics service (agentic/LangChain); reduced ad-hoc BI requests by ~40%.
- Helped design/ship Coach Iris (RAG), improving patient engagement by ~25%.
- Implemented streaming (Kafka/Kinesis) and serverless patterns (Lambda + API GW).
- Integrated data from GCP (BigQuery/GA) and Azure (Power BI).
Data Architect
The company is one of the world's largest service providers, which helps organizations thrive through its industry-focused solutions. During his tenure, Aram worked on a project for Tigo - a leading provider of fixed and mobile telecommunications services in Latin America. Aram re-architected their data ingestion pipeline and migrated workloads to AWS.
- Implemented an event-driven design with Kinesis streams feeding Glue ETL into curated DynamoDB/S3/Snowflake-ready zones, enabling faster availability of telecom usage data for downstream analytics and dashboards.
- Modeled raw/processed zones and designed SQL/NoSQL stores for analytics consumption.
- Built messaging patterns with SNS/SQS; authored runbooks and docs for the data platform.
- Coordinated with client engineering squads to phase migration with minimal downtime.
Data Engineer
The company is one of the largest publicly traded petroleum and petrochemical enterprises in the world. There, Aram built end-to-end analytics to track utilization of corporate IT assets (servers, storage, networking, software). His team was responsible for ingestion, modeling, dashboards, detecting underutilized or orphaned resources, and unlicensed installs to optimize capacity and spend, and to strengthen compliance.
- Built and maintained Python/SQL ETLs; standardized schemas for asset/usage/licensing data.
- Exposed curated datasets via internal APIs for self-service analytics.
- Delivered interactive dashboards (Power BI/Tableau) for governance & optimization.
- Worked in Agile; partnered with infra/security teams on data quality and access controls.
Data Integrator Specialist
The client is the City Government. There, Aram Built and operated data integrations for the city’s IT/Network observability program.
- Designed and ran PDI jobs to collect telemetry from multiple monitoring systems.
- Parsed/validated JSON & XML; unified schemas and enforced data quality before loading to DBs.
- Automated loads into Oracle/PostgreSQL/SQL Server to enable dashboarding and ops reporting.
- Partnered with visualization team to define tables/views optimized for their use cases.
- Documented pipelines and handover runbooks to ensure stable daily operations.
Ssr. Database Administrator
The client is the City Government. There, Aram provided multi-engine database administration for mission-critical government systems: provisioning, performance, patching, backup/recovery, migrations, and DR readiness across Oracle, MySQL, PostgreSQL, and SQL Server estates.
- Administered Oracle/MySQL/PostgreSQL/SQL Server instances used by multiple city agencies.
- Implemented and tested RMAN backup/recovery strategies and DR procedures.
- Performed migrations, patching, and capacity/performance monitoring to maintain SLAs.
- Wrote operational scripts and documentation to standardize routine DBA tasks.
- Collaborated with application teams to troubleshoot issues and plan schema changes.