ETL/ELT - Data Engineer
Posted: 4 days ago
Job Description
This is a full-time work from home opportunity for a star Data Engineer from LATAM.IDT(www.idt.net) is an American telecommunications company founded in 1990 and headquartered in New Jersey. Today it is an industry leader in prepaid communication and payment services and one of the world’s largest international voice carriers. We are listed on the NYSE, employ over 1300 people across 20+ countries, and have revenues in excess of $1.5 billion.IDT is looking for a skilled Data Engineer to join our BI team and take an active role in performing data analysis, ELT/ETL design and support functions to deliver on strategic initiatives to meet organizational goals.Responsibilities:Design, implement, and validate ETL/ELT data pipelines–for batch processing, streaming integrations, and data warehousing, while maintaining comprehensive documentation and testing to ensure reliability and accuracy. Maintain end-to-end Snowflake data warehouse deployments and develop Denodo data virtualization solutionsRecommend process improvements to increase efficiency and reliability in ELT/ETL developmentStay current on emerging data technologies and support pilot projects, ensuring the platform scales seamlessly with growing data volumes. Architect, implement and maintain scalable data pipelines that ingest, transform, and deliver data into real-time data warehouse platforms, ensuring data integrity and pipeline reliabilityPartner with data stakeholders to gather requirements for language-model initiatives and translate into scalable solutionsCreate and maintain comprehensive documentation for all data processes, workflows and model deployment routinesShould be willing to stay informed and learn emerging methodologies in data engineering, and open source technologiesRequirements:5+ years of experience in ETL/ELT design and development, integrating data from heterogeneous OLTP systems and API solutions, and building scalable data warehouse solutions to support business intelligence and analyticsExcellent English communication skillsEffective oral and written communication skills with BI team and user communityDemonstrated experience in utilizing python for data engineering tasks, including transformation, advanced data manipulation, and large-scale data processingDesign and implement event-driven pipelines that leverage messaging and streaming events to trigger ETL workflows and enable scalable, decoupled data architecturesExperience in data analysis, root cause analysis and proven problem solving and analytical thinking capabilitiesExperience designing complex data pipelines extracting data from RDBMS, JSON, API and Flat file sourcesDemonstrated expertise in SQL and PLSQL programming, with advanced mastery in Business Intelligence and data warehouse methodologies, along with hands-on experience in one or more relational database systems and cloud-based database services such as Oracle, MySQL, Amazon RDS, Snowflake, Amazon Redshift, etcProven ability to analyze and optimize poorly performing queries and ETL/ELT mappings, providing actionable recommendations for performance tuningUnderstanding of software engineering principles and skills working on Unix/Linux/Windows Operating systems, and experience with Agile methodologiesProficiency in version control systems, with experience in managing code repositories, branching, merging, and collaborating within a distributed development environmentInterest in business operations and comprehensive understanding of how robust BI systems drive corporate profitability by enabling data-driven decision-making and strategic insights. PlusesExperience in developing ETL/ELT processes within Snowflake and implementing complex data transformations using built-in functions and SQL capabilitiesExperience using Pentaho Data Integration (Kettle) / Ab Initio ETL tools for designing, developing, and optimizing data integration workflows. Experience designing and implementing cloud-based ETL solutions using Azure Data Factory, DBT, AWS Glue, Lambda and open-source toolsExperience with reporting/visualization tools (e.g., Looker) and job scheduler softwareExperience in Telecom, eCommerce, International Mobile Top-upPreferred Certification: AWS Solution Architect, AWS Cloud Data Engineer, Snowflake SnowPro CorePlease attach CV in English.The interview process will be conducted in English.Only accepting applicants from LATAM.We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.
Job Application Tips
- Tailor your resume to highlight relevant experience for this position
- Write a compelling cover letter that addresses the specific requirements
- Research the company culture and values before applying
- Prepare examples of your work that demonstrate your skills
- Follow up on your application after a reasonable time period