Equifax

Data Engineer

Posted: 5 minutes ago

Job Description

Equifax is seeking a skilled and motivated Data Engineer to support our Product Management data initiatives. This role focuses on building, maintaining, and extending robust data pipelines to ensure critical business data is captured efficiently from platforms such as Aha! and AODocs. You will work primarily within a Google Cloud Platform (GCP) environment, utilizing Python and Airflow to automate data workflows and ensure high data quality.You will collaborate with cross-functional teams to understand data requirements, monitor cloud infrastructure health, and implement innovative solutions—potentially leveraging Machine Learning—to expand our data analysis capabilities. What you'll do Design, develop, and maintain scalable Python-based ETL pipelines and workflows using Apache Airflow to ingest data from external APIs (e.g., Aha!, AODocs) into our Google Cloud Platform (GCP) environment. Manage and optimize data storage and retrieval in Cloud Data Warehouses (e.g., BigQuery), ensuring data quality, consistency, and accessibility for business analysis. Monitor the health and performance of cloud infrastructure and data pipelines using Cloud Monitoring tools, providing support and troubleshooting alerts to minimize downtime and ensure operational excellence. Collaborate with the Product Management team to resolve data capture cases and extend the capabilities of existing ETLs based on evolving business needs. Research and implement new methods for data acquisition and management, promoting the use of advanced techniques such as Machine Learning or AI algorithms to offer new analytical alternatives. Adhere to coding best practices and utilize Git for version control to ensure code quality and facilitate collaborative development. What Experience Do You Need A Bachelor’s degree in Computer Science, Software Engineering, Information Technology, or a related technical field. 3+ years of professional experience in Data Engineering with strong proficiency in Python for ETL development and advanced SQL skills for querying and data modeling. Proven experience working with Cloud platforms (GCP is preferred, AWS is acceptable) and workflow orchestration tools like Apache Airflow. Solid understanding of APIs and experience writing scripts to consume and process data from third-party applications. English proficiency of B2 or higher. What could set you apart Google Professional Data Engineer Certification or equivalent Cloud certifications. Conceptual understanding or practical experience with Machine Learning principles and their application in data processes. Experience with cloud infrastructure monitoring tools (e.g., Google Cloud Monitoring, Datadog) and incident response workflows. Familiarity with CI/CD pipelines and DevOps practices for data engineering. Understanding of data quality frameworks and best practices for data governance.

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In