First Factory

Java Developer (Data Engineering Focus)

Posted: 1 days ago

Job Description

We are seeking a highly skilled Java Developer with strong experience in data engineering workflows, particularly ETL processes, SQL modeling, and cloud-based data systems.The ideal candidate will have hands-on experience with PostgreSQL, Snowflake, AWS, and containerized environments, as well as strong analytical and problem-solving capabilities.This role involves building and optimizing data pipelines, integrating with internal and external data systems, and contributing to the design of scalable and efficient data architectures.About First FactoryWe are a software development company with over two decades of experience, boasting a dynamic team of 175+ professionals actively engaged in diverse projects across various industries. We invite you to join us on this journey as we continue to thrive and embrace fresh challenges.Key ResponsibilitiesDesign and implement ETL processes for ingesting, transforming, and delivering data across systems.Expertise in SQL optimization for massive datasets.Work with Snowflake and PostgreSQL databases to design schemas, write optimized queries, and support data ops.Develop, maintain, and optimize Java-based applications that interact with data pipelines and services.Build and optimize SQL models using DBT.Manage data storage and workflows using AWS S3.Deploy, configure, and maintain containerized applications in Kubernetes (AKS).Collaborate with Data Engineers and Architects to ensure high-quality, scalable, and efficient data solutions.Troubleshoot and optimize data systems for performance, scalability, and reliability.Requirements5+ years of experience in Java development.Experience with ETL pipelines and data transformation processes.Proficiency in PostgreSQL and Snowflake.Strong knowledge of AWS S3 and cloud-based data workflows.Experience with Kubernetes (AKS) for deployment and orchestration.Proficiency with DBT modeling (data modeling, transformations, testing).Exceptional SQL skills (query optimization, indexing, analytical queries).Strong understanding of data structures, algorithms, and software engineering best practices.Nice to haveExperience working with H3 indexes.Knowledge of geospatial data modeling and geospatial queries in Snowflake.Familiarity with streaming technologies such as Kafka or Kinesis.Experience with CI/CD pipelines and Infrastructure-as-Code (IaC).

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In