BLeader

Data Engineer

Posted: 6 minutes ago

Job Description

Data Engineer - Building Modern Data Platform from ScratchAbout BLEADERBLEADER is a boutique consulting firm specializing in transforming data into real business assets. Founded to bridge the gap between business vision and technical data expertise, we combine the market's best data experts with visionary business leaders.We excel at building comprehensive data strategies for leading organizations and developing innovative, cutting-edge data solutions. Our focus is on delivering what customers truly need - turning their data challenges into competitive advantages.About the RoleWe're looking for a talented Data Engineer to join our team and take a key role in building a cutting-edge data platform from the ground up using Databricks. This is an exciting opportunity to work with modern data technologies and shape the foundation of our data infrastructure.You'll be working on greenfield project, implementing best practices, and contributing to the development of scalable data solutions that will serve as the backbone of our customer analytics capabilities.What You'll DoBuild a modern data platform from scratch using DatabricksDevelop robust ETL/ELT pipelines using Spark and PythonImplement modern data architectures (Lakehouse, Delta Lake)Design and optimize data processing workflows for performance and reliabilityCollaborate with data scientists and analysts to enable advanced analyticsEnsure data quality, governance, and security across the platformWork with cloud infrastructure (AWS/Azure/GCP) for scalable solutionsWhat We're Looking ForRequired Skills:2-4 years of experience in data engineering or ETL/ELT developmentStrong proficiency in Python for data processing and automationSolid experience with Apache Spark (PySpark)Proficient in SQL for complex data transformationsExperience with cloud platforms (AWS/Azure/GCP)Understanding of data modeling and warehouse conceptsProblem-solving mindset and attention to detailNice to Have:Databricks experience - strong advantageExperience with Delta LakeKnowledge of data streaming technologies (Kafka, Event Hubs)Familiarity with CI/CD practices for data pipelinesExperience with data orchestration tools (Airflow, Azure Data Factory)Understanding of data mesh or lakehouse architecturesWhat We OfferOpportunity to work with cutting-edge data technologiesBuild something meaningful from the ground upCollaborative and learning-focused environmentProfessional development and growth opportunitiesCompetitive salary and benefitsReady to build the future of data with us?We're looking for someone who is passionate about data engineering, eager to learn new technologies, and excited about the challenge of building robust, scalable data solutions from scratch.

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In