Job Description

At LifeByte, we are a dynamic and innovative collective of tech visionaries driven by a relentless pursuit of excellence. Each of us brings a unique set of skills to the table, collaborating on projects that shape the future.Founded in 2017, we are dedicated to fostering an ecosystem of seamless resource exchange, where efficiency and precision are paramount. With cutting-edge solutions, we empower businesses to thrive and individuals to unlock their full potential. Committed to high-tech innovation, we are actively reshaping the future, one Byte at a Time.We are looking for a highly skilled Data Engineer to help build and optimise the company's data infrastructure. This position will work closely with data scientists, analysts and business teams to improve the efficiency of the company's data processing and decision making by building efficient data pipelines and storage solutions.Job ResponsibilitiesDesign, build, install, test and maintain scalable data management systems, ensuring that the system meets business requirements and industry standards.Integrate emerging data management and software engineering technologies into existing data structures, ensuring compliance with data management and security policies.Monitor performance and data accuracy of data processing systems.Maintains Cl/CD systems and code base.Establishes and operates to maintain high quality standards and maintains cloud architecture systems across accounts.Mentor and guide junior data engineers.Adopt new technologies while leveraging your accumulated expertise in modern big data tools and cloud services.RequirementsBachelor's degree or above in Computer Science, Engineering or a related field, English can be used as a working language.At least 1 years of relevant experience in data engineering with project experience in designing and implementing complex data solutions.Proficiency in Python and experience in developing robust, maintainable, and scalable data processing pipelines.Experience with CI/CD systems, automated workflows, and integrating data quality checks into deployment processes.Strong experience with cloud services (preferably AWS), including the use of AWS data services and knowledge of Snowflake as a data warehouse solution.Expertise in working with all forms of data infrastructure and adept at managing streaming and batch data systems.Solid understanding of data modelling, ETL (Extract, Transform, Load) processes and data warehousing principles with a commitment to improving data quality and accuracy.BenefitsHybrid working arrangement - 2 Days of remote work per weekOpportunities for enriching career growth, including exposure to regional contextsComplimentary snacks and beverages available in the office pantryHealthcare coverage (medical, dental, optical), gym benefitsFlexibility in smart casual dress codeYoung, vibrant and open work culture

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In