Loading...
Kartaca

Data Engineer

Posted: 19 hours ago

Job Description

At Kartaca, our goal is to create the perfect solutions for our customers. With the business standards that we do not compromise and preferring free software, we work to develop products that make us proud.We are looking for new teammates who share the same enthusiasm; are curious to learn, willing to add value to what they do, and have work ethics.The ideal candidate;has a bachelor’s degree in Computer Engineering/Science, Mathematics related technical field or a related fieldhas a solid foundation in computer science, with competencies in data structures, algorithms, and software designhas 2+ years of designing, building, and operationalizing large-scale enterprise data solutions and applications using cloud vendor data and analytics services in combination with 3rd parties and working with cross-functional teams in a dynamic environmenthas experience in object-oriented/object function scripting languages: Python, Java, etc.has experience with data processing software (Hadoop, Spark, Pig, Hive) and algorithms (MapReduce, Flume)has experience with relational SQL and NoSQL databases, including PostgreSQL and Cassandra.has experience with stream-processing systems: Storm, Spark-Streaming, etc.has experience with architecting and implementing next-generation data and analytics platforms on cloud platformshas a successful history of manipulating, processing, and extracting value from large unstructured datasetsis an agile and goal-oriented team player with the ability to work under minimal supervisionhas strong analytical skills, project management, and organizational skillshas experience with message queuing, stream processing, and highly scalable ‘big data’ data stores.has experience working with Big Data, information retrieval, data mining, or ML, and experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, MongoDB, SparkML, TensorFlow)Able to demonstrate English proficiency of at least C1 levelPreferably;is a Professional Google Cloud Certified Data Engineerhas hands-on GCP experience with a minimum of 1 solution designed and implemented at a production scalehas experience in performing detailed assessments of current state data platforms and creating an appropriate transition path to GCP cloudhas experience with Cloud DataProc, Cloud Dataflow, Apache Beam, BigTable, Cloud BigQuery, Cloud PubSub, Cloud Functions, etc.Job DescriptionTo employ big data solutions and build the ETL pipelines in building scalable, high-performance systemsTo conduct complex data analysis, report on results and build algorithmsTo build processes supporting data transformation, data structures, metadata, dependency, and workload managementTo assemble large, complex data sets that meet functional / non-functional business requirementsTo identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability To build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Google Cloud ‘big data’ technologiesTo work with stakeholders, including the executives, product and data teams, to assist with data-related technical issues and support their data infrastructure needsTo keep client data separated and secure through multiple data centers and Google Cloud regions, and to improve data reliability and qualityTo work with data and analytics experts to strive for greater functionality in our data systems

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In