Job Description

At Outsourced, we connect top talent with exciting opportunities at innovative global companies. We partner with fast-growing businesses around the world to help them build high-performing teams!We are seeking a highly skilled Data Engineer to design, build, and optimize data pipelines and solutions within our Microsoft-centric data ecosystem. The ideal candidate will have hands-on experience with modern data engineering tools, cloud platforms, and a strong understanding of data integration, transformation, and storage. You will play a critical role in ensuring our data is reliable, scalable, and easily consumable by business users, analysts, and data scientists.ResponsibilitiesDesign, develop, and maintain scalable ETL/ELT pipelines for ingesting and transforming structured and unstructured data.Work with Microsoft Fabric to build dataflows, lakehouses, and end-to-end data solutions.Develop and optimize data storage solutions using Azure Data Lake Storage, Azure Synapse, and Azure SQL Database.Implement streaming and batch processing using tools like Apache Spark, Azure Stream Analytics, and Kafka.Collaborate with data architects to implement data models, governance, and security standards.Schedule and orchestrate pipelines using Azure Data Factory or Apache Airflow.Ensure data quality, integrity, and availability through validation, monitoring, and error-handling frameworks.Support advanced analytics and machine learning by delivering clean, curated, and well-documented datasets.Work with stakeholders to understand business needs and deliver data solutions aligned with objectives.Required Skills & ExperienceStrong programming skills in Python, SQL, and familiarity with PySpark/Spark.Hands-on experience with Microsoft Fabric (dataflows, lakehouses, pipelines, Synapse integration).Proficiency in Azure data services: Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, Azure Data Lake Storage.Knowledge of big data frameworks (Spark, Databricks, Hadoop ecosystem).Experience with data streaming technologies such as Kafka or Azure Event Hubs.Solid understanding of data modeling, warehousing, and performance optimization.Familiarity with cloud architecture and security best practices in Azure.Experience with orchestration & workflow automation tools (Airflow, Data Factory pipelines).Strong problem-solving and collaboration skills with both technical and non-technical teams.Nice to HaveExperience with dbt for transformations.Familiarity with Power BI for enabling analytics.Understanding of CI/CD for data pipelines (e.g., GitHub Actions, Azure DevOps).Exposure to machine learning workflows and MLOps.Education & BackgroundBachelor's or Master's degree in Computer Science, Data Engineering, or related field.4-5 years of professional experience in data engineering.

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In