Job Description

Please send your resume at mounika.karnati@amyantek.com if you are interested in this 05 month contract with Ontario Public Service(Ministry of Public and Business Service Delivery and Procurement) with a possibility of extension, If you are not interested, please feel free to pass it in your network for anyone looking for work.Job Title: RQ09863 - Intermediate Data Science DeveloperWorking Status: 4 Days a week till 4th Jan 2026 and then Full time onsiteLocation: 222 Jarvis Street, Toronto, OntarioHours per day: 7.25Must Have:2–5 years of professional experience in data science, data analytics, or a related quantitative field (e.g., data engineering, machine learning, or business intelligence) or equivalent.Proven experience in data analysis, visualization, and statistical modeling for real-world business or research problems.Demonstrated ability to clean, transform, and manage large datasets using Python, R, or SQL.Programming & Data HandlingPython (pandas, NumPy, scikit-learn, statsmodels, matplotlib, seaborn)SQL (complex queries, joins, aggregations, optimization)Data preprocessing (feature engineering, missing data handling, outlier detection)Experience working with big data frameworks such as Apache Spark and Hadoop for large-scale data processing.Responsibilities Participate in product teams to analyze systems requirements, architect, design, code and implement cloud-based data and analytics products that conform to standards.Design, create, and maintain cloud-based data lake and lakehouse structures, automated data pipelines, analytics models, and visualizations (dashboards and reports).Liaises with cluster IT colleagues to implement products, conduct reviews, resolve operational problems, and support business partners in effective use of cloud-based data and analytics products.Analyses complex technical issues, identifies alternatives and recommends solutions.Prepare and conduct knowledge transferGeneral Skills Experience in multiple cloud base data and analytics platforms and coding/programming/scripting tools to create, maintain, support and operate cloud-based data and analytics products.Experience with designing, creating and maintaining cloud-based data lake and lakehouse structures, automated data pipelines, analytics models, and visualizations (dashboards and reporting) in real world implementationsExperience in assessing client information technology needs and objectivesExperience in problem-solving to resolve complex, multi-component failuresExperience in preparing knowledge transfer documentation and conducting knowledge transferA team player with a track record for meeting deadlines Desirable Skills Written and oral communication skills to participate in team meetings, write/edit systems documentation, prepare and present written reports on findings/alternate solutions, develop guidelines / best practices Interpersonal skills to explain and discuss advantages and disadvantages of various approachesExperience in conducting knowledge transfer sessions and building documentation for technical staff related to architecting, designing, and implementing end to end data and analytics productsTechnology Stack Azure Storage, Azure Data Lake, Azure Databricks Lakehouse, and Azure Synapse Python, SQL, Azure Databricks and Azure Data Factory Power BIRated Criteria Experience - 40 % 2–5 years of professional experience in data science, data analytics, or a related quantitative field (e.g., data engineering, machine learning, or business intelligence) or equivalent.Proven experience in data analysis, visualization, and statistical modeling for real-world business or research problems.Demonstrated ability to clean, transform, and manage large datasets using Python, R, or SQL.Hands-on experience building and deploying predictive models or machine learning solutions in production or business environments.Experience with data storytelling and communicating analytical insights to non-technical stakeholders.Exposure to cloud environments (AWS, Azure, or GCP) and version control tools (e.g., Git).Experience working in collaborative, cross-functional teams, ideally within Agile or iterative project structures.Knowledge of ETL pipelines, APIs, or automated data workflows is an asset.Previous work with dashboarding tools (Power BI, Tableau, or Looker) is preferred.Technical Skills - 35% Programming & Data HandlingPython (pandas, NumPy, scikit-learn, statsmodels, matplotlib, seaborn)SQL (complex queries, joins, aggregations, optimization)Data preprocessing (feature engineering, missing data handling, outlier detection)Machine Learning & Statistical ModelingProficiency in supervised and unsupervised learning techniques (regression, classification, clustering, dimensionality reduction)Understanding of model evaluation metrics and validation techniques (cross-validation, A/B testing, ROC-AUC, confusion matrix)Basic understanding of deep learning frameworks (TensorFlow, PyTorch, or Keras) is a plusData Visualization & ReportingExpertise with visualization libraries (matplotlib, seaborn, plotly, or equivalent)Experience building interactive dashboards (Tableau, Power BI, Dash, or Streamlit)Ability to design clear, impactful data narratives and reportsData Infrastructure & ToolsExperience with cloud-based data services (e.g., AWS S3, Redshift, Azure Data Lake, GCP BigQuery)Experience working with big data frameworks such as Apache Spark and Hadoop for large-scale data processing.Familiarity with data pipeline and workflow toolsExperience with API integration and data automation scripts (Selenium, Python, etc)Solid grounding in probability, statistics, and linear algebraUnderstanding of hypothesis testing, confidence intervals, and sampling methodsSoft Skills- 20% Strong communication skills; both written and verbalAbility to develop and present new ideas and conceptualize new approaches and solutionsExcellent interpersonal relations and demonstrated ability to work with others effectively in teamsDemonstrated ability to work with functional and technical teams Demonstrated ability to participate in a large team and work closely with other individual team membersProven analytical skills and systematic problem solvingStrong ability to work under pressure, work with aggressive timelines, and be adaptive to changeDisplays problem-solving and analytical skills, using them to resolve technical problemsPublic sector Experience- 5% OPS (or other government) standards and processes

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period