Monday, October 27, 2025
Master Works

Datamart Developer

Posted: 1 days ago

Job Description

We are seeking an experienced Datamart / Semantic Layer Developer to develop and implement business-oriented datamarts and semantic layers on Teradata EDW, CDP Hive, and Trino platforms. The candidate must possess strong SQL development skills, dimensional modeling knowledge, telecommunications domain expertise, and ability to translate technical specifications into optimized analytics solutions.Experience Required: Minimum 5+ years in datamart development and semantic layer implementationCore ResponsibilitiesDatamart DevelopmentDevelop and implement star schema and snowflake schema dimensional models on Teradata EDWBuild subject-area datamarts (Customer, Revenue, Network, Product, Finance) based on design specificationsCreate and optimize fact tables, dimension tables, bridge tables, and aggregate tablesImplement slowly changing dimensions (SCD Types 1, 2, 3) logic and dimensional hierarchiesDevelop complex SQL queries, stored procedures, and views for datamart populationImplement data transformation and aggregation logic for business metrics and KPIsSemantic Layer DevelopmentDevelop semantic layers using TIBCO Data Virtualization on Teradata and CDP platformsBuild semantic models using Trino for distributed query processing and data accessCreate virtual views, materialized views, and business-friendly data abstractionsImplement business logic, calculated measures, KPIs, and derived metrics in semantic layerDevelop data access policies, row-level security, and governance rulesOptimize semantic layer performance through caching, indexing, and query optimizationMulti-Platform DevelopmentWork across Teradata, CDP Hive, and Trino platforms for datamart and semantic layer implementationDevelop HiveQL queries and tables in CDP (Cloudera Data Platform) environmentIntegrate data from Teradata EDW and CDP Hive through Trino for unified semantic accessCreate cross-platform queries and federated views using Trino connectorsImplement partitioning, bucketing, and optimization strategies in Hive tablesImplementation & OptimizationTranslate design documents (HLD, LLD) and mapping specifications into SQL codeDevelop ETL/ELT processes to populate datamarts from EDW sourcesOptimize query performance using indexing (PI, SI, NUSI), statistics, partitioning, and aggregationsConduct unit testing, data validation, and reconciliation between source and targetDebug and troubleshoot performance issues in datamarts and semantic layersCollaboration & DocumentationWork closely with datamart designers, EDW developers, BI teams, and business analystsImplement business requirements and KPI calculations as per specificationsCreate technical documentation: SQL scripts, deployment guides, data lineageSupport UAT activities and assist business users in validating data accuracyProvide production support and resolve data or performance issuesRequirementsRequired SkillsSQL & Development (Required - Strong)Teradata (Must Have): Advanced SQL development, stored procedures, performance tuning, utilities (BTEQ, TPT)Strong understanding of Teradata architecture, indexing (PI, SI, NUSI), partitioning, and statisticsCDP Hive: HiveQL development, table creation, partitioning, bucketing, optimization in Cloudera environmentTrino (PrestoSQL): SQL development using Trino, federated queries, connector configurationExpert-level SQL across multiple platforms for complex queries and transformationsOracle SQL and PL/SQL development experienceSemantic Layer & Tools (Required)TIBCO: Hands-on development experience with TIBCO Data Virtualization for semantic layer implementationExperience creating virtual views, business views, and semantic models in TIBCOUnderstanding of data virtualization concepts and query federationKnowledge of BI tool integration with semantic layersDimensional Modeling KnowledgeStrong understanding of star schema and snowflake schema dimensional modelsKnowledge of fact table design, dimension design, and SCD implementationsAbility to translate dimensional models into physical database objectsUnderstanding of dimensional modeling best practices (Kimball methodology)Telecommunications Domain (Required)Understanding of telecom business processes, KPIs, and data flowsOSS: Network performance, inventory, fault management metricsBSS: Billing, customer analytics, revenue, churn, product performanceTelecom KPIs: ARPU, churn rate, CLTV, network utilization, revenue metricsProfessional SkillsFull SDLC experience (Agile/Scrum, Waterfall)Strong analytical and debugging skills for performance troubleshootingGood communication skills for technical collaborationUnix/Linux scripting for automation (plus)Version control: Git, SVNPreferred QualificationsBachelor's degree in Computer Science, Information Technology, or related fieldExperience with data profiling and data quality toolsKnowledge of ETL tools (Ab Initio, Informatica)Understanding of data governance and metadata managementExperience with BI tools: Tableau, Power BI, QlikKey DeliverablesDeveloped and deployed datamarts (star/snowflake schema) on TeradataSemantic layer implementations using TIBCO and Trino with business views and virtual tablesOptimized SQL code, stored procedures, and views for datamartsHiveQL scripts and tables in CDP environmentTechnical documentation: SQL scripts, deployment guides, data lineageUnit tested code with data validation and reconciliation reportsPerformance tuning recommendations and optimization implementations

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

Related Jobs

Nayyara Hospitality Company Ltd

شيف

Nayyara Hospitality Company Ltd

Riyadh