Summary Position Summary AI & DataIn this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. TIn this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making.
The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to:
Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platformsLeverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actionsDrive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvementsPySpark ConsultantThe position is suited for individuals who have demonstrated ability to work effectively in a fast paced, high volume, deadline driven environment. Education and ExperienceEducation: B. Tech/M.
Tech/MCA/MS3-6 years of experience in design and implementation of migrating an Enterprise legacy system to Big Data Ecosystem for Data Warehousing project. Required SkillsMust have excellent knowledge in Apache Spark and Python programming experienceDeep technical understanding of distributed computing and broader awareness of different Spark versionStrong UNIX operating system concepts and shell scripting knowledgeHands-on experience using Spark & PythonDeep experience in developing data processing tasks using PySpark such as reading data from external sources, merge data, perform data enrichment and load in to target data destinations.
Experience in deployment and operationalizing the code, knowledge of scheduling tools like Airflow, Control-M etc. is preferredWorking experience on AWS ecosystem, Google Cloud, BigQuery etc.
is an added advantageHands on experience with AWS S3 Filesystem operationsGood knowledge of Hadoop, Hive and Cloudera/ Hortonworks Data PlatformShould have exposure with Jenkins or equivalent CICD tool & Git repositoryExperience handling CDC operations for huge volume of dataShould understand and have operating experience with Agile delivery modelShould have experience in Spark related performance tuningShould be well versed with understanding of design documents like HLD, TDD etcShould be well versed with Data historical load and overall Framework conceptsShould have participated in different kinds of testing like Unit Testing, System Testing, User Acceptance Testing, etcPreferred SkillsExposure to PySpark, Cloudera/ Hortonworks, Hadoop and Hive.
Exposure to AWS S3/EC2 and Apache AirflowParticipation in client interactions/meetings is desirable. Participation in code-tuning is desirable. Our purposeDeloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities.
Our people and cultureOur inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional developmentAt Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills.
Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You ThriveAt Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits.
Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tipsFrom developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 305899
Customize your resume to highlight skills and experiences relevant to this specific position.
Learn about the company's mission, values, products, and recent news before your interview.
Ensure your LinkedIn profile is complete, professional, and matches your resume information.
Prepare thoughtful questions to ask about team dynamics, growth opportunities, and company culture.