https://bayt.page.link/jcFcrqoDpaVwvCV27
Create a job alert for similar positions

Job Description

QUALIFICATIONS

  • 12+ years of experience in data engineering or data architecture, with a focus on SQL-centric ETL processes and Python programming
  • Strong knowledge of SQL, including complex queries and performance tuning
  • Proficiency in Python, with experience using libraries such as Pandas, NumPy, and PySpark for data manipulation and processing
  • Experience with data modeling, data warehousing, and data integration concepts and best practices
  • One cloud environment is a must have, GCP Azure or AWS
  • Experience with version control systems, such as Git
  • Strong problem-solving skills and the ability to work independently and as part of a team
  • Excellent communication and collaboration skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders
  • Familiarity with the Azure cloud platform
  • Experience with Databricks, including the development and optimization of data pipelines using Databricks
  • Experience with data visualization tools, such as Power BI
  • Familiarity with Github Copilot


WHO YOU'LL WORK WITH

You will be based in our Bengaluru or Gurugram office as part of our Growth, Marketing & Sales solutions team.
You’ll  be primarily aligned with Periscope’s technology team. Periscope® By McKinsey enables better commercial decisions by uncovering actionable insights. The Periscope platform combines world leading intellectual property, prescriptive analytics, and cloud based tools to provide more than 25 solutions focused on insights and marketing, with expert support and training. It is a unique combination that drives revenue growth both now and in the future. Customer experience, performance, pricing, category, and sales optimization are powered by the Periscope platform. Periscope has a presence in 26 locations across 16 countries with a team of 1000+ business and IT professionals and a network of 300+ experts.

WHAT YOU'LL DO

You will be responsible for designing scalable and efficient SQL centric data pipelines using Databricks, SQL, and Python.  
You will work closely with our data architects, data scientists, analysts, developers, and other stakeholders to ensure the delivery of high-quality data solutions that meet the needs of the business. 
You will lead the design, development, and maintenance of standardized B2B & B2C ETLs for all retail related clients using Databricks, SQL, and Python and collaborate with other stakeholders to understand data requirements and design solutions that meet business needs.

Job Details

Job Location
India
Company Industry
Other Business Support Services
Company Type
Unspecified
Employment Type
Unspecified
Monthly Salary Range
Unspecified
Number of Vacancies
Unspecified

Do you need help in adding the right mix of strong keywords to your CV?

Let our experts design a Professional CV for you.

You have reached your limit of 15 Job Alerts. To create a new Job Alert, delete one of your existing Job Alerts first.
Similar jobs alert created successfully. You can manage alerts in settings.
Similar jobs alert disabled successfully. You can manage alerts in settings.