Job Description
Job Description
Purpose of the role
To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure.
Accountabilities
Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data.
Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures.
Development of processing and analysis algorithms fit for the intended data complexity and volumes.
Collaboration with data scientist to build and deploy machine learning models.
Vice President Expectations
To contribute or set strategy, drive requirements and make recommendations for change. Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures..
If managing a team, they define jobs and responsibilities, planning for the department’s future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes. They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements..
If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others..
OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction. They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions..
Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment.
Manage and mitigate risks through assessment, in support of the control and governance agenda.
Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does.
Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business.
Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies.
Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives. In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions.
Adopt and include the outcomes of extensive research in problem solving processes.
Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes.
All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.
At Barclays, we don’t just adapt to the future – we create it. Embark on a transformative journey as a Senior Data Engineer, we are seeking a highly experienced Senior Data Engineer with strong hands‑on expertise in building enterprise‑scale data pipelines on AWS using Apache Spark (Scala / PySpark). The role requires deep experience with modern Lakehouse, architecture, Apache Iceberg, dbt, and strong working knowledge of Snowflake and Databricks within an AWS ecosystem.
To be successful in this role, you should have experience with:
Experience in Data Engineering with strong enterprise delivery exposure.
Strong hands‑on experience building data pipelines using Apache Spark (Scala & PySpark).
Proven experience building and operating enterprise‑scale data platforms, including IaC / Terraform.
Strong DVD skills design scalable and maintainable data solutions.
Validation (data quality, reconciliation, correctness).
Delivery of production‑ready, operationally stable pipelines.
Experience with Airflow or similar orchestration tools.
Strong experience working in regulated / large enterprise environments.
Strong hands‑on experience with AWS services, including: S3, AWS Glue (Jobs & Catalog), Athena, Lakehouse architectures, Step Functions / Lambda.
Hands‑on experience with Apache Iceberg.
Some of Highly Valued Skills may include:
Experience integrating Snowflake and/or Databricks with AWS data lakes and metadata catalogs.
Strong SQL and data modelling skills.
Ab Initio ETL experience.
Additional orchestration experience beyond Airflow.
Exposure to complex enterprise data governance and compliance patterns.
The candidate must demonstrate Barclays’ Values (Respect, Integrity, Service, Excellence, Stewardship) and operate with the Barclays Mindset.
Empower – enable teams, collaborate effectively, and take ownership
Challenge – question the status quo, use data and insight to improve outcomes.
Drive – focus on results, deliver with pace, and act with accountability.
You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills.
The location of the role is Pune, IN