We have an exciting opportunity for an Enterprise Data Architect to join an existing team with one of our retail banking clients. The Data Resilience team has the responsibility to define and embed new Strategies, Operating Models and Control Frameworks to protect critical data services, as well as ensuring compliance with regulatory requirements for operational resilience.
This is built around 3 main objectives:
- Setting up a Data Pillar: identify & define critical data assets supporting critical business processes. Finding innovative and pioneering solutions to deliver Data Pillar in ServiceNow.
- Assess level of resilience around those assets.
- Define/design/test and implement tooling requirements around this.
The Data Resilience team are managing the E2E delivery of Data Resilience & Data Pillar Set-up, from requirements gathering, definition, user stories, ServiceNow platform, agree solutions, do the build, testing, implementation to go live.
About the Role
In the Data workstream they are developing ServiceNow and Tooling capability to stand-up a new Data Pillar alongside our existing Technology, People, Property and Supply Chain Pillars. This will enable the client to understand, map its critical data assets and assess data resilience across the organisation.
The successful candidate will mostly be focussed on Data Resilience; as part of this, you will need to explore: what data is required, how the data flows, and how do we make the data & processes surrounding it resilient?
This role is looking at business services, applications, assets. You’ll need to understand asset classes and have a DBA/Infrastructure mindset. The role looks at how messaging works, batches, data centres, business continuity and speaking with technical experts and architects. This is not a Data Governance or data quality management chapter within CDAO – we need resources having relevant experience in improving data and technology resilience.
- Pushing the data resiliency agenda across the organisation. You’ll look at how data flows from source to destination across the technology landscape and what can be done to ensure there is no data loss, data corruption, ransomware/malware attacks and the data can be recovered within the impact tolerance of Important Business Services.
- You’ll support the establishment of controls and assessment frameworks that identify data vulnerabilities across a complex data and technical landscape (e.g., on premise, 3rd party, middleware, databases, 3rd party applications, messaging queues, data feeds, data connections, APIs, batches, and cloud environments).
- You’ll support the embedding of data assessments engaging a large and diverse stakeholder group including target operating model design, data resilience MI design, data resilience RCSA design, changes to operational resilience, data security, technology, and data policies to embed the standards governing data resilience.
- You’ll analyse outcomes of data resilience annual assessments and identify vulnerabilities from data perspective across availability, integrity, and security of data.
- You’ll support creation of data lineage using Ins-Pi and ServiceNow outlining the applications required for each step of the journey, upstream and downstream applications, how data moves in transit or held at rest across the technology landscape.
Applicants will have the following experience:
- Batches, messaging queues, third party data connections, encryption, data recovery & backup, data vaulting, data integrity and cloud technologies is essential.
- Experience in a similar role focused on improving the resilience of Data across banking or insurance sector.
- An understanding of payments, cards, pensions, insurance, markets, trade & settlement, logon customer journeys.
- Experience in managing risks and controls.
- Strong enterprise technical architecture background to identify data resiliency issues on middleware components e.g., how can we make batches more resilient, improve security when data is transferred from messaging queues, improve quality of data flowing in and out, how can data in cloud be made resilient.