Will Erskine, General Manager, Australia, and Andy Roberts, Principal Consultant at PBT Group
Cyber security is making headlines after several high-profile Australian data breaches have been publicly exposed.
Increasing cyber security equates to building bigger walls around data, putting more locks on the metaphoric doors and giving only one person the key.
However data governance and data management are also vital to ensuring the safety of sensitive data.
Data governance maps out the policies and procedures that are applied to data to ensure it is accurate, available and secure. Data management encompasses the processes and the technology that implement the policies and procedures to provide the data for use with real business benefits.
To understand the link between cyber security and data governance and management, think of an old-fashioned bank in a western movie. It has a vault that’s highly secure and buried underground on three sides. Its only access is a solid steel door with many security mechanisms including a combination lock.
When a data breach occurs, imagine that someone has drilled into the vault from below and removed all the gold bars before anyone notices anything is wrong. This type of attack occurs and it’s important to protect against it, but it’s probably the least likely to happen.
To continue the old-fashioned bank analogy, consider who has the keys to the vault and who knows the combination to the lock? How often does the combination change? When the combination changes how is that communicated to people who need to know it? (Once the combination is emailed or written down, it’s already compromised.)
User access control is a vital element of cyber security. In the bank scenario, and in most modern corporate environments, it is far easier to access the system through the front door by compromising an employee who has access than by drilling through the wall of the safe.
Data is the most valuable asset an organisation has – although many organisations haven’t entirely realised that. Data is “the new oil” with organisations eager to maximise use of the data they store to improve financial success and customer experience.
Organisations need to strike a balance between keeping data locked up completely and providing access to it for appropriate reasons.
For example, if a telco’s customer data is stored on paper in a vault at an underground location in remote Northern Territory, accessible only by passing through several security doors and crossing a moat infested with crocodiles, it is very secure. But it’s not practical if a customer needs to take the treacherous journey to update their address or change their subscription.
Returning to the old-fashioned bank analogy, the bank needs to access the gold in its vault at times for varied reasons:
- to pay the bank manager’s bonus
- to count it
- to buy things
- to transfer gold to other banks in the area.
Each scenario opens potential vulnerabilities where the gold might be compromised. For example, if an employee needs to count the gold and moves it from the hot, stuffy vault to a better-ventilated room and inadvertently leaves the gold outside the security of the vault for a time. (This equates to printing documents for a meeting then leaving them exposed in a meeting room or leaving a laptop open on a desk with access to sensitive data.)
When transferring gold to another branch, you need a registered, known carrier who can deliver the valuable cargo. However if the carrier needs to be booked weeks in advance and an employee forgets to do so, the bank may decide to use someone unknown to deliver the gold. (This equates to emailing documents outside a network.)
Even if a stage coach transports gold between branches in a triple-locked, iron-bound chest with a hired gunman sitting on it, what happens when they stop to rest the horses? (This represents data in transit that must be encrypted at all points to protect it.)
Availability v security
The old-fashioned bank analogy demonstrates that data security cannot be viewed in isolation, it must be balanced with data availability.
User access control is essential to achieving that balance. Strict user roles need to be implemented allowing different access to different data domains within an organisation’s data landscape.
Users are assigned one or more roles based on the data needed to fulfil their work. To ensure successful access governance, the roles must be continually updated with what data can be accessed as the data landscape changes.
Regular reviews of access are required as users’ responsibilities change, they move between teams or leave the organisation.
Building data governance frameworks
PBT Group assists organisations to build bespoke data governance frameworks by engaging with them through the lifecycle steps appropriate to their needs:
- Ascertain the organisation’s overriding data strategy or help to build one that fits the company’s data requirements.
- Define roles and responsibilities – A massive part of a successful data governance project lies with the diverse and influential team that is embedding it.
- Perform a data maturity assessment to understand gaps and strengths within the organisation’s’s data landscape that will help to define the scope and direction of the data governance strategy as a whole.
- Define the data governance strategy.
- Mine metadata, data domains and master data settings to identify and prioritise all the data in the landscape.
- Document data flows to highlight how the data moves and where there are vulnerabilities.
- Create policies and standards that ensure the data is secure but still usable and available.
- Assign stewardship to communicate and enforce the policies and standards.
- Establish data controls – A mix of proactive (fixing at source) or reactive (through data quality monitoring or auditing) controls depending on the data’s criticality.
- Communicate and educate all data users/owners to ensure the governance process is understood and adopted organisation-wide.
- Review and improve – To ensure lasting benefit, the governance framework needs regular reviewing and enhancing.
An organisation’s data governance framework needs senior executive level support and a seamless overlap between the strategic and technical levels. Implementation is only stage one – it’s an interative journey as an organisation evolves its data governance regime.
Most data breaches occur through internal errors or misuse of systems, which means the controls are defective. As data volumes increase, so do entry points, creating greater vulnerabilities. IoT-enabled devices with sensors, processing ability, software and other technologies that connect and exchange data with other devices and systems have created an expanded attack surface.
Organisations need to classify their data so they understand precisely what they have and where it is stored. To enable access, it is safest to start with a zero trust security model, based on the premise that no person or device inside or outside an organisation’s network should be granted access to connect to IT systems or services until they are authenticated and continuously verified.
Until sufficient protocols and controls are in place, organisations’ data governance frameworks are still in the modern equivalent of the wild west days of the bank with the underground vault.
*Andy Roberts has more than 20 years’ experience in data engineering, focusing on systems construction, data architecture, data governance, data integration and analysis in Southern Africa, covering finance, insurance and telcos. In early 2023, Andy is relocating to Melbourne to expand PBT Australia’s data capabilities. You can contact him on Andy.Roberts@pbtgroup.com.au.