Why Data Tokenization Could Be the Answer to Government Security Prayers
With growing number of breaches, there is a huge need for more advanced measures to protect public sector data.
- By Warren Poschman
- May 12, 2020
This year, we have witnessed a variety of government data breaches around the world, ranging from the Quebec government breach which exposed over 350,000 teacher records, to the Israeli citizens voters records leak which affected 6.5 million people, and even the Dutch government breach where personal information pertaining to 7.9 million organ donors was exposed.
Therefore, one might suggest that the public sector has had a difficult time of late when it comes to high-profile cybersecurity incidents, and in particular, governmental institutions. Further research has discovered that the number of leaked government and individual politician records has increased by 278 percent from 2019 to 2020. In fact, according to Verizon’s latest Data Breach Investigations Report, the number of breaches in the public sector increased from 14 to 16 percent in the past year. With state-actors and cybercriminals looking for any avenue to exploit, security can no longer be an afterthought, especially as governmental departments hold vast troves of critical data on its citizens.
However, given the drastic increase of data breaches within this sector, it is apparent there is a need for more advanced measures to protect data and prevent these devastating, and ostensibly increasing lapses in security.
It is not necessary to explain why national and local governments possess so much personal data belonging to their constituents, nor to explain the importance of keeping said data safe, and protected from bad actors. However, it is necessary to say that if information gets into the wrong hands that not only can national security be jeopardized, but public trust in a governments’ ability to safely govern can also be lost. The stakes are high.
Government databases are like a goldmine in a hacker’s eyes. They store information based on education, health, finance and work for a range of different reasons. These libraries of data are often analysed and used to make decisions, but it is well known that even between governmental departments, data sharing can be slow thanks to outdated siloed practices, confidentiality agreements and bureaucracy.
Rather than slowly and inefficiently sharing data, which is at the same time vulnerable to being exposed or abused, there is a solution which renders private data meaningless to unauthorized actors, while still providing analytical insight for privileged users: data tokenization.
Tokenization can provide the security key
Data tokenization is an efficient method of protecting sensitive data. It substitutes a sensitive data element with a non-sensitive equivalent and by tokenizing critical data, the risk of exposing confidential data is minimized but ability to analyze and extract insights is maintained.
One of the benefits of tokenization is that data is protected throughout its entire lifecycle, so if a bad actor were to come across it, it would be worthless and unidentifiable to them.
Benefits and potential use cases
Indeed, there is more to tokenization than simply allowing departments to share sensitive information without the fear of it becoming exploited. For example:
- Tokenizing can help with the development and testing of new software and applications. For instance, it is common for government departments to invest in the skills and expertise of third-party vendors to construct internal systems as it can be more cost effective. However, issues could arise should the software or application necessitate the input of critical information. A prime example could the US Department of Health and Human Services analyzing data for health programs like Medicare. By tokenizing the personal health data, the analytical value of the data isn’t affected even though it is anonymized, and analytics can still be run on the anonymized data; putting no personal info at risk, but potentially saving lives. This will ensure compliance with regulations such as the Health Insurance Portability and Accountability Act (HIPAA).
- For new staff that join the government it would be normal, like in most organizations, to be put through a probationary period while being trained on systems. So, to protect the exposure of any critical data, it is worth the governmental institution to tokenize the data so that the new employees can practice without having to view real sensitive data, or inadvertently disclosing it due to lack of experience.
- What about the highly sensitive data that already resides within the government perimeter? Naturally, with digital transformation, governmental institutions have begun storing and archiving data offsite and into the cloud, but this will increase the risk of data exposure. It is therefore advised to implement data tokenization which will protect the information throughout its lifecycle, including when at rest and during its archived state.
With a data-centric and holistic approach to security, that has the added protection of tokenization, governmental departments and public officials can begin to share and analyze sensitive data in a secure manner. Combine this with the publicly available information, governmental departments can retrieve the necessary insights they require to make improved decisions while being assured that the privacy and security of PII is being ensured.