Addressing the issue of data leakage from the cloud

Paolo Passeri, cyber intelligence principal at Netskope, talks about the issue of data leakage within the cloud.

I cannot be alone in my frustration over the constant reports of data leakage from the cloud. It seems a day cannot go by without another incident being reported; yet none of them seems enough to serve as a rallying cry for change.

The number and scale of leaks are both increasing, as is the potential impact on those affected (both the organisations leaking the data and those about whom the data pertains). Rather than determining whether a leak is worth media attention by the fame of the brand name concerned, the headlines should be paying more attention to the numbers and type of data involved.

Novaestrat, an Ecuadorian data analytics and marketing company, recently left an unsecured Elasticsearch server exposed, potentially compromising data on nearly 21 million individuals. This number includes duplicate records and obsolete entries but considering that the population of Ecuador is 16.6 million, that is a huge number!

IBM says automation is the next big step in cyber security

The fourth IBM cyber security survey has revealed how unprepared companies are for a cyber attack. How can they remedy this?

Leaky AWS S3 buckets are probably the most common crime scene for unsecured data in the cloud and LionAir is the latest S3 exposure example. The company left an exposed AWS buckets containing personally identifiable information – including passport numbers – belonging to millions of passengers. Big names often lurk behind lesser known organisations within the supply chain for these incidents. In another recent example, data company Attunity left a trove of data belonging to Ford, Netflix and TD Bank in publicly accessible S3 buckets.

I could go on and mention many more examples, but this exercise won’t answer the fundamental question: why do these incidents continue to happen?

The fourth Cloud Security Alliance Top Threats report (2019) shows that many of us are not blind to the risks. 241 industry experts place data breaches, misconfiguration of cloud infrastructure and a lack of cloud security architecture and strategy as the top three risks in cloud usage. Of course, not all enterprise cloud consumers are experts, but the majority of those responsible for cloud infrastructure are, at least, specialists. So, assuming that the risks of cloud threats are understood, I believe that too many organisations still do not completely understand the “shared responsibility” model. Shared responsibility establishes where the responsibility of the cloud service provider ends (security “of” the cloud), and where the responsibility of the customer begins (security “in” the cloud). A tiny preposition makes a huge difference!

In an ideal world this model should be quite straightforward, but unfortunately, our world is far from ideal (and not only in terms of information security). Too many organisations ignore their duties relating to the security of data “in” the cloud. Just this week, a new piece of research revealed that only 32% of organisations believe that protecting data in the cloud is their own responsibility. In many instances, this shared responsibility is further ‘clouded’ by the complexity of the supply chain, as we saw in the examples above. It is often the third party – rather than the data owners – that should have been enforcing all the necessary measures to secure the data “in” the cloud. Too often, there is a chain of implicit trust, misplaced with devastating consequences. These breaches become additional confirmation that the security of the supply chain should be a core element of a security strategy, and the cloud security strategy is no exception here.

How digital transformation changes security needs

As companies look to digital transformation, they must look at security transformation as well.

User education plays an important role too in surmounting cloud security issues, but there are also multiple tools that can automate the policing of cloud use, and help educate users as they go. These tools are generally pretty straightforward; we have a predefined rule in our Continuous Security Assessment that states “Ensure S3 Bucket is not publicly accessible”, and we can prevent the ransacking of records on public Elasticsearch databases with a simple rule that alerts in case the TCP 9200 port is exposed so that the appropriate remediation actions can be easily enforced. This is a simple configuration that could have prevented a massive breach like the one that occurred in Ecuador, and demonstrates that sometimes the solutions are more straightforward than the initial problem appears.

Despite the speed at which organisations are now moving to the cloud, security mindsets and strategies (in a nutshell, the concept of perimeter) are still on-premise. This is the fault of both the organisation as a whole and of individuals who continue to ignore the fact that misconfigurations in the cloud can potentially expose data to more than 3.2 billion people. In the past we used to implore people not to leave post-it notes on their monitors that would make a password visible to all their office colleagues. My sixth sense tells me that issue hasn’t gone away, but it now pales in comparison to the exposure of whole data sets left in publicly accessible cloud buckets, unprotected from any passer-by on the internet.

Organisations need to redefine the concept of data security in the cloud, enforcing security controls closer to the data and reimagining their perimeter in a way that is data-centric, cloud smart, and fast enough that it doesn’t prohibit productivity or innovation.

Written by Paolo Passeri, cyber intelligence principal at Netskope

Editor's Choice

Editor's Choice consists of the best articles written by third parties and selected by our editors. You can contact us at timothy.adler at stubbenedge.com