fbpx
Topics

CSIRO: The internet is ‘fundamentally broken’ when it comes to trust and malicious threats

- May 24, 2019 2 MIN READ

 

The CSIRO’s Data61 says around half of the internet’s most popular websites are at risk of malicious activity because of their dependence on third party services for ads, tracking and analytics.

The problem is so profound, it’s now at the point where “the trust model of today’s World Wide Web is fundamentally broken”, experts from the data arm of Australia’s national science agency argue.

Professor Dali Kaafar, Information Security and Privacy research leader at Data61, who is also Scientific Director of Optus Macquarie University Cyber Security Hub, presented the findings of a research paper “The Chain of Implicit Trust: An Analysis of the Web Third-party Resources Loading” at a conference in the US last week.

The tracking services used by leading websites can be so varied, without property scrutiny of security and privacy, that they ultimately undermine the implicit trust the original website holds with users, he argues.

“Almost all websites today are heavily embedded with tracking components. For every website you visit, you could be unknowingly loading content from potentially malicious parties and leaving a trail of your internet activity,” Professor Kaafar said.

The research from Prof. Kaafar and his team found that 1.2% of third parties linked to the top 200,000 websites were suspicious.

Popular web resource Javascript, generally used to improve the user experience of the web, represents the greatest risk of malicious activity as they are designed to be executed undetected.

“The potential threat should not be underestimated, as suspicious content loaded on browsers can open the way to further exploits including Distributed Denial of Service attacks which disrupt traffic to websites, and ransomware campaigns which cost the world more than US$8 billion in 2018,” Professor Kaafar said.

“Worryingly, the original or ‘first party’ websites have little to no visibility of where these resources originate. This points to a lack of ‘trustability’ of content on the web, and the need to better regulate the web by introducing standardised security measures and the notion of explicit trust.”

The research found that while a majority (84.91%) of websites have short chains (with levels of third-party dependencies below 3), others sites had chains with more than 30.

“Of course, the most commonly implicitly trusted third-parties are well known operators (e.g., doubleclick.net), but we also observed various less known implicit third-parties,” the paper concludes.

They performed sandbox (computer security) experiments on the suspicious JavaScript they found.

“We witnessed extensive download activities, much of which consisted of dropper files and malware, which was being installed on the machine. It was particularly worrying to see that JavaScript resources loaded at level ≥ 2 in the dependency chain tended to have more aggressive properties, particularly as exhibited by their higher VTscore,” the paper said.

“This exposes the need to tighten the loose control over indirect resource loading and implicit trust: it creates exposure to risks such as malware distribution, SEO poisoning, malvertising and exploit kits redirection. We argue that ameliorating this can only be achieved through transparency mechanisms that allow web developers to better understand the resources on their webpages (and the related risks).”

Resolving the security issue created by dependency chains will require additional research, Prof. Kaafar said, and the support of the World Wide Web Consortium, the predominant organisation focused on developing web standards, as well as web ‘hypergiants’.

To protect yourself in the meantime, he suggests installing simple web browser extensions such as ad- and JavaScript-blockers to limit exposure to malicious activity.