File hosting services and their data leakage risks

As part of my day-to-day role, I am finding that the use of online file hosting services is increasing significantly on computer networks, and, in most cases, these networks host highly sensitive data. In one instance, the network manager had assumed that his web filtering system would block access to these sites, but this was not the case.

A file hosting service or online file storage provider is an Internet hosting service, specifically designed to host static content, typically large files that are not web pages. End users typically access these services over HTTP or by installing client software. These hosting sites are used for many purposes, but most common ones are:

  1. Users transferring large files externally. In most networks they would be blocked from sending large files via email. Sometimes, the end user may want to work on a file at home, and send it out for retrieval later.
  2. Users downloading large files like video. Some file hosting sites have auto resume options, if downloads get cut off for whatever reason, making them really useful for video downloads.
  3. Users accessing personal data, like photos, music and videos hosted online.
  4. Some of these services are marketed as a backup service in the cloud. Users will then subscribe to this service to backup their most important, and usually their most sensitive, data.

For the purposes of this blog entry, I am going to look at two very popular types of services, which operate in slightly different ways.

  1. Websites, which allow direct uploading, and downloading
  2. Client-based systems, which involve the installation of software, on the end users PC.

Most online file hosting sites provide a number of services, starting off with a free one which usually allow subscribers to send, up to 100MB files. The premium pay options usually allow for the sending of unlimited file sizes.  They work by getting the sender to enter the recipients' e-mail addresses, attaching the file, and then send it to the external hosting site. The recipients receive an e-mail notification, with a URL that lets them download the file. The more popular ones default to using encryption via HTTPS.

Client-based systems usually involve the installation of agent software. Users drag and drop files into specific folders on their systems, and these folders synchronize automatically with the online services. Data is transferred over secure HTTPS connections.

The more popular services will have two types of folder available, private and public. The private folders are only accessible by the user who uploaded them. Any files or folders created within the public folders are accessible by anyone with the correct link. On Windows operating systems, the shared folder is usually located off the users "My Documents".

So what's the problem?

These services are very useful both for sending large files and providing a location where your contacts can download your shared files. Some also use them for online backup of important data.  The ease of use, and availability of these services, introduces serious security holes into your network, and can also result in massive bandwidth consumption on your Internet gateways.

From a security point of view, users may be sending sensitive data to online folders or directly to external contacts. In a worst case situation, they place sensitive data within publicly accessible folders. The traffic will just appear as regular web (HTTPS) traffic on the network, and most firewalls will be oblivious to the actual content. We have even heard reports of users using their own mobile broadband connections to get data off the network and onto the file sharing sites. This circumvents any filtering which may have been put in place on the corporate firewall or proxy.

Users may also 'accidently' move sensitive folders and files into locations which are automatically synchronized with external publically accessible locations, and it is not obvious that this has happened.

These applications can result in massive amounts of data being sent in/out on Internet gateways. A new user to a service can suddenly synch 2GB of data, resulting in serious congestion for other users, sharing the same network. This becomes even more of a problem, if the end users are on the far side of an expensive WAN (wide area connection) link.

Web filtering can be useful for blocking access to these services, but I would also recommend that you implement some sort of monitoring system on your network, as network users may seek out other services if their favorite is blocked. A monitoring system may be simply based on regular log file reviews, or more automated with dedicated systems to report on this activity. The tech savvy may even use external proxies, or anonymisers to gain access to their file hosting service. In part 2 of this blog entry, I am going to discuss ways you can detect, and monitor this activity on your network so stay tuned ...


Darragh Delaney is head of technical services at NetFort Technologies.  As Director of Technical Services and Customer Support, he interacts on a daily basis with NetFort customers and is responsible for the delivery of a high quality technical and customer support service.

Copyright © 2011 IDG Communications, Inc.

It’s time to break the ChatGPT habit
Shop Tech Products at Amazon