diff --git a/projects/torbrowser/design/NewCookieManager.png b/projects/torbrowser/design/NewCookieManager.png new file mode 100644 index 00000000..97a0b408 Binary files /dev/null and b/projects/torbrowser/design/NewCookieManager.png differ diff --git a/projects/torbrowser/design/index.html.en b/projects/torbrowser/design/index.html.en index 30eb5484..7448bfe9 100644 --- a/projects/torbrowser/design/index.html.en +++ b/projects/torbrowser/design/index.html.en @@ -1,12 +1,10 @@ - -The Design and Implementation of the Tor Browser [DRAFT]

The Design and Implementation of the Tor Browser [DRAFT]

Dec 28 2011


Table of Contents

1. Introduction
1.1. Adversary Model
2. Design Requirements and Philosophy
2.1. Security Requirements
2.2. Privacy Requirements
2.3. Philosophy
3. Implementation
3.1. Proxy Obedience
3.2. State Separation
3.3. Disk Avoidance
3.4. Application Data Isolation
3.5. Cross-Origin Identifier Unlinkability
3.6. Cross-Origin Fingerprinting Unlinkability
3.7. Long-Term Unlinkability via "New Identity" button
3.8. Click-to-play for plugins and invasive content
3.9. Description of Firefox Patches
4. Packaging
4.1. Build Process Security
4.2. External Addons
4.3. Pref Changes
4.4. Update Security
5. Testing
5.1. Single state testing

1. Introduction

+ +The Design and Implementation of the Tor Browser [DRAFT]

-This document describes the adversary model, -design requirements, -implementation, packaging and testing -procedures of the Tor Browser. It is -current as of Tor Browser 2.2.35-1 and Torbutton 1.4.5. +This document describes the adversary model, +design requirements, and implementation of the Tor Browser. It is current as of Tor Browser 2.3.25-4 +and Torbutton 1.5.0.

@@ -15,182 +13,27 @@ describe a reference implementation of a Private Browsing Mode that defends against active network adversaries, in addition to the passive forensic local adversary currently addressed by the major browsers. -

1.1. Adversary Model

+

-A Tor web browser adversary has a number of goals, capabilities, and attack -types that can be used to guide us towards a set of requirements for the -Tor Browser. Let's start with the goals. +The Tor Browser is based on Mozilla's Extended +Support Release (ESR) Firefox branch. We have a series of patches against this browser to +enhance privacy and security. Browser behavior is additionally augmented +through the Torbutton +extension, though we are in the process of moving this +functionality into direct Firefox patches. We also change +a number of Firefox preferences from their defaults. -

Adversary Goals

  1. Bypassing proxy settings

    The adversary's primary goal is direct compromise and bypass of -Tor, causing the user to directly connect to an IP of the adversary's -choosing.

  2. Correlation of Tor vs Non-Tor Activity

    If direct proxy bypass is not possible, the adversary will likely -happily settle for the ability to correlate something a user did via Tor with -their non-Tor activity. This can be done with cookies, cache identifiers, -javascript events, and even CSS. Sometimes the fact that a user uses Tor may -be enough for some authorities.

  3. History disclosure

    -The adversary may also be interested in history disclosure: the ability to -query a user's history to see if they have issued certain censored search -queries, or visited censored sites. -

  4. Location information

    +

    -Location information such as timezone and locality can be useful for the -adversary to determine if a user is in fact originating from one of the -regions they are attempting to control, or to zero-in on the geographical -location of a particular dissident or whistleblower. +To help protect against potential Tor Exit Node eavesdroppers, we include +HTTPS-Everywhere. To +provide users with optional defense-in-depth against Javascript and other +potential exploit vectors, we also include NoScript. To protect against +PDF-based Tor proxy bypass and to improve usability, we include the PDF.JS +extension. We also modify several +extension preferences from their defaults. -

  5. Miscellaneous anonymity set reduction

    - -Anonymity set reduction is also useful in attempting to zero in on a -particular individual. If the dissident or whistleblower is using a rare build -of Firefox for an obscure operating system, this can be very useful -information for tracking them down, or at least tracking their activities. - -

  6. History records and other on-disk -information

    -In some cases, the adversary may opt for a heavy-handed approach, such as -seizing the computers of all Tor users in an area (especially after narrowing -the field by the above two pieces of information). History records and cache -data are the primary goals here. -

Adversary Capabilities - Positioning

-The adversary can position themselves at a number of different locations in -order to execute their attacks. -

  1. Exit Node or Upstream Router

    -The adversary can run exit nodes, or alternatively, they may control routers -upstream of exit nodes. Both of these scenarios have been observed in the -wild. -

  2. Ad servers and/or Malicious Websites

    -The adversary can also run websites, or more likely, they can contract out -ad space from a number of different ad servers and inject content that way. For -some users, the adversary may be the ad servers themselves. It is not -inconceivable that ad servers may try to subvert or reduce a user's anonymity -through Tor for marketing purposes. -

  3. Local Network/ISP/Upstream Router

    -The adversary can also inject malicious content at the user's upstream router -when they have Tor disabled, in an attempt to correlate their Tor and Non-Tor -activity. -

  4. Physical Access

    -Some users face adversaries with intermittent or constant physical access. -Users in Internet cafes, for example, face such a threat. In addition, in -countries where simply using tools like Tor is illegal, users may face -confiscation of their computer equipment for excessive Tor usage or just -general suspicion. -

Adversary Capabilities - Attacks

- -The adversary can perform the following attacks from a number of different -positions to accomplish various aspects of their goals. It should be noted -that many of these attacks (especially those involving IP address leakage) are -often performed by accident by websites that simply have Javascript, dynamic -CSS elements, and plugins. Others are performed by ad servers seeking to -correlate users' activity across different IP addresses, and still others are -performed by malicious agents on the Tor network and at national firewalls. - -

  1. Read and insert identifiers

    - -The browser contains multiple facilities for storing identifiers that the -adversary creates for the purposes of tracking users. These identifiers are -most obviously cookies, but also include HTTP auth, DOM storage, cached -scripts and other elements with embedded identifiers, client certificates, and -even TLS Session IDs. - -

    - -An adversary in a position to perform MITM content alteration can inject -document content elements to both read and inject cookies for arbitrary -domains. In fact, even many "SSL secured" websites are vulnerable to this sort of -active -sidejacking. In addition, the ad networks of course perform tracking -with cookies as well. - -

  2. Fingerprint users based on browser -attributes

    - -There is an absurd amount of information available to websites via attributes -of the browser. This information can be used to reduce anonymity set, or even -uniquely fingerprint individual users. Fingerprinting is an intimidating -problem to attempt to tackle, especially without a metric to determine or at -least intuitively understand and estimate which features will most contribute -to linkability between visits. - -

    - -The Panopticlick study -done by the EFF uses the actual entropy - the number of identifying -bits of information encoded in browser properties - as this metric. Their -result data -is definitely useful, and the metric is probably the appropriate one for -determining how identifying a particular browser property is. However, some -quirks of their study means that they do not extract as much information as -they could from display information: they only use desktop resolution and do -not attempt to infer the size of toolbars. In the other direction, they may be -over-counting in some areas, as they did not compute joint entropy over -multiple attributes that may exhibit a high degree of correlation. Also, new -browser features are added regularly, so the data should not be taken as -final. - -

    - -Despite the uncertainty, all fingerprinting attacks leverage the following -attack vectors: - -

    1. Observing Request Behavior

      - -Properties of the user's request behavior comprise the bulk of low-hanging -fingerprinting targets. These include: User agent, Accept-* headers, pipeline -usage, and request ordering. Additionally, the use of custom filters such as -AdBlock and other privacy filters can be used to fingerprint request patterns -(as an extreme example). - -

    2. Inserting Javascript

      - -Javascript can reveal a lot of fingerprinting information. It provides DOM -objects such as window.screen and window.navigator to extract information -about the useragent. - -Also, Javascript can be used to query the user's timezone via the -Date() object, WebGL can -reveal information about the video card in use, and high precision timing -information can be used to fingerprint the CPU and -interpreter speed. In the future, new JavaScript features such as -Resource -Timing may leak an unknown amount of network timing related -information. - - - -

    3. Inserting Plugins

      - -The Panopticlick project found that the mere list of installed plugins (in -navigator.plugins) was sufficient to provide a large degree of -fingerprintability. Additionally, plugins are capable of extracting font lists, -interface addresses, and other machine information that is beyond what the -browser would normally provide to content. In addition, plugins can be used to -store unique identifiers that are more difficult to clear than standard -cookies. Flash-based -cookies fall into this category, but there are likely numerous other -examples. Beyond fingerprinting, plugins are also abysmal at obeying the proxy -settings of the browser. - - -

    4. Inserting CSS

      - -CSS media -queries can be inserted to gather information about the desktop size, -widget size, display type, DPI, user agent type, and other information that -was formerly available only to Javascript. - -

  3. Remotely or locally exploit browser and/or -OS

    - -Last, but definitely not least, the adversary can exploit either general -browser vulnerabilities, plugin vulnerabilities, or OS vulnerabilities to -install malware and surveillance software. An adversary with physical access -can perform similar actions. Regrettably, this last attack capability is -outside of our ability to defend against, but it is worth mentioning for -completeness. The Tails -system however can provide some limited defenses against this -adversary. - -

2. Design Requirements and Philosophy

+

The Tor Browser Design Requirements are meant to describe the properties of a Private Browsing Mode that defends against both network and local forensic @@ -214,9 +57,9 @@ browser distribution. The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this document are to be interpreted as described in - RFC 2119. + RFC 2119. -

2.1. Security Requirements

+

The security requirements are primarily concerned with ensuring the safe use of Tor. Violations in these properties typically result in serious risk for @@ -224,13 +67,13 @@ the user in terms of immediate deanonymization and/or observability. With respect to browser support, security requirements are the minimum properties in order for Tor to support the use of a particular browser. -

  1. Proxy +

    1. Proxy Obedience

      The browser -MUST NOT bypass Tor proxy settings for any content.

    2. State +MUST NOT bypass Tor proxy settings for any content.

    3. State Separation

      The browser MUST NOT provide any stored state to the content window from other browsers or other browsing modes, including shared state from plugins, machine identifiers, and TLS session state. -

    4. Disk +

    5. Disk Avoidance

      The browser MUST NOT write any information that is derived from or that @@ -238,7 +81,7 @@ reveals browsing activity to the disk, or store it in memory beyond the duration of one browsing session, unless the user has explicitly opted to store their browsing history information to disk. -

    6. Application Data +

    7. Application Data Isolation

      The components involved in providing private browsing MUST be self-contained, @@ -253,7 +96,7 @@ to permissions issues with access to swap, implementations MAY choose to leave it out of scope, and/or leave it to the Operating System/platform to implement ephemeral-keyed encrypted swap. -

2.2. Privacy Requirements

+

The privacy requirements are primarily concerned with reducing linkability: the ability for a user's activity on one site to be linked with their activity @@ -264,13 +107,13 @@ to prefer one browser over another.

For the purposes of the unlinkability requirements of this section as well as -the descriptions in the implementation +the descriptions in the implementation section, a url bar origin means at least the second-level DNS name. For example, for mail.google.com, the origin would be google.com. Implementations MAY, at their option, restrict the url bar origin to be the entire fully qualified domain name. -

  1. Cross-Origin +

    1. Cross-Origin Identifier Unlinkability

      User activity on one url bar origin MUST NOT be linkable to their activity in @@ -279,16 +122,17 @@ interaction or approval. This requirement specifically applies to linkability from stored browser identifiers, authentication tokens, and shared state. The requirement does not apply to linkable information the user manually submits to sites, or due to information submitted during manual link traversal. This -functionality SHOULD NOT interfere with federated login in a substantial way. +functionality SHOULD NOT interfere with interactive, click-driven federated +login in a substantial way. -

    2. Cross-Origin +

    3. Cross-Origin Fingerprinting Unlinkability

      User activity on one url bar origin MUST NOT be linkable to their activity in any other url bar origin by any third party. This property specifically applies to linkability from fingerprinting browser behavior. -

    4. Long-Term +

    5. Long-Term Unlinkability

      The browser SHOULD provide an obvious, easy way to remove all of its @@ -296,12 +140,12 @@ authentication tokens and browser state and obtain a fresh identity. Additionally, the browser SHOULD clear linkable state by default automatically upon browser restart, except at user option. -

2.3. Philosophy

+

In addition to the above design requirements, the technology decisions about Tor Browser are also guided by some philosophical positions about technology. -

  1. Preserve existing user model

    +

    1. Preserve existing user model

      The existing way that the user expects to use a browser must be preserved. If the user has to maintain a different mental model of how the sites they are @@ -312,7 +156,7 @@ result. Worse, they may just stop using the browser, assuming it is broken.

      -User model breakage was one of the failures +User model breakage was one of the failures of Torbutton: Even if users managed to install everything properly, the toggle model was too hard for the average user to understand, especially in the face of accumulating tabs from multiple states crossed with the current @@ -340,20 +184,20 @@ be restricted from running automatically on every page (via click-to-play placeholders), and/or be sandboxed to restrict the types of system calls they can execute. If the user decides to craft an exemption to allow a plugin to be used, it MUST only apply to the top level url bar domain, and not to all sites, -to reduce linkability. +to reduce cross-origin fingerprinting linkability.

    2. Minimize Global Privacy Options

      -Another -failure of Torbutton was (and still is) the options panel. Each option +Another +failure of Torbutton was the options panel. Each option that detectably alters browser behavior can be used as a fingerprinting tool. -Similarly, all extensions SHOULD be +Similarly, all extensions SHOULD be disabled in the mode except as an opt-in basis. We SHOULD NOT load -system-wide addons or plugins. +system-wide and/or Operating System provided addons or plugins.

      Instead of global browser privacy options, privacy decisions SHOULD be made -per +per url bar origin to eliminate the possibility of linkability between domains. For example, when a plugin object (or a Javascript access of window.plugins) is present in a page, the user should be given the choice of @@ -361,28 +205,29 @@ allowing that plugin object for that url bar origin only. The same goes for exemptions to third party cookie policy, geo-location, and any other privacy permissions.

      -If the user has indicated they do not care about local history storage, these -permissions can be written to disk. Otherwise, they should remain memory-only. +If the user has indicated they wish to record local history storage, these +permissions can be written to disk. Otherwise, they MUST remain memory-only.

    3. No filters

      -Filter-based addons such as AdBlock -Plus, Request Policy, -Ghostery, Priv3, and Sharemenot are to be +Site-specific or filter-based addons such as AdBlock +Plus, Request Policy, +Ghostery, Priv3, and Sharemenot are to be avoided. We believe that these addons do not add any real privacy to a proper -implementation of the above privacy requirements, as all third parties are -prevented from tracking users between sites by the implementation. +implementation of the above privacy requirements, and that development efforts +should be focused on general solutions that prevent tracking by all +third parties, rather than a list of specific URLs or hosts. +

      Filter-based addons can also introduce strange breakage and cause usability nightmares, and will also fail to do their job if an adversary simply registers a new domain or creates a new url path. Worse still, the unique filter sets that each user creates or installs will provide a wealth of fingerprinting targets. -

      As a general matter, we are also generally opposed to shipping an always-on Ad blocker with Tor Browser. We feel that this would damage our credibility in terms of demonstrating that we are providing privacy through a sound design -alone, as well as damage the acceptance of Tor users by sites who support +alone, as well as damage the acceptance of Tor users by sites that support themselves through advertising revenue.

      @@ -393,7 +238,202 @@ We believe that if we do not stay current with the support of new web technologies, we cannot hope to substantially influence or be involved in their proper deployment or privacy realization. However, we will likely disable high-risk features pending analysis, audit, and mitigation. -

3. Implementation

+

+ +A Tor web browser adversary has a number of goals, capabilities, and attack +types that can be used to illustrate the design requirements for the +Tor Browser. Let's start with the goals. + +

  1. Bypassing proxy settings

    The adversary's primary goal is direct compromise and bypass of +Tor, causing the user to directly connect to an IP of the adversary's +choosing.

  2. Correlation of Tor vs Non-Tor Activity

    If direct proxy bypass is not possible, the adversary will likely +happily settle for the ability to correlate something a user did via Tor with +their non-Tor activity. This can be done with cookies, cache identifiers, +javascript events, and even CSS. Sometimes the fact that a user uses Tor may +be enough for some authorities.

  3. History disclosure

    +The adversary may also be interested in history disclosure: the ability to +query a user's history to see if they have issued certain censored search +queries, or visited censored sites. +

  4. Location information

    + +Location information such as timezone and locality can be useful for the +adversary to determine if a user is in fact originating from one of the +regions they are attempting to control, or to zero-in on the geographical +location of a particular dissident or whistleblower. + +

  5. Correlate activity across multiple sites

    + +The primary goal of the advertising networks is to know that the user who +visited siteX.com is the same user that visited siteY.com to serve them +targeted ads. The advertising networks become our adversary insofar as they +attempt to perform this correlation without the user's explicit consent. + +

  6. Fingerprinting/anonymity set reduction

    + +Fingerprinting (more generally: "anonymity set reduction") is used to attempt +to zero in on a particular individual without the use of tracking identifiers. +If the dissident or whistleblower is using a rare build of Firefox for an +obscure operating system, this can be very useful information for tracking +them down, or at least tracking their +activities. + +

  7. History records and other on-disk +information

    +In some cases, the adversary may opt for a heavy-handed approach, such as +seizing the computers of all Tor users in an area (especially after narrowing +the field by the above two pieces of information). History records and cache +data are the primary goals here. +

+ +The adversary can perform the following attacks from a number of different +positions to accomplish various aspects of their goals. It should be noted +that many of these attacks (especially those involving IP address leakage) are +often performed by accident by websites that simply have Javascript, dynamic +CSS elements, and plugins. Others are performed by ad servers seeking to +correlate users' activity across different IP addresses, and still others are +performed by malicious agents on the Tor network and at national firewalls. + +

  1. Read and insert identifiers

    + +The browser contains multiple facilities for storing identifiers that the +adversary creates for the purposes of tracking users. These identifiers are +most obviously cookies, but also include HTTP auth, DOM storage, cached +scripts and other elements with embedded identifiers, client certificates, and +even TLS Session IDs. + +

    + +An adversary in a position to perform MITM content alteration can inject +document content elements to both read and inject cookies for arbitrary +domains. In fact, even many "SSL secured" websites are vulnerable to this sort of +active +sidejacking. In addition, the ad networks of course perform tracking +with cookies as well. + +

    + +These types of attacks are attempts at subverting our Cross-Origin Identifier Unlinkability and Long-Term Unlikability design requirements. + +

  2. Fingerprint users based on browser +attributes

    + +There is an absurd amount of information available to websites via attributes +of the browser. This information can be used to reduce anonymity set, or even +uniquely fingerprint individual users. Attacks of this nature are typically +aimed at tracking users across sites without their consent, in an attempt to +subvert our Cross-Origin +Fingerprinting Unlinkability and Long-Term Unlikability design requirements. + +

    + +Fingerprinting is an intimidating +problem to attempt to tackle, especially without a metric to determine or at +least intuitively understand and estimate which features will most contribute +to linkability between visits. + +

    + +The Panopticlick study +done by the EFF uses the Shannon +entropy - the number of identifying bits of information encoded in +browser properties - as this metric. Their result data is +definitely useful, and the metric is probably the appropriate one for +determining how identifying a particular browser property is. However, some +quirks of their study means that they do not extract as much information as +they could from display information: they only use desktop resolution and do +not attempt to infer the size of toolbars. In the other direction, they may be +over-counting in some areas, as they did not compute joint entropy over +multiple attributes that may exhibit a high degree of correlation. Also, new +browser features are added regularly, so the data should not be taken as +final. + +

    + +Despite the uncertainty, all fingerprinting attacks leverage the following +attack vectors: + +

    1. Observing Request Behavior

      + +Properties of the user's request behavior comprise the bulk of low-hanging +fingerprinting targets. These include: User agent, Accept-* headers, pipeline +usage, and request ordering. Additionally, the use of custom filters such as +AdBlock and other privacy filters can be used to fingerprint request patterns +(as an extreme example). + +

    2. Inserting Javascript

      + +Javascript can reveal a lot of fingerprinting information. It provides DOM +objects such as window.screen and window.navigator to extract information +about the useragent. + +Also, Javascript can be used to query the user's timezone via the +Date() object, WebGL can +reveal information about the video card in use, and high precision timing +information can be used to fingerprint the CPU and +interpreter speed. In the future, new JavaScript features such as +Resource +Timing may leak an unknown amount of network timing related +information. + + + +

    3. Inserting Plugins

      + +The Panopticlick project found that the mere list of installed plugins (in +navigator.plugins) was sufficient to provide a large degree of +fingerprintability. Additionally, plugins are capable of extracting font lists, +interface addresses, and other machine information that is beyond what the +browser would normally provide to content. In addition, plugins can be used to +store unique identifiers that are more difficult to clear than standard +cookies. Flash-based +cookies fall into this category, but there are likely numerous other +examples. Beyond fingerprinting, plugins are also abysmal at obeying the proxy +settings of the browser. + + +

    4. Inserting CSS

      + +CSS media +queries can be inserted to gather information about the desktop size, +widget size, display type, DPI, user agent type, and other information that +was formerly available only to Javascript. + +

  3. Remotely or locally exploit browser and/or +OS

    + +Last, but definitely not least, the adversary can exploit either general +browser vulnerabilities, plugin vulnerabilities, or OS vulnerabilities to +install malware and surveillance software. An adversary with physical access +can perform similar actions. Regrettably, this last attack capability is +outside of the browser's ability to defend against, but it is worth mentioning +for completeness. In fact, The Tails system can +provide some defense against this adversary, and it does include the Tor +Browser. + +

The Implementation section is divided into subsections, each of which corresponds to a Design Requirement. @@ -406,121 +446,153 @@ In some cases, the implementation meets the design requirements in a non-ideal way (for example, by disabling features). In rare cases, there may be no implementation at all. Both of these cases are denoted by differentiating between the Design Goal and the Implementation -Status for each property. Corresponding bugs in the Tor bug tracker +Status for each property. Corresponding bugs in the Tor bug tracker are typically linked for these cases. -

3.1. Proxy Obedience

+

Proxy obedience is assured through the following: -

  1. Firefox Proxy settings +

    1. Firefox proxy settings, patches, and build flags

      - The Torbutton xpi sets the Firefox proxy settings to use Tor directly as a +Our Firefox +preferences file sets the Firefox proxy settings to use Tor directly as a SOCKS proxy. It sets network.proxy.socks_remote_dns, -network.proxy.socks_version, and -network.proxy.socks_port. +network.proxy.socks_version, +network.proxy.socks_port, and +network.dns.disablePrefetch.

      -We have verified that these settings properly proxy HTTPS, OCSP, HTTP, FTP, -gopher (now defunct), DNS, SafeBrowsing Queries, all javascript activity, -including HTML5 audio and video objects, addon updates, wifi geolocation -queries, searchbox queries, XPCOM addon HTTPS/HTTP activity, and live bookmark -updates. We have also verified that IPv6 connections are not attempted, -through the proxy or otherwise (Tor does not yet support IPv6). We have also -verified that external protocol helpers, such as smb urls and other custom -protocol handers are all blocked. +We also patch Firefox in order to prevent +a DNS leak due to a WebSocket rate-limiting check. As stated in the +patch, we believe the direct DNS resolution performed by this check is in +violation of the W3C standard, but this DNS proxy leak +remains present in stock Firefox releases.

      -Numerous other third parties have also reviewed and tested the proxy settings -and have provided test cases based on their work. See in particular decloak.net. +During the transition to Firefox 17-ESR, a code audit was undertaken to verify +that there were no system calls or XPCOM activity in the source tree that did +not use the browser proxy settings. The only violation we found was that +WebRTC was capable of creating UDP sockets and was compiled in by default. We +subsequently disabled it using the Firefox build option +--disable-webrtc. + +

      + +We have verified that these settings and patches properly proxy HTTPS, OCSP, +HTTP, FTP, gopher (now defunct), DNS, SafeBrowsing Queries, all javascript +activity, including HTML5 audio and video objects, addon updates, wifi +geolocation queries, searchbox queries, XPCOM addon HTTPS/HTTP activity, +WebSockets, and live bookmark updates. We have also verified that IPv6 +connections are not attempted, through the proxy or otherwise (Tor does not +yet support IPv6). We have also verified that external protocol helpers, such +as smb urls and other custom protocol handlers are all blocked. + +

      + +Numerous other third parties have also reviewed and tested the proxy settings +and have provided test cases based on their work. See in particular decloak.net.

    2. Disabling plugins -

      Plugins have the ability to make arbitrary OS system calls and bypass proxy settings. This includes +

      Plugins have the ability to make arbitrary OS system calls and bypass proxy settings. This includes the ability to make UDP sockets and send arbitrary data independent of the browser proxy settings.

      Torbutton disables plugins by using the @mozilla.org/plugin/host;1 service to mark the plugin tags -as disabled. Additionally, we set -plugin.disable_full_page_plugin_for_types to the list of -supported mime types for all currently installed plugins. +as disabled. This block can be undone through both the Torbutton Security UI, +and the Firefox Plugin Preferences.

      -In addition, to prevent any unproxied activity by plugins at load time, we -also patch the Firefox source code to prevent the load of any plugins except +If the user does enable plugins in this way, plugin-handled objects are still +restricted from automatic load through Firefox's click-to-play preference +plugins.click_to_play. +

      +In addition, to reduce any unproxied activity by arbitrary plugins at load +time, and to reduce the fingerprintability of the installed plugin list, we +also patch the Firefox source code to prevent the load of any plugins except for Flash and Gnash. -

      - -Finally, even if the user alters their browser settings to re-enable the Flash -plugin, we have configured NoScript to provide click-to-play placeholders, so -that only desired objects will be loaded, and only after user confirmation. -

    3. External App Blocking

      External apps, if launched automatically, can be induced to load files that perform network activity. In order to prevent this, Torbutton installs a component to - + provide the user with a popup whenever the browser attempts to launch a helper app. -Additionally, due primarily to an issue with Ubuntu Unity, url-based drag and drop is +Additionally, due to an issue with Ubuntu Unity, url-based drag and drop is filtered by this component. Unity was pre-fetching URLs without using the browser's proxy settings during a drag action, even if the drop was ultimately -canceled by the user. -

3.2. State Separation

+canceled by the user. A similar issue was discovered on Mac OS. +

3.3. Disk Avoidance

Design Goal:

-Tor Browser MUST (at user option) prevent all disk records of browser activity. +

Implementation Status:

-For now, Tor Browser blocks write access to the disk through Torbutton -using several Firefox preferences. +features if they so desire. +

-In addition, three Firefox patches are needed to prevent disk writes, even if +We achieve this goal through several mechanisms. First, we set the Firefox +Private Browsing preference +browser.privatebrowsing.autostart. In addition, four Firefox patches are needed to prevent disk writes, even if Private Browsing Mode is enabled. We need to -prevent +prevent the permissions manager from recording HTTPS STS state, -prevent -intermediate SSL certificates from being recorded, and -prevent +prevent +intermediate SSL certificates from being recorded, +prevent +download history from being recorded, and +prevent the content preferences service from recording site zoom. -For more details on these patches, see the +For more details on these patches, see the Firefox Patches section. -

3.4. Application Data Isolation

+

+ +As an additional defense-in-depth measure, we set the following preferences: +, +browser.cache.disk.enable, +browser.cache.offline.enable, +dom.indexedDB.enabled, +network.cookie.lifetimePolicy, +signon.rememberSignons, +browser.formfill.enable, +browser.download.manager.retention, +browser.sessionstore.privacy_level, +and network.cookie.lifetimePolicy. Many of these +preferences are likely redundant with +browser.privatebrowsing.autostart, but we have not done the +auditing work to ensure that yet. + +
+ +Torbutton also contains +code to prevent the Firefox session store from writing to disk. +
+ +For more details on disk leak bugs and enhancements, see the tbb-disk-leak tag in our bugtracker

3.5. Cross-Origin Identifier Unlinkability

+

+ +To ensure TBB directory isolation, we set +browser.download.useDownloadDir, +browser.shell.checkDefaultBrowser, and +browser.download.manager.addToRecentDocs. We also set the +$HOME environment variable to be the TBB extraction directory. +

The Tor Browser MUST prevent a user's activity on one site from being linked to their activity on another site. When this goal cannot yet be met with an @@ -544,21 +616,19 @@ the url bar origin for which browser state exists, possibly with a context-menu option to drill down into specific types of state or permissions. An example of this simplification can be seen in Figure 1. -

Figure 1. Improving the Privacy UI


  1. Cookies

    Design Goal: All cookies MUST be double-keyed to the url bar origin and third-party -origin. There exists a Mozilla bug +origin. There exists a Mozilla bug that contains a prototype patch, but it lacks UI, and does not apply to modern Firefoxes. @@ -574,17 +644,17 @@ unlinkability trumps that desire.

    Cache is isolated to the url bar origin by using a technique pioneered by -Colin Jackson et al, via their work on SafeCache. The technique re-uses the -nsICachingChannel.cacheKey +Colin Jackson et al, via their work on SafeCache. The technique re-uses the +nsICachingChannel.cacheKey attribute that Firefox uses internally to prevent improper caching and reuse of HTTP POST data.

    -However, to increase the -security of the isolation and to solve conflicts +However, to increase the +security of the isolation and to solve conflicts with OCSP relying the cacheKey property for reuse of POST requests, we -had to patch +had to patch Firefox to provide a cacheDomain cache attribute. We use the fully qualified url bar domain as input to this field. @@ -599,49 +669,49 @@ opposed to relying solely on the referer property.

    -Therefore, the original +Therefore, the original Stanford test cases are expected to fail. Functionality can still be -verified by navigating to about:cache and +verified by navigating to about:cache and viewing the key used for each cache entry. Each third party element should have an additional "domain=string" property prepended, which will list the FQDN that was used to source the third party element. +

    + +Additionally, because the image cache is a separate entity from the content +cache, we had to patch Firefox to also isolate +this cache per url bar domain. +

  2. HTTP Auth

    HTTP authentication tokens are removed for third party elements using the -http-on-modify-request -observer to remove the Authorization headers to prevent silent -linkability between domains. We also needed to patch -Firefox to cause the headers to get added early enough to allow the -observer to modify it. - +http-on-modify-request +observer to remove the Authorization headers to prevent silent +linkability between domains.

  3. DOM Storage -

    Design Goal: +

    DOM storage for third party domains MUST be isolated to the url bar origin, -to prevent linkability between sites. - -

    Implementation Status: - -Because it is isolated to third party domain as opposed to top level url bar -origin, we entirely disable DOM storage as a stopgap to ensure unlinkability. +to prevent linkability between sites. This functionality is provided through a +patch +to Firefox.

  4. Flash cookies

    Design Goal: Users should be able to click-to-play flash objects from trusted sites. To make this behavior unlinkable, we wish to include a settings file for all platforms that disables flash -cookies using the Flash +cookies using the Flash settings manager.

    Implementation Status: -We are currently having +We are currently having difficulties causing Flash player to use this settings file on Windows, so Flash remains difficult to enable. -

  5. SSL+TLS session resumption and HTTP Keep-Alive +

  6. SSL+TLS session resumption, HTTP Keep-Alive and SPDY

    Design Goal: TLS session resumption tickets and SSL Session IDs MUST be limited to the url @@ -650,24 +720,28 @@ origin MUST NOT be reused for that same third party in another url bar origin.

    Implementation Status: -We currently clear SSL Session IDs upon New +We currently clear SSL Session IDs upon New Identity, we disable TLS Session Tickets via the Firefox Pref security.enable_tls_session_tickets. We disable SSL Session -IDs via a patch +IDs via a patch to Firefox. To compensate for the increased round trip latency from disabling these performance optimizations, we also enable -TLS +TLS False Start via the Firefox Pref security.ssl.enable_false_start.

    -Becuase of the extreme performance benefits of HTTP Keep-Alive for interactive +Because of the extreme performance benefits of HTTP Keep-Alive for interactive web apps, and because of the difficulties of conveying urlbar origin information down into the Firefox HTTP layer, as a compromise we currently merely reduce the HTTP Keep-Alive timeout to 20 seconds (which is measured from the last packet read on the connection) using the Firefox preference network.http.keep-alive.timeout. +

    +However, because SPDY can store identifiers and has extremely long keepalive +duration, it is disabled through the Firefox preference +network.http.spdy.enabled.

  7. Automated cross-origin redirects MUST NOT store identifiers

    Design Goal: @@ -687,33 +761,34 @@ federated login systems) SHOULD still allow identifiers to persist.

    Implementation status: There are numerous ways for the user to be redirected, and the Firefox API -support to detect each of them is poor. We have a trac bug +support to detect each of them is poor. We have a trac bug open to implement what we can.

  8. window.name

    -window.name is +window.name is a magical DOM property that for some reason is allowed to retain a persistent value for the lifespan of a browser tab. It is possible to utilize this property for -identifier +identifier storage.

    -In order to eliminate linkability but still allow for sites that utilize this -property to function, we reset the window.name property of tabs in Torbutton every -time we encounter a blank referer. This behavior allows window.name to persist -for the duration of a link-driven navigation session, but as soon as the user -enters a new URL or navigates between https/http schemes, the property is cleared. +In order to eliminate non-consensual linkability but still allow for sites +that utilize this property to function, we reset the window.name property of +tabs in Torbutton every time we encounter a blank referer. This behavior +allows window.name to persist for the duration of a click-driven navigation +session, but as soon as the user enters a new URL or navigates between +https/http schemes, the property is cleared.

  9. Auto form-fill

    We disable the password saving functionality in the browser as part of our -Disk Avoidance requirement. However, +Disk Avoidance requirement. However, since users may decide to re-enable disk history records and password saving, -we also set the signon.autofillForms +we also set the signon.autofillForms preference to false to prevent saved values from immediately populating fields upon page load. Since Javascript can read these values as soon as they appear, setting this preference prevents automatic linkability from stored passwords. @@ -721,7 +796,7 @@ appear, setting this preference prevents automatic linkability from stored passw

  10. HSTS supercookies

    -An extreme (but not impossible) attack to mount is the creation of HSTS +An extreme (but not impossible) attack to mount is the creation of HSTS supercookies. Since HSTS effectively stores one bit of information per domain name, an adversary in possession of numerous domains can use them to construct cookies based on stored HSTS state. @@ -735,7 +810,7 @@ Restrict the number of HSTS-enabled third parties allowed per url bar origin. the best approach.

    Implementation Status: Currently, HSTS state is -cleared by New Identity, but we don't +cleared by New Identity, but we don't defend against the creation of these cookies between New Identity invocations.

  11. Exit node usage @@ -748,39 +823,46 @@ observers from linking concurrent browsing activity.

    Implementation Status: The Tor feature that supports this ability only exists in the 0.2.3.x-alpha -series. Ticket +series. Ticket #3455 is the Torbutton ticket to make use of the new Tor functionality. -

3.6. Cross-Origin Fingerprinting Unlinkability

+

+For more details on identifier linkability bugs and enhancements, see the tbb-linkability tag in our bugtracker +

In order to properly address the fingerprinting adversary on a technical level, we need a metric to measure linkability of the various browser -properties beyond any stored origin-related state. The Panopticlick Project -by the EFF provides us with exactly this metric. The researchers conducted a -survey of volunteers who were asked to visit an experiment page that harvested -many of the above components. They then computed the Shannon Entropy of the -resulting distribution of each of several key attributes to determine how many -bits of identifying information each attribute provided. +properties beyond any stored origin-related state. The Panopticlick Project +by the EFF provides us with a prototype of such a metric. The researchers +conducted a survey of volunteers who were asked to visit an experiment page +that harvested many of the above components. They then computed the Shannon +Entropy of the resulting distribution of each of several key attributes to +determine how many bits of identifying information each attribute provided.

-The study is not exhaustive, though. In particular, the test does not take in -all aspects of resolution information. It did not calculate the size of -widgets, window decoration, or toolbar size, which we believe may add high -amounts of entropy. It also did not measure clock offset and other time-based -fingerprints. Furthermore, as new browser features are added, this experiment -should be repeated to include them. - -

- -On the other hand, to avoid an infinite sinkhole, we reduce the efforts for -fingerprinting resistance by only concerning ourselves with reducing the +Many browser features have been added since the EFF first ran their experiment +and collected their data. To avoid an infinite sinkhole, we reduce the efforts +for fingerprinting resistance by only concerning ourselves with reducing the fingerprintable differences among Tor Browser users. We -do not believe it is productive to concern ourselves with cross-browser -fingerprinting issues, at least not at this stage. +do not believe it is possible to solve cross-browser fingerprinting issues. -

  1. Plugins +

    + +Unfortunately, the unsolvable nature of the cross-browser fingerprinting +problem means that the Panopticlick test website itself is not useful for +evaluating the actual effectiveness of our defenses, or the fingerprinting +defenses of any other web browser. Because the Panopticlick dataset is based +on browser data spanning a number of widely deployed browsers over a number of +years, any fingerprinting defenses attempted by browsers today are very likely +to cause Panopticlick to report an increase in +fingerprintability and entropy, because those defenses will stand out in sharp +contrast to historical data. We have been working to convince +the EFF that it is worthwhile to release the source code to +Panopticlick to allow us to run our own version for this reason. + +

    1. Plugins

      Plugins add to fingerprinting risk via two main vectors: their mere presence in @@ -792,17 +874,63 @@ All plugins that have not been specifically audited or sandboxed MUST be disabled. To reduce linkability potential, even sandboxed plugins should not be allowed to load objects until the user has clicked through a click-to-play barrier. Additionally, version information should be reduced or obfuscated -until the plugin object is loaded. +until the plugin object is loaded. For flash, we wish to provide a +settings.sol file to disable Flash cookies, and to restrict P2P +features that are likely to bypass proxy settings.

      Implementation Status: Currently, we entirely disable all plugins in Tor Browser. However, as a -compromise due to the popularity of Flash, we intend to work -towards a -click-to-play barrier using NoScript that is available only after the user has -specifically enabled plugins. Flash will be the only plugin available, and we -will ship a settings.sol file to disable Flash cookies, and to restrict P2P -features that likely bypass proxy settings. +compromise due to the popularity of Flash, we allow users to re-enable Flash, +and flash objects are blocked behind a click-to-play barrier that is available +only after the user has specifically enabled plugins. Flash is the only plugin +available, the rest are entirely +blocked from loading by a Firefox patch. We also set the Firefox +preference plugin.expose_full_path to false, to avoid +leaking plugin installation information. + +

    2. HTML5 Canvas Image Extraction +

      + +The HTML5 +Canvas is a feature that has been added to major browsers after the +EFF developed their Panopticlick study. After plugins and plugin-provided +information, we believe that the HTML5 Canvas is the single largest +fingerprinting threat browsers face today. Initial +studies show that the Canvas can provide an easy-access fingerprinting +target: The adversary simply renders WebGL, font, and named color data to a +Canvas element, extracts the image buffer, and computes a hash of that image +data. Subtle differences in the video card, font packs, and even font and +graphics library versions allow the adversary to produce a stable, simple, +high-entropy fingerprint of a computer. In fact, the hash of the rendered +image can be used almost identically to a tracking cookie by the web server. + +

      + +To reduce the threat from this vector, we have patched Firefox to prompt +before returning valid image data to the Canvas APIs. If the user +hasn't previously allowed the site in the URL bar to access Canvas image data, +pure white image data is returned to the Javascript APIs. + +

    3. WebGL +

      + +WebGL is fingerprintable both through information that is exposed about the +underlying driver and optimizations, as well as through performance +fingerprinting. + +

      + +Because of the large amount of potential fingerprinting vectors and the previously unexposed +vulnerability surface, we deploy a similar strategy against WebGL as +for plugins. First, WebGL Canvases have click-to-play placeholders (provided +by NoScript), and do not run until authorized by the user. Second, we +obfuscate driver information by setting the Firefox preferences +webgl.disable-extensions and +webgl.min_capability_mode, which reduce the information +provided by the following WebGL API calls: getParameter(), +getSupportedExtensions(), and +getExtension().

    4. Fonts

      @@ -819,7 +947,7 @@ still be available. The sure-fire way to address font linkability is to ship the browser with a font for every language, typeface, and style in use in the world, and to only use those fonts at the exclusion of system fonts. However, this set may be -impractically large. It is possible that a smaller common +impractically large. It is possible that a smaller common subset may be found that provides total coverage. However, we believe that with strong url bar origin identifier isolation, a simpler approach can reduce the number of bits available to the adversary while avoiding the rendering and @@ -829,12 +957,49 @@ language issues of supporting a global font set. We disable plugins, which prevents font enumeration. Additionally, we limit both the number of font queries from CSS, as well as the total number of -fonts that can be used in a document by patching Firefox. We create two prefs, +fonts that can be used in a document with +a Firefox patch. We create two prefs, browser.display.max_font_attempts and browser.display.max_font_count for this purpose. Once these limits are reached, the browser behaves as if browser.display.use_document_fonts was reached. We are -still working to determine optimal values for these prefs. +still working to determine optimal values for these prefs. + +

      + +To improve rendering, we exempt remote @font-face +fonts from these counts, and if a font-family CSS rule lists a remote +font (in any order), we use that font instead of any of the named local fonts. + +

    5. Desktop resolution, CSS Media Queries, and System Colors +

      + +Both CSS and Javascript have access to a lot of information about the screen +resolution, usable desktop size, OS widget size, toolbar size, title bar size, +system theme colors, and other desktop features that are not at all relevant +to rendering and serve only to provide information for fingerprinting. + +

      Design Goal: + +Our design goal here is to reduce the resolution information down to the bare +minimum required for properly rendering inside a content window. We intend to +report all rendering information correctly with respect to the size and +properties of the content window, but report an effective size of 0 for all +border material, and also report that the desktop is only as big as the +inner content window. Additionally, new browser windows are sized such that +their content windows are one of a few fixed sizes based on the user's +desktop resolution. + +

      Implementation Status: + +We have implemented the above strategy using a window observer to resize +new windows based on desktop resolution. Additionally, we patch +Firefox to use the client content window size for +window.screen and for +CSS Media Queries. Similarly, we patch +DOM events to return content window relative points. We also patch +Firefox to report +a fixed set of system colors to content window CSS.

    6. User Agent and HTTP Headers

      Design Goal: @@ -849,41 +1014,9 @@ these headers should remain identical across the population even when updated. Firefox provides several options for controlling the browser user agent string which we leverage. We also set similar prefs for controlling the Accept-Language and Accept-Charset headers, which we spoof to English by default. Additionally, we -remove -content script access to Components.interfaces, which can be -used to fingerprint OS, platform, and Firefox minor version.

    7. Desktop resolution and CSS Media Queries -

      - -Both CSS and Javascript have a lot of irrelevant information about the screen -resolution, usable desktop size, OS widget size, toolbar size, title bar size, and -other desktop features that are not at all relevant to rendering and serve -only to provide information for fingerprinting. - -

      Design Goal: - -Our design goal here is to reduce the resolution information down to the bare -minimum required for properly rendering inside a content window. We intend to -report all rendering information correctly with respect to the size and -properties of the content window, but report an effective size of 0 for all -border material, and also report that the desktop is only as big as the -inner content window. Additionally, new browser windows are sized such that -their content windows are one of ~5 fixed sizes based on the user's -desktop resolution. - -

      Implementation Status: - -We have implemented the above strategy for Javascript using Torbutton's JavaScript -hooks as well as a window observer to resize -new windows based on desktop resolution. Additionally, we patch -Firefox to cause CSS Media Queries to use the client content window size -for all desktop size related media queries. - -

      - -As far as we know, this fully satisfies our design goals for desktop -resolution information. - -

    8. Timezone and clock offset +remove +content script access to Components.interfaces, which can be +used to fingerprint OS, platform, and Firefox minor version.

    9. Timezone and clock offset

      Design Goal: All Tor Browser users MUST report the same timezone to websites. Currently, we @@ -897,26 +1030,26 @@ values used in Tor Browser to something reasonably accurate.

      Implementation Status: We set the timezone using the TZ environment variable, which is supported on -all platforms. Additionally, we plan to obtain a clock +all platforms. Additionally, we plan to obtain a clock offset from Tor, but this won't be available until Tor 0.2.3.x is in use.

    10. Javascript performance fingerprinting

      -Javascript performance +Javascript performance fingerprinting is the act of profiling the performance of various Javascript functions for the purpose of fingerprinting the Javascript engine and the CPU.

      Design Goal: -We have several potential +We have several potential mitigation approaches to reduce the accuracy of performance fingerprinting without risking too much damage to functionality. Our current favorite is to reduce the resolution of the Event.timeStamp and the Javascript Date() object, while also introducing jitter. Our goal is to increase the -amount of time it takes to mount a successful attack. Mowery et al found that +amount of time it takes to mount a successful attack. Mowery et al found that even with the default precision in most browsers, they required up to 120 seconds of amortization and repeated trials to get stable results from their feature set. We intend to work with the research community to establish the @@ -925,7 +1058,20 @@ optimum trade-off between quantization+jitter and amortization time.

      Implementation Status: -We have no implementation as of yet. +Currently, the only mitigation against performance fingerprinting is to +disable Navigation +Timing through the Firefox preference +dom.enable_performance. + +

    11. Non-Uniform HTML5 API Implementations +

      + +At least two HTML5 features have different implementation status across the +major OS vendors: the Battery +API and the Network +Connection API. We disable these APIs +through the Firefox preferences dom.battery.enabled and +dom.network.enabled.

    12. Keystroke fingerprinting

      @@ -940,90 +1086,71 @@ fingerprinting: timestamp quantization and jitter.

      Implementation Status: We have no implementation as of yet. -

    13. WebGL -

      +

    +For more details on identifier linkability bugs and enhancements, see the tbb-fingerprinting tag in our bugtracker +

3.7. Long-Term Unlinkability via "New Identity" button

In order to avoid long-term linkability, we provide a "New Identity" context -menu option in Torbutton. -

Design Goal:

+menu option in Torbutton. This context menu option is active if Torbutton can +read the environment variables $TOR_CONTROL_PASSWD and $TOR_CONTROL_PORT. + +

Implementation Status:

+
-Additionally, the user is allowed to "protect" cookies of their choosing from -deletion during New Identity by using the Torbutton Cookie Protections UI to -protect the cookies they would like to keep across New Identity invocations. -

3.8. Click-to-play for plugins and invasive content

-Some content types are too invasive and/or too opaque for us to properly -eliminate their linkability properties. For these content types, we use -NoScript to provide click-to-play placeholders that do not activate the -content until the user clicks on it. This will eliminate the ability for an -adversary to use such content types to link users in a dragnet fashion across -arbitrary sites. -

-Currently, the content types isolated in this way include Flash, WebGL, and -audio and video objects. -

3.9. Description of Firefox Patches

-The set of patches we have against Firefox can be found in the current-patches directory of the torbrowser git repository. They are: -

  1. Block Components.interfaces and Components.lookupMethod -

    +

    -In order to reduce fingerprinting, we block access to these two interfaces -from content script. Components.lookupMethod can undo our Javascript -hooks, -and Components.interfaces can be used for fingerprinting the platform, OS, and -Firebox version, but not much else. +After closing all tabs, we then clear the following state: searchbox and +findbox text, HTTP auth, SSL state, OCSP state, site-specific content +preferences (including HSTS state), content and image cache, Cookies, DOM +storage, safe browsing key, and the Google wifi geolocation token (if it +exists). -

  2. Make Permissions Manager memory only -

    +

    + +After the state is cleared, we then close all remaining HTTP keep-alive +connections and then send the NEWNYM signal to the Tor control port to cause a +new circuit to be created. +

    +Finally, a fresh browser window is opened, and the current browser window is +closed. +

+If the user chose to "protect" any cookies by using the Torbutton Cookie +Protections UI, those cookies are not cleared as part of the above. +

+ +The set of patches we have against Firefox can be found in the current-patches directory of the torbrowser git repository. They are: + +

  1. Block +Components.interfaces

    + +In order to reduce fingerprinting, we block access to this interface from +content script. Components.interfaces can be used for fingerprinting the +platform, OS, and Firebox version, but not much else. + +

  2. Make +Permissions Manager memory only

    This patch exposes a pref 'permissions.memory_only' that properly isolates the permissions manager to memory, which is responsible for all user specified -site permissions, as well as stored HSTS +site permissions, as well as stored HSTS policy from visited sites. The pref does successfully clear the permissions manager memory if toggled. It does not need to be set in prefs.js, and can be handled by Torbutton. -

  3. Make Intermediate Cert Store memory-only -

    +

  4. Make +Intermediate Cert Store memory-only

    The intermediate certificate store records the intermediate SSL certificates the browser has seen to date. Because these intermediate certificates are used @@ -1037,153 +1164,257 @@ As an additional design goal, we would like to later alter this patch to allow t information to be cleared from memory. The implementation does not currently allow this. -

  5. Add HTTP auth headers before on-modify-request fires -

    +

  6. Add +a string-based cacheKey property for domain isolation

    -This patch provides a trivial modification to allow us to properly remove HTTP -auth for third parties. This patch allows us to defend against an adversary -attempting to use HTTP -auth to silently track users between domains. - -

  7. Add a string-based cacheKey property for domain isolation -

    - -To increase the -security of cache isolation and to solve strange and -unknown conflicts with OCSP, we had to patch -Firefox to provide a cacheDomain cache attribute. We use the url bar +To increase the +security of cache isolation and to solve strange and +unknown conflicts with OCSP, we had to patch +Firefox to provide a cacheDomain cache attribute. We use the url bar FQDN as input to this field. -

  8. Randomize HTTP pipeline order and depth -

    -As an -experimental -defense against Website Traffic Fingerprinting, we patch the standard -HTTP pipelining code to randomize the number of requests in a -pipeline, as well as their order. -

  9. Block all plugins except flash -

    -We cannot use the +

  10. Block +all plugins except flash

    +We cannot use the @mozilla.org/extensions/blocklist;1 service, because we actually want to stop plugins from ever entering the browser's process space and/or executing code (for example, AV plugins that collect statistics/analyze -URLs, magical toolbars that phone home or "help" the user, skype buttons that +URLs, magical toolbars that phone home or "help" the user, Skype buttons that ruin our day, and censorship filters). Hence we rolled our own. -

  11. Make content-prefs service memory only -

    -This patch prevents random URLs from being inserted into content-prefs.sqllite in +

  12. Make content-prefs service memory only

    +This patch prevents random URLs from being inserted into content-prefs.sqlite in the profile directory as content prefs change (includes site-zoom and perhaps other site prefs?). -

  13. Make Tor Browser exit when not launched from Vidalia -

    +

  14. Make Tor Browser exit when not launched from Vidalia

    It turns out that on Windows 7 and later systems, the Taskbar attempts to automatically learn the most frequent apps used by the user, and it recognizes -Tor Browser as a seperate app from Vidalia. This can cause users to try to -launch Tor Brower without Vidalia or a Tor instance running. Worse, the Tor +Tor Browser as a separate app from Vidalia. This can cause users to try to +launch Tor Browser without Vidalia or a Tor instance running. Worse, the Tor Browser will automatically find their default Firefox profile, and properly connect directly without using Tor. This patch is a simple hack to cause Tor Browser to immediately exit in this case. -

  15. Disable SSL Session ID tracking -

    +

  16. Disable SSL Session ID tracking

    This patch is a simple 1-line hack to prevent SSL connections from caching (and then later transmitting) their Session IDs. There was no preference to govern this behavior, so we had to hack it by altering the SSL new connection defaults. -

  17. Provide an observer event to close persistent connections -

    +

  18. Provide an observer event to close persistent connections

    This patch creates an observer event in the HTTP connection manager to close all keep-alive connections that still happen to be open. This event is emitted -by the New Identity button. +by the New Identity button. -

4. Packaging

4.1. Build Process Security

4.2. External Addons

Included Addons

Excluded Addons

Dangerous Addons

4.3. Pref Changes

4.4. Update Security

5. Testing

+

  • Limit Device and System Specific Media Queries

    -The purpose of this section is to cover all the known ways that Tor browser -security can be subverted from a penetration testing perspective. The hope -is that it will be useful both for creating a "Tor Safety Check" -page, and for developing novel tests and actively attacking Torbutton with the -goal of finding vulnerabilities in either it or the Mozilla components, -interfaces and settings upon which it relies. +CSS +Media Queries have a fingerprinting capability approaching that of +Javascript. This patch causes such Media Queries to evaluate as if the device +resolution was equal to the content window resolution. -

    5.1. Single state testing

    +

  • Limit the number of fonts per document

    -Torbutton is a complicated piece of software. During development, changes to -one component can affect a whole slough of unrelated features. A number of -aggregated test suites exist that can be used to test for regressions in -Torbutton and to help aid in the development of Torbutton-like addons and -other privacy modifications of other browsers. Some of these test suites exist -as a single automated page, while others are a series of pages you must visit -individually. They are provided here for reference and future regression -testing, and also in the hope that some brave soul will one day decide to -combine them into a comprehensive automated test suite. +Font availability can be queried by +CSS and Javascript and is a fingerprinting vector. This patch limits +the number of times CSS and Javascript can cause font-family rules to +evaluate. Remote @font-face fonts are exempt from the limits imposed by this +patch, and remote fonts are given priority over local fonts whenever both +appear in the same font-family rule. -

    1. Decloak.net

      +

    2. Rebrand Firefox to Tor Browser

      -Decloak.net is the canonical source of plugin and external-application based -proxy-bypass exploits. It is a fully automated test suite maintained by HD Moore as a service for people to -use to test their anonymity systems. +This patch updates our branding in compliance with Mozilla's trademark policy. -

    3. Deanonymizer.com

      +

    4. Make Download Manager Memory Only

      -Deanonymizer.com is another automated test suite that tests for proxy bypass -and other information disclosure vulnerabilities. It is maintained by Kyle -Williams, the author of JanusVM -and JanusPA. +This patch prevents disk leaks from the download manager. The original +behavior is to write the download history to disk and then delete it, even if +you disable download history from your Firefox preferences. -

    5. JonDos -AnonTest

      +

    6. Add DDG and StartPage to Omnibox

      -The JonDos people also provide an -anonymity tester. It is more focused on HTTP headers and behaviors than plugin bypass, and -points out a couple of headers Torbutton could do a better job with -obfuscating. +This patch adds DuckDuckGo and StartPage to the Search Box, and sets our +default search engine to StartPage. We deployed this patch due to excessive +Captchas and complete 403 bans from Google. -

    7. Browserspy.dk

      +

    8. Make nsICacheService.EvictEntries() Synchronous

      -Browserspy.dk provides a tremendous collection of browser fingerprinting and -general privacy tests. Unfortunately they are only available one page at a -time, and there is not really solid feedback on good vs bad behavior in -the test results. +This patch eliminates a race condition with "New Identity". Without it, +cache-based Evercookies survive for up to a minute after clearing the cache +on some platforms. -

    9. Privacy -Analyzer

      +

    10. Prevent WebSockets DNS Leak

      -The Privacy Analyzer provides a dump of all sorts of browser attributes and -settings that it detects, including some information on your original IP -address. Its page layout and lack of good vs bad test result feedback makes it -not as useful as a user-facing testing tool, but it does provide some -interesting checks in a single page. +This patch prevents a DNS leak when using WebSockets. It also prevents other +similar types of DNS leaks. -

    11. Mr. T

      +

    12. Randomize HTTP pipeline order and depth

      +As an +experimental +defense against Website Traffic Fingerprinting, we patch the standard +HTTP pipelining code to randomize the number of requests in a +pipeline, as well as their order. +

    13. Adapt Steve Michaud's Mac crashfix patch

      -Mr. T is a collection of browser fingerprinting and deanonymization exploits -discovered by the ha.ckers.org crew -and others. It is also not as user friendly as some of the above tests, but it -is a useful collection. +This patch allows us to block Drag and Drop without causing crashes on Mac OS. +We need to block Drag and Drop because Mac OS and Ubuntu both immediately load +any URLs they find in your drag buffer before you even drop them (without +using your browser's proxy settings, of course). -

    14. Gregory Fleischer's Torbutton and -Defcon -17 Test Cases -

      +

    15. Add mozIThirdPartyUtil.getFirstPartyURI() API

      -Gregory Fleischer has been hacking and testing Firefox and Torbutton privacy -issues for the past 2 years. He has an excellent collection of all his test -cases that can be used for regression testing. In his Defcon work, he -demonstrates ways to infer Firefox version based on arcane browser properties. -We are still trying to determine the best way to address some of those test -cases. +This patch provides an API that allows us to more easily isolate identifiers +to the URL bar domain. -

    16. Xenobite's -TorCheck Page

      +

    17. Add canvas image extraction prompt

      -This page checks to ensure you are using a valid Tor exit node and checks for -some basic browser properties related to privacy. It is not very fine-grained -or complete, but it is automated and could be turned into something useful -with a bit of work. +This patch prompts the user before returning canvas image data. Canvas image +data can be used to create an extremely stable, high-entropy fingerprint based +on the unique rendering behavior of video cards, OpenGL behavior, +system fonts, and supporting library versions. -

    -

  • +

  • Return client window coordinates for mouse events

    + +This patch causes mouse events to return coordinates relative to the content +window instead of the desktop. + +

  • Do not expose physical screen info to window.screen

    + +This patch causes window.screen to return the display resolution size of the +content window instead of the desktop resolution size. + +

  • Do not expose system colors to CSS or canvas

    + +This patch prevents CSS and Javascript from discovering your desktop color +scheme and/or theme. + +

  • Isolate the Image Cache per url bar domain

    + +This patch prevents cached images from being used to store third party tracking +identifiers. + +

  • nsIHTTPChannel.redirectTo() API

    + +This patch provides HTTPS-Everywhere with an API to perform redirections more +securely and without addon conflicts. + +

  • Isolate DOM Storage to first party URI

    + +This patch prevents DOM Storage from being used to store third party tracking +identifiers. + +

  • A. Towards Transparency in Navigation Tracking

    + +The privacy properties of Tor Browser are based +upon the assumption that link-click navigation indicates user consent to +tracking between the linking site and the destination site. While this +definition is sufficient to allow us to eliminate cross-site third party +tracking with only minimal site breakage, it is our long-term goal to further +reduce cross-origin click navigation tracking to mechanisms that are +detectable by attentive users, so they can alert the general public if +cross-origin click navigation tracking is happening where it should not be. + +

    + +In an ideal world, the mechanisms of tracking that can be employed during a +link click would be limited to the contents of URL parameters and other +properties that are fully visible to the user before they click. However, the +entrenched nature of certain archaic web features make it impossible for us to +achieve this transparency goal by ourselves without substantial site breakage. +So, instead we maintain a Deprecation +Wishlist of archaic web technologies that are currently being (ab)used +to facilitate federated login and other legitimate click-driven cross-domain +activity but that can one day be replaced with more privacy friendly, +auditable alternatives. + +

    + +Because the total elimination of side channels during cross-origin navigation +will undoubtedly break federated login as well as destroy ad revenue, we +also describe auditable alternatives and promising web draft standards that would +preserve this functionality while still providing transparency when tracking is +occurring. + +

    1. The Referer Header +

      + +We haven't disabled or restricted the referer ourselves because of the +non-trivial number of sites that rely on the referer header to "authenticate" +image requests and deep-link navigation on their sites. Furthermore, there +seems to be no real privacy benefit to taking this action by itself in a +vacuum, because many sites have begun encoding referer URL information into +GET parameters when they need it to cross http to https scheme transitions. +Google's +1 buttons are the best example of this activity. + +

      + +Because of the availability of these other explicit vectors, we believe the +main risk of the referer header is through inadvertent and/or covert data +leakage. In fact, a great deal of +personal data is inadvertently leaked to third parties through the +source URL parameters. + +

      + +We believe the Referer header should be made explicit. If a site wishes to +transmit its URL to third party content elements during load or during +link-click, it should have to specify this as a property of the associated +HTML tag. With an explicit property, it would then be possible for the user +agent to inform the user if they are about to click on a link that will +transmit referer information (perhaps through something as subtle as a +different color for the destination URL). This same UI notification can also +be used for links with the "ping" +attribute. + +

    2. window.name +

      +window.name is +a DOM property that for some reason is allowed to retain a persistent value +for the lifespan of a browser tab. It is possible to utilize this property for +identifier +storage during click navigation. This is sometimes used for additional +XSRF protection and federated login. +

      + +It's our opinion that the contents of window.name should not be preserved for +cross-origin navigation, but doing so may break federated login for some sites. + +

    3. Javascript link rewriting +

      + +In general, it should not be possible for onclick handlers to alter the +navigation destination of 'a' tags, silently transform them into POST +requests, or otherwise create situations where a user believes they are +clicking on a link leading to one URL that ends up on another. This +functionality is deceptive and is frequently a vector for malware and phishing +attacks. Unfortunately, many legitimate sites also employ such transparent +link rewriting, and blanket disabling this functionality ourselves will simply +cause Tor Browser to fail to navigate properly on these sites. + +

      + +Automated cross-origin redirects are one form of this behavior that is +possible for us to address +ourselves, as they are comparatively rare and can be handled with site +permissions. + +

    1. Web-Send Introducer

      + +Web-Send is a browser-based link sharing and federated login widget that is +designed to operate without relying on third-party tracking or abusing other +cross-origin link-click side channels. It has a compelling list of privacy and security features, +especially if used as a "Like button" replacement. + +

    2. Mozilla Persona

      + +Mozilla's Persona is designed to provide decentralized, cryptographically +authenticated federated login in a way that does not expose the user to third +party tracking or require browser redirects or side channels. While it does +not directly provide the link sharing capabilities that Web-Send does, it is a +better solution to the privacy issues associated with federated login than +Web-Send is. + +