diff --git a/projects/en/torbrowser/design/CookieManagers.png b/projects/en/torbrowser/design/CookieManagers.png new file mode 100644 index 00000000..b979f30e Binary files /dev/null and b/projects/en/torbrowser/design/CookieManagers.png differ diff --git a/projects/en/torbrowser/design/index.html.en b/projects/en/torbrowser/design/index.html.en new file mode 100644 index 00000000..90b496ab --- /dev/null +++ b/projects/en/torbrowser/design/index.html.en @@ -0,0 +1,955 @@ + + +The Design and Implementation of the Tor Browser [DRAFT]

The Design and Implementation of the Tor Browser [DRAFT]

Steven Murdoch

Sep 29 2011


Table of Contents

1. Introduction
1.1. Adversary Model
2. Design Requirements and Philosophy
2.1. Security Requirements
2.2. Privacy Requirements
2.3. Philosophy
3. Implementation
3.1. Proxy Obedience
3.2. State Separation
3.3. Disk Avoidance
3.4. Application Data Isolation
3.5. Cross-Domain Identifier Unlinkability
3.6. Cross-Domain Fingerprinting Unlinkability
3.7. Long-Term Unlinkability via "New Identity" button
3.8. Click-to-play for plugins and invasive content
3.9. Description of Firefox Patches
4. Packaging
4.1. Build Process Security
4.2. External Addons
4.3. Pref Changes
4.4. Update Security
5. Testing
5.1. Single state testing

1. Introduction

+ +This document describes the adversary model, +design requirements, +implementation, packaging and testing +procedures of the Tor Browser. It is +current as of Tor Browser 2.2.32-4. + +

+ +This document is also meant to serve as a set of design requirements and to +describe a reference implementation of a Private Browsing Mode that defends +against both local and network adversaries. + +

1.1. Adversary Model

+ +A Tor web browser adversary has a number of goals, capabilities, and attack +types that can be used to guide us towards a set of requirements for the +Tor Browser. Let's start with the goals. + +

Adversary Goals

  1. Bypassing proxy settings

    The adversary's primary goal is direct compromise and bypass of +Tor, causing the user to directly connect to an IP of the adversary's +choosing.

  2. Correlation of Tor vs Non-Tor Activity

    If direct proxy bypass is not possible, the adversary will likely +happily settle for the ability to correlate something a user did via Tor with +their non-Tor activity. This can be done with cookies, cache identifiers, +javascript events, and even CSS. Sometimes the fact that a user uses Tor may +be enough for some authorities.

  3. History disclosure

    +The adversary may also be interested in history disclosure: the ability to +query a user's history to see if they have issued certain censored search +queries, or visited censored sites. +

  4. Location information

    + +Location information such as timezone and locality can be useful for the +adversary to determine if a user is in fact originating from one of the +regions they are attempting to control, or to zero-in on the geographical +location of a particular dissident or whistleblower. + +

  5. Miscellaneous anonymity set reduction

    + +Anonymity set reduction is also useful in attempting to zero in on a +particular individual. If the dissident or whistleblower is using a rare build +of Firefox for an obscure operating system, this can be very useful +information for tracking them down, or at least tracking their activities. + +

  6. History records and other on-disk +information

    +In some cases, the adversary may opt for a heavy-handed approach, such as +seizing the computers of all Tor users in an area (especially after narrowing +the field by the above two pieces of information). History records and cache +data are the primary goals here. +

Adversary Capabilities - Positioning

+The adversary can position themselves at a number of different locations in +order to execute their attacks. +

  1. Exit Node or Upstream Router

    +The adversary can run exit nodes, or alternatively, they may control routers +upstream of exit nodes. Both of these scenarios have been observed in the +wild. +

  2. Adservers and/or Malicious Websites

    +The adversary can also run websites, or more likely, they can contract out +ad space from a number of different adservers and inject content that way. For +some users, the adversary may be the adservers themselves. It is not +inconceivable that adservers may try to subvert or reduce a user's anonymity +through Tor for marketing purposes. +

  3. Local Network/ISP/Upstream Router

    +The adversary can also inject malicious content at the user's upstream router +when they have Tor disabled, in an attempt to correlate their Tor and Non-Tor +activity. +

  4. Physical Access

    +Some users face adversaries with intermittent or constant physical access. +Users in Internet cafes, for example, face such a threat. In addition, in +countries where simply using tools like Tor is illegal, users may face +confiscation of their computer equipment for excessive Tor usage or just +general suspicion. +

Adversary Capabilities - Attacks

+ +The adversary can perform the following attacks from a number of different +positions to accomplish various aspects of their goals. It should be noted +that many of these attacks (especially those involving IP address leakage) are +often performed by accident by websites that simply have Javascript, dynamic +CSS elements, and plugins. Others are performed by adservers seeking to +correlate users' activity across different IP addresses, and still others are +performed by malicious agents on the Tor network and at national firewalls. + +

  1. Inserting Javascript

    +If not properly disabled, Javascript event handlers and timers +can cause the browser to perform network activity after Tor has been disabled, +thus allowing the adversary to correlate Tor and Non-Tor activity and reveal +a user's non-Tor IP address. Javascript +also allows the adversary to execute history disclosure attacks: +to query the history via the different attributes of 'visited' links to search +for particular Google queries, sites, or even to profile +users based on gender and other classifications. Finally, +Javascript can be used to query the user's timezone via the +Date() object, and to reduce the anonymity set by querying +the navigator object for operating system, CPU, locale, +and user agent information. +

  2. Inserting Plugins

    + +Plugins are abysmal at obeying the proxy settings of the browser. Every plugin +capable of performing network activity that the author has +investigated is also capable of performing network activity independent of +browser proxy settings - and often independent of its own proxy settings. +Sites that have plugin content don't even have to be malicious to obtain a +user's +Non-Tor IP (it usually leaks by itself), though plenty of active +exploits are possible as well. In addition, plugins can be used to store unique identifiers that are more +difficult to clear than standard cookies. +Flash-based +cookies fall into this category, but there are likely numerous other +examples. + +

  3. Inserting CSS

    + +CSS can also be used to correlate Tor and Non-Tor activity and reveal a user's +Non-Tor IP address, via the usage of +CSS +popups - essentially CSS-based event handlers that fetch content via +CSS's onmouseover attribute. If these popups are allowed to perform network +activity in a different Tor state than they were loaded in, they can easily +correlate Tor and Non-Tor activity and reveal a user's IP address. In +addition, CSS can also be used without Javascript to perform CSS-only history disclosure +attacks. +

  4. Read and insert cookies

    + +An adversary in a position to perform MITM content alteration can inject +document content elements to both read and inject cookies for arbitrary +domains. In fact, many "SSL secured" websites are vulnerable to this sort of +active +sidejacking. In addition, the ad networks of course perform tracking +with cookies as well. + +

  5. Create arbitrary cached content

    + +Likewise, the browser cache can also be used to store unique +identifiers. Since by default the cache has no same-origin policy, +these identifiers can be read by any domain, making them an ideal target for +ad network-class adversaries. + +

  6. Fingerprint users based on browser +attributes

    + +There is an absurd amount of information available to websites via attributes +of the browser. This information can be used to reduce anonymity set, or even +uniquely +fingerprint individual users.

    + +The Panopticlick study +done by the EFF attempts to measure the actual entropy - the number of +identifying bits of information encoded in browser properties. Their result +data is definitely useful, and the metric is probably the appropriate one for +determining how identifying a particular browser property is. However, some +quirks of their study means that they do not extract as much information as +they could from display information: they only use desktop resolution (which +Torbutton reports as the window resolution) and do not attempt to infer the +size of toolbars. + + + +

  7. Remotely or locally exploit browser and/or +OS

    + +Last, but definitely not least, the adversary can exploit either general +browser vulnerabilities, plugin vulnerabilities, or OS vulnerabilities to +install malware and surveillance software. An adversary with physical access +can perform similar actions. Regrettably, this last attack capability is +outside of our ability to defend against, but it is worth mentioning for +completeness. The Tails +system however can provide some limited defenses against this +adversary. + +

2. Design Requirements and Philosophy

+ +The Tor Browser Design Requirements are meant to describe the properties of a +Private Browsing Mode that defends against both network and local adversaries. + +

+ +There are two main categories of requirements: Security Requirements, and Privacy Requirements. Security Requirements are the +minimum properties in order for a web client platform to be able to support +Tor. Privacy requirements are the set of properties that cause us to prefer +one platform over another. + +

+ +We will maintain an alternate distribution of the web client in order to +maintain and/or restore privacy properties to our users. + +

2.1. Security Requirements

+ +The security requirements are primarily concerned with ensuring the safe use +of Tor. Violations in these properties typically result in serious risk for +the user in terms of immediate deanonymization and/or observability. + +

  1. Proxy Obedience

    The browser +MUST NOT bypass Tor proxy settings for any content.

  2. State Separation

    The browser MUST NOT provide any stored state to the content window +from other browsers or other browsing modes, including shared state from +plugins, machine identifiers, and TLS session state. +

  3. Disk Avoidance

    The +browser SHOULD NOT write any browsing history information to disk, or store it +in memory beyond the duration of one Tor session, unless the user has +explicitly opted to store their browsing history information to +disk.

  4. Application Data Isolation

    The browser +MUST NOT write or cause the operating system to +write any information to disk outside of the application +directory. All exceptions and shortcomings due to operating system behavior +MUST BE documented. + +

  5. Update Safety

    The browser SHOULD NOT perform unsafe updates or upgrades.

2.2. Privacy Requirements

+ +The privacy requirements are primarily concerned with reducing linkability: +the ability for a user's activity on one site to be linked with their +activity on another site without their knowledge or explicit consent. + +

  1. Cross-Domain Identifier Unlinkability

    + +User activity on one url bar domain MUST NOT be linkable to their activity in +any other domain by any third party. This property specifically applies to +linkability from stored browser identifiers, authentication tokens, and shared +state. This functionality SHOULD NOT interfere with federated login in a +substantial way. + +

  2. Cross-Domain Fingerprinting Unlinkability

    + +User activity on one url bar domain MUST NOT be linkable to their activity in +any other domain by any third party. This property specifically applies to +linkability from fingerprinting browser behavior. + +

  3. Long-Term Unlinkability

    + +The browser SHOULD provide an obvious, easy way to remove all of their authentication +tokens and browser state and obtain a fresh identity. Additionally, this +should happen by default automatically upon browser restart. + +

2.3. Philosophy

+ +In addition to the above design requirements, the technology decisions about +Tor Browser are also guided by some philosophical positions about technology. + +

  1. Preserve existing user model

    + +The existing way that the user expects to use a browser must be preserved. If +the user has to maintain a different mental model of how the sites they are +using behave depending on tab, browser state, or anything else that would not +normally be what they experience in their default browser, the user will +inevitably be confused. They will make mistakes and reduce their privacy as a +result. Worse, they may just stop using the browser, assuming it is broken. + +

    + +User model breakage was one of the failures +of Torbutton: Even if users managed to install everything properly, +the toggle model was too hard for the average user to understand, especially +in the face of accumulating tabs from multiple states crossed with the current +tor-state of the browser. + +

  2. Favor the implementation mechanism least likely to +break sites

    + +In general, we try to find solutions to privacy issues that will not induce +site breakage, though this is not always possible. + +

  3. Plugins must be restricted

    + +Even if plugins always properly used the browser proxy settings (which none of +them do) and could not be induced to bypass them (which all of them can), the +activities of closed-source plugins are very difficult to audit and control. +They can obtain and transmit all manner of system information to websites, +often have their own identifier storage for tracking users, and also +contribute to fingerprinting. + +

    + +Therefore, if plugins are to be enabled in private browsing modes, they must +be restricted from running automatically on every page (via click-to-play +placeholders), and/or be sandboxed to restrict the types of system calls they +can execute. If the user decides to craft an exemption to allow a plugin to be +used, it MUST ONLY apply to the top level urlbar domain, and not to all sites, +to reduce linkability. + +

  4. Minimize Global Privacy Options

    + +Another +failure of Torbutton was (and still is) the options panel. Each option +that detectably alters browser behavior can be used as a fingerprinting tool. +Similarly, all extensions should be +disabled in the mode except as an opt-in basis. We should not load +system-wide addons or plugins. + +

    +Instead of global browser privacy options, privacy decisions should be made +per +top-level url-bar domain to eliminate the possibility of linkability +between domains. For example, when a plugin object (or a Javascript access of +window.plugins) is present in a page, the user should be given the choice of +allowing that plugin object for that top-level url-bar domain only. The same +goes for exemptions to third party cookie policy, geo-location, and any other +privacy permissions. +

    +If the user has indicated they do not care about local history storage, these +permissions can be written to disk. Otherwise, they should remain memory-only. +

  5. No filters

    + +Filter-based addons such as AdBlock +Plus, Request Policy, Priv3, and Sharemenot are to be +avoided. We believe that these addons do not add any real privacy to a proper +implementation of the above privacy requirements, as all third parties are +prevented from tracking users between sites by the implementation. +Filter-based addons can also introduce strange breakage and cause usability +nightmares, and will also fail to do their job if an adversary simply +registers a new domain or creates a new url path. Worse still, the unique +filter sets that each user is liable to create/install likely provide a wealth +of fingerprinting targets. + +

    + +As a general matter, we are also generally opposed to shipping an always-on Ad +blocker with Tor Browser. We feel that this would damage our credibility in +terms of demonstrating that we are providing privacy through a sound design +alone, as well as damage the acceptance of Tor users by sites who support +themselves through advertising revenue. + +

    +Users are free to install these addons if they wish, but doing +so is not recommended, as it will alter the browser request fingerprint. +

  6. Stay Current

    +We believe that if we do not stay current with the support of new web +technologies, we cannot hope to substantially influence or be involved in +their proper deployment or privacy realization. However, we will likely disable +certain new features (where possible) pending analysis and audit. +

3. Implementation

+

3.1. Proxy Obedience

+ +Proxy obedience is assured through the following: +

  1. Firefox Proxy settings +

    + The Torbutton xpi sets the Firefox proxy settings to use Tor directly as a +SOCKS proxy. It sets network.proxy.socks_remote_dns, +network.proxy.socks_version, and +network.proxy.socks_port. +

  2. Disabling plugins +

    + Plugins have the ability to make arbitrary OS system calls. This includes +the ability to make UDP sockets and send arbitrary data independent of the +browser proxy settings. +

    +Torbutton disables plugins by using the +@mozilla.org/plugin/host;1 service to mark the plugin tags +as disabled. Additionally, we set +plugin.disable_full_page_plugin_for_types to the list of +supported mime types for all currently installed plugins. +

    +In addition, to prevent any unproxied activity by plugins at load time, we +also patch the Firefox source code to prevent the load of any plugins except +for Flash and Gnash. + +

  3. External App Blocking +

    +External apps, if launched automatically, can be induced to load files that +perform network activity. In order to prevent this, Torbutton installs a +component to + +provide the user with a popup whenever the browser attempts to +launch a helper app. +

3.2. State Separation

+Tor Browser State is separated from existing browser state through use of a +custom Firefox profile. Furthermore, plugins are disabled, which prevents +Flash cookies from leaking from a pre-existing Flash directory. +

3.3. Disk Avoidance

Design Goal:

+Tor Browser should optionally prevent all disk records of browser activity. +The user should be able to optionally enable URL history and other history +features if they so desire. Once we simplify the +preferences interface, we will likely just enable Private Browsing +mode by default to handle this goal. +

Implementation Status:

+For now, Tor Browser blocks write access to the disk through Torbutton +using several Firefox preferences. + + + +The set of prefs is: +dom.storage.enabled, +browser.cache.memory.enable, +network.http.use-cache, +browser.cache.disk.enable, +browser.cache.offline.enable, +general.open_location.last_url, +places.history.enabled, +browser.formfill.enable, +signon.rememberSignons, +browser.download.manager.retention, +and network.cookie.lifetimePolicy. +

+In addition, three Firefox patches are needed to prevent disk writes, even if +Private Browsing Mode is enabled. We need to + +prevent +the permissions manager from recording HTTPS STS state, +prevent +intermediate SSL certificates from being recorded, and +prevent +the content preferences service from recording site zoom. + +For more details on these patches, see the +Firefox Patches section. + +

3.4. Application Data Isolation

+ +Tor Browser Bundle MUST NOT cause any information to be written outside of the +bundle directory. This is to ensure that the user is able to completely and +safely remove the bundle without leaving other traces of Tor usage on their +computer. + +

XXX: sjmurdoch, Erinn: explain what magic we do to satisfy this, +and/or what additional work or auditing needs to be done. +

3.5. Cross-Domain Identifier Unlinkability

+ +The Tor Browser MUST prevent a user's activity on one site from being linked +to their activity on another site. When this goal cannot yet be met with an +existing web technology, that technology or functionality is disabled. Our +design goal is to ultimately eliminate the need to disable arbitrary +technologies, and instead simply alter them in ways that allows them to +function in a backwards-compatible way while avoiding linkability. Users +should be able to use federated login of various kinds to explicitly inform +sites who they are, but that information should not transparently allow a +third party to record their activity from site to site without their prior +consent. + +

+ +The benefit of this approach comes not only in the form of reduced +linkability, but also in terms of simplified privacy UI. If all stored browser +state and permissions become associated with the top-level url-bar domain, the +six or seven different pieces of privacy UI governing these identifiers and +permissions can become just one piece of UI. For instance, a window that lists +the top-level url bar domains for which browser state exists with the ability +to clear and/or block them, possibly with a context-menu option to drill down +into specific types of state. An exmaple of this simplifcation can be seen in +Figure 1. + +

Figure 1. Improving the Privacy UI

Improving the Privacy UI

+ +On the left is the standard Firefox cookie manager. On the right is a mock-up +of how isolating identifiers to the URL bar domain might simplify the privacy +UI for all data - not just cookies. Both windows represent the set of +Cookies accomulated after visiting just five sites, but the window on the +right has the option of also representing history, DOM Storage, HTTP Auth, +search form history, login values, and so on within a context menu for each +site. + +

  1. Cookies +

    Design Goal: + +All cookies should be double-keyed to the top-level domain. There exists a +Mozilla +bug that contains a prototype patch, but it lacks UI, and does not +apply to modern Firefoxes. + +

    Implementation Status: + +As a stopgap to satisfy our design requirement of unlinkability, we currently +entirely disable 3rd party cookies by setting +network.cookie.cookieBehavior to 1. We would prefer that +third party content continue to function , but we believe the requirement for +unlinkability trumps that desire. + +

  2. Cache +

    +Cache is isolated to the top-level url bar domain by using a technique +pioneered by Colin Jackson et al, via their work on SafeCache. The technique re-uses the +nsICachingChannel.cacheKey +attribute that Firefox uses internally to prevent improper caching of HTTP POST data. +

    +However, to increase the +security of the isolation and to solve strange and +unknown conflicts with OCSP, we had to patch +Firefox to provide a cacheDomain cache attribute. We use the full +url bar domain as input to this field. +

    + + +Furthermore, we chose a different isolation scheme than the Stanford +implementation. First, we decoupled the cache isolation from the third party +cookie attribute. Second, we use several mechanisms to attempt to determine +the actual location attribute of the top-level window (the url bar domain) +used to load the page, as opposed to relying solely on the referer property. +

    +Therefore, the original +Stanford test +cases are expected to fail. Functionality can still be verified by +navigating to about:cache and viewing the key +used for each cache entry. Each third party element should have an additional +"domain=string" property prepended, which will list the top-level urlbar +domain that was used to source the third party element. +

  3. HTTP Auth +

    + +HTTP authentication tokens are removed for third party elements using the +http-on-modify-request +observer to remove the Authorization headers to prevent silent +linkability between domains. We also needed to patch +Firefox to cause the headers to get added early enough to allow the +observer to modify it. + +

  4. DOM Storage +

    Design Goal: + +DOM storage for third party domains MUST BE isolated to the url bar domain, +to prevent linkability between sites. + +

    Implementation Status: + +Because it is isolated to third party domain as opposed to top level url bar +domain, we entirely disable DOM storage as a stopgap to ensure unlinkability. + +

  5. TLS session resumption and HTTP Keep-Alive +

    +TLS session resumption and HTTP Keep-Alive must not allow third party origins +to track users via either TLS session IDs, or the fact that different requests +arrive on the same TCP connection. +

    Design Goal: + +TLS session resumption IDs must be limited to the top-level url bar domain. +HTTP Keep-Alive connections from a third party in one top-level domain must +not be reused for that same third party in another top-level domain. + +

    Implementation Status: + +We plan to +disable TLS session resumption, and limit HTTP Keep-alive duration. + +

  6. window.name +

    + +window.name is +a magical DOM property that for some reason is allowed to retain a persistent value +for the lifespan of a browser tab. It is possible to utilize this property for +identifier +storage. + +

    + +In order to eliminate linkability but still allow for sites that utilize this +property to function, we reset the window.name property of tabs in Torbutton every +time we encounter a blank referer. This behavior allows window.name to persist +for the duration of a link-driven navigation session, but as soon as the user +enters a new URL or navigates between https/http schemes, the property is cleared. + +

  7. Exit node usage +

    Design Goal: + +Every distinct navigation session (as defined by a non-blank referer header) +MUST exit through a fresh Tor circuit in Tor Browser to prevent exit node +observers from linking concurrent browsing activity. + +

    Implementation Status: + +The Tor feature that supports this ability only exists in the 0.2.3.x-alpha +series. Ticket +#3455 is the Torbutton ticket to make use of the new Tor +functionality. + +

3.6. Cross-Domain Fingerprinting Unlinkability

+ +In order to properly address the fingerprinting adversary on a technical +level, we need a metric to measure linkability of the various browser +properties that extend beyond any stored origin-related state. The Panopticlick Project +by the EFF provides us with exactly this metric. The researchers conducted a +survey of volunteers who were asked to visit an experiment page that harvested +many of the above components. They then computed the Shannon Entropy of the +resulting distribution of each of several key attributes to determine how many +bits of identifying information each attribute provided. + +

+ +The study is not exhaustive, though. In particular, the test does not take in +all aspects of resolution information. It did not calculate the size of +widgets, window decoration, or toolbar size, which we believe may add high +amounts of entropy. It also did not measure clock offset and other time-based +fingerprints. Furthermore, as new browser features are added, this experiment +should be repeated to include them. + +

+ +On the other hand, to avoid an infinite sinkhole, we reduce the efforts for +fingerprinting resistance by only concerning ourselves with reducing the +fingerprintable differences among Tor Browser users. We +do not believe it is productive to concern ourselves with cross-browser +fingerprinting issues, at least not at this stage. + +

  1. Plugins +

    + +Plugins add to fingerprinting risk via two main vectors: their mere presence in +window.navigator.plugins, as well as their internal functionality. + +

    Design Goal: + +All plugins that have not been specifically audited or sandboxed must be +disabled. To reduce linkability potential, even sandboxed plugins should not +be allowed to load objects until the user has clicked through a click-to-play +barrier. Additionally, version information should be reduced or obfuscated +until the plugin object is loaded. + +

    Implementation Status: + +Currently, we entirely disable all plugins in Tor Browser. However, as a +compromise due to the popularity of Flash, we intend to work +towards a +click-to-play barrier using NoScript that is available only after the user has +specifically enabled plugins. Flash will be the only plugin available, and we +will ship a settings.sol file to disable Flash cookies, and to restrict P2P +features that likely bypass proxy settings. + +

  2. Fonts +

    + +According to the Panopticlick study, fonts provide the most linkability when +they are provided as an enumerable list in filesystem order, via either the +Flash or Java plugins. However, it is still possible to use CSS and/or +Javascript to query for the existence of specific fonts. With a large enough +pre-built list to query, a large amount of fingerprintable information may +still be available. + +

    Design Goal: + +To address the Javascript issue, we intend to limit the number of +fonts an origin can load, gracefully degrading to built-in and/or +remote fonts once the limit is reached. + +

    Implementation Status: + +Aside from disabling plugins to prevent enumeration, we have not yet +implemented any defense against CSS or Javascript fonts. + +

  3. User Agent and HTTP Headers +

    Design Goal: + +All Tor Browser users should provide websites with an identical user agent and +HTTP header set for a given request type. We omit the Firefox minor revision, +and report a popular Windows platform. If the software is kept up to date, +these headers should remain identical across the population even when updated. + +

    Implementation Status: + +Firefox provides several options for controlling the browser user agent string +which we leverage. We also set similar prefs for controlling the +Accept-Language and Accept-Charset headers, which we spoof to English by default. Additionally, we +remove +content script access to Components.interfaces, which can be +used to fingerprint OS, platform, and Firefox minor version.

  4. Desktop resolution and CSS Media Queries +

    + +Both CSS and Javascript have a lot of irrelevant information about the screen +resolution, usable desktop size, OS widget size, toolbar size, title bar size, and +other desktop features that are not at all relevant to rendering and serve +only to provide information for fingerprinting. + +

    Design Goal: + +Our design goal here is to reduce the resolution information down to the bare +minimum required for properly rendering inside a content window. We intend to +report all rendering information correctly with respect to the size and +properties of the content window, but report an effective size of 0 for all +border material, and also report that the desktop is only as big as the +inner content window. Additionally, new browser windows are sized such that +their content windows are one of ~5 fixed sizes based on the user's +desktop resolution. + +

    Implementation Status: + +We have implemented the above strategy for Javascript using Torbutton's JavaScript +hooks as well as a window observer to resize +new windows based on desktop resolution. However, CSS Media Queries +still need +to be dealt with. + +

  5. Timezone and clock offset +

    Design Goal: + +All Tor Browser users should report the same timezone to websites. Currently, +we choose UTC for this purpose, although an equally valid argument could be +made for EDT/EST due to the large English-speaking population density. +Additionally, the Tor software should detect if the users clock is +significantly divergent from the clocks of the relays that it connects to, and +use this to reset the clock values used in Tor Browser to something reasonably +accurate. + +

    Implementation Status: + +We set the timezone using the TZ environment variable, which is supported on +all platforms. Additionally, we plan to obtain a clock +offset from Tor, but this won't be available until Tor 0.2.3.x is in +use. + +

  6. Javascript performance fingerprinting +

    + +Javascript performance +fingerprinting is the act of profiling the performance +of various Javascript functions for the purpose of fingerprinting the +Javascript engine and the CPU. + +

    Design Goal: + +We have several potential +mitigation approaches to reduce the accuracy of performance +fingerprinting without risking too much damage to functionality. Our current +favorite is to reduce the resolution of the Event.timeStamp and the Javascript +Date() object, while also introducing jitter. Our goal is to increase the +amount of time it takes to mount a successful attack. Mowery et al found that +even with the default precision in most browsers, they required up to 120 +seconds of amortization and repeated trials to get stable results from their +feature set. We intend to work with the research community to establish the +optimum tradeoff between quantization+jitter and amortization time. + + +

    Implementation Status: + +We have no implementation as of yet. + +

  7. Keystroke fingerprinting +

    + +Keystroke fingerprinting is the act of measuring key strike time and key +flight time. It is seeing increasing use as a biometric. + +

    Design Goal: + +We intend to rely on the same mechanisms for defeating Javascript performance +fingerprinting: timestamp quantization and jitter. + +

    Implementation Status: +We have no implementation as of yet. +

  8. WebGL +

    + +WebGL is fingerprintable both through information that is exposed about the +underlying driver and optimizations, as well as through performance +fingerprinting. + +

    Design Goal: + +Because of the large amount of potential fingerprinting vectors, we intend to +deploy a similar strategy against WebGL as for plugins. First, WebGL canvases +will have click-to-play placeholders, and will not run until authorized by the +user. Second, we intend to obfuscate driver +information by hooking +getParameter(), +getSupportedExtensions(), +getExtension(), and +getContextAttributes() to provide standard minimal, +driver-neutral information. + +

    Implementation Status: + +Currently we simply disable WebGL. + +

3.7. Long-Term Unlinkability via "New Identity" button

+In order to avoid long-term linkability, we provide a "New Identity" context +menu option in Torbutton. +

Design Goal:

+ +All linkable identifiers and browser state should be cleared by this feature. + +

Implementation Status:

+ First, Torbutton disables +all open tabs and windows via nsIContentPolicy blocking, and then closes each +tab and window. The extra step for blocking tabs is done as a precaution to +ensure that any asynchronous Javascript is in fact properly disabled. After +closing all of the windows, we then clear the following state: OCSP (by +toggling security.OCSP.enabled), cache, site-specific zoom and content +preferences, Cookies, DOM storage, safe browsing key, the Google wifi +geolocation token (if exists), HTTP auth, SSL Session IDs, and the last opened URL +field (via the pref general.open_location.last_url). After clearing the +browser state, we then send the NEWNYM signal to the Tor control port to cause +a new circuit to be created. +

3.8. Click-to-play for plugins and invasive content

+Some content types are too invasive and/or too opaque for us to properly +eliminate their linkability properties. For these content types, we use +NoScript to provide click-to-play placeholders that do not activate the +content until the user clicks on it. This will eliminate the ability for an +adversary to use such content types to link users in a dragnet fashion across +arbitrary sites. +

+Currently, the content types isolated in this way include Flash, WebGL, and +audio and video objects. +

3.9. Description of Firefox Patches

+The set of patches we have against Firefox can be found in the current-patches +directory of the torbrowser git repository. They are: +

  1. Block Components.interfaces and Components.lookupMethod +

    + +In order to reduce fingerprinting, we block access to these two interfaces +from content script. Components.lookupMethod can undo our Javascript +hooks, +and Components.interfaces can be used for fingerprinting the platform, OS, and +Firebox version, but not much else. + +

  2. Make Permissions Manager memory only +

    + +This patch exposes a pref 'permissions.memory_only' that properly isolates the +permissions manager to memory, which is responsible for all user specified +site permissions, as well as stored HTTPS STS policy from visited sites. + +The pref does successfully clear the permissions manager memory if toggled. It +does not need to be set in prefs.js, and can be handled by Torbutton. + +

    Design Goal: + +As an additional design goal, we would like to later alter this patch to allow this +information to be cleared from memory. The implementation does not currently +allow this. + +

  3. Make Intermediate Cert Store memory-only +

    + +The intermediate certificate store holds information about SSL certificates +that may only be used by a limited number of domains. In some cases +effectively recording on disk the fact that a website owned by a certain +organization was viewed. + +

    Design Goal: + +As an additional design goal, we would like to later alter this patch to allow this +information to be cleared from memory. The implementation does not currently +allow this. + +

  4. Add HTTP auth headers before on-modify-request fires +

    + +This patch provides a trivial modification to allow us to properly remove HTTP +auth for third parties. This patch allows us to defend against an adversary +attempting to use HTTP +auth to silently track users between domains. + +

  5. Add a string-based cacheKey property for domain isolation +

    + +To increase the +security of cache isolation and to solve strange and +unknown conflicts with OCSP, we had to patch +Firefox to provide a cacheDomain cache attribute. We use the full +url bar domain as input to this field. + +

  6. Randomize HTTP pipeline order and depth +

    +As an +experimental +defense against Website Traffic Fingerprinting, we patch the standard +HTTP pipelining code to randomize the number of requests in a +pipeline, as well as their order. +

  7. Block all plugins except flash +

    +We cannot use the +@mozilla.org/extensions/blocklist;1 service, because we +actually want to stop plugins from ever entering the browser's process space +and/or executing code (for example, AV plugins that collect statistics/analyze +URLs, magical toolbars that phone home or "help" the user, skype buttons that +ruin our day, and censorship filters). Hence we rolled our own. +

  8. Make content-prefs service memory only +

    +This patch prevents random URLs from being inserted into content-prefs.sqllite in +the profile directory as content prefs change (includes site-zoom and perhaps +other site prefs?). +

4. Packaging

4.1. Build Process Security

4.2. External Addons

Included Addons

Excluded Addons

Dangerous Addons

4.3. Pref Changes

4.4. Update Security

5. Testing

+ +The purpose of this section is to cover all the known ways that Tor browser +security can be subverted from a penetration testing perspective. The hope +is that it will be useful both for creating a "Tor Safety Check" +page, and for developing novel tests and actively attacking Torbutton with the +goal of finding vulnerabilities in either it or the Mozilla components, +interfaces and settings upon which it relies. + +

5.1. Single state testing

+ +Torbutton is a complicated piece of software. During development, changes to +one component can affect a whole slough of unrelated features. A number of +aggregated test suites exist that can be used to test for regressions in +Torbutton and to help aid in the development of Torbutton-like addons and +other privacy modifications of other browsers. Some of these test suites exist +as a single automated page, while others are a series of pages you must visit +individually. They are provided here for reference and future regression +testing, and also in the hope that some brave soul will one day decide to +combine them into a comprehensive automated test suite. + + +

  1. Decloak.net

    + +Decloak.net is the canonical source of plugin and external-application based +proxy-bypass exploits. It is a fully automated test suite maintained by HD Moore as a service for people to +use to test their anonymity systems. + +

  2. Deanonymizer.com

    + +Deanonymizer.com is another automated test suite that tests for proxy bypass +and other information disclosure vulnerabilities. It is maintained by Kyle +Williams, the author of JanusVM +and JanusPA. + +

  3. JonDos +AnonTest

    + +The JonDos people also provide an +anonymity tester. It is more focused on HTTP headers than plugin bypass, and +points out a couple of headers Torbutton could do a better job with +obfuscating. + +

  4. Browserspy.dk

    + +Browserspy.dk provides a tremendous collection of browser fingerprinting and +general privacy tests. Unfortunately they are only available one page at a +time, and there is not really solid feedback on good vs bad behavior in +the test results. + +

  5. Privacy +Analyzer

    + +The Privacy Analyzer provides a dump of all sorts of browser attributes and +settings that it detects, including some information on your origin IP +address. Its page layout and lack of good vs bad test result feedback makes it +not as useful as a user-facing testing tool, but it does provide some +interesting checks in a single page. + +

  6. Mr. T

    + +Mr. T is a collection of browser fingerprinting and deanonymization exploits +discovered by the ha.ckers.org crew +and others. It is also not as user friendly as some of the above tests, but it +is a useful collection. + +

  7. Gregory Fleischer's Torbutton and +Defcon +17 Test Cases +

    + +Gregory Fleischer has been hacking and testing Firefox and Torbutton privacy +issues for the past 2 years. He has an excellent collection of all his test +cases that can be used for regression testing. In his Defcon work, he +demonstrates ways to infer Firefox version based on arcane browser properties. +We are still trying to determine the best way to address some of those test +cases. + +

  8. Xenobite's +TorCheck Page

    + +This page checks to ensure you are using a valid Tor exit node and checks for +some basic browser properties related to privacy. It is not very fine-grained +or complete, but it is automated and could be turned into something useful +with a bit of work. + +

+