diff --git a/projects/en/torbrowser/design/CookieManagers.png b/projects/en/torbrowser/design/CookieManagers.png new file mode 100644 index 00000000..b979f30e Binary files /dev/null and b/projects/en/torbrowser/design/CookieManagers.png differ diff --git a/projects/en/torbrowser/design/index.html.en b/projects/en/torbrowser/design/index.html.en new file mode 100644 index 00000000..90b496ab --- /dev/null +++ b/projects/en/torbrowser/design/index.html.en @@ -0,0 +1,955 @@ + + +
Table of Contents
+ +This document describes the adversary model, +design requirements, +implementation, packaging and testing +procedures of the Tor Browser. It is +current as of Tor Browser 2.2.32-4. + +
+ +This document is also meant to serve as a set of design requirements and to +describe a reference implementation of a Private Browsing Mode that defends +against both local and network adversaries. + +
+ +A Tor web browser adversary has a number of goals, capabilities, and attack +types that can be used to guide us towards a set of requirements for the +Tor Browser. Let's start with the goals. + +
The adversary's primary goal is direct compromise and bypass of +Tor, causing the user to directly connect to an IP of the adversary's +choosing.
If direct proxy bypass is not possible, the adversary will likely +happily settle for the ability to correlate something a user did via Tor with +their non-Tor activity. This can be done with cookies, cache identifiers, +javascript events, and even CSS. Sometimes the fact that a user uses Tor may +be enough for some authorities.
+The adversary may also be interested in history disclosure: the ability to +query a user's history to see if they have issued certain censored search +queries, or visited censored sites. +
+ +Location information such as timezone and locality can be useful for the +adversary to determine if a user is in fact originating from one of the +regions they are attempting to control, or to zero-in on the geographical +location of a particular dissident or whistleblower. + +
+ +Anonymity set reduction is also useful in attempting to zero in on a +particular individual. If the dissident or whistleblower is using a rare build +of Firefox for an obscure operating system, this can be very useful +information for tracking them down, or at least tracking their activities. + +
+In some cases, the adversary may opt for a heavy-handed approach, such as +seizing the computers of all Tor users in an area (especially after narrowing +the field by the above two pieces of information). History records and cache +data are the primary goals here. +
+The adversary can position themselves at a number of different locations in +order to execute their attacks. +
+The adversary can run exit nodes, or alternatively, they may control routers +upstream of exit nodes. Both of these scenarios have been observed in the +wild. +
+The adversary can also run websites, or more likely, they can contract out +ad space from a number of different adservers and inject content that way. For +some users, the adversary may be the adservers themselves. It is not +inconceivable that adservers may try to subvert or reduce a user's anonymity +through Tor for marketing purposes. +
+The adversary can also inject malicious content at the user's upstream router +when they have Tor disabled, in an attempt to correlate their Tor and Non-Tor +activity. +
+Some users face adversaries with intermittent or constant physical access. +Users in Internet cafes, for example, face such a threat. In addition, in +countries where simply using tools like Tor is illegal, users may face +confiscation of their computer equipment for excessive Tor usage or just +general suspicion. +
+ +The adversary can perform the following attacks from a number of different +positions to accomplish various aspects of their goals. It should be noted +that many of these attacks (especially those involving IP address leakage) are +often performed by accident by websites that simply have Javascript, dynamic +CSS elements, and plugins. Others are performed by adservers seeking to +correlate users' activity across different IP addresses, and still others are +performed by malicious agents on the Tor network and at national firewalls. + +
+If not properly disabled, Javascript event handlers and timers
+can cause the browser to perform network activity after Tor has been disabled,
+thus allowing the adversary to correlate Tor and Non-Tor activity and reveal
+a user's non-Tor IP address. Javascript
+also allows the adversary to execute history disclosure attacks:
+to query the history via the different attributes of 'visited' links to search
+for particular Google queries, sites, or even to profile
+users based on gender and other classifications. Finally,
+Javascript can be used to query the user's timezone via the
+Date()
object, and to reduce the anonymity set by querying
+the navigator
object for operating system, CPU, locale,
+and user agent information.
+
+ +Plugins are abysmal at obeying the proxy settings of the browser. Every plugin +capable of performing network activity that the author has +investigated is also capable of performing network activity independent of +browser proxy settings - and often independent of its own proxy settings. +Sites that have plugin content don't even have to be malicious to obtain a +user's +Non-Tor IP (it usually leaks by itself), though plenty of active +exploits are possible as well. In addition, plugins can be used to store unique identifiers that are more +difficult to clear than standard cookies. +Flash-based +cookies fall into this category, but there are likely numerous other +examples. + +
+ +CSS can also be used to correlate Tor and Non-Tor activity and reveal a user's +Non-Tor IP address, via the usage of +CSS +popups - essentially CSS-based event handlers that fetch content via +CSS's onmouseover attribute. If these popups are allowed to perform network +activity in a different Tor state than they were loaded in, they can easily +correlate Tor and Non-Tor activity and reveal a user's IP address. In +addition, CSS can also be used without Javascript to perform CSS-only history disclosure +attacks. +
+ +An adversary in a position to perform MITM content alteration can inject +document content elements to both read and inject cookies for arbitrary +domains. In fact, many "SSL secured" websites are vulnerable to this sort of +active +sidejacking. In addition, the ad networks of course perform tracking +with cookies as well. + +
+ +Likewise, the browser cache can also be used to store unique +identifiers. Since by default the cache has no same-origin policy, +these identifiers can be read by any domain, making them an ideal target for +ad network-class adversaries. + +
+ +There is an absurd amount of information available to websites via attributes +of the browser. This information can be used to reduce anonymity set, or even +uniquely +fingerprint individual users.
+ +The Panopticlick study +done by the EFF attempts to measure the actual entropy - the number of +identifying bits of information encoded in browser properties. Their result +data is definitely useful, and the metric is probably the appropriate one for +determining how identifying a particular browser property is. However, some +quirks of their study means that they do not extract as much information as +they could from display information: they only use desktop resolution (which +Torbutton reports as the window resolution) and do not attempt to infer the +size of toolbars. + + + +
+ +Last, but definitely not least, the adversary can exploit either general +browser vulnerabilities, plugin vulnerabilities, or OS vulnerabilities to +install malware and surveillance software. An adversary with physical access +can perform similar actions. Regrettably, this last attack capability is +outside of our ability to defend against, but it is worth mentioning for +completeness. The Tails +system however can provide some limited defenses against this +adversary. + +
+ +The Tor Browser Design Requirements are meant to describe the properties of a +Private Browsing Mode that defends against both network and local adversaries. + +
+ +There are two main categories of requirements: Security Requirements, and Privacy Requirements. Security Requirements are the +minimum properties in order for a web client platform to be able to support +Tor. Privacy requirements are the set of properties that cause us to prefer +one platform over another. + +
+ +We will maintain an alternate distribution of the web client in order to +maintain and/or restore privacy properties to our users. + +
+ +The security requirements are primarily concerned with ensuring the safe use +of Tor. Violations in these properties typically result in serious risk for +the user in terms of immediate deanonymization and/or observability. + +
The browser +MUST NOT bypass Tor proxy settings for any content.
The browser MUST NOT provide any stored state to the content window +from other browsers or other browsing modes, including shared state from +plugins, machine identifiers, and TLS session state. +
The +browser SHOULD NOT write any browsing history information to disk, or store it +in memory beyond the duration of one Tor session, unless the user has +explicitly opted to store their browsing history information to +disk.
The browser +MUST NOT write or cause the operating system to +write any information to disk outside of the application +directory. All exceptions and shortcomings due to operating system behavior +MUST BE documented. + +
The browser SHOULD NOT perform unsafe updates or upgrades.
+ +The privacy requirements are primarily concerned with reducing linkability: +the ability for a user's activity on one site to be linked with their +activity on another site without their knowledge or explicit consent. + +
+ +User activity on one url bar domain MUST NOT be linkable to their activity in +any other domain by any third party. This property specifically applies to +linkability from stored browser identifiers, authentication tokens, and shared +state. This functionality SHOULD NOT interfere with federated login in a +substantial way. + +
+ +User activity on one url bar domain MUST NOT be linkable to their activity in +any other domain by any third party. This property specifically applies to +linkability from fingerprinting browser behavior. + +
+ +The browser SHOULD provide an obvious, easy way to remove all of their authentication +tokens and browser state and obtain a fresh identity. Additionally, this +should happen by default automatically upon browser restart. + +
+ +In addition to the above design requirements, the technology decisions about +Tor Browser are also guided by some philosophical positions about technology. + +
+ +The existing way that the user expects to use a browser must be preserved. If +the user has to maintain a different mental model of how the sites they are +using behave depending on tab, browser state, or anything else that would not +normally be what they experience in their default browser, the user will +inevitably be confused. They will make mistakes and reduce their privacy as a +result. Worse, they may just stop using the browser, assuming it is broken. + +
+ +User model breakage was one of the failures +of Torbutton: Even if users managed to install everything properly, +the toggle model was too hard for the average user to understand, especially +in the face of accumulating tabs from multiple states crossed with the current +tor-state of the browser. + +
+ +In general, we try to find solutions to privacy issues that will not induce +site breakage, though this is not always possible. + +
+ +Even if plugins always properly used the browser proxy settings (which none of +them do) and could not be induced to bypass them (which all of them can), the +activities of closed-source plugins are very difficult to audit and control. +They can obtain and transmit all manner of system information to websites, +often have their own identifier storage for tracking users, and also +contribute to fingerprinting. + +
+ +Therefore, if plugins are to be enabled in private browsing modes, they must +be restricted from running automatically on every page (via click-to-play +placeholders), and/or be sandboxed to restrict the types of system calls they +can execute. If the user decides to craft an exemption to allow a plugin to be +used, it MUST ONLY apply to the top level urlbar domain, and not to all sites, +to reduce linkability. + +
+ +Another +failure of Torbutton was (and still is) the options panel. Each option +that detectably alters browser behavior can be used as a fingerprinting tool. +Similarly, all extensions should be +disabled in the mode except as an opt-in basis. We should not load +system-wide addons or plugins. + +
+Instead of global browser privacy options, privacy decisions should be made +per +top-level url-bar domain to eliminate the possibility of linkability +between domains. For example, when a plugin object (or a Javascript access of +window.plugins) is present in a page, the user should be given the choice of +allowing that plugin object for that top-level url-bar domain only. The same +goes for exemptions to third party cookie policy, geo-location, and any other +privacy permissions. +
+If the user has indicated they do not care about local history storage, these +permissions can be written to disk. Otherwise, they should remain memory-only. +
+ +Filter-based addons such as AdBlock +Plus, Request Policy, Priv3, and Sharemenot are to be +avoided. We believe that these addons do not add any real privacy to a proper +implementation of the above privacy requirements, as all third parties are +prevented from tracking users between sites by the implementation. +Filter-based addons can also introduce strange breakage and cause usability +nightmares, and will also fail to do their job if an adversary simply +registers a new domain or creates a new url path. Worse still, the unique +filter sets that each user is liable to create/install likely provide a wealth +of fingerprinting targets. + +
+ +As a general matter, we are also generally opposed to shipping an always-on Ad +blocker with Tor Browser. We feel that this would damage our credibility in +terms of demonstrating that we are providing privacy through a sound design +alone, as well as damage the acceptance of Tor users by sites who support +themselves through advertising revenue. + +
+Users are free to install these addons if they wish, but doing +so is not recommended, as it will alter the browser request fingerprint. +
+We believe that if we do not stay current with the support of new web +technologies, we cannot hope to substantially influence or be involved in +their proper deployment or privacy realization. However, we will likely disable +certain new features (where possible) pending analysis and audit. +
+
+ +Proxy obedience is assured through the following: +
+ The Torbutton xpi sets the Firefox proxy settings to use Tor directly as a +SOCKS proxy. It sets network.proxy.socks_remote_dns, +network.proxy.socks_version, and +network.proxy.socks_port. +
+ Plugins have the ability to make arbitrary OS system calls. This includes +the ability to make UDP sockets and send arbitrary data independent of the +browser proxy settings. +
+Torbutton disables plugins by using the +@mozilla.org/plugin/host;1 service to mark the plugin tags +as disabled. Additionally, we set +plugin.disable_full_page_plugin_for_types to the list of +supported mime types for all currently installed plugins. +
+In addition, to prevent any unproxied activity by plugins at load time, we +also patch the Firefox source code to prevent the load of any plugins except +for Flash and Gnash. + +
+External apps, if launched automatically, can be induced to load files that +perform network activity. In order to prevent this, Torbutton installs a +component to + +provide the user with a popup whenever the browser attempts to +launch a helper app. +
+Tor Browser State is separated from existing browser state through use of a +custom Firefox profile. Furthermore, plugins are disabled, which prevents +Flash cookies from leaking from a pre-existing Flash directory. +
+Tor Browser should optionally prevent all disk records of browser activity. +The user should be able to optionally enable URL history and other history +features if they so desire. Once we simplify the +preferences interface, we will likely just enable Private Browsing +mode by default to handle this goal. +
+For now, Tor Browser blocks write access to the disk through Torbutton +using several Firefox preferences. + + + +The set of prefs is: +dom.storage.enabled, +browser.cache.memory.enable, +network.http.use-cache, +browser.cache.disk.enable, +browser.cache.offline.enable, +general.open_location.last_url, +places.history.enabled, +browser.formfill.enable, +signon.rememberSignons, +browser.download.manager.retention, +and network.cookie.lifetimePolicy. +
+In addition, three Firefox patches are needed to prevent disk writes, even if +Private Browsing Mode is enabled. We need to + +prevent +the permissions manager from recording HTTPS STS state, +prevent +intermediate SSL certificates from being recorded, and +prevent +the content preferences service from recording site zoom. + +For more details on these patches, see the +Firefox Patches section. + +
+ +Tor Browser Bundle MUST NOT cause any information to be written outside of the +bundle directory. This is to ensure that the user is able to completely and +safely remove the bundle without leaving other traces of Tor usage on their +computer. + +
XXX: sjmurdoch, Erinn: explain what magic we do to satisfy this, +and/or what additional work or auditing needs to be done. +
+ +The Tor Browser MUST prevent a user's activity on one site from being linked +to their activity on another site. When this goal cannot yet be met with an +existing web technology, that technology or functionality is disabled. Our +design goal is to ultimately eliminate the need to disable arbitrary +technologies, and instead simply alter them in ways that allows them to +function in a backwards-compatible way while avoiding linkability. Users +should be able to use federated login of various kinds to explicitly inform +sites who they are, but that information should not transparently allow a +third party to record their activity from site to site without their prior +consent. + +
+ +The benefit of this approach comes not only in the form of reduced +linkability, but also in terms of simplified privacy UI. If all stored browser +state and permissions become associated with the top-level url-bar domain, the +six or seven different pieces of privacy UI governing these identifiers and +permissions can become just one piece of UI. For instance, a window that lists +the top-level url bar domains for which browser state exists with the ability +to clear and/or block them, possibly with a context-menu option to drill down +into specific types of state. An exmaple of this simplifcation can be seen in +Figure 1. + +
Design Goal: + +All cookies should be double-keyed to the top-level domain. There exists a +Mozilla +bug that contains a prototype patch, but it lacks UI, and does not +apply to modern Firefoxes. + +
Implementation Status: + +As a stopgap to satisfy our design requirement of unlinkability, we currently +entirely disable 3rd party cookies by setting +network.cookie.cookieBehavior to 1. We would prefer that +third party content continue to function , but we believe the requirement for +unlinkability trumps that desire. + +
+Cache is isolated to the top-level url bar domain by using a technique +pioneered by Colin Jackson et al, via their work on SafeCache. The technique re-uses the +nsICachingChannel.cacheKey +attribute that Firefox uses internally to prevent improper caching of HTTP POST data. +
+However, to increase the +security of the isolation and to solve strange and +unknown conflicts with OCSP, we had to patch +Firefox to provide a cacheDomain cache attribute. We use the full +url bar domain as input to this field. +
+ + +Furthermore, we chose a different isolation scheme than the Stanford +implementation. First, we decoupled the cache isolation from the third party +cookie attribute. Second, we use several mechanisms to attempt to determine +the actual location attribute of the top-level window (the url bar domain) +used to load the page, as opposed to relying solely on the referer property. +
+Therefore, the original +Stanford test +cases are expected to fail. Functionality can still be verified by +navigating to about:cache and viewing the key +used for each cache entry. Each third party element should have an additional +"domain=string" property prepended, which will list the top-level urlbar +domain that was used to source the third party element. +
+ +HTTP authentication tokens are removed for third party elements using the +http-on-modify-request +observer to remove the Authorization headers to prevent silent +linkability between domains. We also needed to patch +Firefox to cause the headers to get added early enough to allow the +observer to modify it. + +
Design Goal: + +DOM storage for third party domains MUST BE isolated to the url bar domain, +to prevent linkability between sites. + +
Implementation Status: + +Because it is isolated to third party domain as opposed to top level url bar +domain, we entirely disable DOM storage as a stopgap to ensure unlinkability. + +
+TLS session resumption and HTTP Keep-Alive must not allow third party origins +to track users via either TLS session IDs, or the fact that different requests +arrive on the same TCP connection. +
Design Goal: + +TLS session resumption IDs must be limited to the top-level url bar domain. +HTTP Keep-Alive connections from a third party in one top-level domain must +not be reused for that same third party in another top-level domain. + +
Implementation Status: + +We plan to +disable TLS session resumption, and limit HTTP Keep-alive duration. + +
+ +window.name is +a magical DOM property that for some reason is allowed to retain a persistent value +for the lifespan of a browser tab. It is possible to utilize this property for +identifier +storage. + +
+ +In order to eliminate linkability but still allow for sites that utilize this +property to function, we reset the window.name property of tabs in Torbutton every +time we encounter a blank referer. This behavior allows window.name to persist +for the duration of a link-driven navigation session, but as soon as the user +enters a new URL or navigates between https/http schemes, the property is cleared. + +
Design Goal: + +Every distinct navigation session (as defined by a non-blank referer header) +MUST exit through a fresh Tor circuit in Tor Browser to prevent exit node +observers from linking concurrent browsing activity. + +
Implementation Status: + +The Tor feature that supports this ability only exists in the 0.2.3.x-alpha +series. Ticket +#3455 is the Torbutton ticket to make use of the new Tor +functionality. + +
+ +In order to properly address the fingerprinting adversary on a technical +level, we need a metric to measure linkability of the various browser +properties that extend beyond any stored origin-related state. The Panopticlick Project +by the EFF provides us with exactly this metric. The researchers conducted a +survey of volunteers who were asked to visit an experiment page that harvested +many of the above components. They then computed the Shannon Entropy of the +resulting distribution of each of several key attributes to determine how many +bits of identifying information each attribute provided. + +
+ +The study is not exhaustive, though. In particular, the test does not take in +all aspects of resolution information. It did not calculate the size of +widgets, window decoration, or toolbar size, which we believe may add high +amounts of entropy. It also did not measure clock offset and other time-based +fingerprints. Furthermore, as new browser features are added, this experiment +should be repeated to include them. + +
+ +On the other hand, to avoid an infinite sinkhole, we reduce the efforts for +fingerprinting resistance by only concerning ourselves with reducing the +fingerprintable differences among Tor Browser users. We +do not believe it is productive to concern ourselves with cross-browser +fingerprinting issues, at least not at this stage. + +
+ +Plugins add to fingerprinting risk via two main vectors: their mere presence in +window.navigator.plugins, as well as their internal functionality. + +
Design Goal: + +All plugins that have not been specifically audited or sandboxed must be +disabled. To reduce linkability potential, even sandboxed plugins should not +be allowed to load objects until the user has clicked through a click-to-play +barrier. Additionally, version information should be reduced or obfuscated +until the plugin object is loaded. + +
Implementation Status: + +Currently, we entirely disable all plugins in Tor Browser. However, as a +compromise due to the popularity of Flash, we intend to work +towards a +click-to-play barrier using NoScript that is available only after the user has +specifically enabled plugins. Flash will be the only plugin available, and we +will ship a settings.sol file to disable Flash cookies, and to restrict P2P +features that likely bypass proxy settings. + +
+ +According to the Panopticlick study, fonts provide the most linkability when +they are provided as an enumerable list in filesystem order, via either the +Flash or Java plugins. However, it is still possible to use CSS and/or +Javascript to query for the existence of specific fonts. With a large enough +pre-built list to query, a large amount of fingerprintable information may +still be available. + +
Design Goal: + +To address the Javascript issue, we intend to limit the number of +fonts an origin can load, gracefully degrading to built-in and/or +remote fonts once the limit is reached. + +
Implementation Status: + +Aside from disabling plugins to prevent enumeration, we have not yet +implemented any defense against CSS or Javascript fonts. + +
Design Goal: + +All Tor Browser users should provide websites with an identical user agent and +HTTP header set for a given request type. We omit the Firefox minor revision, +and report a popular Windows platform. If the software is kept up to date, +these headers should remain identical across the population even when updated. + +
Implementation Status: + +Firefox provides several options for controlling the browser user agent string +which we leverage. We also set similar prefs for controlling the +Accept-Language and Accept-Charset headers, which we spoof to English by default. Additionally, we +remove +content script access to Components.interfaces, which can be +used to fingerprint OS, platform, and Firefox minor version.
+ +Both CSS and Javascript have a lot of irrelevant information about the screen +resolution, usable desktop size, OS widget size, toolbar size, title bar size, and +other desktop features that are not at all relevant to rendering and serve +only to provide information for fingerprinting. + +
Design Goal: + +Our design goal here is to reduce the resolution information down to the bare +minimum required for properly rendering inside a content window. We intend to +report all rendering information correctly with respect to the size and +properties of the content window, but report an effective size of 0 for all +border material, and also report that the desktop is only as big as the +inner content window. Additionally, new browser windows are sized such that +their content windows are one of ~5 fixed sizes based on the user's +desktop resolution. + +
Implementation Status: + +We have implemented the above strategy for Javascript using Torbutton's JavaScript +hooks as well as a window observer to resize +new windows based on desktop resolution. However, CSS Media Queries +still need +to be dealt with. + +
Design Goal: + +All Tor Browser users should report the same timezone to websites. Currently, +we choose UTC for this purpose, although an equally valid argument could be +made for EDT/EST due to the large English-speaking population density. +Additionally, the Tor software should detect if the users clock is +significantly divergent from the clocks of the relays that it connects to, and +use this to reset the clock values used in Tor Browser to something reasonably +accurate. + +
Implementation Status: + +We set the timezone using the TZ environment variable, which is supported on +all platforms. Additionally, we plan to obtain a clock +offset from Tor, but this won't be available until Tor 0.2.3.x is in +use. + +
+ +Javascript performance +fingerprinting is the act of profiling the performance +of various Javascript functions for the purpose of fingerprinting the +Javascript engine and the CPU. + +
Design Goal: + +We have several potential +mitigation approaches to reduce the accuracy of performance +fingerprinting without risking too much damage to functionality. Our current +favorite is to reduce the resolution of the Event.timeStamp and the Javascript +Date() object, while also introducing jitter. Our goal is to increase the +amount of time it takes to mount a successful attack. Mowery et al found that +even with the default precision in most browsers, they required up to 120 +seconds of amortization and repeated trials to get stable results from their +feature set. We intend to work with the research community to establish the +optimum tradeoff between quantization+jitter and amortization time. + + +
Implementation Status: + +We have no implementation as of yet. + +
+ +Keystroke fingerprinting is the act of measuring key strike time and key +flight time. It is seeing increasing use as a biometric. + +
Design Goal: + +We intend to rely on the same mechanisms for defeating Javascript performance +fingerprinting: timestamp quantization and jitter. + +
Implementation Status: +We have no implementation as of yet. +
+ +WebGL is fingerprintable both through information that is exposed about the +underlying driver and optimizations, as well as through performance +fingerprinting. + +
Design Goal: + +Because of the large amount of potential fingerprinting vectors, we intend to +deploy a similar strategy against WebGL as for plugins. First, WebGL canvases +will have click-to-play placeholders, and will not run until authorized by the +user. Second, we intend to obfuscate driver +information by hooking +getParameter(), +getSupportedExtensions(), +getExtension(), and +getContextAttributes() to provide standard minimal, +driver-neutral information. + +
Implementation Status: + +Currently we simply disable WebGL. + +
+In order to avoid long-term linkability, we provide a "New Identity" context +menu option in Torbutton. +
+ First, Torbutton disables +all open tabs and windows via nsIContentPolicy blocking, and then closes each +tab and window. The extra step for blocking tabs is done as a precaution to +ensure that any asynchronous Javascript is in fact properly disabled. After +closing all of the windows, we then clear the following state: OCSP (by +toggling security.OCSP.enabled), cache, site-specific zoom and content +preferences, Cookies, DOM storage, safe browsing key, the Google wifi +geolocation token (if exists), HTTP auth, SSL Session IDs, and the last opened URL +field (via the pref general.open_location.last_url). After clearing the +browser state, we then send the NEWNYM signal to the Tor control port to cause +a new circuit to be created. +
+Some content types are too invasive and/or too opaque for us to properly +eliminate their linkability properties. For these content types, we use +NoScript to provide click-to-play placeholders that do not activate the +content until the user clicks on it. This will eliminate the ability for an +adversary to use such content types to link users in a dragnet fashion across +arbitrary sites. +
+Currently, the content types isolated in this way include Flash, WebGL, and +audio and video objects. +
+The set of patches we have against Firefox can be found in the current-patches +directory of the torbrowser git repository. They are: +
+ +In order to reduce fingerprinting, we block access to these two interfaces +from content script. Components.lookupMethod can undo our Javascript +hooks, +and Components.interfaces can be used for fingerprinting the platform, OS, and +Firebox version, but not much else. + +
+ +This patch exposes a pref 'permissions.memory_only' that properly isolates the +permissions manager to memory, which is responsible for all user specified +site permissions, as well as stored HTTPS STS policy from visited sites. + +The pref does successfully clear the permissions manager memory if toggled. It +does not need to be set in prefs.js, and can be handled by Torbutton. + +
Design Goal: + +As an additional design goal, we would like to later alter this patch to allow this +information to be cleared from memory. The implementation does not currently +allow this. + +
+ +The intermediate certificate store holds information about SSL certificates +that may only be used by a limited number of domains. In some cases +effectively recording on disk the fact that a website owned by a certain +organization was viewed. + +
Design Goal: + +As an additional design goal, we would like to later alter this patch to allow this +information to be cleared from memory. The implementation does not currently +allow this. + +
+ +This patch provides a trivial modification to allow us to properly remove HTTP +auth for third parties. This patch allows us to defend against an adversary +attempting to use HTTP +auth to silently track users between domains. + +
+ +To increase the +security of cache isolation and to solve strange and +unknown conflicts with OCSP, we had to patch +Firefox to provide a cacheDomain cache attribute. We use the full +url bar domain as input to this field. + +
+As an +experimental +defense against Website Traffic Fingerprinting, we patch the standard +HTTP pipelining code to randomize the number of requests in a +pipeline, as well as their order. +
+We cannot use the +@mozilla.org/extensions/blocklist;1 service, because we +actually want to stop plugins from ever entering the browser's process space +and/or executing code (for example, AV plugins that collect statistics/analyze +URLs, magical toolbars that phone home or "help" the user, skype buttons that +ruin our day, and censorship filters). Hence we rolled our own. +
+This patch prevents random URLs from being inserted into content-prefs.sqllite in +the profile directory as content prefs change (includes site-zoom and perhaps +other site prefs?). +
+ +The purpose of this section is to cover all the known ways that Tor browser +security can be subverted from a penetration testing perspective. The hope +is that it will be useful both for creating a "Tor Safety Check" +page, and for developing novel tests and actively attacking Torbutton with the +goal of finding vulnerabilities in either it or the Mozilla components, +interfaces and settings upon which it relies. + +
+ +Torbutton is a complicated piece of software. During development, changes to +one component can affect a whole slough of unrelated features. A number of +aggregated test suites exist that can be used to test for regressions in +Torbutton and to help aid in the development of Torbutton-like addons and +other privacy modifications of other browsers. Some of these test suites exist +as a single automated page, while others are a series of pages you must visit +individually. They are provided here for reference and future regression +testing, and also in the hope that some brave soul will one day decide to +combine them into a comprehensive automated test suite. + + +
+ +Decloak.net is the canonical source of plugin and external-application based +proxy-bypass exploits. It is a fully automated test suite maintained by HD Moore as a service for people to +use to test their anonymity systems. + +
+ +Deanonymizer.com is another automated test suite that tests for proxy bypass +and other information disclosure vulnerabilities. It is maintained by Kyle +Williams, the author of JanusVM +and JanusPA. + +
+ +The JonDos people also provide an +anonymity tester. It is more focused on HTTP headers than plugin bypass, and +points out a couple of headers Torbutton could do a better job with +obfuscating. + +
+ +Browserspy.dk provides a tremendous collection of browser fingerprinting and +general privacy tests. Unfortunately they are only available one page at a +time, and there is not really solid feedback on good vs bad behavior in +the test results. + +
+ +The Privacy Analyzer provides a dump of all sorts of browser attributes and +settings that it detects, including some information on your origin IP +address. Its page layout and lack of good vs bad test result feedback makes it +not as useful as a user-facing testing tool, but it does provide some +interesting checks in a single page. + +
+ +Mr. T is a collection of browser fingerprinting and deanonymization exploits +discovered by the ha.ckers.org crew +and others. It is also not as user friendly as some of the above tests, but it +is a useful collection. + +
+ +Gregory Fleischer has been hacking and testing Firefox and Torbutton privacy +issues for the past 2 years. He has an excellent collection of all his test +cases that can be used for regression testing. In his Defcon work, he +demonstrates ways to infer Firefox version based on arcane browser properties. +We are still trying to determine the best way to address some of those test +cases. + +
+ +This page checks to ensure you are using a valid Tor exit node and checks for +some basic browser properties related to privacy. It is not very fine-grained +or complete, but it is automated and could be turned into something useful +with a bit of work. + +
+