The scary thing about web history logging is that it makes you question your web habits, if not become actively paranoid.
For instance, the article quotes the head of MI5 regarding preventing the bombing of the London Stock Exchange in 2010.
I wanted to know more about this, so Googled London Stock Exchange Bomb, and clicked on a few stories, and wanting to find out a bit more about the people involved, I then Googled their names and clicked on a few more links.
All this time, I had the thought at the back of my head: will these searches and clicks put me on a list somewhere?
It's this feeling that I most dislike about it all; something, or someone, somewhere may be watching, and so now I'm questioning myself because some discussion on some site has potentially questionable keywords in its URL.
That's fine and dandy until you search for something and the results return something in there randomly and is considered a threat to national security or of interest for you to be watched. Then your browser starts downloading a page you never intend to visit and you get logged as interested in terrorist activity. This is not good news.
I don't know if being that paranoid is healthy, unless the government or the mob is trying to get you. In which case, there are easier ways to do so besides tracking your browsing habits.
Having said that, maybe you can try browsing on a VM that you can reset via snapshots.
Many organizations attempt to build a profile of you based on your browsing history. Paranoid Browsing confuses that effort by browsing the internet randomly in the background.
Maybe, probably in combination with a feature that sometimes tells me that I searched some term some time ago and how often I visited a hyperlink. What disturbs me: these features are hardly explained anywhere, making this an incubator for FUD. (I can delete my browser history - how do I delete this?)
It only seems like paranoia because widespread, highly-publicized abuse of history sniffing hasn't started yet. I'm happy they've jumped onto this grenade before it exploded.
The most valuable thing Mozilla just did was to send a signal to the market that history sniffing is going to be unreliable, and that ad networks shouldn't invest time and money into exploiting it. The scariest thing about history sniffing was the idea that some large company would adopt it and create pressure to keep the vulnerability there forever, as part of how the Internet works.
Do you honestly believe that if the NSA decided they wanted your browser history, the encryption used by Chrome or Firefox would stymie them for too long?
I guess I'm just not really clear on what folks are worried about happening with these history data. Is the concern that the data could be leaked, revealing potentially privacy-compromising information? Or that Google might mine it in an attempt to make a buck off of your metadata? Or that the government might use it to identify you as a potential political dissident?
Personally, I find the Google search history to be useful, though less useful lately since DDG is my primary search engine now. I don't quite get the knee-jerk "this is creepy" outrage and fear I see from some folks about this feature.
There is a small difference between ruining ones search history and being brought to the attention of whatever authority is watching[1]. It does seem like a https://xkcd.com/576/ situation applied to search.
1) I would bet our web filter would be sending me some reports
Sometimes the line items in the history list aren't really informative, and if you can't recall what particular day you might have visited the page you might have some homework in front of you. I wish the entire page was catched in some way so I can search for a keyword or use some regex to make sense of my foggy memory. This is a computer for crying out loud, I should be using regex, not feeling like I'm digging through a filing cabinet of reciepts by hand, which is exactly what the experience is when sites title themselves opaque things like "Add Comment | Hacker News" for the 2000th time in your history log.
> Considering the work needed by the website to convince the user to give away the data, and even with approaches like described with the article, we may be overestimating what websites could learn of us by checking if we've visited some random 2, or 4, or 15 sites.
It's true, history hacks were a lot more "fun" when you could still check 10-100s of URLs per second without any user interaction necessary.
You could pull off some very cool and unexpected tricks by applying such knowledge in a clever manner. The important thing to realize is that what you can use history information for goes way beyond "I know where you surfed last summer". That's just a mild privacy problem.
Targeting specific end-points, profile pages, stuff like that, you can leak a lot more interesting information than just someone's browsing habits. Right up to the point where you could (for certain services, under certain circumstances) cook up someone's session key and it suddenly became an actual security exploit.
Still, a couple of those tricks (or perhaps new ones) might still be worth it even if you can only query 10-20 sites. I don't know. It's not a lot of information. Though 33 bits is enough to identify any human on Earth.
> you need to make the user somehow disclose what he sees on the screen, which may often look suspicious.
Well that's the thing what this article is about, right? How it works is, there is a 32x32 grid of "close"-buttons. But you only see one of them, all the others are CSS hidden. That one you click. By virtue of which button it was, the site now knows 10 history-visited-states.
As the article says, if that looked like a "close"-button on a pop-up ad, where's the suspicion?
I use an adblocker. I get a popup, it has nothing but a "close"-button in it. I figure, "guess it only blocked part of the ad or something" and click the "close"-button without even thinking. The pop-up closes, and I've already forgotten that it happened.
The only suspicion I see could be that the "close"-button might appear in a weird, off-centre place. But that matches perfectly with the wonky-adblocker-behaviour theory in my head, so I wouldn't think twice about it.
I voluntarily removed it from the web store after realizing it was caching lots of sensitive data. I eventually started encrypting the stored info but I realized that if the extension ever became very successful, it would become a target and I wasn't comfortable with that.
I hope the developer of this extension will invest more effort in their user's security than a simple blacklist.
Well, it depends how paranoid you are. Sites could still store history even if they say they don't. Like how people realized Facebook wasn't truly deleting your account when you asked for it.
But you make a good point about making search results worse. That's a real risk. It's the classic convenience vs. security/privacy trade off.
For instance, the article quotes the head of MI5 regarding preventing the bombing of the London Stock Exchange in 2010.
I wanted to know more about this, so Googled London Stock Exchange Bomb, and clicked on a few stories, and wanting to find out a bit more about the people involved, I then Googled their names and clicked on a few more links.
All this time, I had the thought at the back of my head: will these searches and clicks put me on a list somewhere?
(for anyone who wants to be saved searching for these terms, here's a quick overview: http://www.telegraph.co.uk/news/uknews/terrorism-in-the-uk/9...)
It's this feeling that I most dislike about it all; something, or someone, somewhere may be watching, and so now I'm questioning myself because some discussion on some site has potentially questionable keywords in its URL.
reply