Looking at the past few week of Google news, you’ll be forgiven for thinking Google doesn’t do anything else beyond making Android. While there’s sexier stuff going on within Google, the company is also still trying to improve its core user service: search. They’ve launched encrypted search today, and it will be rolled out across the world in the coming days.
Basically, it works pretty much in the same way online banking or other login pages work. Using Secure Sockets Layer connections, the connection between you and Google is encrypted, so that third parties, beyond you and Google, cannot look at your search data.
“When you search on https://www.google.com, an encrypted connection is created between your browser and Google,” said Evan Roseman, Software Engineer at Google, “This secured channel helps protect your search terms and your search results pages from being intercepted by a third party on your network. The service includes a modified logo to help indicate that you’re searching using SSL and that you may encounter a somewhat different Google search experience.”
It is important to note that this is a beta service, so for now, it only works with traditional search results – Google Maps or Google Image search is not yet supported, so switching to those results may take you out of your encrypted connection. It goes without saying that the encrypted connection may lead to slightly slower results; the search results in and of themselves are not affected.
To pre-empt any misunderstanding: encrypted search does not mean Google no longer stores your search data – it only means third parties cannot listen in to the connection between you and Google. “Searching over SSL doesn’t reduce the data sent to Google – it only hides that data from third parties who seek it,” Roseman explains.
Do any of you plan to use the encrypted version of Google?
Well, that’s all fair and good, but lately the worry has been about Google itself invading our privacy.
True enough, but there are any number of things to block various google services, and https://ssl.scroogle.org/ for search.
Exactly. The people that I’m most interested in keeping my search history private from… are Google themselves.
If you lived in Iran you would have different concerns.
I live in Australia I already am concerned.
I love all the stuff coming from Google. I just can’t believe they haven’t released a box we can deploy within companies.
Would I love to use Google Chart API for internal charting? Absolutely! Would I risk sending all our corporate date to Google over the web… Nope.
Just like they have a Google search appliance, they need a Google X appliance They make great products. They just need to get them out to customers in a box
I know Google is ‘not evil’, but I’m interested in keeping our corporate data away FROM GOOGLE. I’m not too concerned with my web search being tapped by someone.
I know its a deployment headache and you have to deal with upgrades… but it’s the only way into many corporate environments.
Edited 2010-05-25 22:02 UTC
Google Office Appliance
Email
Calendar
Docs + Search
If such an appliance existed, and all info stayed within the organization (intranet) I could imagine it would peek a lot of interest.
Edited 2010-05-25 23:00 UTC
Already exists for years now:
http://www.google.com/enterprise/search/gsa.html
We’re talking about more than just “search”.
I’ve been thinking the same. Google apps running off my own hardware and limited to my own network; fantastic. I’d love to point Analytics code at my own back end service even. Provided apps are not built for a specific browser (say.. IE) then it should actually free up much of your upgrade grief. Upgrade the server and software; new version waiting for user browsers in the morning.
Storing my data in a hosting service from a third party; remains a hell no.
I have no problem putting my data at a hosting facility I just want to be the owner of the server hardware and known who has access to the hardware.
That would be acceptable also. I consider a rented locker at a hosting facility as within the network given tunneling and such.
Corporations cannot by definition be evil, but people can, and there are any number of people within Google and even external to Google who have access to the data Google collects. This should be a major concern for everyone – it’s not about doing anything illegal, it’s about our privacy and our right to determine whether we want to be targeted by their advertising based on searches we do.
Just as web indexing should only be allowed on sites that specifically allow it rather than having to specify directives that disallow it (which they really don’t have to honour anyway), collection of data from searches should only be allowed if specifically permitted by the user. Give the user the choice the first time they perform a search in that session and set a session cookie so that there’s no server side tracking required. It would be a very simple thing to implement if Google really cared about privacy.
Alas it will never happen because like any corporation, Google is about making money, and openness of this type would impact their business model.
Edited 2010-05-25 23:10 UTC
I could see it actually helping Google. The reason why Outlook is so popular is because so many people have to use it at work so they already know how to use it, and are comfortable with it. This will also get people used to the idea of working with documents on the web/network so Google Docs and even Office Live will have to compete to get better and better.
The only difference will be one is hosted by your company and the other is from Google or even Microsoft if you’re using Office Live.
To the contrary, corporations are evil by their very nature, since they are deeply immoral.
Moral = If anyone can do it the same way without something very bad happening, it’s fine (Kant’s definition)
Corporation = Making money, no matter how as long as you don’t get caught.
As you said…
First, I feel I have to get this out the way: you’re an idiot. And that’s just not spurious name-calling, you really don’t know what you’re talking about.
If you’re referring to Kant’s categorical imperative, it runs: “Act only according to that maxim whereby you can at the same time will that it should become a universal law.” Which is a far cry from what you’ve paraphrased. In fact, Kant was about as far from a consequentialist as can be imagined; he was one of the champions of deontological ethics.
A corporation is a just a group of people united for a common purpose. Yes, that purpose is to make money, but the ethics of the corporation are the ethics of the group – a corporation won’t try to make money “no matter how” if the ethics of the group run against it. Of course, people can be greedy and selfish and short-sighted, which is why some corporations will act that way as well.
It’s a very popular, but largely unfounded notion that a corporation will do anything to make money. There are hundreds and thousands of corporations that participate in their communities and sponsorship and every benevolent activity you can think of.
But then of course, such facts get in the way of hating ‘the establishment’.
+1
Awesome, awesome reply. Truly.
Already exists for years now:
http://www.google.com/enterprise/search/gsa.html
“Just like they have a Google search appliance, they need a Google X appliance”
we know they have a search appliance. They need appliances for everything thing else they produce.
DISCLAIMER: Google employee here, working in Apps “ops”
I used to work in “corporate IT before joining Google and the reality is that the pace, processes and development models of IT organizations are radically different (and I’d go farther to even say incompatible) with “cloud/internet/saas/insert preferred moniker” software companies.
Consider today’s typical IT org, with more process managers (Change Control Manager, Change Control committees, infrastructure requirement form, etc… ad nauseum) and the protracted deployment for even the most basic application. In one of my previous companies, it now takes up to a MONTH, for a simple DNS change to be done because it needs to be done first in a form, where it is reviewed by a CC process manager (no clue on what it does but has to approve it), submmited to CC commitee where it is “reviewed” by several directors that have no clue what it is either. Then, a report of approved changes comes down and only then it can get executed. <Snore…>
Contrast this with the typical pace at web companies: http://timothyfitz.wordpress.com/2009/02/10/continuous-deployment-a… . This, maybe in not so extreme ways, is very typical.
All of the major Internet apps we all love have almost weekly, if not daily, improvements, some too small to notice, some quite significant. None of that is possible in a “standard” IT organization. Even if it was even possible to decouple the apps from their underlying special infrastructure (a topic which I’m not even going to explore).
The solution to this different gears problem between corp IT and internet apps, would be to do a Point In Time feature freeze, “package” for corp deployment and the manage the patching, upgrades, etc…
But this is very, very costly, would force the companies to maintain legacy versions and adds so much overhead that it would no longer be cost-effective, at all. It is a model that, I believe, is not possible to sustain profitably AND affordable.
In the end, the question becomes: does a company want to get bogged down by this huge anchor, slow down, spend tons of resources for little return OR make fast progress, take the 80-20 rule for the market and (in the case of Google) hope that good ethics, transparency and good internal security & reviews addresses the (valid) concerns of the community?
Maybe we need something in between for large corporations.
An external company doing frequent software upgrades (possible without access to the data) on servers running on redundant hardware.
I’ll replace any http address with an https at the first given opportunity. Cleartext protocols like http, ftp, telnet, ftp, pop, smtp, snmp must al die.. die.. die.. die.. there is no excuse for not using encrypted protocols these days.
(OSAlert, Techrepublick, I’m looking squarely at you with your http login forms. >:| )
I say, address the BS of overpriced third party validation certificates and get the cleartext protocols off the network for good.
Forgive my ignorance, but what’s wrong with self-signed certificates? They’re free to set up and still offer you SSL/TLS encryption.
I’ve got one running on my FTPES server after following the linked wiki:
http://wiki.archlinux.org/index.php/Sftp
(read the 2nd section)
Edited 2010-05-26 07:16 UTC
Well, depending on bit strength, self signed certificates are just fine. I was focusing on the general network so my mind was thinking in terms of the third party certificate a big name site would get. Self signed certificates tend to scare average users (they involve a warning which involves reading). If you visit Google and are asked to accept a self signed cert, you should have questions. That’s also why I suggested reducing the cost of a cert signing as it’s currently an overpriced protection racket in most cases and especially if you want to use SSL/TLS that is actually safe.
It’s because a MITM attack is so much easier. An attacker could be placed in the middle, and hand you HIS self signed certificate, while connecting to the legit site on your behalf and read ALL your data (meaning the ssl is now useless). At least now with a limited number of signing authorities, it’s damn near impossible to do this. If a certificate authority goes rogue, then the browsers just need to remove their root server.
Besides, Startcom provides FREE class 1 ssl certificates and are available in every major browser (except Opera, but I believe they fixed that now). My domain runs off it, and never receiving the “WARNING” when I switch computers or browsers is very reassuring.
If you don’t want the warning prompts from your own self signed certificates, then just install your signing cert into your browser. However, it’s quite a hassle to get visitors to your site if they all get the warning.
Actually, third party certificate validation isn’t as rock solid as people like to think either. Unless you pay the premium protection racket fee for cert that validates all the way back up the chain (usually involving a grand or two in fees and a background check) MITM is still mostly limited by being able to position one inbetween of the two stream ends.
Mix a little Dan Kaminski DNS magic with some Moxie Marlinspike SSL MITM and whammo!
So, it’s still down to bit strength and strong cert validation.
Edited 2010-05-27 18:29 UTC
As someone else explained, because of a self-signed certificate you can’t be sure who you are talking too, it may be encrypted, but if it’s the wrong person. Then who cares it’s encrypted ?
But their are other ways:
https://www.startssl.com/
http://cacert.org/
I know of very little concerns with using https.
1. you need a dedicated address, this problably means: ipv4-address, we are running out. Not good. I would love to see websites adopt: we have https for IPv6 users only.
2. their needs to be enough entropy to do the encryption. Banks recently had DOS-attacks and the https-sites were really slow, not because of CPU-bound encryption (for which possible they already have extra hardware), but because of entropy shortage.
vhosts. Multiple sites/domains sharing a single IP. When the browser asks for the IP, it’s header lists the domain it wants and the webserver presents domain/IP.
It would be much easier if vhosts could share an IP without sharing an SSL cert. The certificate is bound to the domain name not the IP it’s currently hosted on. This may reduce the trust in certificates though as now your still sure of cert/domain but your not sure of location.
This seems like an idea:
http://psivision.blogspot.com/2010/03/data-operations-in-encryption…
That is, the target data is encrypted, as are your commands/operators. The only plain text space is within your perimeter.
This means if someone has access to the target data, it is meaningless. Similarly, what you do with it (transform, query, reduce, etc) is also meaningless as the operations/programs are encrypted too, even while they operate on the encrypted data.
This would solve some of the security fears around cloud computing.
Nice idea – could it be implemented? Or is there a theoretical reason this can’t be done?
[plain text, my machine]
|
|
|
===== encrypt/decrypt at perimeter ================
@
@ encrypted command/program/operator
@
@
[encrypted data space]
It’s called homomorphic encryption ( http://en.wikipedia.org/wiki/Homomorphic_encryption ). Basically, yes, there exist encryption systems such that a remote computer can be given encrypted data and an arbitrary program, convert the program to run on encrypted data, run it, and send back an encrypted result without ever knowing anything about the encrypted data. As that article mentions, such cyptosystems are very recent (the first one was discovered less than a year ago), and their efficiency makes them impracticably slow for many (most?) applications.
I switched to HTTPS as soon as it was announced. I think I’ve lost suggestions, but that’s an acceptable temporary cost. Basically, I try to cut out as much of the unknown as I can. I know Google is under scrutiny, and seems to police themselves well enough for my purposes. I have no idea if my ISP does deep packet inspection, or if my wireless connection or the cable line it’s connected to have been compromised. Besides that, there’s public wifi and school connections that have no expectation of privacy. I just wish I could get all of Google’s services via SSL, like iGoogle.
I assume anything not encrypted is being read. This assumption comes from my boarding school days when I played around with a packet sniffer. For the span of a few minutes, that assumption was veritably true for my classmates since I was the one doing the reading. Then I realized that I really couldn’t care less if the girl in my French class was cheating on her boyfriend, or the guy down the hall was using AIM. So I never bothered with that again, but always assumed that I wasn’t the only one who ever tried that.
Sugestions work, at least on Firefox.
Ok, it’s not quite what you are after but you can at least protect your traffic on the open networks such as school with a quick and easy OpenSSH proxy. You’d need *nix at home but OpenSSH or Putty both work on the mobile side. I tend to use the quick/dirty proxy when traveling as it saves wondering just what is on the untrusted network I connect from.
I suspect the real driver behind this https: trial is still to come. I can’t see a huge win for secure search. Using it with search gives it a good high-volume test.
Perhaps an application or music store for Google’s various products…
So my search is secure but Google is till keeping logs of my search for 9 months and the cookie for 18 months. I should be able to opt out of that!
]{
This is cool. Now only Google can violate our privacy. Hurray!
There’s also https://www.shroomery.org/ssoogle/ and http://scroogle.org/ if you want to search without Google tracking you as well.
Yeah, search will be encrypted, but going to the search results will still get you out of the encrypted connection.
Anyone sniffing on your connection will be able to derive your search terms just by looking at your outgoing links.
You can find top Google keywords for any given link on many sites, ie. seodigger.com
I just checked in 3 browsers (firefox, opera, chrome) and going from https to a different domain atleast doesn’t set a referrer. So no1 can see the search-term.
That’s atleast something.
It would seem this is in direct response to the uncensoring of search results (and subsequent blocking by the Chinese government) that Google recently did in China, or?
I get the feeling it’s more about good PR. The chinese governemnt, like most, ask for top level SSL certs which would allow them to sniff your SSL connection. If your selling encryption to chineses buyers, you’ll need to hand over decryption keys; again, not different from other governments. Based on the theory that PRC can already open Google’s SSL traffic, I see it giving them more ROI from good PR. With several slippups in the last few months, I’m thinking it’s more about company image.
Still, it’s another encrypted protocol where a cleartext should have been replaced long ago so I’m not complaining; another domain goes into my auto-https plugin.
There’s no real downside to this that I see. While we’d all love greater security than this provides, and it obviously doesn’t keep Google from recording what we’re up to, it is an extra layer of security at pretty much no cost. I switched my browser to using the new https search immediately upon finding out about it.
You lose the image search, but being aware of that, it’s easy to find and use, and it’ll probably end up being available with https soon enough anyway.
I see no real downside to this. If anything, it would be nice if more sites used https.