Google is a platform-its a tool that can be abused. That is why generic reassurances such as the lame – Don’t Be Evil – just don’t go anywhere far enough. Google seems to have a problem with policy– the kind of policy that allows a user to make their own decisions about potential risks in using the service. Chris Byrne calls this out clearly in his latest blog. According to Chris:
The bottom line here is that Google should have been up front in their documentation and disclosed what exactly the install process would do, what ports would be needed and why they would be needed. One of the questions in the help section is “Does Desktop Search install malicious software?” The answer they give is “No. When you download and install Google Desktop Search, you’re just getting Desktop Search. That’s it. End of story.” Like a politician in a debate, the answer really is not as clear as it should be. That is just plain wrong and irresponsible of Google.
Taking Chris’ thoughts to their natural conclusion, i believe that corporate IT departments should immediately make a clear commitment to either
a. fully support Google Desktop Search, including end to end security remediation
b. completely ban use of the product on corporate networks
These organizations are already thrashing to try and cope with current security breaches and threats, especially in the context of enterprise desktops. Now Google has added another potential exploit, a new listener to the desktop. Of course SP2 should block the port, as should a personal firewall, but GDS certainly adds a new risk in terms of corporate desktops and operational controls in enterprises.
I would advise the IT department to go for b above and work out whether to support the tool later, once risks can be better assessed and more eyeballs have got to work on the problem. It may seem a shame to lock down end user apps – especially ones that fall into the “productivity” space. But which is worse–annoying a few end users or failing a Sarbanes Oxley audit–i think in this case you may find the CEO is on your side…
As Bruce Schneier keeps trying to help us understand–you can’t remove risk entirely. But you can help people make more informed decisions. Google hasn’t helped its users enough in this case. Google is not just a tool for home users. It it is also now an important business application. Google therefore needs to do a better job of helping enterprise customers build usage policies, especially if it really wants be different from Microsoft, which has had, for example, some email policy issues.
Policy may be boring, but it gives us a basis for decision making that goes beyond gut instinct. Don’t be Evil is just a matter of opinion. Don’t install GDS on a machine used for corporate business, even at home for now–is a sensible IT policy.
October 18, 2004 at 6:36 pm
not to disagree with my esteemed colleague, but i don’t agree 😉 it’s not necessarily that i disagree with the substance of your point, James, or Chris’, they’re legitimate – if a little overdone – concerns.
but rather, i wonder why there is a hue and cry about GDS, when it’s really not a new application category. the same arguments could be made about X1, Copernic, or any of the other search tools out there. or even more to the point, any number of applications that users download – like Kazaa or Limewire.
basically i think IT shops need to make sound decisions for their workstations in general, rather than single out GDS as a security scapegoat. GDS introduces security concerns, true, but it’s hardly the only example of that, and surely not the worst.
Christopher Byrne says:
October 18, 2004 at 8:40 pm
I got some push back on the IT Governance ListServ as well, but I stand by what I said and will post why tonite on the blog. Appreciate the pushback though!
Prentiss Riddle says:
October 19, 2004 at 3:24 pm
I’m new to this issue, and a sentence or two explaining the risk would have helped me make sense of your post.
A wild guess: does GDS send keywords to Google so Google can send back a sidebar of Sponsored Links and Related Searches? Then yes, I see that that would constitute a risk. Google should be up front about it and make it easy to turn the feature off.
That said, I just had a similar discussion with a friend about privacy concerns in Gmail. He was suspicious about storing his e-mail on a service whose business model involved narrowcasting advertising, even if the service claimed that the information flow was one-way from marketers to users. My response was that any ISP or mail provider has the same opportunity to abuse its customers’ privacy, and I’d bet that Google has a stronger motivation than some no-name operation to keep its nose clean.
Of course GDS and Gmail are different animals, as are institutional security and personal privacy, so conclusions about one don’t necessarily apply to the other.
Chris Byrne says:
October 20, 2004 at 2:58 pm
culled by JJG from newsgroup posting (with permission from chris)—
I have seen a number of organizations do a total lockdown, including
> one that would not allow users to add any shortcut icons to their
> desktop and running a process each night to removed unauthorized
> shortcuts (a bit draconian?), but with mixed results.
> It comes down to a balance of trust and controls, the ability to
> support users needs if there is a total lockdown (i.e. handle user
> requests for software installs, updates, etc), the establishment of a
> clear and consistent policy (including penalties for non-compliance),
> documentation, and a strong political will (i.e. the backing of the C-level people).
> Which leads to this thought: C-level types are many times the biggest
> offenders because they have no fear. You wrote “Let’s also suppose the
> environment is not locked down and people do contact tech support
> after installing a software that is not part of the SOE. What good
> solutions do people have to minimise this load?”
> You might adopt a policy which state that these users are told that
> they cannot receive suport for their call because it is an unsupported
> application. Does that policy get applied when it is the CEO/CFO/CIO
> calling, the same CEO/CFO/CIO who calls to get technical support for
> their home computers or so that their children can get something done?
> If you use a policy to fire or otherwise penalize an employee for
> non-compliance, what happens when they take you to court because they
> know that C-level people have not been sanctioned for similar actions?