Ethical Use of Data in Business

I had this paper due for class at Babson. It was a longish assignment, asking us to opine on the use ethical use of data and the general loss of privacy in the business world today. I’m posting it here, as I’m hosting the Carnival of the Capitalists this week, and I figure I ought to have something to contribute.

In any event, I got an A on the paper, so it must be worth something. Read on.

Privacy has become a major source of contention in the business world today. The standard business maxim that one should “know thy customer” has begun to be put to the test. “How much” should a business know about its customers, and “to what end?” Specifically, how does a business balance its business needs with the right of its customers to have some privacy? And how should that business go about implementing a proper privacy policy?

There are numerous ways in which a company can violate a customer’s privacy. For the purposes of this paper, I will define three types of privacy loss that can occur within a business. They are Contrived Loss, Custodial Loss and Legal Loss. Contrived Loss is when a business intentionally acts in such a way as to deprive a customer of his privacy, usually by demanding that they sign away privacy rights in exchange for doing business with (or receiving discounts from) the retailer. The standard End User License Agreement (EULA) or privacy policy of a website is the typical mechanism used online. The second type of loss, Custodial Loss, is when private information is lost due to negligence on behalf of the retailer or theft of the retailer’s data or some combination thereof. It is not always clear just what the liability to the business and the responsibility of the business are in the event of a custodial loss. Finally there is Legal Loss, which is a loss that occurs in the context of some sort of legal action, such as a new regulation requiring the collection of data or the occurrence of a legal event such as a bankruptcy or buyout of a company. The convergence of these losses can have serious consequence for the customer as well as for the business.

On the other hand, there are serious questions about the degree to which consumers truly care about their privacy at all. After all, they sign away their privacy every time they use a credit card, and seem willing to sign away their privacy in exchange for $0.50 discounts at the supermarket. To what degree should businesses care about a consumer’s privacy when the consumer himself, indeed society at large, doesn’t seem to care all that much?

Contrived Loss
I define a contrived loss of privacy as when a business decides intentionally, to collect and/or use data in a way that the customer may find to be a violation if they were fully aware of it. The most common example of this type of loss is found in EULA’s and privacy policies on web sites simply because the policy is published for all to see. These agreements typically ask the consumer to give up some or all of his rights to privacy in exchange for the opportunity to conduct business with the company.

One common example that has sprung up recently is the online freebie. Since the enactment of the Do Not Call list by the FTC, telemarketers have been at a loss as to how to get permission to call people’s houses. One way that they do this is through the online give-away, where a user fills out their name, address and phone number to get a free sample of some product or other, or to be entered into a sweepstakes. The legal verbiage that accompanies such giveaways commonly contains language giving the company (or its affiliates) permission to contact you with additional offers. Typical language may include, “We may sell the personal information that you supply to us and we may work with other third party businesses to bring selected retail opportunities to out members via direct mail, e-mail, and telemarketing.”

Often times, it seems as if the lawyers for companies go overboard in trying to acquire the rights to use data for their companies, without giving thought to the obvious business implications of using such heavy handed language. Such is the case with Hilton Hotels. In their online privacy policy as of June 24th (since removed after an outcry on the Internet), after defining what types of information may be collected and used at the website, including the typical information required to reserve a hotel room and access a website, the agreement reads, “You agree that HHC (Hilton Hotel Corporation) shall own all Information.” And goes on to assert that they may go so far as to publish such information, or share it with anyone they see fit.

Now consider for a moment what information a hotel gathers in the process of taking a reservation over a website. They know your travel plans, including precise location. They know your credit card number. They may know your travel history if you use a frequent flyer or rewards points program to book your room. After you’ve stayed, they know what you ate at the hotel, and if there’s a casino there how much you gambled. Their surveillance cameras may even have taped what you did and who you spent time with at the hotel. And they reserve the right to publish such information? Traditionally, hotels have prided themselves on maintaining the privacy of their patrons, as if they were at home away from home. What possible business purpose could it serve to claim the right to collect and publish any and all data it has about its customers?

The answer was that in the end, it didn’t and on July 5th, Hilton replaced its policy with a reasonable one. The lesson here is to be sure that the data you collect, and the data you claim you have a right to use, has an actual business purpose behind it. Far too often businesses either hand of the task of figuring that out to overzealous lawyers, or they themselves get caught up in believing their assets aren’t the facilities that make up their businesses and the relationships that bind their customers to them, but rather the data generated by people buying their product.

Sometimes, a vendor uses its privacy policy to extract concessions it would never get in a face-to-face transaction. For example, in the United States, companies are often required to offer an “opt-out” option for their customers, allowing them to decline to receive to receive additional offers or to have their information shared with affiliates and partner companies. In Europe, it’s the opposite, where companies are required to ask permission, or ask customers to “opt-in” to receive additional offers or to have their information shared. One vendor, however, does none of the above.

Ticketmaster, known for having a near monopoly on electronic concert ticket sales, uses the following language in its privacy policy:

By purchasing a ticket, or completing a registration form so that you are able to access a purchase page for a ticket, to a concert, game or other event on the Site, you consent (i.e., you opt-in) to us sharing your personal information with the venues, promoters, artists, teams, leagues and other third parties associated with that concert, game or other event (“Event Partners”). We cannot offer you a separate opportunity to opt-out, or not to consent, to our sharing of your personal information with them.

The policy goes on to say that if you want to opt-out of their event partners’ programs, you’ll have to contact them directly. Of course, Ticketmaster does not state who these partners are, and does not give any means of contacting these partners, but they do state that credit card numbers are only shared under special circumstances. Tickmaster says that because they are only an agent for the venues for whom they sell tickets, that it is the venues’ privacy policies that apply, and that their responsibility is not relevant.

But Ticketmaster’s policy just doesn’t pass the smell test. Surely, buying a ticket through a travel agent doesn’t yield the same type of policy. Travelocity, an online travel agent, provides a clear and relatively unambiguous policy:

When you reserve or purchase travel services through Travelocity, we provide information about you to the airline, car-rental agency, hotel, travel agency or other involved third party to ensure the successful fulfillment of your travel arrangements. We do not sell individual customer names or other private profile information to third parties and have no intention of doing so in the future. Occasionally, Travelocity will hire a third party to act on our behalf for projects such as market-research surveys and contest-entry processing and will provide information to these third parties specifically for use in connection with these projects. The information we provide to such third parties is protected by a confidentiality agreement and is to be used solely for completing the specific project.

Travelocity goes on to specify their opt-out policy, whereby you can stop being pestered by their email newsletters and the like.

But it isn’t just Ticketmaster that is failing to offer an opt-out mechanism for its customers. Ford Credit and Chrysler Financial privacy policies, both refuse to offer opt-outs. The relevant law here is the Gramm-Leach-Bliley Act. The law mandates that the bank send its customers its privacy policy once a year, and gives customers an opportunity to opt-out of having information shared with non-affiliated companies. However, the definition of affiliation is what makes this law so easy to evade. Parties involved in joint marketing agreements or who are service providers to the financial institution can count as affiliated, even though the financial institution in question doesn’t own them. So Ford and Chrysler send out privacy policies that say there’s nothing to opt out of, because literally, there is nothing to opt-out of. The banks simply structure their agreements such that every vendor they want to share information with is legally an “affiliate,” thus evading the need for an opt-out.

The worst example yet may be that of, which states, “Except as limited below, we reserve the right to use or disclose your personally identifiable information for business reasons in whatever manner desired.” There seems to be a psychology at work here, which says that a company should collect any and all data it can, and reserve the right to use it in any way it wishes… just in case. But adopting that tact can lead to unforeseen consequences, not only if that data is lost by that business, as in, stolen or subpoenaed in court. It may ruin its reputation, and damage its future business prospects as a result.

Sometimes, businesses even intentionally try to lose customers. Retailers KB Toys, Sports Authority, and Express are implementing “electronic blacklists” whereby they attempt to steer undesirable customers away from their establishments. In some cases, these are customers who only purchase loss-leading merchandise, and in other cases, these are customers who simply return “too many” items. One customer of the Express clothing chain was told that she returns too much merchandise, and would not be allowed to return any more. The customer, who used to spend $2,000 per year at the store, is now too embarrassed to return. It would seem as if this is a formula for being penny wise and pound foolish with one’s data. After all, even if she weren’t a profitable customer on her own, how many friends did she take with her to the store when she used to shop, friends who will no longer be returning? Was the cost savings acquired by losing this customer worth the bad publicity it generated? Best Buy is also attempting to steer unprofitable customers from its stores, but is using more conventional means. As its President, Bradbury Anderson states, “Culturally I want to be very careful. The most dangerous image I can think of is a retailer that wants to fire its customers.” Indeed.

Custodial Loss
Custodial loss is when a customer’s privacy is lost because a vendor did not maintain adequate security over its customer data. Such loss may incur liability for the company. A company should therefore, in determining its security policy, not only look at how secure its data is, but whether or not it needs such data in the first place. If there is no legitimate business reason for having data, whether it be customer data or employee data or even company emails, then the company ought to discard it, rather than run the possibility of being liable for its loss.

John Patrick, former Vice President of IBM and current board member for several tech companies, writes about one such issue, with free online publications requesting registration data on its readers. The problem with doing this, aside from the obvious question about whether people are entering valid data in the first place, is the security risk caused by asking people to remember so many usernames and passwords. Some sites request an email address as a username, while others request a string of characters, some with a numeric character included. Password requirements differ from site to site as well.

Security experts recommend using different passwords for each site visited. But given the amount of sites that require passwords, it is thought that few actually do so, but that instead people use variants of the same password over and over again. Given that many content sites do not even use secure connections when asking for such passwords, people are undoubtedly sending unencrypted usernames and passwords over the Internet, usernames and passwords which may be the same ones that they use for their online banking or other commercial purposes. What liability does a content site run when it asks for a username and password for no identifiable purpose? Furthermore, what liability does a retailer run by requiring only a username and a password for identification purposes to complete a transaction?

Patrick runs through the steps required for genuine security on the Internet: Digital ID’s with Public Key encryption. Such a system would identify the user, authenticate who he is, authorize him to do certain things (and prevent him from doing other things), maintain confidentiality, maintain data integrity and enable non-repudiation, namely provide proof of the transaction such that the user cannot deny he engaged in the transaction. There is nothing new about such systems. There have been attempts to commercially deploy such systems, with little success.
One such company was Zero Knowledge ( Zero Knowledge had a system called Freedom, which maintained user privacy to such a degree that even Zero Knowledge itself would be unable to violate it. However, the company was unable to sell the system either as a program for individual user to purchase, or as an add-on bonus for ISP’s to offer.

Adam Shostack, the company’s directory of security, wrote a paper entitled, “Economic Barriers to the Deployment of Existing Privacy Technologies” in which he delineated the reasons why such products seem to fail.

It is easy to see that many privacy technologies obey Metcalfe’s law and therefore exhibit network externalities – their marginal value to a user increases with their expected number of users. Anonymous file sharing systems will become truly beneficial to users only when a large array of content can be readily, easily accessed. Anonymous email is unidirectional (and therefore less useful) unless both parties use the anonynimizing network. The anonymity offered by such a network is bounded by the number of users. Similarly, electronic cash will only become useful if many merchants accept it. We may infer from this that DRM and other e-commerce systems are unlikely to push the acceptance of cryptographic ecash but rather will continue with existing payment methods such as credit cards.

Arguably, the same could be said for any security system. So long as the marginal cost of securing data exceeds the loss incurred by a breach of that security, companies are unlikely to upgrade their systems. And incompatibilities between security systems only increase that cost to users and companies, further discouraging their use.

The central problem to a market based system for privacy and security is that while the loss to any one company or individual of their privacy can be catastrophic, the likelihood that that loss will occur to any one company or individual is minimal. And therefore, only the minimum is spent trying to defend against such loss. Consider “The Club”, that locking bar that people put on their steering wheels to deter car theft. The Club is easy to defeat (just saw through the steering wheel) and yet people use it because it tells the thief to go steal the car that is marginally easier to steal. So long as people, whether they be consumers or corporate executives, view the privacy/security problem that way, we are sure to remain vulnerable to custodial loss of privacy in the future.

The question of custodial loss liability is a complex one. While liability always rests with the person who steals the data, or in cases of gross negligence (such as the recent theft of computers from Wells Fargo) with the custodian of the data, in other cases liability is less clear. In large part that’s because so many of our systems rely on the use of Social Security numbers as both an identifier and authenticator in transactions. Noted tech columnist Robert Cringely recently returned home from vacation to find that his identity had been stolen while he was away. Who is Cringely to blame for this? Assuming he didn’t do something stupid and hand out his Social Security number on the back of his business cards, is he to blame the credit bureau that gave out his information incorrectly? Or is he to blame the unknown institution that revealed his Social Security number without his permission? What recourse does he really have in this matter? What’s most maddening about it is that even the addition of something as simple as a pin, verified by the Social Security Administration, to go along with the number, would go a long way towards alleviating these problems. But they don’t offer that. And if we, collectively as a society, don’t have the political will to fix this system which is central to all our financial privacy, if we only have the political will to pursue bank fraud cases over $15,000, then why should a merchant, online or otherwise, have any greater will to do better? Unfortunately, our society in general has decided to chance our collective information security, and our privacy, with a roll of the dice, and simply hope that it’s the other guy who gets screwed.

Legal Loss
The most immediate thing that comes to mind with legal loss is the loss of privacy engendered when a subpoena reveals customer data that the customer would rather not have revealed. But that’s actually only part of the risk. The principle risk is that the law demands that businesses collect information that they previously had no requirement to collect, that consumers may not otherwise want them to collect, but that may provide a business benefit. These types of losses have the potential to be the most intrusive of all.

The federal government has required that all cell phones sold have location tracing capabilities so that 911 responders can know where to go in the event of an emergency. While such a feature may be of benefit to the consumer in the rare case of an emergency, by and large most people would not view having their phone report their whereabouts 27/7 as a feature they’d want to have. But from the phone company’s perspective, it’s an opportunity to market location based services to the customer.

A similar thing is happening with respect to automobiles, which National Transportation Safety Board (NTSB) wants to come equipped with the equivalent of black boxes found on passenger jets today. But imagine what happens when that data is uploaded to a central database, and the insurance company wants access to it to determine how you’re driving skills are. Or when the state government wants the data synchronized to the car’s GPS so they can issue speeding tickets without setting up speed traps. The possibilities are endless, yet many companies seem to view the government requirements as an opportunity to collect data and use it, even if they aren’t sure what they would use it for.

Currently, GM has voluntarily put these black boxes in every car sold since 2004, and only in California are they required to tell their customers about them.

So let’s walk ourselves through a hypothetical scenario where these various types of privacy loss converge, and see how they can cause problems. Professor of International Business Bob Smith of Nosbab College has been having marital difficulties with his wife. After an argument with his wife one night, he storms out of the house for a few hours. His wife goes through his mail, and sees an offer for a package vacation to Thailand, but the offer seems to emphasize the seedier sides of Bangkok, as opposed to a normal tourist package. Suspicious, his wife hires a private investigator who manages to “acquire” data from the black box and GPS in Bob’s car (now legally required). The data shows that the night Bob stormed out of the house, he drove around Chinatown a bit, and eventually parked on a side street, leaving about an hour later. From all the data collected, Bob’s wife concludes that Bob is paying for sex, and wants a divorce.

In fact, here’s what happened. Bob was planning an academic trip to Thailand, and searched around a bit on travel packages. The website he used sold this information to a number of travel vendors, one of whom concluded that a middle aged man looking to travel alone to Bangkok was interested in a sex tour, and took it upon themselves to mail such information to his home address. Meanwhile, the night Bob stormed out, he drove to Chinatown to get a bite with a friend and commiserate over the state of his marriage. When he arrived, he tried to avoid paying for parking, and so he went around the block a few times before he found a spot on a side street. An hour later, he and his friend concluded their conversation, and Bob returned home.

So where can Bob go to settle his grievances? Can he sue his car company for leaving his black box data insecure? Can he sue the website he searched for trips through for sharing his information? The solution is unclear, but the collection and sharing of this data may pose some liability for the companies that Bob does business with, assuming he can determine which ones shared his information, legally or otherwise.

So what is a business to do? On one level, it would appear that consumers are relatively unconcerned about their privacy. Consumers seem to place a higher value on convenience than on privacy. According to Australia’s Federal Privacy Commissioner, only 41% of users even bother to set their browsers to reject cookies when web surfing. If users cannot be bothered to protect themselves by this small amount, then why should businesses bother to protect users any more than they already do, especially when privacy features may cause inconvenience to the consumer, thereby risking sales that the business may otherwise obtain were the safeguards not in place?

Part of the answer would seem to be to only retain data that has a real and tangible business need. And that goes for non-customer data as well. Wal-mart is currently being sued based on human resources data discovered during trial, data that is unusually rich and likely serves no real business need. Collection and retention of such data only opens up the company to potential liability down the road.

Furthermore, as businesses are required to collect more and more data due to legal requirements, they should resist the temptation to use this data for purposes other than those of fulfilling those legal requirements. One could argue that businesses have no authority to do anything more, and I suspect that this issue is a lawsuit waiting to happen. So while one could take the advice of Mark Street of Computing Magazine, and make it “the goal of every large corporation or organisation to find out as much about their customers as they can, to help satisfy their customers’ needs”, some caution would be well advised in using data collected for other purposes.

Then again, perhaps it’s the case that the age of privacy was just a transitory phase in human development. Just two hundred years ago, most of us lived in small villages, typically in one room with the rest of our families, where everyone knew absolutely everyone else’s business. Only for about a two hundred year period, in large cities with minimal data processing capabilities, have we been able to live lives where significant amounts of knowledge about ourselves are known only to us, and not our friends, family and neighbors. Perhaps we are just returning to what throughout most of human history has been just the normal state of affairs, a life with minimal privacy. Perhaps in that case, we simply ought to take Scott McNealy’s advice and just “get used to it.”


Leave a Reply