Thelles Rodriguez (Taylor Rodriguez) is ready to fly out, she put the trip into the clouds, and then everything is automated: luggage outside the door is guarded by intelligent streetlights, contact lenses equipped with a monitor to guide her into the designated location of the vehicle, the vehicle to achieve unmanned, automatic driving to the airport. All of this was so simple and quick that all kinds of equipment were closely monitored by Taylor to detect abnormal signs of her face, mood, or walking.
The situation is in the hands of the White House, and it should be the President's Advisory Council on Science and Technology. Last week, the Committee published a report analysing the big data scenarios that humans may face. Published together, there is a "90 days Big Data" practice assessment, the main head of the White House consultant Podesta (John Podesta).
Big Data has brought miracles to many fields, including health care, crime investigations, smart homes, education, law enforcement and employment. But, for the convenience of sacrificing privacy (as the Thelles Rodriguez described above), is that what we really want? If so, how can we guarantee the autonomy of personal information?
When it comes to the White House's commitment to big data development, the researchers, pundits and privacy defenders I've contacted agree: according to the assessment, many people have a certain understanding of the risk of big data. Among the various risks, it is particularly noteworthy that large data discrimination is involved.
Now that the White House is aware of the problem, let's wait and see what the next steps it will take. In addition, the assessment has made a number of recommendations, including amendments to the Electronic Communications Privacy Act and the Consumer Privacy Act introduced by President Obama in 2012. According to the former, it is not unlawful for law enforcement officers to steal other people's mails without permission. It is clear that the legislative process is not following technological developments. Experts worry that the institutions that control and sell our personal data will take advantage of this opportunity to infringe upon the interests of the disadvantaged groups. It was too late to wait until the legislator knew the situation.
You may feel nothing to worry about. But in the future, big data may affect your life in the following three ways.
Occupational discrimination
For example, Darnell (Darnell) and Jeffrey (Geoffrey) are competing for the same job. The boss went to Google to search their names for further information. When searching for "Darnell", the page has a lot of ads about the head-shot search and the criminal background query; When searching for "Jeffrey", there are some innocuous ads, such as food processors and Honolulu trips.
The White House said in an assessment that research has shown that online search for typical black names, such as Darnell and Latanya Swini (LaTanya Sweeney, the Federal Trade Commission's chief technical expert), will lead to more searches and criminal background enquiries than the typical white names. Algorithm based advertising seems harmless, but first impressions are very important, and similar advertising optimization techniques can put an entire population at a disadvantage.
What is the realization of the discrimination based on the name of advertising? However, similar marketing tools have emerged and have had some negative effects. Businesses that make a living selling data, such as Experian and Acxiom, pack thousands of consumer files into a demographic map and sell them to marketers or other customers. A Senate committee survey found that data sellers put labels on their files when they sell them, such as "other ethnic groups struggling in the second category" to make it easier for marketers to find their targets.
The data industry is an industry worth billions of of dollars. Industry companies claim that the data they sell (such as online shopping information and health status) does not contain names. But a number of studies have shown that without the name of the information processed, can easily reconstruct the complete personal files. As you can imagine, once your personal information falls to employers, retailers, or lenders, they will use the information to penetrate every aspect of your life.
Judicial discrimination
After the unsuccessful job, Darnell straight home, but found that the police have been waiting for him at the door. The police wanted him to help with the investigation, because Darnell and his family were suspected of committing DNA from the crime scene.
Some law enforcement agencies in the United States have learned from the British police that, in investigating cases, using large data technology, law enforcement officials will take action once the crime scene DNA is found to resemble the criminal justice database. The genetic data in the criminal Justice database were collected from criminals, but in some states, the arrests were also collected. Law enforcement officers will investigate the genetic part of the anastomosis, which is called familial screening. This has both advantages and disadvantages, on the one hand it helps to lock up criminals quickly, on the other hand, it will treat many innocent people as suspects, such as relatives of those who have been genetically acquired. In many American States, such as California and New York, police have started using similar methods to investigate, and the results are good and bad.
Alondra Nelson, a sociologist at Columbia University, said: "We should be cautious when applying large data, otherwise many people will be involved in the criminal justice system," Arandra Nielsen. Nelson is primarily responsible for the interaction between the judiciary, large data, and DNA-specific technologies. By using DNA-specific techniques, the police will likely be able to focus on an ethnic group, which is very different from traditional interception and interrogation methods. Once the number of ethnic minorities in the criminal justice system is overweight, the DNA database will result in a larger scale of judicial discrimination.
According to the assessment, the police also used large data to predict the geographical "hot spot" of car thefts. Kate Claufford (Kate Crawford), a researcher at Microsoft and MIT, points out that similar algorithms based predictive behavior can discriminate against residents of a whole region. "From the point of view of justice, we should be alerted to this kind of behaviour by the police," she said. ”
Consumer discrimination
After the interview failure, the police investigation, Darnell decided to run relaxed. In order to record the running path, he opened a mobile phone app. After running, Darnell finally happy up, behold the app but a pop-up ad, said the front of the force is a discount promotion.
Ryan Caro Ryan Calo, an assistant professor at the American University of Washington Law School, said the situation was, in essence, an immoral act. This discrimination based on mobile-side and Web-tracking will one day come true. On the basis of mastering consumer's habits, retailers can set different sales schemes for different people. If the business is allowed to do so, what will be the impact on society? Carol expressed concern about the problem.
The assessment also calls for an amendment to the Consumer Privacy Act enacted by the White House in 2012. Under the Act, individuals should have control over the collection and use of private information.
However, Carlo did not consider the recommendations provided by the assessment to be sufficient. "I think the assessment is a clear recognition of the problem, that is, discrimination and power imbalance," he said. However, it offers a solution that is not complete. I am concerned that businesses will use large data to seek the interests of vulnerable consumers. Businesses can not only understand the consumer's details, but also create opportunities for interaction, which will promote the occurrence of infringement. ”