I’m reviewing this article as part of the reading list for the Learning Analytics Open Course in conjunction with the 2011 Learning and Knowledge Analytics (LAK11) conference (which I presented at) orgainzed by George Siemens of Athabasca University.
According to The Economist article, Untangling the Social web, Social Network Analytsis (SNA) is having a major impact on the way buisiness and govnerments are run, on the way decsions are made. For example, Influencers (customers with influence over other customers) on a telecom network are tracked because they can affect whether their network of people stay with their current providers or change. So, telecom companies are targeting these influcencers with discounts and promotions. Something that they have discovered about influencers are that they:
- receive quick callbacks,
- do not worry about calling other people late at night
- tend to get more calls at times when social events are most often organized, such as Friday afternoons
- make long calls, while received calls are generally short
It’s an interesting definition of influence in comparison with the influence project that Fast Company did last year. Would the definition of influence in a network of people change if using their behavior patterns were not for financial gain? What patterns that were significant did not make the list above because they did not closely relate to maintaining or increasing revenue?
It’s big business:
The article notes that significant investment is going into SNA and that it will only increase.
- IBM expects its sales of SNA software to exceed $15 Billion by 2015. It has spent $11 billion on purchasing SNA software in the past five years.
- Gartner rates it as the number two strategic investment for businesses in 2010.
- Richmond Virginia police are able to save $15,000 of overtime pay on party nites by using SNA to mine Facebook and twitter for areas that are likely to have criminal activity.
- SPADAC‘s, a company that does country analysis, will have gone from $19 million in revenue last year to over $27 million this year.
- “The capture of Saddam Hussein in 2003 was due in large part to the mapping of the social networks of his former chauffeurs, according to Bob Griffin, the chief executive of i2, a British firm which developed the software used in the manhunt. Senior members of the Iraqi regime were mostly clueless about the whereabouts of the former president, says Mr Griffin, but modelling the social networks of his chauffeurs who had links to rural property eventually led to the discovery of his hideout, on a farm near his hometown of Tikrit.”
- “Ms Carley’s model [Kathleen Carley of Carnegie Mellon University in Pittsburgh], known as ORA, analyses a decade of data on such things as weather, land and water disputes, cabinet reshuffles, reactions to corruption, court cases, economic activity and changes in tribal geographic maps. Within the information that emerges are lists of the locals most likely to co-operate with Westerners, with details of the role each would best play.”
- “Brian Uzzi of Northwestern University in Evanston, Illinois, who advises intelligence agencies on democracy-promotion analytics, says diplomatic services are mapping the “tipping point” when ideas go mainstream in spite of government repression.”
- ” A forecasting model developed by Venkatramana Subrahmanian of the University of Maryland does just that. Called SOMA Terror Organization Portal, it analyses a wide range of information about politics, business and society in Lebanon to predict, with surprising accuracy, rocket attacks by the country’s Hizbullah militia on Israel.”
Threat against civil liberties or other types of fair treatment?
- “In some companies, e-mails are analyzed automatically to help bosses manage their workers. Employees who are often asked for advice may be good candidates for promotion, for example.”
Is it fair or wise to measure people’s candidacy for management positions on how often they ask for advice? What if the person is particularly competent in the position and doesn’t need advice? What if someone asks for more advice over the phone or in person than email? Would a company only be promoting people who communicate virtually more than others?
- “The latest version of SAS’s software identifies risky borrowers by examining their social networks and Internal Revenue Service records, she says. For example, an applicant may be a bad risk, or even a fraudster, if he plans to launch a type of business which has no links to his social network, education, previous business dealings or travel history, which can be pieced together with credit-card records. Ms Joyner says the software can also determine if an applicant has associated with known criminals—perhaps his fiancée has shared an address with a parolee. Some insurers reduce premiums for banks that protect themselves with such software.”
This seems like a useful tool, but is it really wise to deny someone alone because their roommate happened to have a checkered past that someone didn’t know about. It seems like a mechanical Senator McCarthy, but this time the mass circumstantial accusations are made by software on a grander scale. How would one be able to confront once’s accusers if it is an algorithm? At least with credit scores some attempt is made to show people what they are being evaluated on and have some chance to contest items put into the calculation.
SNA and other analyitcs methods put emense power into the hands of those who know how to use them. It seems that without standards and laws in place about what types of exclusions and inclusions can be made based on data that you emit, soon many decisions regarding our lives will be made for us. Without a way to validate and participate in the curation of your own data, it will be difficult to know how to protect yourself from actions that could limit choices made available to you.
Untangling the social web. (2010, September 2). The Economist.