A whistleblower has revealed to the Observer how Cambridge Analytica – a company owned by the hedge fund billionaire Robert Mercer, and headed at the time by Trump’s key adviser Steve Bannon – used personal information taken without authorisation in early 2014 to build a system that could profile individual US voters, in order to target them with personalised political advertisements.
Whistleblower describes how firm linked to former Trump adviser Steve Bannon compiled user data to target American voters
The data analytics firm that worked with Donald Trump’s election team and the winning Brexit campaign harvested millions of Facebook profiles of US voters, in one of the tech giant’s biggest ever data breaches, and used them to build a powerful software program to predict and influence choices at the ballot box.
A whistleblower has revealed to the Observer how Cambridge Analytica – a company owned by the hedge fund billionaire Robert Mercer, and headed at the time by Trump’s key adviser Steve Bannon – used personal information taken without authorisation in early 2014 to build a system that could profile individual US voters, in order to target them with personalised political advertisements.
Christopher Wylie, who worked with a Cambridge University academic to obtain the data, told the Observer: “We exploited Facebook to harvest millions of people’s profiles. And built models to exploit what we knew about them and target their inner demons. That was the basis the entire company was built on.”
Documents seen by the Observer, and confirmed by a Facebook statement, show that by late 2015 the company had found out that information had been harvested on an unprecedented scale. However, at the time it failed to alert users and took only limited steps to recover and secure the private information of more than 50 million individuals.
The New York Times is reporting that copies of the data harvested for Cambridge Analytica could still be found online; its reporting team had viewed some of the raw data.
The data was collected through an app called thisisyourdigitallife, built by academic Aleksandr Kogan, separately from his work at Cambridge University. Through his company Global Science Research (GSR), in collaboration with Cambridge Analytica, hundreds of thousands of users were paid to take a personality test and agreed to have their data collected for academic use.
However, the app also collected the information of the test-takers’ Facebook friends, leading to the accumulation of a data pool tens of millions-strong. Facebook’s “platform policy” allowed only collection of friends’ data to improve user experience in the app and barred it being sold on or used for advertising. The discovery of the unprecedented data harvesting, and the use to which it was put, raises urgent new questions about Facebook’s role in targeting voters in the US presidential election. It comes only weeks after indictments of 13 Russians by the special counsel Robert Mueller which stated they had used the platform to perpetrate “information warfare” against the US.
Cambridge Analytica and Facebook are one focus of an inquiry into data and politics by the British Information Commissioner’s Office. Separately, the Electoral Commission is also investigating what role Cambridge Analytica played in the EU referendum.
“We are investigating the circumstances in which Facebook data may have been illegally acquired and used,” said the information commissioner Elizabeth Denham. “It’s part of our ongoing investigation into the use of data analytics for political purposes which was launched to consider how political parties and campaigns, data analytics companies and social media platforms in the UK are using and analysing people’s personal information to micro-target voters.”
On Friday, four days after the Observer sought comment for this story, but more than two years after the data breach was first reported, Facebook announced that it was suspending Cambridge Analytica and Kogan from the platform, pending further information over misuse of data. Separately, Facebook’s external lawyers warned the Observer it was making “false and defamatory” allegations, and reserved Facebook’s legal position.
The revelations provoked widespread outrage. The Massachusetts Attorney General Maura Healey announced that the state would be launching an investigation. “Residents deserve answers immediately from Facebook and Cambridge Analytica,” she said on Twitter.
The Democratic senator Mark Warner said the harvesting of data on such a vast scale for political targeting underlined the need for Congress to improve controls. He has proposed an Honest Ads Act to regulate online political advertising the same way as television, radio and print. “This story is more evidence that the online political advertising market is essentially the Wild West. Whether it’s allowing Russians to purchase political ads, or extensive micro-targeting based on ill-gotten user data, it’s clear that, left unregulated, this market will continue to be prone to deception and lacking in transparency,” he said.
Last month both Facebook and the CEO of Cambridge Analytica, Alexander Nix, told a parliamentary inquiry on fake news: that the company did not have or use private Facebook data.
Simon Milner, Facebook’s UK policy director, when asked if Cambridge Analytica had Facebook data, told MPs: “They may have lots of data but it will not be Facebook user data. It may be data about people who are on Facebook that they have gathered themselves, but it is not data that we have provided.”
Cambridge Analytica’s chief executive, Alexander Nix, told the inquiry: “We do not work with Facebook data and we do not have Facebook data.”
Wylie, a Canadian data analytics expert who worked with Cambridge Analytica and Kogan to devise and implement the scheme, showed a dossier of evidence about the data misuse to the Observer which appears to raise questions about their testimony. He has passed it to the National Crime Agency’s cybercrime unit and the Information Commissioner’s Office. It includes emails, invoices, contracts and bank transfers that reveal more than 50 million profiles – mostly belonging to registered US voters – were harvested from the site in one of the largest-ever breaches of Facebook data. Facebook on Friday said that it was also suspending Wylie from accessing the platform while it carried out its investigation, despite his role as a whistleblower.
At the time of the data breach, Wylie was a Cambridge Analytica employee, but Facebook described him as working for Eunoia Technologies, a firm he set up on his own after leaving his former employer in late 2014.
The evidence Wylie supplied to UK and US authorities includes a letter from Facebook’s own lawyers sent to him in August 2016, asking him to destroy any data he held that had been collected by GSR, the company set up by Kogan to harvest the profiles.
That legal letter was sent several months after the Guardian first reported the breach and days before it was officially announced that Bannon was taking over as campaign manager for Trump and bringing Cambridge Analytica with him.
“Because this data was obtained and used without permission, and because GSR was not authorised to share or sell it to you, it cannot be used legitimately in the future and must be deleted immediately,” the letter said.
Facebook did not pursue a response when the letter initially went unanswered for weeks because Wylie was travelling, nor did it follow up with forensic checks on his computers or storage, he said.
“That to me was the most astonishing thing. They waited two years and did absolutely nothing to check that the data was deleted. All they asked me to do was tick a box on a form and post it back.”
Paul-Olivier Dehaye, a data protection specialist, who spearheaded the investigative efforts into the tech giant, said: “Facebook has denied and denied and denied this. It has misled MPs and congressional investigators and it’s failed in its duties to respect the law.
“It has a legal obligation to inform regulators and individuals about this data breach, and it hasn’t. It’s failed time and time again to be open and transparent.”
A majority of American states have laws requiring notification in some cases of data breach, including California, where Facebook is based.
Facebook denies that the harvesting of tens of millions of profiles by GSR and Cambridge Analytica was a data breach. It said in a statement that Kogan “gained access to this information in a legitimate way and through the proper channels” but “did not subsequently abide by our rules” because he passed the information on to third parties.
Facebook said it removed the app in 2015 and required certification from everyone with copies that the data had been destroyed, although the letter to Wylie did not arrive until the second half of 2016. “We are committed to vigorously enforcing our policies to protect people’s information. We will take whatever steps are required to see that this happens,” Paul Grewal, Facebook’s vice-president, said in a statement. The company is now investigating reports that not all data had been deleted.
Kogan, who has previously unreported links to a Russian university and took Russian grants for research, had a licence from Facebook to collect profile data, but it was for research purposes only. So when he hoovered up information for the commercial venture, he was violating the company’s terms. Kogan maintains everything he did was legal, and says he had a “close working relationship” with Facebook, which had granted him permission for his apps.
The Observer has seen a contract dated 4 June 2014, which confirms SCL, an affiliate of Cambridge Analytica, entered into a commercial arrangement with GSR, entirely premised on harvesting and processing Facebook data. Cambridge Analytica spent nearly $1m on data collection, which yielded more than 50 million individual profiles that could be matched to electoral rolls. It then used the test results and Facebook data to build an algorithm that could analyse individual Facebook profiles and determine personality traits linked to voting behaviour.
The algorithm and database together made a powerful political tool. It allowed a campaign to identify possible swing voters and craft messages more likely to resonate.
“The ultimate product of the training set is creating a ‘gold standard’ of understanding personality from Facebook profile information,” the contract specifies. It promises to create a database of 2 million “matched” profiles, identifiable and tied to electoral registers, across 11 states, but with room to expand much further.
At the time, more than 50 million profiles represented around a third of active North American Facebook users, and nearly a quarter of potential US voters. Yet when asked by MPs if any of his firm’s data had come from GSR, Nix said: “We had a relationship with GSR. They did some research for us back in 2014. That research proved to be fruitless and so the answer is no.”
Cambridge Analytica said that its contract with GSR stipulated that Kogan should seek informed consent for data collection and it had no reason to believe he would not.
GSR was “led by a seemingly reputable academic at an internationally renowned institution who made explicit contractual commitments to us regarding its legal authority to license data to SCL Elections”, a company spokesman said.
SCL Elections, an affiliate, worked with Facebook over the period to ensure it was satisfied no terms had been “knowingly breached” and provided a signed statement that all data and derivatives had been deleted, he said. Cambridge Analytica also said none of the data was used in the 2016 presidential election.
Steve Bannon’s lawyer said he had no comment because his client “knows nothing about the claims being asserted”. He added: “The first Mr Bannon heard of these reports was from media inquiries in the past few days.” He directed inquires to Nix.
How Cambridge Analytica turned Facebook ‘likes’ into a lucrative political tool
The algorithm used in the Facebook data breach trawled though personal data for information on sexual orientation, race, gender – and even intelligence and childhood trauma
The algorithm at the heart of the Facebook data breach sounds almost too dystopian to be real. It trawls through the most apparently trivial, throwaway postings –the “likes” users dole out as they browse the site – to gather sensitive personal information about sexual orientation, race, gender, even intelligence and childhood trauma.
A few dozen “likes” can give a strong prediction of which party a user will vote for, reveal their gender and whether their partner is likely to be a man or woman, provide powerful clues about whether their parents stayed together throughout their childhood and predict their vulnerability to substance abuse. And it can do all this without an need for delving into personal messages, posts, status updates, photos or all the other information Facebook holds.
Some results may sound more like the result of updated online sleuthing than sophisticated data analysis; “liking” a political campaign page is little different from pinning a poster in a window.
But five years ago psychology researchers showed that far more complex traits could be deduced from patterns invisible to a human observer scanning through profiles. Just a few apparently random “likes” could form the basis for disturbingly complex character assessments.
When users liked “curly fries” and Sephora cosmetics, this was said to give clues to intelligence; Hello Kitty likes indicated political views; “Being confused after waking up from naps” was linked to sexuality.
These were just some of the unexpected but consistent correlations noted in a paper in the Proceedings of the National Academy of Sciences journal in 2013. “Few users were associated with ‘likes’ explicitly revealing their attributes. For example, less than 5% of users labelled as gay were connected with explicitly gay groups, such as No H8 Campaign,” the peer-reviewed research found.
The researchers, Michal Kosinski, David Stillwell and Thore Graepel, saw the dystopian potential of the study and raised privacy concerns. At the time Facebook “likes” were public by default.
“The predictability of individual attributes from digital records of behaviour may have considerable negative implications, because it can easily be applied to large numbers of people without their individual consent and without them noticing,” they said.
“Commercial companies, governmental institutions, or even your Facebook friends could use software to infer attributes such as intelligence, sexual orientation or political views that an individual may not have intended to share.”
To some, that may have sounded like a business opportunity. By early 2014, Cambridge Analytica chief executive Alexander Nix had signed a deal with one of Kosinski’s Cambridge colleagues, lecturer Aleksandr Kogan, for a private commercial venture, separate from Kogan’s duties at the university, but echoing Kosinski’s work.
The academic had developed a Facebook app which featured a personality quiz, and Cambridge Analytica paid for people to take it, advertising on platforms such as Amazon’s Mechanical Turk.
The app recorded the results of each quiz, collected data from the taker’s Facebook account – and, crucially, extracted the data of their Facebook friends as well.
The results were paired with each quiz-taker’s Facebook data to seek out patterns and build an algorithm to predict results for other Facebook users. Their friends’ profiles provided a testing ground for the formula and, more crucially, a resource that would make the algorithm politically valuable.
To be eligible to take the test the user had to have a Facebook account and be a US voter, so tens of millions of the profiles could be matched to electoral rolls. From an initial trial of 1,000 “seeders”, the researchers obtained 160,000 profiles – or about 160 per person. Eventually a few hundred thousand paid test-takers would be the key to data from a vast swath of US voters.
It was extremely attractive. It could also be deemed illicit, primarily because Kogan did not have permission to collect or use data for commercial purposes. His permission from Facebook to harvest profiles in large quantities was specifically restricted to academic use.
And although the company at the time allowed apps to collect friend data, it was only for use in the context of Facebook itself, to encourage interaction. Selling that data on, or putting it to other purposes, – including Cambridge Analytica’s political marketing – was strictly barred.
It also appears likely the project was breaking British data protection laws, which ban sale or use of personal data without consent. That includes cases where consent is given for one purpose but data is used for another.
The paid test-takers signed up to T&Cs, including collection of their own data, and Facebook’s default terms allowed their friends’ data to be collected by an app, unless they had changed their privacy settings. But none of them agreed to their data possibly being used to create a political marketing tool or to it being placed in a vast campaign database.
Kogan maintains everything he did was legal and says he had a “close working relationship” with Facebook, which had granted him permission for his apps.
Facebook denies this was a data breach. Vice-president Paul Grewal said: “Protecting people’s information is at the heart of everything we do, and we require the same from people who operate apps on Facebook. If these reports are true, it’s a serious abuse of our rules.”
The scale of the data collection Cambridge Analytica paid for was so large it triggered an automatic shutdown of the app’s ability to harvest profiles. But Kogan told a colleague he “spoke with an engineer” to get the restriction lifted and, within a day or two, work resumed.
Within months, Kogan and Cambridge Analytica had a database of millions of US voters that had its own algorithm to scan them, identifying likely political persuasions and personality traits. They could then decide who to target and craft their messages that was likely to appeal to them for those individuals – a political approach known as “micro-targeting”.
Facebook announced on Friday that it was suspending Cambridge Analytica and Kogan from the platform pending information over misuse of data related to this project.
Facebook denies that the harvesting of tens of millions of profiles by GSR and Cambridge Analytica was a data breach.
It said in a statement that Kogan “gained access to this information in a legitimate way and through the proper channels”, but “did not subsequently abide by our rules” because he passed the information onto third parties.
Facebook has suspended the account of the whistleblower who exposed Cambridge Analytica
Tech hath no fury like a multi-billion dollar social media giant scorned.
In the latest turn of the developing scandal around how Facebook’s user data wound up in the hands of Cambridge Analytica — for use in the in development in psychographic profiles that may or may not have played a part in the election victory of Donald Trump — the company has taken the unusual step of suspending the account of the whistleblower who helped expose the issues.
In a fantastic profile in The Guardian, Wylie revealed himself to be the architect of the technology that Cambridge Analytica used to develop targeted advertising strategies that arguably helped sway the U.S. presidential election.
A self-described gay, Canadian vegan, Wylie eventually became — as he told The Guardian — the developer of “Steve Bannon’s psychological warfare mindfuck tool.”
The goal, as The Guardian reported, was to combine social media’s reach with big data analytical tools to create psychographic profiles that could then be manipulated in what Bannon and Cambridge Analytica investor Robert Mercer allegedly referred to as a military-style psychological operations campaign — targeting U.S. voters.
In a series of Tweets late Saturday, Wylie’s former employer, Cambridge Analytica, took issue with Wylie’s characterization of events (and much of the reporting around the stories from The Times and The Guardian).
Meanwhile, Cadwalldr noted on Twitter earlier today she’d received a phone call from the aggrieved whistleblower.
Facebook has since weighed in with a statement of its own, telling media outlets:
“Mr. Wylie has refused to cooperate with us until we lift the suspension on his account. Given he said he ‘exploited Facebook to harvest millions of people’s profiles,’ we cannot do this at this time.
“We are in the process of conducting a comprehensive internal and external review as we work to determine the accuracy of the claims that the Facebook data in question still exists. That is where our focus lies as we remain committed to vigorously enforcing our policies to protect people’s information.”
Facebook suspends Cambridge Analytica, the data analysis firm that worked on the Trump campaign
Facebook announced late Friday that it had suspended the account of Strategic Communication Laboratories, and its political data analytics firm Cambridge Analytica — which used Facebook data to target voters for President Donald Trump’s campaign in the 2016 election. In a statement released by Paul Grewal, the company’s vice president and deputy general counsel, Facebook explained that the suspension was the result of a violation of its platform policies. The company noted that the very unusual step of a public blog post explaining the decision to act against Cambridge Analytica was due to “the public prominence of this organization.”
Facebook claims that back in 2015 Cambridge Analytica obtained Facebook user information without approval from the social network through work the company did with a University of Cambridge psychology professor named Dr. Aleksandr Kogan. Kogan developed an app called “thisisyourdigitallife” that purported to offer a personality prediction in the form of “a research app used by psychologists.”
Apparently around 270,000 people downloaded the app, which used Facebook Login and granted Kogan access to users’ geographic information, content they had liked, and limited information about users’ friends. While Kogan’s method of obtaining personal information aligned with Facebook’s policies, “he did not subsequently abide by our rules,” Grewal stated in the Facebook post.
“By passing information on to a third party, including SCL/Cambridge Analytica and Christopher Wylie of Eunoia Technologies, he violated our platform policies. When we learned of this violation in 2015, we removed his app from Facebook and demanded certifications from Kogan and all parties he had given data to that the information had been destroyed. Cambridge Analytica, Kogan and Wylie all certified to us that they destroyed the data.”
Facebook said it first identified the violation in 2015 and took action — apparently without informing users of the violation. The company demanded that Kogan, Cambridge Analytica and Wylie certify that they had destroyed the information.
Over the past few days, Facebook said it received reports (from sources it would not identify) that not all of the data Cambridge Analytica, Kogan, and Wylie collected had been deleted. While Facebook investigates the matter further, the company said it had taken the step to suspend the Cambridge Analytica account as well as the accounts of Kogan and Wylie.
Depending on who you ask, UK-based Cambridge Analytica either played a pivotal role in the U.S. presidential election or cooked up an effective marketing myth to spin into future business. Last year, a handful of former Trump aides and Republican consultants dismissed the potency of Cambridge Analytica’s so-called secret sauce as “exaggerated” in a profile by the New York Times. A May 2017 profile in the Guardian that painted the Robert Mercer-funded data company as shadowy and all-powerful resulted in legal action on behalf of Cambridge Analytica. Last October, the Daily Beast reported that Cambridge Analytica’s chief executive Alexander Nix contacted Wikileaks’ Julian Assange with an offer to help disseminate Hillary Clinton’s controversial missing emails.
In an interview with TechCrunch late last year, Nix said that his company had detailed hundreds of thousands of profiles of Americans throughout 2014 and 2015 (the time when the company was working with Sen. Ted Cruz on his presidential campaign).
…We used psychographics all through the 2014 midterms. We used psychographics all through the Cruz and Carson primaries. But when we got to Trump’s campaign in June 2016, whenever it was, there it was there was five and a half months till the elections. We just didn’t have the time to rollout that survey. I mean, Christ, we had to build all the IT, all the infrastructure. There was nothing. There was 30 people on his campaign. Thirty. Even Walker it had 160 (it’s probably why he went bust). And he was the first to crash out. So as I’ve said to other of your [journalist] colleagues, clearly there’s psychographic data that’s baked-in to legacy models that we built before, because we’re not reinventing the wheel. [We’ve been] using models that are based on models, that are based on models, and we’ve been building these models for nearly four years. And all of those models had psychographics in them. But did we go out and rollout a long form quantitive psychographics survey specifically for Trump supporters? No. We just didn’t have time. We just couldn’t do that.
The tools that Cambridge Analytica deployed have been at the heart of recent criticism of Facebook’s approach to handling advertising and promoted posts on the social media platform.Nix credits the fact that advertising was ahead of most political messaging and that traditional political operatives hadn’t figured out that the tools used for creating ad campaigns could be so effective in the political arena.
“There’s no question that the marketing and advertising world is ahead of the political marketing the political communications world,” Nix told TechCrunch last year. “…There are some things which [are] best practice digital advertising, best practice communications which we’re taking from the commercial world and are bringing into politics.”
Responding to the allegations, Cambridge Analytica sent the following statement.
In 2014, SCL Elections contracted Dr. Kogan via his company Global Science Research (GSR) to undertake a large scale research project in the US. GSR was contractually committed to only obtain data in accordance with the UK Data Protection Act and to seek the informed consent of each respondent. GSR were also contractually the Data Controller (as per Section 1(1) of the Data Protection Act) for any collected data. The language in the SCL Elections contract with GSR is explicit on these points. GSR subsequently obtained Facebook data via an API provided by Facebook. When it subsequently became clear that the data had not been obtained by GSR in line with Facebook’s terms of service, SCL Elections deleted all data it had received from GSR. For the avoidance of doubt, no data from GSR was used in the work we did in the 2016 US presidential election.
Under Section 55 of the Data Protection Act (Unlawful obtaining etc. of personal data), a criminal offense has not been committed if a person has acted in the reasonable belief that he had in law the right to obtain data. GSR was a company led by a seemingly reputable academic at an internationally renowned institution who made explicit contractual commitments to us regarding the its legal authority to license data to SCL Elections. It would be entirely incorrect to attempt to claim that SCL Elections
illegally acquired Facebook data. Indeed SCL Elections worked with Facebook over this period to ensure that they were satisfied that SCL Elections had not knowingly breached any of Facebook’s Terms of Service and also provided a signed statement to confirm that all Facebook data and their derivatives had been deleted.
Cambridge Analytica and SCL Elections do not use or hold Facebook data.
Facebook is using us. It is actively giving away our information. It is creating an echo chamber in the name of connection. It surfaces the divisive and destroys the real reason we began using social media in the first place – human connection.
It is a cancer.
I’ve begun the slow process of weaning myself off of the platform by methodically running a script that will delete my old content. And there’s a lot. There are likes and shares. There are long posts I wrote to impress my friends. There are thousands of WordPress notifications that tell the world what I’m doing. In fact, I would wager I use Facebook more to broadcast my ego than interact with real humans. And I suspect that most of us are in a similar situation.
There is a method to my madness. I like Facebook Messenger and I like that Facebook is now a glorified version of OAuth. It’s a useful tool when it is stripped of its power. However, when it is larded with my personal details it is a weapon and a liability.
Think about it: any posts older than about a week are fodder for bots and bad actors. Posts from 2016? 2017? Why keep them? No one will read them, no one cares about them. Those “You and Joe have known each other for five years” auto-posts are fun but does anyone care? Ultimately you’ve created the largest dossier on yourself and you’ve done it freely, even gleefully. This dossier reflects your likes, your dislikes, your feelings, and political leanings. It includes clear pictures of your face from all angles, images of your pets and family, and details your travels. You are giving the world unfettered access to your life. It’s wonderful to imagine that this data will be used by a potential suitor who will fall in love with your street style. It’s wonderful to imagine you will scroll through Facebook at 80 and marvel at how you looked at the turn of the century. It’s wonderful to imagine that Facebook is a place to share ideas, dreams, and hopes, a human-to-human connection engine that gives more than it takes.
None of that will happen.
Facebook is a data collection service for those who want to sell you products. It is the definitive channel to target you based on age, sex, geographic location, political leanings, interests, and marital status. It’s an advertiser’s dream and it is wildly expensive in terms of privacy lost and cash spent to steal that privacy. It is the perfect tool for marketers, a user-generated paradise that is now run by devils.
Will you delete Facebook? Probably not. Will I? I’m working on it. I’ve already been deleting old tweets after realizing that border police and potential employers may use what I write publicly against me. I’m clearing out old social media accounts and, as I mentioned before, deleting old Facebook posts, thus ensuring that I will no longer be a target for companies like Cambridge Analytica. But we love our social media, don’t we? The power it affords. The feeling of connection. In the absence of human interaction we cling to whatever dark simulacrum is available. In the absence of the Town Square we talk to ourselves. In the absence of love and understanding we join the slow riot of online indifference.
When Travis Kalanick led his ride-sharing company down the dark path to paranoia, bro culture, and classist rantings we reacted by deleting the app. We didn’t want to do business with that particular brand of company. Yet we sit idly by while Facebook sells us out and its management pummels and destroys all competition.
I wish it didn’t have to be this way. There is plenty of good in these platforms but the dangers far outweigh the benefits. Try to recall the last time you were thankful for social media. I can. It happened twice. First, it happened when I posted on my “wall” a eulogy for my father who died in January. The outpouring of support was heartening in a dark time. It was wonderful to see friends and acquaintances tell me their own stories, thereby taking the sting out of my own. But months later that good feeling is gone, replaced by ads for fancy shoes and political rants. Out of the Facebook swamp sometimes surfaces a pearl. But it sinks just as quickly.
One more sad example: I found out, accidentally, that my friend’s wife died. It appeared on my feed as if placed there by some divine hand and I was thankful it surfaced. It beat out videos of Mister Rogers saying inspiring things and goofy pictures of Trump. It beat out ads and rants and questions about the best sushi restaurant in Scranton. The stark announcement left me crying and breathless. There it was in black and blue, splashed across her page: she was gone. There was the smiling photo of her two little children and there was the outpouring of grief under these once innocuous photos. Gone, it said. She was gone. I found out from her wall where her memorial service would be held and I finally reached back out to my old friend to try to comfort him in his grief. Facebook, in those two instances, worked.
But Facebook isn’t the only thing that can give us that feeling of connectedness. We’ve had it for centuries.
Facebook simply replaced the tools we once used to tell the world of our joys and sorrows and it replaced them with cheap knock-offs that make us less connected, not more. Decades ago, on one coal-fogged winter morning in Krakow, Poland where I was living, I passed Kościół św. Wojciecha with its collection of nekrologi – necrologies – posted on a board in front of the church. There you saw the names of the dead – and sometimes the names of the newly born – and it was there you discovered what was happening in your little corner of the world. The church wasn’t far from the central square – the Rynek – and I walked there thinking about the endless parade of humanity that had walked across those cobbles, stopping for a moment in their hustle at the church yard to see who had died. I stood in the crisp air, flanked by centuries old brickwork, and imagined who once populated this place. This was the place you met your friends and your future partners. It was there you celebrated your successes and mourned your failures. It was there, among other humans, you told the world the story of your life, but told it slant. You witnessed kindnesses and cruelties, you built a world entire based on the happenings in a few square miles.
No more. Or, at least, those places are no longer available to most of us.
We’ve moved past the superstitions and mythologies of the past. Tools like Facebook were designed to connect us to the world, giving us an almost angelic view of daily happenstance. We replaced the churchyard with the “timeline.” But our efforts failed. We are still as closed, still full of superstition, as we were a hundred years ago. We traded a market square for the Internet but all of the closed-mindedness and cynicism came with it. We still disparage the outsider, we still rant against invisible enemies, and we still keep our friends close and fear what lies beyond our door. Only now we have the whole world on which to reflect our terror.
It doesn’t have to be this way. Maybe some day we’ll get the tools we need to interact with the world. Maybe they’re already here and we just don’t want to use them.
Until we find them, however, it’s probably better for us to delete the ones we use today.
TOOL TO DELETE (ALL!) YOUR FACEBOOK POSTS
==================================== >>> Batch delete Social Book posts/items >>> hide/unhide, unlike >>> Support All Languages on Social Book ==================================== ==================================== The extension is free to use for everyone. I don't know why there are so many cloned reviews on top. I feel really awful for that. Please DO NOT duplicate reviews! ==================================== * Facebook (TM) is a registered trade mark of Facebook. The author of this extension is by no mean associated or affiliated with Facebook. This extension uses absolutely NO any Facebook APIs, and it's therefore not binding with any Facebook APIs licenses. All the occurrences of the word "Facebook" is for descriptive purpose only, and hence it's legally allowed. "Social Book Post Manager" helps you to delete your posts through the activity log, which include posts by you, and by other persons/apps. You may specify "Year", "Month", "Text Contains", and "Text Not Contains" filters for posts to delete. Plus the activity log filters provided by Facebook (TM), you have full control of which posts to delete, and which posts to keep. Recent new features: 1) Text filters support AND/OR conditions. 2) Prescan the activity log. You can then select exactly which individual entries that you want to delete/hide/unhide/unlike/change privacy settings. 3) Hide/unhide timeline items. 4) Unlike items. 5) Change privacy settings. ALL the features are FREE for you to use, and totally UNLIMITED. If you are satisfied, please leave me some feedback. Also please feel free to give me suggestion, and bug reports if any. >>> The process is slow. It simulates your mouse click on delete button one-by-one. This is the intentional limitation by Facebook (TM). There is no way to bypass it. <<< >>> Any other Chrome extension may conflict with this extension. If it's not working, please follow the instruction to remove/disable other Chrome extensions temporarily. Then restart Chrome and try again. <<< ================================ Notes: The scanning/deletion process takes a LONG time to finish, depending on number of posts involved. Because Facebook (TM) does not want the users to easily remove posts, they don't provide any function/API to delete multiple posts at a time. This extension can only filter/delete posts one-by-one. There is no any known method to accelerate this process. Please be patient, sit back, and let the extension does its job. You may have a coffee, or better just let it run through a night. ================================ Instructions (with "Prescan" option checked): 1. Login to Facebook (TM), go to the Activity Log. Use activity filters to select a subset of posts to delete. 2. Click the extension's button to open the interface. 3. If needed, choose "Year", "Month", "Text Contains" and "Text Not Contains" for posts that you want to delete. 4. Click the "Delete Post" button. The extension will scan through your Activity Log and mark all posts matching the conditions. 5. The prescan process may take a long time, depending on number of posts to be deleted. 6. After the prescan process finishes, there will be a Confirm button shown on top of the Facebook (TM) page. You may verify and uncheck certain posts as desired, then continue to actually delete the posts. 7. When done, the extension reports total number of posts deleted. ================================ Instructions (with "Prescan" option unchecked): 1. Login to Facebook (TM), go to the Activity Log. Use activity filters to select a subset of posts to delete. 2. Click the extension's button to open the interface. 3. If needed, choose "Year", "Month", "Text Contains" and "Text Not Contains" for posts that you want to delete. 4. Click the "Delete Post" button. The extension will scan through your Activity Log and delete all posts matching the conditions. 5. The deletion process may take a long time, depending on number of posts to be deleted. 6. Deleted post counter is displayed in real-time. You may stop the deletion process at any time by clicking the "Stop Now" button. 7. When done, the extension reports total number of posts deleted. ================================