Site icon IT World Canada

U.K. report hammers Facebook, calls for tougher regulation of tech companies

After an 18 month investigation, British lawmakers have called for more regulation over big technology companies for failing to protect and allowing the manipulation of personal data of users, as well as facilitating the distribution of fake news.

In its final report into disinformation issued Monday, the House of Commons digital, culture and media committee said “companies like Facebook should not be allowed to behave like ‘digital gangsters’ in the online world, considering themselves to be ahead of and beyond the law.”

The report in particular hammers Facebook, saying the company “continues to choose profit over data security, taking risks in order to prioritize their aim of making money from user data.”

“Among the countless innocuous postings of celebrations and holiday snaps, some malicious forces use Facebook to threaten and harass others, to publish revenge porn, to disseminate hate speech and propaganda of all kinds, and to influence elections and democratic processes—much of which Facebook, and other social media companies, are either unable or unwilling to prevent. We need to apply widely-accepted democratic principles to ensure their application in the digital age.”

The report says social media companies can’t hide behind the claim of being merely a ‘platform’ and maintain that they have no responsibility themselves in regulating the content posted by others on their sites. Tech companies should be forced to assume legal liability for removing content identified as harmful after it has been posted by users, it recommends

CLICK HERE FOR THE FULL REPORT

According to the Washington Post, Facebook issued a statement saying it had made considerable changes to its business practices and supported regulation in areas including privacy. It also denied it had broken any laws in the country.

The U.K. committee report comes amid news reports that U.S. Federal Trade Commission wants to levy a record fine of over $1 billion against the company for violating a 2011 data privacy agreement. Facebook has confirmed it is in talks with the FTC, but didn’t detail about what.

The committee was helped in part by documents it seized from a U.S. app developer called Six4Three, which is suing Facebook. The documents were sealed by a U.S. court, but a Six4Three official visiting Britain last year was forced by the committee into turning them over.

“The evidence that we obtained from the Six4Three court documents indicates that Facebook was willing to override its users’ privacy settings in order to transfer data to some app developers, to charge high prices in advertising to some developers, for the exchange of that data, and to starve some developers—such as Six4Three—of that data, thereby causing them to lose their business,” the committee concluded . “It seems clear that Facebook was, at the very least, in violation of its Federal Trade Commission settlement.”

Meanwhile, Canada’s privacy commissioner is expected to soon issue its report into what has become known as the Cambridge Analytica scandal, where that company used data collected by a third party from Facebook users to target political ads in a number of countries, including the U.S.

The U.K. report also echoes some of the conclusions of the Canadian House of Commons’ ethics committee December 2018 report into the impact of social media on democracy.

Much of the U.K. committee’s work has focused on allegations that U.K.-based SCL Group, its Cambridge Analytica division and associated companies misused Facebook data to target advertising and email messages during the 2016 Brexit referendum. It grew wider into an investigation into online disinformation and foreign interference.

The committee heard allegations that SCL was involved in disinformation in elections in Argentina, Trinidad and Tobago, St. Kits and Nevis and other countries, “The work of SCL and its associates in foreign countries involved unethical and dangerous work,” the committee concluded in its interim report last year.

Victoria, B.C.-based Aggregate IQ Data Services, a digital advertising web and software development company, has been caught up in the controversy for a “political customer relationship management tool”  it built for SCL, which has since gone out of business. Cambridge Analytica is now called Emerdata.

Questions raised by the U.K. committee include how much personal data did AIQ have and what if any was its role in targeted campaigns. AIQ’s Canadian lawyers wrote the committee that “AggregateIQ did not manipulate micro-targeting, nor facilitate its manipulation.” The report also notes an AIQ executive also told a Canadian parliamentary committee that it asked where the data came from that was used to help create the tool and was told it came from public data sources. “We were unaware they were obtaining information improperly at the time,” the official said.

However, the committee’s evidence also included work from researcher Chris Vickery of security vendor UpGuard, who found unsecured data on an AIQ web site that led to other conclusions.

The final U.K. committee report says “there is clear evidence that there was a close working relationship between Cambridge Analytica, SCL and AIQ. There was certainly a contractual relationship” to develop a software platform for SCL, “but we believe that the information revealed from the [AIQ data] repository would imply something closer, with data exchanged between both AIQ and SCL, as well as between AIQ and Cambridge Analytica.”

The U.K. committee report says evidence from Vickery’s database discovery led it to conclude that AIQ had created what it called the “Database of Truth,” with personal data from a number of sources, including the U.S. Republican Party. That database “could have been used to target specific users on Facebook, using its demographic targeting feature when creating adverts on the Facebook platform,” says the report.

In its interim report in July 2018 the committee report said Aggregate IQ had held onto U.K. personal data from the referrendum longer than it should have.

AIQ did online advertising work for Brexit-supporting organizations during the referendum, says the report. There was evidence that AIQ had made use of some customized information to email Facebook users, but AIQ said that it was an administrative error, which was quickly corrected.

It also says AIQ had the capability to use the data scraped by Aleksandr Kogan, who surveyed Facebook subscribers through a personality app. Unknowingly, those who participated in that survey also had data from their Facebook friends pulled in. Ultimately that data was used by Cambridge Analytica to distribute targeted online election ads.

Kogan’s data “also included U.K. citizens’ data and the question arises whether this was used during the EU referendum,” says the U.K. report. “We know from Facebook that data matching Dr Kogan’s was found in the data used by AIQ’s leave campaign audience files. Facebook believes that this is a coincidence.”

AIQ apparently also had a hand in a Leave Campaign data harvesting campaign, the report says. Vote Leave offered soccer fans the opportunity to win £50 million if they entered their name, address, email and telephone number, and also how they intended to vote in the Brexut referendum. The report believes AIQ processed all the data from the contest, including harvesting Facebook IDs. “There is no evidence to show that this was fraudulent,” the report adds, “but one could question whether data gathered in this way was ethical” if it was later used for targeting political ads.

“From the files obtained by Chris Vickery, and from evidence we received, there seems to be more to the AIQ/Cambridge Analytica/SCL relationship than is usually seen in a strictly contractual relationship,” says the U.K. report. “AIQ worked on both the U.S. Presidential primaries and for Brexit-related organizations, including the designated Vote Leave group, during
the EU Referendum. The work of AIQ highlights the fact that data has been and is still being used extensively by private companies to target people, often in a political context, in order to influence their decisions. It is far more common than people think.”

Details of sources and full use of by AIQ of data for political campaigns — including whether it has a “Database of Truth” and for which clients it was used for —  may come out in the upcoming report by Canadian privacy commissioner Daniel Therrien.

Among the recommendations of the U.K. committee are

–that what it calls “inferred data” created by companies or political parties from a number of sources — for example, that a person likely supports a political position — should be protected under U.K. privacy law;

–that tech companies must address the issue of shell companies and other professional attempts to hide identity in advert purchasing, especially around political advertising at any time. There should be full disclosure of the targeting used as part of advertising transparency.

Canada’s new election rules force companies to detail sources of political advertising, but only after an election has been called. It also created a task force to prevent covert, clandestine, or criminal activities from influencing or interfering with the scheduled fall federal election.

–that the U.K. government pressure social media companies to publicize any instances of disinformation their sites. Imposing security certificates and authenticating social media accounts, would ensure that a real person was behind the views expressed on the account, the committee says;

–finally, the report urges the U.K. government to put more emphasis on digital literacy to help people recognize disinformation and their privacy rights. In its interim report the committee suggested a social media company levy, to be used, in part, to finance a comprehensive educational framework.

As part of its protection from disinformation strategy Ottawa will spend $7 million this year on digital literacy.

The U.K. parliamentary committee also recommends social media force users to slow down before making their words live. “More obstacles or ‘friction’ should be both incorporated into social media platforms and into users’ own activities—to give people time to consider what they are writing and sharing. Techniques for slowing down interaction online should be taught, so that people themselves question both what they write and what they read—and that they pause and think further, before they make a judgement online.”

As part of its hearings the committee also created a so-called ‘international grand committee’ of elected politicians from eight countries — including three from Canada — to join in the questioning of Richard Allan, Facebook U.K.’s vice-president of policy solutions.

The U.K. report acknowledged the work of Canada’s standing committee on access to information, privacy and ethics, and the U.S. Senate select committee on intelligence’s ongoing investigation into the extent of Russian interference in the 2016 U.S. elections.

In October, 2018 the U.K. information and privacy commissioner fined Facebook £500,000 under the former British previous data protection law for lack of transparency and security issues over its harvesting of data. Facebook is appealing the fine.

In January SCL Elections Ltd., part of SCL Group was fined £15,000 for failing to comply with an enforcement notice issued by the U.K. information commissioner.

The U.K. government is expected to release an online harms white paper soon, which will also cover disinformation.

Exit mobile version