Connect with us

TECH

Police Scotland received an official notice about the cloud system

Published

on

The Scottish Biometrics Commissioner has issued an information notice to Police Scotland requiring the force to demonstrate that the deployment of a cloud-based digital evidence system complies with UK law enforcement data protection regulations.

In early April 2023, Computer Weekly reported that the Scottish Government’s Digital Evidence Exchange (DESC) service, contracted to wearable video provider Axon for delivery and hosted on Microsoft Azure, is currently in pilot mode despite significant concerns about data protection, raised by observers that the use of Azure “won’t be legal”.

According to a Data Protection Impact Assessment (DPIA) conducted by the Scottish Police Authority (SPA), which notes that the system will process genetic and biometric information, the risks to data subject rights include access by the US government through the Cloud Act, effectively giving access the US government to any data stored anywhere by US corporations in the cloud; Microsoft’s use of general rather than specific contracts; and Axon’s failure to comply with contractual provisions relating to data sovereignty.

There is also concern that the transfer of personal data to the United States, a jurisdiction with markedly lower data protection standards, could in turn adversely affect people’s rights to correct, delete, and opt out of automated decision making.

While SPA DPIA noted that the risk of US government access through the Cloud Act was “unlikely… the consequences would be catastrophic.”

In addition to Computer Weekly’s coverage of the DESC service, Scottish biometrics commissioner Brian Plastow served with Police Scotland (lead data controller for the system). information notice April 22, 2023, which gives you the right to provide information about their compliance with data protection requirements until mid-June.

The information notice itself directly references the DESC report from Computer Weekly. “Now I am sufficiently concerned about the potential consequences of DESC that, in accordance with the provisions of section 16 of the Scottish Biometrics Commissioner Act 2020, I must now require Police Scotland to provide me with information so that I can determine if Police Scotland is complying with the data protection elements of my of the statutory set of rules,” he wrote in an official notice.

Plastow also outlined the specific information he would like to receive, including whether there was a transfer of biometric data; what types were passed; in what volumes; and in which country the data is stored.

“If biometrics have been exchanged under DESC, please confirm that Police Scotland are fully complying with Part 3 of the UK Data Protection Act 2018 relating to law enforcement data processing and Principle 10 of the Scottish Biometrics Commissioner Code of Practice.” he said, referring to the legislative code that came into force in Scotland on November 16, 2022, after approval by the Scottish government.

Principle 10 of the code specifically addresses the promotion of privacy enhancing technologies and states that the way in which biometric data is acquired, stored, used and destroyed must ensure that the data is protected from unauthorized access or disclosure.

“In order to enforce the Code of Practice, Police Scotland need to demonstrate that any use of hyperscale cloud infrastructure incorporating biometrics complies with law enforcement data protection regulations,” Plastow said. “The best way to achieve this is to have a hosting platform that is entirely located in the UK and meets all the requirements of Part 3 of the Data Protection Act 2018 for law enforcement processing.

“If this is not the case with DESC, then to ensure that public trust and trust is maintained, Police Scotland need to explain to citizens what it means to use the cloud for their personal data. This means talking openly with citizens about which country their data will be stored in and, if the answer is not in the UK, explaining the obvious risks of accessing this extremely sensitive data, either through legal action or malicious intent.”

Responding to the notification, a Police Scotland spokesman said: “Police Scotland takes data management and security very seriously and is working alongside criminal justice partners to ensure that strong, efficient and secure processes are in place to support the development of the DESC system.

“All digital evidence in Dundee’s DESC system is securely stored and only accessible to authorized personnel such as police officers, [Crown Office and Procurator Fiscal Service] COPFS and protection agents. Access to this information is fully audited and controlled, and processes are in place to ensure that any risks associated with the data are quickly identified, assessed and mitigated. We will continue to work with the Biometrics Commissioner to provide the necessary safeguards for data protection and security as the Dundee pilot progresses.”

Lack of regulatory approval

According to the notice, Plastow is also seeking information on what discussions have taken place with the Office of the Information Commissioner (ICO) on international transfers and digital sovereignty, as well as with Police Scotland, to confirm whether all issues have been resolved to the satisfaction of the ICO.

Computer Weekly previously asked the ICO about the dominance of US cloud providers in the UK criminal justice sector and whether their use is compatible with UK data protection regulations as part of its coverage of the DESC system. The ICO press office was unable to respond. , and forwarded the Computer Weekly questions to the FOI team for further responses.

On April 24, the FOI ICO team responded that while it had received legal advice on the matter, the matter was under review and had not yet come to an official position on the matter. However, the consultation itself was rejected as being the subject of a legal professional privilege.

The ICO also confirmed that it has “never given formal regulatory approval for the use of these systems in a law enforcement context.”

However Correspondence between SPA and ICO — also disclosed under the FOI — showed that the regulator broadly agreed with its risk assessments, noting that technical support from the US or US government access through the Cloud Act would constitute an international data transfer.

“These transfers are unlikely to meet the terms of the respective transfer,” the statement said. “To avoid a potential breach of data protection law, we strongly recommend that you ensure that personal data remains in the UK by contacting the UK technical support team.”

Preliminary consultation

IN separate correspondence with the police in Scotland (again disclosed under the FOI), the ICO noted: “If there remains a residual high risk in your DPIA that cannot be mitigated, prior consultation with the ICO is required pursuant to Section 65 of the DPA 2018. You may not proceed with processing until you have consulted with us”.

While Plastow welcomed DESC’s strategic goals of digitally transforming the way the Scottish justice system manages evidence, he confirmed that his office had never been involved by either the Scottish Government or the Scottish Police prior to the November 29, 2022 meeting.

At the meeting, requested by Plastow himself after learning that biometric data could be passed through the system, the commissioner’s professional advisory panel asked Police Scotland for assurances on data security and data sovereignty issues.

Following the presentation by the security forces, the members of the advisory group demanded that the slides regarding DESC be subsequently distributed. However, the presentation superintendent indicated that he needed to consider this request because some of the slides may contain sensitive commercial information: “The slide package was never received.”

All-British problem

The release of the SPA DPIA also casts doubt on the legality of law enforcement and criminal justice deployments of cloud services across England and Wales, as a number of other DPIAs that Computer Weekly has reviewed do not assess the risks outlined in the SPA for US cloud providers. even though they are subject to the same data protection regulations.

For example, in December 2020, a Computer Weekly investigation found that the UK police illegally processed the personal data of more than one million people, including biometrics, on the Microsoft 365 hyperscale public cloud service after failing to comply with key contractual and processing requirements under Part Three of the Protection Act. 2018 data, such as restrictions on international transfers.

In particular, the DPIA disclosed by Computer Weekly through freedom of information requests showed that the risks of sending sensitive personal data to a US company subject to the US government’s intrusive surveillance regime were not properly considered.

Other uses of US cloud providers in the UK criminal justice sector include integrating the Ident1 fingerprint database with Amazon Web Services (AWS) on the Police Digital Services (PDS) Xchange cloud platform; and HM Courts and Tribunals Cloud Video Platform, partly hosted on Azure, which processes biometric information in the form of audio and video recordings of court hearings.

In mid-April 2023, Biometrics Commissioner for England and Wales Fraser Sampson told Computer Weekly that UK law enforcement and judiciary should be able to prove that their growing use of public cloud infrastructure complies with law enforcement data protection regulations.

Speaking specifically about the use of hyperscale public cloud providers to store and process sensitive biometric data, Sampson said that “the burden of proof is on the police because [data] controllers not only to provide information and guarantees, but also to demonstrate that their processing complies with all applicable [data protection] requirements.” He added that the burden of proof is not only about the law, but also about governance, accountability and building public confidence in how the police use new technology.

During a speech before Parliament’s Joint Committee on Human Rights in February 2023, Sampson noted that there is a “culture of non-deletion” within the UK police force when it comes to preserving biometric information.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

TECH

Meta returns to growth after struggling with falling sales

Published

on

The debate over whether the Meta is in decline may die down, at least for now.

After three quarters of falling revenue, Meta, formerly known as Facebook, informed on Wednesday, first-quarter revenue jumped 3% year-over-year to $28.6 billion. Profits fell 24 percent to $5.7 billion, partly due to restructuring costs.

The results, which beat Wall Street’s expectations and Meta’s own projections, were bolstered by user growth. The company added 37 million daily users to Facebook, its standout app, up 4 percent from a year earlier, a game-changer from the first-ever user drop it reported in early 2022.

“We had a good quarter and our community continues to grow,” said Mark Zuckerberg, CEO of Meta. He added that the company “becomes more efficient so we can build better products faster and strengthen our position to realize our long-term vision.”

The play comes at a tumultuous year for the Meta, which is trying to reinvent itself after declining revenues and what Mr. Zuckerberg called an overstuffed workforce.

He was pushing the company into the so-called immersive world of the metaverse, an untested market. Meta is also facing stiff competition from adversaries like TikTok, which is taking ad dollars from social media companies, and Apple, which has blocked Facebook’s ad tech with privacy updates to its iOS software.

These issues, after years of rampant growth, have raised questions about the future of Meta and its vulnerabilities.

On Wednesday, Mr. Zuckerberg said in a earnings call that he has no intention of giving up on his pursuit of the metaverse and that it remains a long-term goal.

In an attempt to make a difference, he instead embarked on what he calls a “year of efficiency,” reining in spending and cutting the ranks by more than 21,000 people, or about 30 percent. Meta’s share price, which rose more than 12 percent after hours, has surged 63 percent since the company announced the first round of layoffs of 11,000 people in November.

In March, Meta announced it was laying off another 10,000 people. The company said on Wednesday it would incur severance pay and related staff costs of about $1 billion as a result of the cuts.

“When we started this work last year, our business was not doing as well as I wanted,” Mr. Zuckerberg said in a conversation with investors. He added that he still “believes that slowing down hiring and simplifying our governance structure” will improve Meta’s speed and quality of work.

But the moves also damaged employee morale. Workers are wondering if they will be laid off. Mr. Zuckerberg said he is trying to eliminate “managers running managers” as a result of the middle management glut caused by over-hiring in the pandemic era.

The company said it had 77,114 employees as of March 31, down 1 percent from a year earlier.

Despite recent results, the Meta’s problems remain. The company’s first-quarter costs jumped 10 percent year-over-year to $21.4 billion, outpacing revenue growth.

As the metaverse hype has died and switched to artificial intelligence, Meta is also trying to position itself as a leader in the field, backed by years of investment. Mr. Zuckerberg and his executive team attend weekly AI strategy meetings. He told investors that artificial intelligence is helping to offer more relevant photos and videos to Instagram and Facebook users.

Mr. Zuckerberg said he expects the new technology to “affect literally every one of our products” in the future. He did not reveal specific plans, but did suggest potential products such as AI-powered chatbots that could help with customer service or small businesses using WhatsApp. According to him, artificial intelligence can also help make photos or videos more attractive.

For now, Meta plans to continue to invest heavily in data centers and infrastructure that help ramp up AI efforts like other big tech companies.

“Our AI work is delivering good results in our applications and business,” Mr. Zuckerberg said.

Continue Reading

TECH

US bill bans children under 13 from joining social media

Published

on

While all the major Silicon Valley social networks — from Instagram to TikTok — say they are blocking kids from using their apps, these senators say those efforts have been unsuccessful.

“It doesn’t work,” Schatz says. “There is no right to freedom of speech to be squeezed by an algorithm that upsets you, and these algorithms make us more and more polarized, dismissive, depressed and angry at each other. And bad enough that this is happening to all of us adults, the least we can do is protect our children.”

While the measure is sponsored by progressive Democrats and one of the most vocal conservatives in the Senate, lawmakers from across the ideological spectrum are equally skeptical of the proposal, showing that there is a difficult road ahead for any new media measures, including those aimed at children. Many legislators are torn between protecting children online and maintaining the secure internet as we know it. Of course, most senators look to their families for advice.

“My grandchildren have flip phones. They don’t have smartphones until they’re older,” says Utah Republican Senator Mitt Romney. Romney, who is open to the idea, though initially hesitant, says even his own family is divided on the issues.

“I have five sons, so five different families, and they have different approaches,” says Romney. “And the youngest son is the most strict, and the eldest son didn’t really consider it to be anything more.”

For Smith, a Minnesota senator who worried that her party would look like a Big Sister, there was not even uniformity in her own home when her boys long ago quarreled over the family’s first desktop computer. And her children also turned out to be (mini)hackers.

“We were trying to figure out how to control their interactions with the computer, and we quickly realized that, at least for them, it’s hard to set hard and fast rules because kids find a way,” says Smith. “And different parents have different rules about what they think is right for their children.”

Although Smith is open to a new measure, she is wary. “I tend to be a little suspicious of hard and fast rules, I guess, because I’m not sure if they work and because I kind of think parents and kids should have the freedom to decide what’s right for their family,” she says. Smith.

Although Smith is a progressive Democrat, she currently aligns with Senator Rand Paul, a libertarian Republican from Kentucky, on this new measure. “Parents have some control over what their kids watch online, what they watch on TV, all of those things are important. I’m not sure I want the federal government [involved]” says Paul.

The new measure also has competition. Just last week, Senators Richard Blumenthal, Democrat of Connecticut, and Lindsey Graham of South Carolina, the top Republican on the Senate Judiciary Committee, re-introduced their EARN IT Act, the Law to Eliminate Abuse and Rampant Disregard for Interactive Technology. This measure will remove the current Section 230 protections for any sites that publish online child sexual exploitation content. Section 230 remains a highly controversial law as it shields online companies from liability for much of what their users post on their platforms.

Continue Reading

TECH

Microsoft introduces iMessage for Windows 11 PC – if you don’t want to send photos

Published

on

Continue Reading

Trending

Copyright © 2023 Culture Belle Media.