Connect with us

TECH

BIOS update for MSI X670 and B650 boards can cut boot time in half

Published

on

In a nutshell: MSI has released a BIOS update for X670 and B650 motherboards with a new option that can reportedly cut boot times in half. In lab testing with an AMD Ryzen 7 7800X3D processor, MSI MAG X670E Tomahawk WiFi card, and 32GB (2x 16GB) Kingston DDR5-6000 memory, MSI was able to reduce boot time (the moment the power button was pressed while the Windows 11 desktop loaded) from 43 seconds to 22 seconds. It’s not bad for changing one setting, but is it worth the risk?

New from MSI Restoring a memory context An option in the BIOS makes all of this possible, but unfortunately the hardware vendor has little to say about what the option actually does.

Tom’s equipment points to Asus forum post A month ago, a user discussed using this feature on the ROG Crosshair X670E Hero board. This feature also appeared in this TechPowerUp. forum discussion at the end of last year, although in both cases it was buried deep in the advanced users menu.

Do you have experience with the memory context restore option on your X670 or B650 board, or does it even sound like an option you might be interested in? In my younger years, I would go to great lengths to achieve even a modicum of performance. However, now I value stability above all else. If my computer needs a couple extra seconds to boot but I know it won’t crash, I go this route.

Many enthusiasts leave their machines running 24/7, so something like this might not even be worth the hassle or potential stability risk. If it’s not broken, don’t fix it!

Those interested in the impact with the new setting can get the latest BIOS update from MSI website. Simply navigate to your board, click Support, then search for the BIOS under Drivers and Downloads. A quick check of the board MSI used for testing revealed that the latest BIOS update was released on April 14th.

TECH

Police Scotland received an official notice about the cloud system

Published

on

The Scottish Biometrics Commissioner has issued an information notice to Police Scotland requiring the force to demonstrate that the deployment of a cloud-based digital evidence system complies with UK law enforcement data protection regulations.

In early April 2023, Computer Weekly reported that the Scottish Government’s Digital Evidence Exchange (DESC) service, contracted to wearable video provider Axon for delivery and hosted on Microsoft Azure, is currently in pilot mode despite significant concerns about data protection, raised by observers that the use of Azure “won’t be legal”.

According to a Data Protection Impact Assessment (DPIA) conducted by the Scottish Police Authority (SPA), which notes that the system will process genetic and biometric information, the risks to data subject rights include access by the US government through the Cloud Act, effectively giving access the US government to any data stored anywhere by US corporations in the cloud; Microsoft’s use of general rather than specific contracts; and Axon’s failure to comply with contractual provisions relating to data sovereignty.

There is also concern that the transfer of personal data to the United States, a jurisdiction with markedly lower data protection standards, could in turn adversely affect people’s rights to correct, delete, and opt out of automated decision making.

While SPA DPIA noted that the risk of US government access through the Cloud Act was “unlikely… the consequences would be catastrophic.”

In addition to Computer Weekly’s coverage of the DESC service, Scottish biometrics commissioner Brian Plastow served with Police Scotland (lead data controller for the system). information notice April 22, 2023, which gives you the right to provide information about their compliance with data protection requirements until mid-June.

The information notice itself directly references the DESC report from Computer Weekly. “Now I am sufficiently concerned about the potential consequences of DESC that, in accordance with the provisions of section 16 of the Scottish Biometrics Commissioner Act 2020, I must now require Police Scotland to provide me with information so that I can determine if Police Scotland is complying with the data protection elements of my of the statutory set of rules,” he wrote in an official notice.

Plastow also outlined the specific information he would like to receive, including whether there was a transfer of biometric data; what types were passed; in what volumes; and in which country the data is stored.

“If biometrics have been exchanged under DESC, please confirm that Police Scotland are fully complying with Part 3 of the UK Data Protection Act 2018 relating to law enforcement data processing and Principle 10 of the Scottish Biometrics Commissioner Code of Practice.” he said, referring to the legislative code that came into force in Scotland on November 16, 2022, after approval by the Scottish government.

Principle 10 of the code specifically addresses the promotion of privacy enhancing technologies and states that the way in which biometric data is acquired, stored, used and destroyed must ensure that the data is protected from unauthorized access or disclosure.

“In order to enforce the Code of Practice, Police Scotland need to demonstrate that any use of hyperscale cloud infrastructure incorporating biometrics complies with law enforcement data protection regulations,” Plastow said. “The best way to achieve this is to have a hosting platform that is entirely located in the UK and meets all the requirements of Part 3 of the Data Protection Act 2018 for law enforcement processing.

“If this is not the case with DESC, then to ensure that public trust and trust is maintained, Police Scotland need to explain to citizens what it means to use the cloud for their personal data. This means talking openly with citizens about which country their data will be stored in and, if the answer is not in the UK, explaining the obvious risks of accessing this extremely sensitive data, either through legal action or malicious intent.”

Responding to the notification, a Police Scotland spokesman said: “Police Scotland takes data management and security very seriously and is working alongside criminal justice partners to ensure that strong, efficient and secure processes are in place to support the development of the DESC system.

“All digital evidence in Dundee’s DESC system is securely stored and only accessible to authorized personnel such as police officers, [Crown Office and Procurator Fiscal Service] COPFS and protection agents. Access to this information is fully audited and controlled, and processes are in place to ensure that any risks associated with the data are quickly identified, assessed and mitigated. We will continue to work with the Biometrics Commissioner to provide the necessary safeguards for data protection and security as the Dundee pilot progresses.”

Lack of regulatory approval

According to the notice, Plastow is also seeking information on what discussions have taken place with the Office of the Information Commissioner (ICO) on international transfers and digital sovereignty, as well as with Police Scotland, to confirm whether all issues have been resolved to the satisfaction of the ICO.

Computer Weekly previously asked the ICO about the dominance of US cloud providers in the UK criminal justice sector and whether their use is compatible with UK data protection regulations as part of its coverage of the DESC system. The ICO press office was unable to respond. , and forwarded the Computer Weekly questions to the FOI team for further responses.

On April 24, the FOI ICO team responded that while it had received legal advice on the matter, the matter was under review and had not yet come to an official position on the matter. However, the consultation itself was rejected as being the subject of a legal professional privilege.

The ICO also confirmed that it has “never given formal regulatory approval for the use of these systems in a law enforcement context.”

However Correspondence between SPA and ICO — also disclosed under the FOI — showed that the regulator broadly agreed with its risk assessments, noting that technical support from the US or US government access through the Cloud Act would constitute an international data transfer.

“These transfers are unlikely to meet the terms of the respective transfer,” the statement said. “To avoid a potential breach of data protection law, we strongly recommend that you ensure that personal data remains in the UK by contacting the UK technical support team.”

Preliminary consultation

IN separate correspondence with the police in Scotland (again disclosed under the FOI), the ICO noted: “If there remains a residual high risk in your DPIA that cannot be mitigated, prior consultation with the ICO is required pursuant to Section 65 of the DPA 2018. You may not proceed with processing until you have consulted with us”.

While Plastow welcomed DESC’s strategic goals of digitally transforming the way the Scottish justice system manages evidence, he confirmed that his office had never been involved by either the Scottish Government or the Scottish Police prior to the November 29, 2022 meeting.

At the meeting, requested by Plastow himself after learning that biometric data could be passed through the system, the commissioner’s professional advisory panel asked Police Scotland for assurances on data security and data sovereignty issues.

Following the presentation by the security forces, the members of the advisory group demanded that the slides regarding DESC be subsequently distributed. However, the presentation superintendent indicated that he needed to consider this request because some of the slides may contain sensitive commercial information: “The slide package was never received.”

All-British problem

The release of the SPA DPIA also casts doubt on the legality of law enforcement and criminal justice deployments of cloud services across England and Wales, as a number of other DPIAs that Computer Weekly has reviewed do not assess the risks outlined in the SPA for US cloud providers. even though they are subject to the same data protection regulations.

For example, in December 2020, a Computer Weekly investigation found that the UK police illegally processed the personal data of more than one million people, including biometrics, on the Microsoft 365 hyperscale public cloud service after failing to comply with key contractual and processing requirements under Part Three of the Protection Act. 2018 data, such as restrictions on international transfers.

In particular, the DPIA disclosed by Computer Weekly through freedom of information requests showed that the risks of sending sensitive personal data to a US company subject to the US government’s intrusive surveillance regime were not properly considered.

Other uses of US cloud providers in the UK criminal justice sector include integrating the Ident1 fingerprint database with Amazon Web Services (AWS) on the Police Digital Services (PDS) Xchange cloud platform; and HM Courts and Tribunals Cloud Video Platform, partly hosted on Azure, which processes biometric information in the form of audio and video recordings of court hearings.

In mid-April 2023, Biometrics Commissioner for England and Wales Fraser Sampson told Computer Weekly that UK law enforcement and judiciary should be able to prove that their growing use of public cloud infrastructure complies with law enforcement data protection regulations.

Speaking specifically about the use of hyperscale public cloud providers to store and process sensitive biometric data, Sampson said that “the burden of proof is on the police because [data] controllers not only to provide information and guarantees, but also to demonstrate that their processing complies with all applicable [data protection] requirements.” He added that the burden of proof is not only about the law, but also about governance, accountability and building public confidence in how the police use new technology.

During a speech before Parliament’s Joint Committee on Human Rights in February 2023, Sampson noted that there is a “culture of non-deletion” within the UK police force when it comes to preserving biometric information.

Continue Reading

TECH

AI creators must study consciousness, experts warn

Published

on

The open letter calls for the tech sector to explore consciousness as AI becomes more advanced.

Continue Reading

TECH

Artificial intelligence and the future of employment

Published

on

Editor’s Note: In this Looking Ahead, students discuss artificial intelligence. Next week, we’ll be wondering, “With the news of the departure of Tucker Carlson and Don Lemon, is cable news in decline? Or is reshaping a chance for him to rise? How do you consume your news? Students must Click here submit opinions of less than 250 words by May 2. The best answers will be published the same evening.

When direct-dial telephones appeared on the switchboards in the 1950s, almost a quarter of a million telephone operators lost their jobs, many of them becoming so-called “hello girls.” More recently, grocery stores and fast food restaurants have cut staff with self-service kiosks. Today, college-educated professionals in many industries fear being replaced by artificial intelligence.

AI will change the future of work, but not necessarily for the worse. Seemingly stable permanent jobs could be in jeopardy, according to the World Economic Forum. The forum projects 85 million job losses by 2025, but also estimates that AI will create 97 million jobs. Their advice to troubled workers? Start retraining.

Automated work will never replace human ingenuity, creativity, or friendships. Jobs that value these skills will become more important with the advent of AI. Workers must see themselves as more than a set of technical skills. AI invites entrepreneurs and pioneers to continue discovering how a changing economy can best appreciate unique human skills, and this can give a competitive edge to those who develop traits we may have taken for granted of late.

— Emily Marsh, Hillsdale College, Economics and Mathematics

The future of education

Artificial intelligence is pushing pedagogy into a new era. Although the first version of ChatGPT was launched a few months ago, it has already had an amazing impact on teachers and students. Forty-three percent of college students report using AI, but more than half say it’s a scam to complete assignments. Educators already have a hard time spotting wrongdoing, and in the next few years this will become almost impossible. It makes no sense to ban technology in the classroom. Artificial intelligence is the future of education.

AI enables an unprecedented level of personalization. Its adaptability allows you to create effective learning programs targeted at each student to increase engagement. The AI ​​also serves as a mentor, explaining complex concepts and issues. In addition, the simplicity of ChatGPT allows anyone with a computer or smartphone to access it.

Many educators seem reluctant to acknowledge that artificial intelligence is changing education. AI won’t take their jobs; good teachers will always be needed at every level of education. AI will only make their job easier by making grading and other routine tasks easier, allowing teachers to focus more on their students.

Educators need to use AI, and students need to be taught how to use it responsibly. Instead of being considered a form of cheating, AI should be seen as an important tool in helping students become better writers and problem solvers.

— Nicholas Rhine, Villanova University, Finance

Revival of the humanities

Many of my colleagues in the humanities are pessimistic about future job prospects. The spread of AI for them is the final nail in the coffin of a discipline that is already in decline. But I am quite optimistic about the future of AI.

AI is designed to streamline time-consuming tasks — coding, math, data analysis, and more — in an efficient and accurate way so that we don’t have to. But since the machine can only produce results based on input, it cannot create new knowledge on its own. That’s what the humanities are for.

Pushing new boundaries in thinking requires a deep curiosity that only humans are capable of—a curiosity cultivated by participation in the work of great thinkers. Aristotle’s teachings are responsible for how we conduct science today. Shakespeare’s plays show us the power of understanding character. Orwell’s stories warn us of the dangers of an unfree society.

Thus, in tandem with the rise of AI, there will be a renaissance in the humanities that will feed it with new knowledge as well as define the ethical framework for its use. AI does not detract from the value of the humanities. It revives him.

— Long Tran Bui, Swarthmore College, Politics, Philosophy and Economics.

Our new companion

No one can disagree: AI will have a huge impact on the labor market and productivity. The rapid rise of ChatGPT testifies not only to the popularity of AI, but also to its ability to solve any day-to-day tasks.

But the AI ​​revolution will not be as apocalyptic as many fear. Those who have taken the time to ignore the hype and use ChatGPT for themselves know that AI tools are more barking than biting. Despite the human tone, GPT, like other AIs, cannot use the most basic logical reasoning.

Of course, future improvements may correct factual errors. But will further training—and extra money for AI developers—really teach new skills to programs that have already learned more than 175 billion parameters?

I’m not sure that AI can replace a human at work. By its nature, AI cannot think like a human. However, it can digest colossal amounts of information and regurgitate it for quick consumption, making it a valuable tool. AI can take its place alongside existing programs, making time-consuming data tasks more efficient, streamlining work in analytics, computer science, and management. This would encourage more growth and therefore more jobs in many areas of business.

If we learn how to use AI to improve our workflow, we will be ready for the future with AI on our workstations. AI will become a companion for all workers and a boon for the labor market.

— John Manning, University of Notre Dame, history.

Click here to submit a response to Future View next week.

Magazine editorial report: Paul Gigot interviews Wall Street Journal columnist Andy Kessler. Images: Zuma Press/WSJ Composite: Mark Kelly

Copyright © 2022 Dow Jones & Company, Inc. All rights reserved. 87990cbe856818d5eddac44c7b1cdeb8

Continue Reading

Trending

Copyright © 2023 Culture Belle Media.