
Save 54% on a lifetime of language lessons from Rosetta Stone
TL;DR: As of July 21, get lifetime access to all Rosetta Stone Languages for only
2023-07-21 17:50

Texas A&M University president resigns after Black journalist’s hiring at campus unravels
The president of Texas A&M University has resigned after a Black journalist’s celebrated hiring at one of the nation’s largest campuses unraveled over criticism of her diversity and inclusion work
2023-07-22 06:17

Snag a 2021 iPad for $79 off with this early Prime Day deal
SAVE $79.01: As of June 22, the 2021 iPad (WiFi, 64GB) is on sale at
2023-06-23 03:20

Intel in talks to be anchor investor in Arm IPO - source
Intel is in talks with SoftBank Group Corp's Arm to be an anchor investor in the chip maker's
2023-06-13 09:51

Wrongly arrested because of facial recognition: Why new police tech risks serious miscarriages of justice
On 16 February, Porcha Woodruff was helping her children get ready for school when six Detroit police officers arrived at her door. They told her she was under arrest for a January carjacking and robbery. She was so shocked she wondered for a moment if she was being pranked. She was eight months into a difficult pregnancy and partway through a nursing school programme. She did little else besides study and take care of her kids. She certainly wasn’t out stealing cars at gunpoint, she said. “I’m like, ‘What,?’ I opened my door so he could see my stomach. ‘I’m eight months pregnant. You can see two vehicles in the driveway. Why would I carjack?’” she told The Independent. “‘You’ve gotta be wrong. You can’t have the right person.’” Her children cried as she asked officers if the suspect was pregnant and insisted they had mistakenly arrested her. She was put in handcuffs and taken to jail, where she had panic attacks and early contractions. She later learned police identified her as a suspect after running security footage through the department’s facial recognition software, relying on a 2015 mugshot from a past traffic arrest into a photo lineup where the carjacking victim singled out Ms Woodruff as her assailant. The Detroit Police Department eventually dropped the case, but the arrest has deeply shaken Ms Woodruff. “What happened to the questioning? What happened to me speaking to someone?” she said. “What happened to any of the initial steps that I thought were available to a person who was accused of doing something?” The case underscores the growing risks of civil rights violations as police departments and law enforcement agencies across the country increasingly adopt facial-recognition and other mass surveillance technologies, often used as an unreliable shortcut around methodical human police work. Criminal justice advocates and the people targeted by this burgeoning police tech argue these programmes are riddled with the same biases and opaque or nonexistent oversight measures plaguing policing at large. The early results, at least, haven’t been encouraging. At least six people around the US have been falsely arrested using facial ID technology. All of them are Black. These misfires haven’t stopped the technology from proliferating across the country. At least half of federal law enforcement agencies with officers and a quarter of state and local agencies are using it. “We have no idea how often facial recognition is getting it wrong,” Albert Fox Cahn, executive director of the Surveillance Technology Oversight Project (STOP), told The Independent. “When you have facial recognition being used thousands of times, without any accountability for mistakes, it’s inviting injustice,” he added. Nowhere has that injustice been more pronounced than Detroit, a city where Black people have long experienced documented over-policing from law enforcement. Three of the six people mistakenly arrested by facial recognition technology have been in the Motor City, according to the ACLU. This status quo is why Ms Woodruff is suing DPD, claiming among other things that the agency has engaged in “a pattern of racial discrimination” against her and other Black residents “by using facial recognition technology practices proven to misidentify Black citizens at a higher rate than others in violation of the equal protection guaranteed by” the Michigan civil rights statutes. “I definitely believe that situation would’ve gone differently had it been another race, honestly, just my opinion. There was no remorse shown to me and I was pregnant. I pleaded,” she told The Independent. “Being mistaken for something as serious as that crime – carjacking and armed robbery – that could’ve put me in a whole different type of lifestyle,” she added. “I was in school for nursing. Felons cannot become nurses. I could’ve ended up in jail. That could have altered my life tremendously.” The Independent has requested comment from DPD. After Ms Woodruff filed her suit, Detroit police chief James White said in a press conference in August “poor investigative work” led to the false arrest, not facial recognition technology. He claimed that department software gave detectives numerous possible suspects and was only meant to be a “launch” point for further investigation. “What this is, is very, very poor investigative work that led to a number of inappropriate decisions being made along the lines of the investigation, and that’s something this team is committed to not only correcting, having accountability, having transparency with this community, and in building policy immediately to ensure regardless of the tool being used, this never happens,” Mr White said. He added that officers won’t be allowed to use images sourced by facial recognition in lineups, and warrants based on facial ID matches must be reviewed by two captains before being carried out. ‘The lead and the conclusion’ Some aren’t convinced these changes will prevent the excesses of what they see as a fundamentally flawed technology. “The technology is flawed. It’s inaccurate,” Philip Mayor, senior staff attorney at the ACLU of Michigan, told The Independent. “Police repeatedly assured us that it’s being used only as an investigative lead, but what we see here in Detroit time and time again is it is both being used as the lead and the conclusion.” Studies suggest that facial-recognition algorithms, which have been used to capture suspects in high-profile cases like those connected to January 6, also fail to accurately identify Black people and women, driving up inequalities in arrests, because image-training datasets often lack full diversity. However, according to Mr Mayor, police departments make things even worse by failing to do basic training and common-sense investigative work on top of facial recognition tools. He represents Robert Williams, a Detroit man who was mistakenly arrested for a 2020 theft from a high-end Detroit boutique. A security contractor employed by the store worked with the city and state police and flagged Mr Williams’ name using facial recognition tools. How police came to trust that Williams was the right man reveals the sloppiness of how facial ID tech is used in practice, according to the ACLU attorney. After the theft, police searched a database containing both past photos of Mr Williams and his present-day driver’s license. ‘It picks out 486 people who are the most likely perpetrators; not a single one of them is his current driver’s license, even though his current driver’s license is in the database that was searched,’’ Mr Mayor said. “That seems like an obvious exculpatory fact, the kind of thing that would lead you to say if you were actually thinking, this isn’t the right guy.” When these dubious matches are then used to build a line-up, questionable police work attains the gloss of near-fact, and witnesses choose from a group of people who may have no credible tie to a crime that took place but still look something like the person who did. “This is not me,” Mr Williams told police during his investigation, according to The New York Times. “You think all Black men look alike?” The father of two, after asking a local police voluntarily stop using facial recognition technology, sued the DPD in 2021. “This technology is dangerous when it doesn’t work, which is what the cases in Detroit are about. It’s even more dangerous when it does work. It can be used to systematically surveillance when we come and go from every one of the places that are important in our private lives,” the ACLU attorney said. “I don’t think there’s any reason to believe that departments elsewhere right now are not making the same mistakes.” ‘A force multiplier for police racism’ Detroit isn’t the only place grappling with the impacts – and errors – of this technology. In Louisiana, the use of facial recognition technology led to a wrongful arrest of a Georgia man for a string of purse thefts. A man in Baltimore spent nine days in jail after police incorrectly identified him as a match to a suspect who assaulted a bus driver. The Baltimore Police Department ran nearly 800 facial recognition searches last year. Those cases and others have added to a growing list of misidentified suspects in a new era of racial profiling dragnets fuelled by tech that is rapidly outpacing police and lawmakers’ ability to fix it. Facial recognition software often is “a force multiplier for police racism,” worsening racial disparities and amplifying existing biases, according to Mr Cahn. It can spur a vicious cycle. Black and brown people are already arrested at disproportionate rates. These arrests mean they are more likely to enter a database of faces being analyzed and used for police investigations. Then, error-prone facial recognition technology is used to comb these databases, often failing to identify or distinguish between Black and brown people, particularly Black women. “So the algorithms are biased, but that’s just the start, not the end of the injustice,” Mr Cahn says. Such technologies, advocates warn, are embedded in wider mass surveillance programmes that often lack robust public oversight. In New York City, law enforcement agencies relied on facial recognition technology in at least 22,000 cases between 2016 and 2019, according to Amnesty International. New York City’s Police Department spent nearly $3bn growing its surveillance operations and adding new technology between 2007 and 2019, including roughly $400m for the Domain Awareness System, built in partnership with Microsoft to collect footage from tens of thousands of cameras throughout the city, according to an analysis from STOP and the Legal Aid Society. The NYPD has failed to comply with public disclosure requirements about what those contracts – from facial recognition software to drones and license plate readers – actually include, according to the report. Until 2020, that money was listed under “special expenses” in the police budget until passage of the Public Oversight of Surveillance Technology Act. The following year, more than $277m in budget items were listed under that special expenses programme, the report found. “We’ve seen just concerted pushback from police departments against the sort of oversight that every other type of government agency has because they don’t want to be held accountable,” according to Mr Cahn. “If we treated surveillance technology vendors the way we treated other technology vendors, it would be like Theranos – police would be arresting some of these vendors for fraud rather than giving them government contracts,” he added. “But there is no accountability.” On 7 August, 2020, New York City Police Department officers in riot gear launched a six-hour siege outside Derrick Ingram’s Hell’s Kitchen apartment. Mr Ingram – a racial justice organiser who is embroiled in a federal lawsuit against the NYPD – was surrounded by more than 50 officers after he allegedly shouted into an officer’s ear at a protest earlier that summer. Police insisted they had a warrant on assault charges, but couldn’t produce one when Mr Ingram asked them to, according to his suit. The whole encounter, in which the NYPD deployed snipers, drones, helicopters, and police dogs, began with facial recognition technology. “To say that I was terrified is an understatement – I was traumatized, I still am,” Mr Ingram later testified. “I fear deep down in my core that if I opened my door to those officers, my life would be swiftly taken.” To identify Mr Ingram as a potential suspect, NYPD relied on facial recognition software “as a limited investigative tool, comparing a still image from a surveillance video to a pool of lawfully possessed arrest photos,” according to a police statement, adding that “no one has ever been arrested solely on the basis of a computer match.” The software pulls from a massive internal database of mugshots to generate possible matches, according to the department. Civil rights groups and lawmakers criticized the department’s use of facial recognition – initially hailed as a tool to crack down on violence offenders – for being deployed to suppress dissent, and triggering a potentially lethal police encounter at Mr Ingram’s home. As for Ms Woodruff in Detroit, she hopes her experience can show the dangers of relying too heavily on facial recognition technology. “It may be a good tool to use, but you have to do the investigative part of using that, too,” she said. “It’s just like everything else. You have your pieces that you put together to complete a puzzle.” Her life would’ve been a whole lot different, she said, if “someone would’ve just taken the time to say, ‘OK, stop, we’re going to check this out, let me make a phone call.’” Read More Detroit police changing facial-recognition policy after pregnant woman says she was wrongly charged White House science adviser calls for more safeguards against artificial intelligence risks How a Drake concert put NYPD’s ‘arsenal’ of surveillance technologies under the spotlight
2023-09-15 03:47

Cryptoassets increase risk in developing economies, study says
NEW YORK Cryptoassets, peddled as the future of finance, have not only failed to deliver on their promise
2023-08-23 02:15

Get to the next level with these Logitech gaming deals
Gaming is a fun way to have fun, relax, and learn to strategize. But we
2023-05-23 00:53

SpaceX Starship: World’s most powerful rocket should launch imminently, Elon Musk says
SpaceX’s Starship should take off for the second ever time this week, Elon Musk has said. The world’s most powerful and tallest rocket is aiming to launch this week, he tweeted. The rocket will attempt to fly around the Earth and then drop into the ocean in a major test. Eventually, SpaceX hopes that Starship will help carry humans to the Moon and onto Mars. But first it must prove that it is safe for orbital flight in an uncrewed test. Friday’s flight would mark the second launch after a spectacular failure in April that saw the rocket blow up soon after launch. Since then, the private space company has been working to secure regulatory approval for another test. Now Elon Musk says that those approvals should be granted in time to launch on Friday, 14 November. Earlier, SpaceX had only said the rocket “could launch as early as Friday”. It may still be delayed, and previous tests have been pushed back mere seconds before launch. The first orbital test flight was attempted in April this year. Soon after it took off, Starship began to tumble, and the rocket exploded. Since then, SpaceX has been working to fix a number of issues with both the rocket and the launchpad. The FAA required that 63 fixes were needed before it would give permission for the rocket to launch again. Those changes have led to a series of improvements that SpaceX says should reduce the chance of another failure, as well as protecting the humans who will eventually fly in the spacecraft. “Starship’s first flight test provided numerous lessons learned that directly contributed to several upgrades to both the vehicle and ground infrastructure to improve the probability of success on future flights,” SpaceX says on its website. “The second flight test will debut a hot-stage separation system and a new electronic Thrust Vector Control (TVC) system for Super Heavy Raptor engines, in addition to reinforcements to the pad foundation and a water-cooled steel flame deflector, among many other enhancements. “This rapid iterative development approach has been the basis for all of SpaceX’s major innovative advancements, including Falcon, Dragon, and Starlink. Recursive improvement is essential as we work to build a fully reusable transportation system capable of carrying both crew and cargo to Earth orbit, help humanity return to the Moon, and ultimately travel to Mars and beyond.” Read More ChatGPT creator mocks Elon Musk in brutal tweet Elon Musk’s new AI bot will help you make cocaine which proves it’s ‘rebellious’ Elon Musk weighs in on the scooped bagel debate How Elon Musk’s ‘spicy’ Grok compares to ‘woke’ ChatGPT Elon Musk unveils new sarcasm-loving AI chatbot for premium X subscribers Elon Musk mocks politicians at AI summit
2023-11-14 18:16

MTG Ob Nixilis, Captive Kingpin Combo Explained
Magic: The Gathering Standard has a new two-card combo that instantly wins games. Here's how it works.
2023-05-11 08:26

Missed October Prime Day? Here Are the Best Amazon Deals You Can Still Get Online
Some Prime Big Deal Days sales are still going strong, even though October Prime Day is over. Discover the best Amazon deals you can still shop here.
2023-10-17 05:26

OpenAI announces return of Sam Altman as chief executive
Sam Altman will return to OpenAI after an agreement in principle was reached, the company has announced. Posting on X, formerly known as Twitter, OpenAI also announced a new initial board of former Salesforce chief executive Bret Taylor, the former US treasury secretary Larry Summers and Quora chief executive Adam D’Angelo. Mr Altman also posted, saying “i love openai, and everything i’ve done over the past few days has been in service of keeping this team and its mission together.” Last week the board of OpenAI, which created the ChatGPT artificial intelligence tool, said it had pushed Mr Altman out after a review found he was “not consistently candid in his communications” with the board. Greg Brockman, the company’s president and co-founder, who left in protest at Mr Altman’s sacking said on X: “Amazing progress made today. We will come back stronger and more unified than ever.” “Returning to AI & getting back to coding tonight,” Mr Brockman added. The previous board of directors, which included Mr D’Angelo and Mr Brockman, refused to give specific reasons to why they fired Mr Altman last Friday. This led to mounting pressure within the company to reinstate Mr Altman, including a threatened exodus of nearly all of the company’s 770 employees. Microsoft, which has invested billions of dollars in OpenAI, moved to hire Mr Altman and Mr Brockman on Monday. In a post on social media on Wednesday morning, the chairman and chief executive of Microsoft, Satya Nadella, said he is “encouraged by the changes to the OpenAI board”. “We believe this is a first essential step on a path to more stable, well-informed, and effective governance.” Read More Data protection watchdog warns websites over cookie consent alerts Employee data leaked during British Library cyber attack Half of adults who chat online with strangers do not check age – poll Businesses embracing generative AI but fear cyberattacks, survey finds Young Britons turning to AI chatbots for help with school and work – survey Police to trial use of drones as first responders to emergencies
2023-11-22 15:46

Astronauts dropped a toolbag in space which you can see with just binoculars
Whilst repairing external parts of the International Space Station (ISS) last week, astronauts dropped a toolbag. And it turns out you just need a pair of binoculars to see it. The bag is tiny compared to the ISS, but it's reflective enough that when it catches the Sun's light it reaches 6th magnitude from Earth according to Earthsky. Under very dark skies, some powerful binoculars or a small telescope might allow you to see the toolbag. The bag is moving at almost exactly the same speed as the ISS on the same path and about a minute ahead of it. Over time, however, its distance from the ISS will grow, making it harder to find. Eventually, its orbit will become low enough that it burns up from friction with the outer atmosphere. You can find out if you have the ISS passing overhead here if you want to have a chance of seeing the bag. The ISS can only be seen easily when it's dark on the ground and sunlight is still catching it. It means it's usually best seen when the skies are not fully dark - so around dusk or dawn. Here is what the toolbag looks like from space: Sign up to our free Indy100 weekly newsletter Have your say in our news democracy. Click the upvote icon at the top of the page to help raise this article through the indy100 rankings. How to join the indy100's free WhatsApp channel
2023-11-13 22:51
You Might Like...

GameStop shares plummet after fifth CEO exit in 5 years

USNC Selects Gadsden, Alabama for Advanced Microreactor Assembly Plant

SEC dropping claims against ripple executives - court filing

Binance market share takes regulatory hit, its US affiliate shrinks

From $1 Billion to Almost Worthless: Faze Clan Runs Out of Hype

Why is Keith Lee's DoorDash hack on TikTok 'vaguely unethical'?

Twitter's new encrypted message feature criticized by security and privacy experts

Can Your PC Handle Mortal Kombat 1's Blood-Soaked System Requirements?