Cyber Daily: Police Use of Purchased Data Raises Constitutional Questions
By Kim S. Nash
Welcome back. Data brokers sprung up to help marketers
and advertisers better communicate with consumers. But over the past
few decades, they have created products that cater to the
law-enforcement, homeland-security and national-security markets.
Police agencies can buy the data in a practice they call open-source
intelligence. Critics denounce it as warrantless surveillance. The
practice is the subject of a vigorous national dispute about government
tracking through data brokers without a judge’s approval, The Wall
Street Journal reports.
Read on for more news.
Sponsored by Netscout
Why Are Carpet Bombing DDoS Attacks Difficult To Mitigate?
Although carpet bombing attacks are complex,
building an understanding is the first step to detecting, mitigating,
and lessening their impact.
Kevin Metcalf, a prosecutor in Arkansas, used widely available commercial data to pursue a hunch in the case of a missing girl.
PHOTO: RANA YOUNG FOR THE WALL STREET JOURNAL
Tracking behavior online and offline: Few
consumers realize how much information their phones, cars and other
connected devices broadcast to commercial brokers and how widely it is
used in finance, real-estate planning and advertising. While such data
has been quietly used for years in intelligence, espionage and military
operations, its increasing use in criminal law raises a host of
potential constitutional questions.
Data purchased on consumer addresses, purchases and online and offline
behavior have increasingly been used to screen airline passengers, find
and track criminal suspects, and enforce immigration and
Privacy advocates on Capitol Hill, led by Sen. Ron Wyden (D., Ore.) and
Sen. Rand Paul (R., Ky.), have proposed a bill—the Fourth Amendment Is
Not for Sale Act—that would curtail warrantless searches by law
enforcement, including by requiring government entities to secure a
court order before buying U.S. cellphone locations and other
commercially available data from data brokers. The proposed bill only
applies to law enforcement—not volunteers or other nonprofit groups,
which could still pursue open-source intelligence leads and turn them
over to police.
Air Force software platform struggling with cyber concerns. A
software development environment designed to produce code quickly for
the U.S. Air Force has generated questions about its security. Requests
for documentation regarding Platform One, which launched in 2020,
haven’t been fulfilled, creating distrust among senior officials in
other branches of the military. Some officials have tried to ban code
stored on the platform. (FedScoop)
Ransomware group threatens to delete victims’ data if they hire negotiators.
The Grief Corp ransomware group said it will destroy its victims’ data
if they hire negotiators to try to lower a ransom fee. Another
ransomware group known as RagnarLocker previously made a similar
threat. (The Register)
More Privacy News
PHOTO: HERWIN BAHAR/ZUMA PRESS
Irish regulator investigates TikTok’s data practices. Ireland’s data protection commission is investigating TikTok’s
handling of children’s data and alleged transfer of user information to
China. The inquiry will look at the company’s age-verification measures
for children under 13, and other practices for users under age 18.
TikTok has said it doesn’t share Americans’ data with the Chinese
government. The company is also under investigation from the British
privacy regulator. (Financial Times)
The electrocardiogram function on a Fitbit smartwatch. PHOTO: MICHELE TANTUSSI/REUTERS
FTC urges health apps and devices to disclose breaches. Firms
that collect consumers’ health data must disclose when the information
is improperly accessed, even if they aren’t subject to the Health
Insurance Portability and Accountability Act, the Federal Trade
Commission said on Wednesday. (FTC)
The agency warned makers of health apps and wearable fitness trackers
that a 2009 FTC rule mandates such disclosures, carrying a potential
penalty of $43,792 per violation per day. While the FTC rule has yet to
be enforced, the Biden administration has promised to ramp up scrutiny
of companies’ data practices.
PHOTO: SALVATORE DI NOLFI/SHUTTERSTOCK
Facial recognition systems can violate human rights, UN warns.
Michelle Bachelet, the United Nations High Commissioner for Human
Rights, called for a government moratorium on artificial intelligence
tech that doesn’t comply with human rights laws, like face-scanners that
track individuals in public. Governments should show that such AI
systems are accurate, don’t discriminate and adhere to privacy laws, she
said. (Associated Press)