- About Us
The U.S. Food and Drug Administration said it sent warning letters to 11 companies for marketing dietary supplements that don’t meet its guidelines. The agency issued warnings to three companies for marketing dietary supplements containing phenibut, which is sometimes sold as a sleep aid or to treat anxiety. It said that phenibut doesn’t meet the statutory definition of a dietary ingredient, which is generally a vitamin, herb or other natural substance used to supplement the diet. A Wall Street Journal article last week on the $40 billion supplement industry said that phenibut, developed as a drug in the former Soviet Union, was being marketed as a “nootropic,” or brain supplement, in the U.S. A spokeswoman for the FDA said that the agency was already investigating phenibut. The FDA also issued warnings to eight companies for marketing dietary supplements containing DMHA, a stimulant sometimes found in exercise and weight-loss supplements. The companies have 15 business days to inform the FDA of steps they will take to bring their products into compliance. That could include a decision to recall, reformulate or discontinue sales. Supplements aren’t tightly regulated by the FDA like prescription drugs. Dietary-supplement manufacturers don’t need approval from the FDA before introducing their products to the market. The FDA has oversight for taking action against any misbranded supplement after it reaches the market.
Federal regulators investigating Facebook for mishandling its users’ personal information have set their sights on the company’s chief executive, Mark Zuckerberg, exploring his past statements on privacy and weighing whether to seek new, heightened oversight of his leadership. The discussions about how to hold Zuckerberg accountable for Facebook’s data lapses have come in the context of wide-ranging talks between the Federal Trade Commission and Facebook that could settle the government’s more than year-old probe, according to two people familiar with the discussions. Both requested anonymity because the FTC’s inquiry is confidential under law. Such a move could create new legal, political and public-relations headaches for one of Silicon Valley’s best known — and image conscious — corporate leaders. Zuckerberg is Facebook’s co-founder, chief executive, board chairman and most powerful stock owner, and a sanction from the federal government would be seen as a rare rebuke to him and the tech giant’s historic “move fast and break things” ethos.
The FDA is at its core a public health agency and should not be bucketed with those institutions that regulate commerce and business practice. Yes, the FDA does regulate multi-billion-dollar companies in how they are able to commercialize their products. However, their products impact is less counted by financial gain and more by lives saved. All of this necessitates that FDA should and must be exempt from the directive in this memo. Because advancing science, protecting consumers, and bettering patients’ lives are too important to fall victim to a policy misstep.
Creating strong regulations for the technology is going to be an uphill battle, especially because it’s already become widespread, being deployed at airports to make boarding easier and adopted by schools to increase safety. It is even being used at summer camps so parents can automatically receive photos in which their children appear. Threats to our obscurity are growing because technology is making our personal information easy and cheap to aggregate, archive and interpret — with substantial growth in predictive analytics, too. To see what we mean, just look yourself up on the website MyLife and marvel at how much information has been cobbled together from different moments in your life for anyone to see at the click of a button. Even speaking in hushed tones to a friend at a crowded cafe might not be enough to protect your obscurity if cameras are someday equipped with lip-reading artificial intelligence software. Obscurity is vital to our well-being for several reasons. It gives us breathing room to go about our daily routines with little fear of being judged, sent unwanted ads, gossiped about or needlessly shamed.
Algorithms are being used everywhere: in credit decisions, mortgages, insurance rates, who gets a job, which kids get into college, and how long criminal defendants go to prison to name a few proliferating examples. Messy, complicated human decisions are being made, typically without an explanation or a chance to appeal, by artificial intelligence systems. They provide efficiency, profitability, and, often, a sense of scientific precision and authority. The problem is that this authority has been bestowed too hastily. Algorithms are increasingly found to be making mistakes. Whether it’s a sexist hiring algorithm developed by Amazon, conspiracy theories promoted by the Google search engine or an IBM facial-recognition program that didn’t work nearly as well on black women as on white men, we’ve seen that large companies that pride themselves on their technical prowess are having trouble navigating this terrain. The Democratic bill, introduced in the Senate and House of Representatives last week, would give the Federal Trade Commission power to require and monitor procedures by big companies to keep track of their algorithms and audit them for fairness and accuracy. It would apply only to companies with at least $50 million in annual revenue and would pertain even if intellectual property rights are involved, although it looks like the companies would have leeway in terms of whether they make the audits publicly available.