Linklist: November 30, 2021
😔 https://pessimistsarchive.org/ – "Welcome to Pessimists Archive, a project created to jog our collective memories about the hysteria, technophobia and moral panic that often greets new technologies, ideas and trends."
💸 Back when it was normal to advertise cocaine gadgets in magazines, 1970-1980
"In the 1970s, cocaine emerged as the fashionable new drug for entertainers and businesspeople. Cocaine seemed to be the perfect companion for a trip into the fast lane. It “provided energy” and helped people stay “up.”
At some American universities, the percentage of students who experimented with cocaine increased tenfold between 1970 and 1980. The height of drug use in the United States was in 1979, when one in 10 people used illegal drugs on a daily basis, according to the FDA. It was a glamorous party drug that fit in with late nights, loud music, and flashy fashion.
Large amounts of the drug were moving into the country from South America; it was cheap, and dealers took advantage by buying large quantities and mixing it with ammonia and baking soda to create an even cheaper, solid version called crack.
While the white powder was winding its way through rich parties, crack – solid, smokable, faster, and much more addictive – found its way into low-income communities.
While traditionally cocaine was a rich man’s drug (due to the large expense of a cocaine habit), by the late 1980s, cocaine was no longer thought of as the drug of choice for the wealthy. By then, it had the reputation of America’s most dangerous and addictive drug, linked with poverty, crime, and death."
"For some patients weathering a temporary crisis, the restful environment was all the treatment they needed, and they left after a short stay. For those suffering from more severe or chronic disorders, the hospital offered comfort and stability. The focus of treatment was on easing symptoms and providing structures that kept patients safe.
By all accounts, Mr. X thrived at Whitfield. He worked in the hospital’s greenhouse, tending to plants and flowers, and he revealed a surprising store of botanical knowledge. In his downtime he played cards with other patients and with staff. He had a knack for complicated games like bridge.
Knowing the names of things is semantic knowledge; knowing how to do things is procedural knowledge. These parts of Mr. X’s mental functioning were intact. What was missing were his autobiographical memories. And without them, who was he? A skilled bridge player who couldn’t remember how or when he’d learned the game; a gardener with no recollection of who’d taught him the names of flowers or which varieties grew in his mother’s yard.
Mr. X spent hours in the hospital’s library, reading every newspaper and magazine he could get his hands on. He told his doctors that he was looking for something that might jog his memory, something that felt familiar. Nothing ever did. He spoke with a genteel Southern accent, which suggested that he’d had some education in his life, or at least had grown up among educated people. Those people—his people—could tell Mr. X who he was. But no one came to Whitfield to claim him."
"“On a bright, cold morning in early February 1980, Jeffrey Flier, a tall, mustachioed young physician, boarded a train in Boston on his way to New Haven to carry out a distinctly disagreeable professional task.” So began a New York Times Magazine story, dated November 1, 1981, entitled, “A Fraud That Shook the World of Science” (Hunt 1981). Three years following a move to Boston to set up my Harvard lab, this widely read story brought attention of a kind I wasn’t seeking, even though my role in the story was, to paraphrase Agatha Christie, to be detective chief inspector of the data. As the article vividly recounts, it was a tragic case of research fraud at Yale into which I was reluctantly drawn as an impartial expert auditor. My amateur forensic analysis revealed that all was not well in biomedical research, even in its highest echelons. [End Page 438]
The story began with Helena Wachslicht-Rodbard, a young Brazilian physician who joined the Diabetes Branch at the National Institutes of Health (NIH) in 1977 as a visiting scientist. Her ambition was to be a physician-scientist in the area of metabolism, and joining the leading lab in the emerging insulin receptor field was a logical choice. We shared a lab on the 8th floor of NIH Building 10, where I was among those helping orient Helena to the techniques required for her work. Her aim was to quantitate insulin receptors on patients’ cells to see if their number was affected by clinical states such as diabetes. We collaborated, and published one paper together ( Wachslicht-Rodbard et al. 1981).
One project that Helena undertook involved measuring receptors on cells of patients with anorexia nervosa. This disorder, in which young women restrict their food intake for reasons that remain poorly understood, threatens their health and often their lives. Severe starvation causes blood sugar and insulin levels to be very low. Helena’s research would test the hypothesis that in response to low blood insulin levels, the number of insulin receptors on their cells would increase. When I left the NIH in July 1978, Helena was well into her study.
In January 1980, I received a phone call from my NIH mentor Jesse Roth. We had had minimal contact over the prior 18 months, so we caught up on my new lab and on family. Then Jesse got to the point of his call. There was “a problem in the lab,” and he “knew that I was the perfect person” to help solve it. A tangled tale emerged. Helena had submitted her paper on insulin receptors in anorexia nervosa to the New England Journal of Medicine. Several weeks later the editorial decision arrived in the mail: the paper was rejected. One reviewer was positive; the other raised technical issues and thought the paper unsuitable for a general journal like the NEJM. Disappointing news for sure, but such rejections are a normal, if frustrating, part of scientific life. Helena and Jesse spent time considering where to resubmit the paper.
Several months later, Jesse was asked to review a manuscript for another journal, the American Journal of Medicine. Submitted by a group at Yale, the topic and results of the paper sounded remarkably similar to Helena’s’ rejected anorexia work. The paper arrived as he was leaving for an extended trip, and like many laboratory chiefs heading out on a lecture tour, he left the paper with a mentee to examine—in this case Helena—saying they would discuss it when he returned.
Upon seeing the title, Helena devoured the paper, and within minutes, she was infuriated. The data and conclusions closely paralleled her rejected paper, and it was evident that several paragraphs from her paper had been plagiarized, word for word. She deduced the authors of this paper must have reviewed her paper. Who were they? None other than a highly regarded group at Yale led by Philip Felig, a respected leader in metabolic research. In his mid-40s, Felig was on the fast track to leadership in academic medicine. Unknown to Helena, he [End Page 439] was an acquaintance of Jesse Roth from their boyhood days in Brooklyn. Helena concluded that Felig and colleagues were academic criminals, and that justice should be swift."
"The year 1832 in France still conjures up images of rebellion and barricades thanks to the enduring pathos of Victor Hugo’s Les Misérables. For the real-life Parisians, however, who inspired the novel’s iconic characters, it was not only a year of lost causes, bloody street battles, and political disillusionment. It was also, in the parlance of our times, a “pandemic year” during which thousands — more than 18,000 in Paris; 100,000 across France1 — succumbed to a wave of cholera that had been causing havoc throughout Asia, Russia, and parts of East Central Europe since the 1820s. Although germ theory was still in its infancy at the time, people were quick to grasp the contagious nature of the disease and sought to speedily bury their dead as authorities scrambled desperately to meet the demands of an unprecedented public health crisis. A recent transplant to Paris, the German poet Heinrich Heine noted in a letter penned in mid-April 1832 — less than a month after the first recorded case of cholera in the French capital — the “disagreeable” sight of “great furniture wagons used for ‘moving’ now moving about as dead men’s omnibuses . . . going from house to house for fares and carrying them by dozens to the field of rest.”2
Parisians had been vaguely aware of the advancing wave of choléra-morbus, as the disease was then known, since at least 1830, when papers began regularly reporting on outbreaks in the Russian Empire’s eastern provinces.3 By early 1831, news of the ravages caused by the pandemic in Poland and East Prussia was already circulating by word of mouth in the capital. Legitimists (ultra-conservative supporters of the exiled Bourbon dynasty) imagined rootless political radicals spreading the disease with the same ease that they proselytised the “dangerous classes”, while the Church foresaw in the potential upheaval an “opportunity to renew its ties with a population that had shown itself unfaithful to the Catholic religion and the Bourbon dynasty”.4 Medical authorities, meanwhile, were certain that “the topographical situation of France is so advantageous that there is little to fear in this country from choléra-morbus or any other pestilential epidemic”.5
It is less certain what ordinary Parisians made of the alarming news, but those with money and time to spend on leisure turned to popular entertainment for comfort and comic relief. The early 1830s saw a spectacular resurgence of satire in France — the sort that could be both brilliant and crass, as the drawings of Honoré Daumier attest — but what is usually forgotten is that not all satire was printed for an exclusively reading public. Much was meant for the stage, particularly in 1831 when the July Monarchy’s promises of free speech did not yet ring entirely hollow.6 Only the previous year, French theatre had experienced one of the most tumultuous episodes in its venerable history with the so-called Battle of Hernani, a skirmish — at times quite physical — between the Romantic rebels congregated around Victor Hugo and the classicist fogeys who saw in Romanticism only the glorification of deformity and vulgarity.7 That was, however, the “serious” theatre of the Comédie Française which, generally speaking, only the educated bourgeoisie and aristocracy had any real interest in, as Hugo himself was forced to concede.8 A petit-bourgeois office clerk or shop assistant with no particular interest in the rarefied culture wars of the day — the sort that Balzac, and later Flaubert, caricatured without mercy — was much more likely to go for a one-act farce at one of Paris’ several vaudeville theatres."
"A scenario in which harmful pathogens could be hoarded and used as bargaining chips by countries that could unleash a pandemic on the world is sadly not the stuff of fiction. In fact, under certain interpretations of an international agreement generally known as the Nagoya Protocol, countries could choose to keep pathogen data and samples for themselves. This potential scenario, and the need for quick sharing of information, is being considered as part of the World Health Assembly pandemic preparedness discussions.
The Nagoya Protocol is a supplement to the Convention on Biological Diversity. Its stated aim is to enable countries to preserve biodiversity and share in any benefits derived from the use of their “genetic resources” — be they plants, fungi, or various forms of wildlife. It’s a laudable goal.
But several nations have interpreted the Nagoya Protocol to extend to pathogens, and enacted policies that impede sharing either samples of pathogens or data about them even when doing so would save lives.
During an outbreak of Middle East respiratory syndrome (MERS) that began in 2012, Saudi Arabia refused to share samples of the virus with researchers. A similar instance of pathogen withholding occurred after an Ebola outbreak began in western Africa near the end of 2013. In each of these cases, the scientific community’s ability to contain outbreaks, track the spread of disease, and treat patients was impeded."
"The basic idea of proof of stake is fairly simple:
Instead of buying mining equipment for $1000, nodes can lock up $1000 of cryptocurrency worth (“staking”)
Instead of indicating which blocks are valuable by mining on top of them, they can just vote for them on the network, and sign this with a digital signature
Instead of having the block that had the most4 mining done on it win, the block that had the most votes will win
If nodes misbehave, instead of losing their rewards from the day’s work, they will literally lose their entire stake - as if their entire mining rig farm burned down in a proof of work system.
Promoters will then argue that, because these incentives are equal or superior to those of proof of work (this is true), it is also an equally strong or superior system to proof of work (this is a lie). Their problem is that it is not sufficient to write a wish list of incentives, because they also have to create the system that puts them in place.
To use an analogy, it is as if someone would sit down to design a building in the following way: first, they draw how they would like for the exterior to look. Then, they draw how they would like for the interior to look. They make basic measurements, to confirm that the interior does not exceed the exterior in terms of dimensions. They then suggest that the house is plausible, and send it off to the construction workers to build.
Of course, they are missing the most important part: the structural system of beams and load-bearing walls that ensures the building continues being a building! Our heroes have to lay out, in practice, how their system would work, and this is where the fun starts."
"This photograph shows men making pig iron at a place called the Iroquois smelter in Chicago sometime between 1890 and 1901. Molten iron was poured via a central channel to fill the small, regular trenches on the ground to form ingots. Once cooled, the ingots were broken apart from one another and readied for transport or storage.
The smelter's brick structure has church-like arches, and the beams of light flowing through smoke suggest burning incense. The beauty is undeniable.
But this image also contains evil. The same smoke that gives form to light was undoubtedly life-shortening for workers at the smelter.
I’m reminded of this passage from Robert Adams’s essay Photographing Evil from his book Beauty in Photography:
"When we are young, we want art that is filled with the bitter facts, because we believe that evil can be overcome if we face it; when we grow older and begin to doubt this optimistic belief, we want art that does not simply reinforce the pain of our disillusionment.”
Photographs like this one meet the requirements of young and old alike; "[they] urge reform, but seem to suggest that the need for it is not the most important thing to be said of life.""
"Like many fading species, the semicolon has been usurped by a more violent, adaptable rival. “We live,” says Cecelia Watson author of Semicolon, “in the Era of the Dash.” It’s an accurate observation. The dash is brutal, unsympathetic, slashing, and impossible to be confused about — it takes you, frictionlessly, to the point. The dash invites no ambiguity and wastes no time. The punctuational equivalent of tearing a heart from a chest.
To read back those writers who put the semicolon to best use, like Jane Austen, or Virginia Woolf, is to enter a world where time is abundant. The OED describes the semicolon as “indicating a pause… more pronounced than that indicated by a comma.” The pronounced pause is where the real beauty of a semicolon lies. You can feel it in one of Woolf’s greatest sentences, as she describes Clarissa Dalloway hearing Big Ben: “First a warning, musical; then the hour, irrevocable.”
Put a dash where the semicolon is there and a whole moment in time and feeling would evaporate. The loss of such moments is what the end of the semicolon signals."
"Reality shifting (RS) is a trendy mental activity that emerged abruptly following the flare-up of the COVID-19 pandemic in 2020 and seems to be practiced mainly by members of the post-millennial generation. RS, described as the experience of being able to transcend one’s physical confines and visit alternate, mostly fictional, universes, is discussed by many on Internet platforms. One RS forum boasts over 40,000 members and RS clips on some social media platforms have been viewed over 1.7 billion times. The experience of shifting is reportedly facilitated by specific induction methods involving relaxation, concentration of attention, and autosuggestion. Some practitioners report a strong sense of presence in their desired realities, reified by some who believe in the concrete reality of the alternate world they shift to. One of the most popular alternate universes involves environments adopted from the Harry Potter book and film series. We describe the phenomenology of RS as reported online and then compare it to related phenomena such as hypnosis, tulpamancy, dissociation, immersive and maladaptive daydreaming, and lucid dreaming. We propose a theoretical model of interactive factors giving rise to RS, and conclude that it is an important, uninvestigated emerging phenomenon and propose future research directions."