The 21st century belongs to China

Why the new Silk Road threatens to end America’s economic dominance

Beijing is building a trans-Siberian railway system that rivals the Marshall Plan in its ambition and global reach

The 21st century belongs to China: Why the new Silk Road threatens to end America's economic dominance
Performers show the dragon dance during a night parade to celebrate Chinese New Year in Hong Kong, Thursday, Feb. 19, 2015. (Credit: AP/Vincent Yu)
This piece originally appeared on TomDispatch.

BEIJING — Seen from the Chinese capital as the Year of the Sheep starts, the malaise affecting the West seems like a mirage in a galaxy far, far away. On the other hand, the China that surrounds you looks all too solid and nothing like the embattled nation you hear about in the Western media, with its falling industrial figures, its real estate bubble, and its looming environmental disasters. Prophecies of doom notwithstanding, as the dogs of austerity and war bark madly in the distance, the Chinese caravan passes by in what President Xi Jinping calls “new normal” mode.

“Slower” economic activity still means a staggeringly impressive annual growth rate of 7% in what is now the globe’s leading economy. Internally, an immensely complex economic restructuring is underway as consumption overtakes investment as the main driver of economic development. At 46.7% of the gross domestic product (GDP), the service economy has pulled ahead of manufacturing, which stands at 44%.

Geopolitically, Russia, India, and China have just sent a powerful message westward: they are busy fine-tuning a complex trilateral strategy for setting up a network of economic corridors the Chinese call “new silk roads” across Eurasia. Beijing is also organizing a maritime version of the same, modeled on the feats of Admiral Zheng He who, in the Ming dynasty, sailed the “western seas” seven times, commanding fleets of more than 200 vessels.

Meanwhile, Moscow and Beijing are at work planning a new high-speed rail remix of the fabled Trans-Siberian Railroad. And Beijing is committed to translating its growing strategic partnership with Russia into crucial financial and economic help, if a sanctions-besieged Moscow, facing a disastrous oil price war, asks for it.



To China’s south, Afghanistan, despite the 13-year American war still being fought there, is fast moving into its economic orbit, while a planned China-Myanmar oil pipeline is seen as a game-changing reconfiguration of the flow of Eurasian energy across what I’ve long called Pipelineistan.

And this is just part of the frenetic action shaping what the Beijing leadership defines as the New Silk Road Economic Belt and the Maritime Silk Road of the twenty-first century. We’re talking about a vision of creating a potentially mind-boggling infrastructure, much of it from scratch, that will connect China to Central Asia, the Middle East, and Western Europe. Such a development will include projects that range from upgrading the ancient silk road via Central Asia to developing a Bangladesh-China-India-Myanmar economic corridor; a China-Pakistan corridor through Kashmir; and a new maritime silk road that will extend from southern China all the way, in reverse Marco Polo fashion, to Venice.

Don’t think of this as the twenty-first-century Chinese equivalent of America’s post-World War II Marshall Plan for Europe, but as something far more ambitious and potentially with a far vaster reach.

China as a Mega-City

If you are following this frenzy of economic planning from Beijing, you end up with a perspective not available in Europe or the U.S. Here, red-and-gold billboards promote President Xi Jinping’s much ballyhooed new tagline for the country and the century, “the Chinese Dream” (which brings to mind “the American Dream” of another era). No subway station is without them. They are a reminder of why 40,000 miles of brand new high-speed rail is considered so essential to the country’s future. After all, no less than 300 million Chinese have, in the last three decades, made a paradigm-breaking migration from the countryside to exploding urban areas in search of that dream.

Another 350 million are expected to be on the way, according to a McKinsey Global Institute study. From 1980 to 2010, China’s urban population grew by 400 million, leaving the country with at least 700 million urban dwellers. This figure is expected to hit one billion by 2030, which means tremendous stress on cities, infrastructure, resources, and the economy as a whole, as well as near-apocalyptic air pollution levels in some major cities.

Already 160 Chinese cities boast populations of more than one million. (Europe has only 35.) No less than 250 Chinese cities have tripled their GDP per capita since 1990, while disposable income per capita is up by 300%.

These days, China should be thought of not in terms of individual cities but urban clusters — groupings of cities with more than 60 million people. The Beijing-Tianjin area, for example, is actually a cluster of 28 cities. Shenzhen, the ultimate migrant megacity in the southern province of Guangdong, is now a key hub in a cluster as well. China, in fact, has more than 20 such clusters, each the size of a European country. Pretty soon, the main clusters will account for 80% of China’s GDP and 60% of its population. So the country’s high-speed rail frenzy and its head-spinning infrastructure projects – part of a $1.1 trillion investment in 300 public works — are all about managing those clusters.

Not surprisingly, this process is intimately linked to what in the West is considered a notorious “housing bubble,” which in 1998 couldn’t have even existed. Until then all housing was still owned by the state. Once liberalized, that housing market sent a surging Chinese middle class into paroxysms of investment. Yet with rare exceptions, middle-class Chinese can still afford their mortgages because both rural and urban incomes have also surged.

The Chinese Communist Party (CCP) is, in fact, paying careful attention to this process, allowing farmers to lease or mortgage their land, among other things, and so finance their urban migration and new housing. Since we’re talking about hundreds of millions of people, however, there are bound to be distortions in the housing market, even the creation of whole disastrous ghost towns with associated eerie, empty malls.

The Chinese infrastructure frenzy is being financed by a pool of investments from central and local government sources, state-owned enterprises, and the private sector. The construction business, one of the country’s biggest employers, involves more than 100 million people, directly or indirectly. Real estate accounts for as much as 22% of total national investment in fixed assets and all of this is tied to the sale of consumer appliances, furnishings, and an annual turnover of 25% of China’s steel production, 70% of its cement, 70% of its plate glass, and 25% of its plastics.

So no wonder, on my recent stay in Beijing, businessmen kept assuring me that the ever-impending “popping” of the “housing bubble” is, in fact, a myth in a country where, for the average citizen, the ultimate investment is property. In addition, the vast urbanization drive ensures, as Premier Li Keqiang stressed at the recent World Economic Forum in Davos, a “long-term demand for housing.”

Markets, Markets, Markets

China is also modifying its manufacturing base, which increased by a multiple of 18 in the last three decades. The country still produces 80% of the world’s air conditioners, 90% of its personal computers, 75% of its solar panels, 70% of its cell phones, and 63% of its shoes. Manufacturing accounts for 44% of Chinese GDP, directly employing more than 130 million people. In addition, the country already accounts for 12.8% of global research and development, well ahead of England and most of Western Europe.

Yet the emphasis is now switching to a fast-growing domestic market, which will mean yet more major infrastructural investment, the need for an influx of further engineering talent, and a fast-developing supplier base. Globally, as China starts to face new challenges — rising labor costs, an increasingly complicated global supply chain, and market volatility — it is also making an aggressive push to move low-tech assembly to high-tech manufacturing. Already, the majority of Chinese exports are smartphones, engine systems, and cars (with planes on their way). In the process, a geographic shift in manufacturing is underway from the southern seaboard to Central and Western China. The city of Chengdu in the southwestern province of Sichuan, for instance, is now becoming a high-tech urban cluster as it expands around firms like Intel and HP.

So China is boldly attempting to upgrade in manufacturing terms, both internally and globally at the same time. In the past, Chinese companies have excelled in delivering the basics of life at cheap prices and acceptable quality levels. Now, many companies are fast upgrading their technology and moving up into second- and first-tier cities, while foreign firms, trying to lessen costs, are moving down to second- and third-tier cities. Meanwhile, globally, Chinese CEOs want their companies to become true multinationals in the next decade. The country already has 73 companies in the Fortune Global 500, leaving it in the number two spot behind the U.S.

In terms of Chinese advantages, keep in mind that the future of the global economy clearly lies in Asia with its record rise in middle-class incomes. In 2009, the Asia-Pacific region had just 18% of the world’s middle class; by 2030, according to the Development Center of the Organization for Economic Cooperation and Development, that figure will rise to an astounding 66%. North America and Europe had 54% of the global middle class in 2009; in 2030, it will only be 21%.

Follow the money, and the value you get for that money, too. For instance, no less than 200,000 Chinese workers were involved in the production of the first iPhone, overseen by 8,700 Chinese industrial engineers. They were recruited in only two weeks. In the U.S., that process might have taken more than nine months. The Chinese manufacturing ecosystem is indeed fast, flexible, and smart — and it’s backed by an ever more impressive education system. Since 1998, the percentage of GDP dedicated to education has almost tripled; the number of colleges has doubled; and in only a decade, China has built the largest higher education system in the world.

Strengths and Weaknesses

China holds more than $15 trillion in bank deposits, which are growing by a whopping $2 trillion a year. Foreign exchange reserves are nearing $4 trillion. A definitive study of how this torrent of funds circulates within China among projects, companies, financial institutions, and the state still does not exist. No one really knows, for instance, how many loans the Agricultural Bank of China actually makes. High finance, state capitalism, and one-party rule all mix and meld in the realm of Chinese financial services where realpolitik meets real big money.

The big four state-owned banks — the Bank of China, the Industrial and Commercial Bank of China, the China Construction Bank, and the Agricultural Bank of China — have all evolved from government organizations into semi-corporate state-owned entities. They benefit handsomely both from legacy assets and government connections, or guanxi, and operate with a mix of commercial and government objectives in mind. They are the drivers to watch when it comes to the formidable process of reshaping the Chinese economic model.

As for China’s debt-to-GDP ratio, it’s not yet a big deal. In a list of 17 countries, it lies well below those of Japan and the U.S., according to Standard Chartered Bank, and unlike in the West, consumer credit is only a small fraction of total debt. True, the West exhibits a particular fascination with China’s shadow banking industry: wealth management products, underground finance, off-the-balance-sheet lending. But such operations only add up to around 28% of GDP, whereas, according to the International Monetary Fund, it’s a much higher percentage in the U.S.

China’s problems may turn out to come from non-economic areas where the Beijing leadership has proven far more prone to false moves. It is, for instance, on the offensive on three fronts, each of which may prove to have its own form of blowback: tightening ideological control over the country under the rubric of sidelining “Western values”; tightening control overonline information and social media networks, including reinforcing “the Great Firewall of China” to police the Internet; and tightening further its control over restive ethnic minorities, especially over the Uighurs in the key western province of Xinjiang.

On two of these fronts — the “Western values” controversy and Internet control — the leadership in Beijing might reap far more benefits, especially among the vast numbers of younger, well educated, globally connected citizens, by promoting debate, but that’s not how the hyper-centralized Chinese Communist Party machinery works.

When it comes to those minorities in Xinjiang, the essential problem may not be with the new guiding principles of President Xi’s ethnic policy. According to Beijing-based analyst Gabriele Battaglia, Xi wants to manage ethnic conflict there by applying the “three Js”: jiaowang, jiaoliu, jiaorong (“inter-ethnic contact,” “exchange,” and “mixage”). Yet what adds up to a push from Beijing for Han/Uighur assimilation may mean little in practice when day-to-day policy in Xinjiang is conducted by unprepared Han cadres who tend to view most Uighurs as “terrorists.”

If Beijing botches the handling of its Far West, Xinjiang won’t, as expected, become the peaceful, stable, new hub of a crucial part of the silk-road strategy. Yet it is already considered an essential communication link in Xi’s vision of Eurasian integration, as well as a crucial conduit for the massive flow of energy supplies from Central Asia and Russia. The Central Asia-China pipeline, for instance, which brings natural gas from the Turkmen-Uzbek border through Uzbekistan and southern Kazakhstan, is already adding a fourth line to Xinjiang. And one of the two newly agreed upon Russia-China pipelines will also arrive in Xinjiang.

The Book of Xi

The extent and complexity of China’s myriad transformations barely filter into the American media. Stories in the U.S. tend to emphasize the country’s “shrinking” economy and nervousness about its future global role, the way it has “duped” the U.S. about its designs, and its nature as a military “threat” to Washington and the world.

The U.S. media has a China fever, which results in typically feverish reports that don’t take the pulse of the country or its leader. In the process, so much is missed. One prescription might be for them to read The Governance of China, a compilation of President Xi’s major speeches, talks, interviews, and correspondence. It’s already a three-million-copy bestseller in its Mandarin edition and offers a remarkably digestible vision of what Xi’s highly proclaimed “China Dream” will mean in the new Chinese century.

Xi Dada (“Xi Big Bang” as he’s nicknamed here) is no post-Mao deity. He’s more like a pop phenomenon and that’s hardly surprising. In this “to get rich is glorious” remix, you couldn’t launch the superhuman task of reshaping the Chinese model by being a cold-as-a-cucumber bureaucrat. Xi has instead struck a collective nerve by stressing that the country’s governance must be based on competence, not insider trading and Party corruption, and he’s cleverly packaged the transformation he has in mind as an American-style “dream.”

Behind the pop star clearly lies a man of substance that the Western media should come to grips with. You don’t, after all, manage such an economic success story by accident. It may be particularly important to take his measure since he’s taken the measure of Washington and the West and decided that China’s fate and fortune lie elsewhere.

As a result, last November he made official an earthshaking geopolitical shift. From now on, Beijing would stop treating the U.S. or the European Union as its main strategic priority and refocus instead on China’s Asian neighbors and fellow BRICS countries (Brazil, Russia, India, and South Africa, with a special focus on Russia), also known here as the “major developing powers” (kuoda fazhanzhong de guojia). And just for the record, China does not consider itself a “developing country” anymore.

No wonder there’s been such a blitz of Chinese mega-deals and mega-dealings across Pipelineistan recently. Under Xi, Beijing is fast closing the gap on Washington in terms of intellectual and economic firepower and yet its global investment offensive has barely begun, new silk roads included.

Singapore’s former foreign minister George Yeo sees the newly emerging world order as a solar system with two suns, the United States and China. The Obama administration’s new National Security Strategy affirms that “the United States has been and will remain a Pacific power” and states that “while there will be competition, we reject the inevitability of confrontation” with Beijing. The “major developing powers,” intrigued as they are by China’s extraordinary infrastructural push, both internally and across those New Silk Roads, wonder whether a solar system with two suns might not be a non-starter. The question then is: Which “sun” will shine on Planet Earth?  Might this, in fact, be the century of the dragon?

Curing the fear of death

How “tripping out” could change everything

A chemical called “psilocybin” shows remarkable therapeutic promise. Only problem? It comes from magic mushrooms

 

 Curing the fear of death: How "tripping out" could change everything

(Credit: stilikone, Objowl via Shutterstock/Salon)

The second time I ate psychedelic mushrooms I was at a log cabin on a lake in northern Maine, and afterwards I sat in a grove of spruce trees for three and a half hours, saying over and over, “There’s so much to see!”

The mushrooms converted my worldview from an uninspired blur to childlike wonderment at everything I glimpsed. And now, according to recent news, certain cancer patients are having the same experience. The active ingredient in psychedelic mushrooms, psilocybin, is being administered on a trial basis to certain participating cancer patients to help them cope with their terminal diagnosis and enjoy the final months of their lives. The provisional results show remarkable success, with implications that may be much, much bigger.

As Michael Pollan notes in a recent New Yorker piece, this research is still in its early stages. Psychedelic mushrooms are presently classified as a Schedule 1 drug, meaning, from the perspective of our federal government, they have no medical use and are prohibited. But the scientific community is taking some steps that – over time, and after much deliberation – could eventually change that.

Here’s how it works: In a controlled setting, cancer patients receive psilocybin plus coaching to help them make the most of the experience. Then they trip, an experience that puts ordinary life, including their cancer, in a new perspective. And that changed outlook stays with them over time. This last part might seem surprising, but at my desk I keep a picture of the spot where I had my own transcendental experience several years ago; it reminds me that my daily tribulations are not all there is to existence, nor are they what actually matter.

The preliminary research findings are convincing. You could even call them awe-inspiring. In one experiment, an astounding two-thirds of participants said the trip was “among the top five most spiritually significant experiences of their lives.” Pollan describes one cancer patient in detail, a man whose psilocybin session was followed by months that were “the happiest in his life” — even though they were also his last. Said the man’s wife: “[After his trip] it was about being with people, enjoying his sandwich and the walk on the promenade. It was as if we lived a lifetime in a year.”



Which made me do a fist pump for science: Great work, folks. Keep this up! Researchers point out that these studies are small and there’s plenty they don’t know. They also stress the difference between taking psilocybin in a clinical setting — one that’s structured and facilitated by experts — and taking the drug recreationally. (By a lake in Maine, say.) Pollan suggests that the only commonality between the two is the molecules being ingested. My (admittedly anecdotal) experience suggests matters aren’t quite that clear-cut. But even that distinction misses a larger point, which is the potential for this research to help a great many people, with cancer or without, to access a deeper sense of joy in their lives. The awe I felt by that lake in Maine — and the satisfaction and peacefulness that Pollan’s cancer patient felt while eating his sandwich and walking on the promenade — is typically absent from regular life. But that doesn’t mean it has to be.

The growing popularity of mindfulness and meditation suggests that many of us would like to inject a bit more wonder into our lives. As well we should. Not to be a damp towel or anything, but we’re all going to die. “We’re all terminal,” as one researcher said to Pollan. While it’s possible that you’ll live to be 100, and hit every item on your bucket list, life is and always will be uncertain. On any given day, disaster could strike. You could go out for some vigorous exercise and suffer a fatal heart attack, like my dad did. There’s just no way to know.

In the meantime, most of us are caught in the drudgery of to-do lists and unread emails. Responsibility makes us focus on the practical side of things — the rent isn’t going to pay itself, after all — while the force of routine makes it seem like there isn’t anything dazzling to experience anyhow. Even if we’d like to call carpe diem our motto, what we actually do is more along the lines of the quotidian: Work, commute, eat, and nod off to sleep.

With that for a backdrop, it’s not surprising that many of us experience angst about our life’s purpose, not to mention a deep-seated dread over the unavoidable fact of our mortality. It can be a wrenching experience, one that sometimes results in panic attacks or depression. We seek out remedies to ease the discomfort: Some people meditate, others drink. If you seek formal treatment, though, you’ll find that the medical establishment doesn’t necessarily consider existential dread to be a disorder. That’s because it’s normal for us to question our existence and fear our demise. In the case of debilitating angst, though, a doctor is likely to recommend the regimen for generalized anxiety — some combo of therapy and meds.

Both of these can be essential in certain cases, of course; meds tend to facilitate acceptance of the way things are, while therapy can help us, over a long stretch of time, change the things that we can to some degree control. But psychedelics are different from either of these. They seem to open a door to a different way of experiencing life. Pollan quotes one source, a longtime advocate for the therapeutic use of psilocybin, who identifies the drug’s potential for “the betterment of well people.” Psychedelics may help ordinary people, who are wrestling with ordinary angst about death and the meaning of life, to really key into, and treasure, the various experiences of their finite existence.

In other words, psychedelics could possibly help us to be more like kids.

Small children often view the world around them with mystic wonder — pushing aside blades of grass to inspect a tiny bug that’s hidden underneath, or perhaps looking wide-eyed at a bright yellow flower poking through a crack in the sidewalk. (Nothing but a common dandelion, says the adult.) Maybe the best description of psilocybin’s effect is a reversion to that childlike awe at the complexity of the world around us, to the point that we can actually relish our lives.

What’s just as remarkable is that we’re not talking about a drug that needs to be administered on a daily or weekly or even monthly basis in order to be effective. These studies gave psilocybin to cancer patients a single time. Then, for months afterward, or longer, the patients reaped enormous benefit.

(The fact that psychedelics only need to be administered once could actually make it less likely that the research will receive ample funding, because pharmaceutical companies don’t see dollar signs in a drug that’s dispensed so sparingly. But that’s another matter )

Of course, some skepticism may be warranted. Recreational use of psychedelics has been associated with psychotic episodes. That’s a good reason for caution. And a potential criticism here is that psilocybin is doing nothing more than playing a hoax on the brain — a hoax that conjures up a mystical experience and converts us into spellbound kids. You might reasonably ask, “do I even want to wander around awe-struck at a dandelion the same way a 3-year-old might?”

So caution is reasonably advised. But what the research demonstrates is nonetheless remarkable: the way the experience seems to shake something loose in participants’ consciousness, something that lets them see beyond the dull gray of routine, or the grimness of cancer, to the joy in being with loved ones, the sensory pleasure of a good meal, or the astounding pink visuals of the sunset.

 

Are America’s High Rates of Mental Illness Actually Based on Sham Science?

feature1

The real purpose behind many of these statistics is to change our attitudes and political positions.

About one in five American adults (18.6%) has a mental illness in any given year, according to recent statistics from the National Institute of Mental Health. This statistic has been widely reported with alarm and concern. It’s been used to back up demands for more mental health screening in schools, more legislation to forcibly treat the unwilling, more workplace psychiatric interventions, and more funding for the mental health system. And of course, personally, whenever we or someone we know is having an emotional or psychological problem, we now wonder, is it a mental illness requiring treatment? If one in five of us have one….

But what NIMH quietly made disappear from its website is the fact that this number actually represented a dramatic drop. “An estimated 26.2 percent of Americans ages 18 and older — about one in four adults — suffer from a diagnosable mental disorder in a given year,” the NIMH website can still be found to say in Archive.org’s Wayback Machine. Way back, that is, in 2013.

A reduction in the prevalence of an illness by eight percent of America’s population—25 million fewer victims in one year—is extremely significant. So isn’t that the real story? And isn’t it also important that India recently reported that mental illnesses affect 6.5% of its population, a mere one-third the US rate?

And that would be the real story, if any of these statistics were even remotely scientifically accurate or valid. But they aren’t. They’re nothing more than manipulative political propaganda.

Pharmaceutical companies fund the tests

First, that 18.6% is comprised of a smaller group who have “serious” mental illness and are functionally impaired (4.1%), and a much larger group who are “mildly to moderately” mentally ill and not functionally impaired by it­. Already, we have to wonder how significant a lot of these “mental illnesses” are, if they don’t at all impair someone’s functioning.

NIMH also doesn’t say how long these illnesses last. We only know that, sometime in the year, 18.6% of us met criteria for a mental illness of some duration. But if some depressions or anxieties ­last only a week or month, then it’s possible that at any time as few as 1-2% of the population are mentally ill. That’s a much less eye-popping number that, critics like Australian psychiatrist Jon Jureidini argue, is more accurate.

But even that number may be overblown. That’s because these national-level statistics come from surveys of the general population using mental health screening questionnaires that produce extremely high “false positive” rates.

Virtually all of the screening tools have been designed by people, institutions or companies that profit from providing mental health treatments. The Kutcher Adolescent Depression Scale, for example, will “find” mental illnesses wrongly about seven times as often as it finds them correctly. The screening tool’s author, psychiatrist Stan Kutcher, has taken money from over a dozen pharmaceutical companies. He also co-authored the massively influential study that promoted the antidepressant Paxil as safe and effective for depression in children – a study which, according to a $3 billion US Justice Department settlement with GlaxoSmithKlin­e, had actually secretly found that Paxil was ineffective and unsafe for children. ­Similarly, t­he widely used PHQ-9 and GAD-7 adult mental health questionnaires were created by the pharmaceutical company Pfizer.

This year’s NIMH numbers came from population surveys conducted by the Substance Abuse and Mental Health Services Administration (SAMHSA) and National Survey on Drug Use, which included the Kessler-6 screening tool as a central component — the author of which, Ronald C. Kessler, has received funding from numerous pharmaceutical companies. How misleading is the Kessler-6? It has just six questions. “During the past 30 days, about how often did you feel: 1) nervous, 2) worthless, 3) hopeless, 4) restless or fidgety, 5) that nothing could cheer you up, or 6) that everything was an effort?” For each, responses range from “none of the time” to “all of the time.” If you answer that for “some of the time” over the past month you felt five of those six emotions, then that’s typically enough for a score of 15 and a diagnosis of mild to moderate mental illness. That may sound like the Kessler-6 is a fast way to diagnose as “mentally ill” a lot of ordinary people who are really just occasionally restless, nervous, despairing about the state of the world, and somewhat loose in how they define “some of the time” in a phone survey.

And indeed, that’s exactly what it is.

How 80% accuracy leads to 20 times as much mental illness

Under optimal conditions, the best mental health screening tools like the Kessler-6 have sometimes been rated at a sensitivity of 90% and specificity of 80%. Sensitivity is the rate at which people who have a disease are correctly identified as ill. Specificity is the rate at which people who don’t have a disease are correctly identified as disease-free. Many people assume 90% sensitivity and 80% specificity mean that a test will be wrong around 10-20% of the time. But the accuracy depends on the prevalence of the illness being screened for. So for example if you’re trying to find a few needles in a big haystack, and you can distinguish needles from hay with 90% accuracy, how many stalks of hay will you wrongly identify as needles?

The answer is: A lot of hay. With a 10% prevalence rate of mental illnesses among 1,000 people, any online screening tool calculator can be used to help show that of the 100 who are mentally ill, we will identify 90 of them. Not too bad. However, at 80% specificity, of the 900 who are well, 180 will be wrongly identified as mentally ill. Ultimately, then, our test will determine that 270 people out of 1,000 are mentally ill, nearly tripling the mental illness rates we started with to 27%. And if mental illnesses are less prevalent, the performance of the test is mathematically worse: ­When only 10 in 1,000 are mentally ill, our test will determine that over twenty times that many are.

Mental illness diagnosing is a scientific bottomless pit

This is a common problem with most medical screening tests. They are typically calibrated to miss as few ill people as possible, but consequently they also then scoop up a lot of healthy people who become anxious or ­depressed while getting subjected to lots of increasingly invasive follow-up tests or unnecessary, dangerous treatments. That’s why even comparably much more reliable tests like mammography, cholesterol measuring, annual “physicals,” and many other screening programs are coming under increasing criticism.

The designers of mental health screening tools acknowledge all this in the scientific literature, if not often openly to the general public. As explained deep in their report, SAMHSA tried to compensate for the Kessler-6’s false positive rates; however, the main method they used was to give a sub-sample of their participa­nts a Standard Clinical Interview for DSM Disorders (SCID).

SCID is the “gold standard” for diagnosing mental illnesses in accordance with the Diagnostic and Statistical Manual of Mental Disorders, SAMHSA stated. In fact, SCID simply employs a much larger number of highly subjective questions designed to divide people into more specific diagnoses. For example, the SCID asks if there’s ever been “anything that you have been afraid to do or felt uncomfortable doing in front of other people, like speaking, eating, or writing.” Answering “yes” puts you on a fast path to having anxiety disorder with social phobia. Have you ever felt “like checking something several times to make sure that you’d done it right?” You’re on your way to an obsessive compulsive disorder diagnosis.

That’s why SCID actually isn’t any more reliable than the Kessler-6, according to Ronald Kessler. He should know; Harvard University’s Kessler is author of the Kessler-6 and co-author of the World Health Organization’s popular screening survey, the World Mental Health Composite International Diagnostic Interview (WMH-CIDI). In their scientific report on the development of the WMH-CIDI, Kessler’s team explained that they simply abandoned the whole idea of trying to create a mental health screening tool that was “valid” or “accurate.”

The underlying problem, they wrote, is that, unlike with cancer, there’s no scientific way to definitively determine the absence of any mental illnesses and thereby verify the accuracy of a screening tool. “As no clinical gold standard assessment is available,” Kessler et al wrote, “we adopted the goal of calibration rather than validation; that is, we asked whether WMH-CIDI diagnoses are ‘consistent’ with diagnoses based on a state-of-the-art clinical research diagnostic interview [the SCID], rather than whether they are ‘correct’.” Essentially, creating an impression of scientific consensus between common screening and diagnostic tools was considered to be more important than achieving scientific accuracy with any one of them.

And where that “consensus” lies has shifted over time. Until the 1950s, it wasn’t uncommon to see studies finding that up to 80% of Americans were mentally ill. Throughout the ’90s, NIMH routinely assessed that 10% of Americans were mildly to seriously mentally ill. In 2000, the US Surgeon General’s report declared that the number was 20%, and the NIMH that year doubled its reported prevalence rates, too. In recent years, NIMH was steadily pushing its rate up to a high of 26.2%, but changed it several months ago to 18.6% to match the latest SAMHSA rate.

Suicide and mental illness and other influential sham statistics

Yet as a society we don’t seem to care that there’s a scientific bottomless pit at the heart of all mental illness statistics and diagnosing. One example which highlights how ridiculously overblown and yet influential such epidemiological statistics have become is the claim that, “Over 90% of people who commit suicide are mentally ill.” This number is frequently pumped by the National Alliance on Mental Illness, American Foundation for Suicide Prevention, American Psychiatric Association, and the National Institute of Mental Health, and it has dominated public policy discussions about suicide prevention for years.

The statistic comes from “psychological autopsy” studies. Psychological autopies involve getting friends or relatives of people who committed suicide to complete common mental health screening questionnaires on behalf of the dead people.

As researchers in the journal Death Studies in 2012 exhaustively detailed, psychological autopsies are even less reliable than mental health screening tests administered under normal conditions. Researchers doing psychological autopsies typically don’t factor in false positive rates. They don’t account for the fact that the questions about someone’s feelings and thoughts in the weeks leading up to suicide couldn’t possibly be reliably answered by someone else, and they ignore the extreme biases that would certainly exist in such answers coming from grieving friends and family. Finally, the studies often include suicidal thinking as itself a heavily weighted sign of mental illness—making these studies’ conclusions rarely more than tautology: “Suicidal thinking is a strong sign of mental illness, therefore people who committed suicide have a strong likelihood of having been mentally ill.”

Unfortunately, there is immense political significance to framing suicidal feelings and other psychological challenges this way, if not any substantive scientific significance. These alleged high rates of mental illness are becoming increasingly influential when we discuss policy questions with respect to issues as diverse as prison populations, troubled kids, pregnant and postpartum women, the homeless, gun violence, and the supposed vast numbers of untreated mentally ill. They draw attention, funding and resources into mental health services and treatments at the expense of many other, arguably more important factors in people’s overall psychological wellness that we could be working on, such as poverty, social services, fragmented communities, and declining opportunities for involvement with nature, the arts, or self-actualizing work. At the individual level, we all become more inclined to suspect we might need a therapist or pill for our troubles, where before we might have organized with others for political change.

And that reveals what the real purpose behind many of these statistics is: To change our attitudes and political positions. They are public relations efforts coming from extremely biased sources.

The politics of “mental illness”

Why is 18.6% the going rate of mental illnesses in America? SAMHSA’s report takes many pages to explain all the adjustments they made to arrive at the numbers they did. However, it’s easy to imagine why they’d avoid going much higher or lower. If SAMHSA scored 90% of us as mentally ill, how seriously would we take them? Conversely, imagine if they went with a cut-off score that determined only 0.3% were mentally ill, while the rest of us were just sometimes really, really upset. How would that affect public narratives on America’s mental health “crisis” and debates about the importance of expanding mental health programs?

However well-meaning, the professional mental health sector develops such statistics to create public concern and support for their positions, to steer people towards their services, and to coax money out of public coffers. These statistics are bluffs in a national game of political poker. The major players are always pushing the rates as high as possible, while being careful not to push them so high that others skeptically demand to see the cards they’re holding. This year, 18.6% is the bet.

The myth of pure science

It’s all about political, economic, religious interests

Scientific research can flourish only in alliance with some ideology. Even Darwin couldn’t have done it alone

The myth of pure science: It's all about political, economic, religious interests
Charles Darwin (Credit: Wikimedia/Salon)
Excerpted from “Sapiens”

The Ideal of Progress

Until the Scientific Revolution most human cultures did not believe in progress. They thought the golden age was in the past, and that the world was stagnant, if not deteriorating. Strict adherence to the wisdom of the ages might perhaps bring back the good old times, and human ingenuity might conceivably improve this or that facet of daily life. However, it was considered impossible for human know-how to overcome the world’s fundamental problems. If even Muhammad, Jesus, Buddha and Confucius – who knew everything there is to know – were unable to abolish famine, disease, poverty and war from the world, how could we expect to do so?

Many faiths believed that some day a messiah would appear and end all wars, famines and even death itself. But the notion that humankind could do so by discovering new knowledge and inventing new tools was worse than ludicrous – it was hubris. The story of the Tower of Babel, the story of Icarus, the story of the Golem and countless other myths taught people that any attempt to go beyond human limitations would inevitably lead to disappointment and disaster.

When modern culture admitted that there were many important things that it still did not know, and when that admission of ignorance was married to the idea that scientific discoveries could give us new powers, people began suspecting that real progress might be possible after all. As science began to solve one unsolvable problem after another, many became convinced that humankind could overcome any and every problem by acquiring and applying new knowledge. Poverty, sickness, wars, famines, old age and death itself were not the inevitable fate of humankind. They were simply the fruits of our ignorance.

A famous example is lightning. Many cultures believed that lightning was the hammer of an angry god, used to punish sinners. In the middle of the eighteenth century, in one of the most celebrated experiments in scientific history, Benjamin Franklin flew a kite during a lightning storm to test the hypothesis that lightning is simply an electric current. Franklin’s empirical observations, coupled with his knowledge about the qualities of electrical energy, enabled him to invent the lightning rod and disarm the gods.



Poverty is another case in point. Many cultures have viewed poverty as an inescapable part of this imperfect world. According to the New Testament, shortly before the crucifixion a woman anointed Christ with precious oil worth 300 denarii. Jesus’ disciples scolded the woman for wasting such a huge sum of money instead of giving it to the poor, but Jesus defended her, saying that ‘The poor you will always have with you, and you can help them any time you want. But you will not always have me’ (Mark 14:7). Today, fewer and fewer people, including fewer and fewer Christians, agree with Jesus on this matter. Poverty is increasingly seen as a technical problem amenable to intervention. It’s common wisdom that policies based on the latest findings in agronomy, economics, medicine and sociology can eliminate poverty.

And indeed, many parts of the world have already been freed from the worst forms of deprivation. Throughout history, societies have suffered from two kinds of poverty: social poverty, which withholds from some people the opportunities available to others; and biological poverty, which puts the very lives of individuals at risk due to lack of food and shelter. Perhaps social poverty can never be eradicated, but in many countries around the world biological poverty is a thing of the past.

Until recently, most people hovered very close to the biological poverty line, below which a person lacks enough calories to sustain life for long. Even small miscalculations or misfortunes could easily push people below that line, into starvation. Natural disasters and man-made calamities often plunged entire populations over the abyss, causing the death of millions. Today most of the world’s people have a safety net stretched below them. Individuals are protected from personal misfortune by insurance, state-sponsored social security and a plethora of local and international NGOs. When calamity strikes an entire region, worldwide relief efforts are usually successful in preventing the worst. People still suffer from numerous degradations, humiliations and poverty-related illnesses, but in most countries nobody is starving to death. In fact, in many societies more people are in danger of dying from obesity than from starvation.

The Gilgamesh Project

Of all mankind’s ostensibly insoluble problems, one has remained the most vexing, interesting and important: the problem of death itself. Before the late modern era, most religions and ideologies took it for granted that death was our inevitable fate. Moreover, most faiths turned death into the main source of meaning in life. Try to imagine Islam, Christianity or the ancient Egyptian religion in a world without death. These creeds taught people that they must come to terms with death and pin their hopes on the afterlife, rather than seek to overcome death and live for ever here on earth. The best minds were busy giving meaning to death, not trying to escape it.

That is the theme of the most ancient myth to come down to us – the Gilgamesh myth of ancient Sumer. Its hero is the strongest and most capable man in the world, King Gilgamesh of Uruk, who could defeat anyone in battle. One day, Gilgamesh’s best friend, Enkidu, died. Gilgamesh sat by the body and observed it for many days, until he saw a worm dropping out of his friend’s nostril. At that moment Gilgamesh was gripped by a terrible horror, and he resolved that he himself would never die. He would somehow find a way to defeat death. Gilgamesh then undertook a journey to the end of the universe, killing lions, battling scorpion-men and finding his way into the underworld. There he shattered the mysterious “stone things” of Urshanabi, the ferryman of the river of the dead, and found Utnapishtim, the last survivor of the primordial flood. Yet Gilgamesh failed in his quest. He returned home empty-handed, as mortal as ever, but with one new piece of wisdom. When the gods created man, Gilgamesh had learned, they set death as man’s inevitable destiny, and man must learn to live with it.

Disciples of progress do not share this defeatist attitude. For men of science, death is not an inevitable destiny, but merely a technical problem. People die not because the gods decreed it, but due to various technical failures – a heart attack, cancer, an infection. And every technical problem has a technical solution. If the heart flutters, it can be stimulated by a pacemaker or replaced by a new heart. If cancer rampages, it can be killed with drugs or radiation. If bacteria proliferate, they can be subdued with antibiotics. True, at present we cannot solve all technical problems. But we are working on them. Our best minds are not wasting their time trying to give meaning to death. Instead, they are busy investigating the physiological, hormonal and genetic systems responsible for disease and old age. They are developing new medicines, revolutionary treatments and artificial organs that will lengthen our lives and might one day vanquish the Grim Reaper himself.

Until recently, you would not have heard scientists, or anyone else, speak so bluntly. ‘Defeat death?! What nonsense! We are only trying to cure cancer, tuberculosis and Alzheimer’s disease,’ they insisted. People avoided the issue of death because the goal seemed too elusive. Why create unreasonable expectations? We’re now at a point, however, where we can be frank about it. The leading project of the Scientific Revolution is to give humankind eternal life. Even if killing death seems a distant goal, we have already achieved things that were inconceivable a few centuries ago. In 1199, King Richard the Lionheart was struck by an arrow in his left shoulder. Today we’d say he incurred a minor injury. But in 1199, in the absence of antibiotics and effective sterilisation methods, this minor flesh wound turned infected and gangrene set in. The only way to stop the spread of gangrene in twelfth-century Europe was to cut off the infected limb, impossible when the infection was in a shoulder. The gangrene spread through the Lionheart’s body and no one could help the king. He died in great agony two weeks later.

As recently as the nineteenth century, the best doctors still did not know how to prevent infection and stop the putrefaction of tissues. In field hospitals doctors routinely cut off the hands and legs of soldiers who received even minor limb injuries, fearing gangrene. These amputations, as well as all other medical procedures (such as tooth extraction), were done without any anaesthetics. The first anaesthetics – ether, chloroform and morphine – entered regular usage in Western medicine only in the middle of the nineteenth century. Before the advent of chloroform, four soldiers had to hold down a wounded comrade while the doctor sawed off the injured limb. On the morning after the battle of Waterloo (1815), heaps of sawn-off hands and legs could be seen adjacent to the field hospitals. In those days, carpenters and butchers who enlisted to the army were often sent to serve in the medical corps, because surgery required little more than knowing your way with knives and saws.

In the two centuries since Waterloo, things have changed beyond recognition. Pills, injections and sophisticated operations save us from a spate of illnesses and injuries that once dealt an inescapable death sentence. They also protect us against countless daily aches and ailments, which premodern people simply accepted as part of life. The average life expectancy jumped from around twenty-five to forty years, to around sixty-seven in the entire world, and to around eighty years in the developed world.

Death suffered its worst setbacks in the arena of child mortality. Until the twentieth century, between a quarter and a third of the children of agricultural societies never reached adulthood. Most succumbed to childhood diseases such as diphtheria, measles and smallpox. In seventeenth-century England, 150 out of every 1,000 newborns died during their first year, and a third of all children were dead before they reached fifteen. Today, only five out of 1,000 English babies die during their first year, and only seven out of 1,000 die before age fifteen.

We can better grasp the full impact of these figures by setting aside statistics and telling some stories. A good example is the family of King Edward I of England (1237–1307) and his wife, Queen Eleanor (1241–90). Their children enjoyed the best conditions and the most nurturing surroundings that could be provided in medieval Europe. They lived in palaces, ate as much food as they liked, had plenty of warm clothing, well-stocked fireplaces, the cleanest water available, an army of servants and the best doctors. The sources mention sixteen children that Queen Eleanor bore between 1255 and 1284:

1. An anonymous daughter, born in 1255, died at birth.

2. A daughter, Catherine, died either at age one or age three.

3. A daughter, Joan, died at six months.

4. A son, John, died at age five.

5. A son, Henry, died at age six.

6. A daughter, Eleanor, died at age twenty-nine.

7. An anonymous daughter died at five months.

8. A daughter, Joan, died at age thirty-five.

9. A son, Alphonso, died at age ten.

10. A daughter, Margaret, died at age fifty-eight.

11. A daughter, Berengeria, died at age two.

12. An anonymous daughter died shortly after birth.

13. A daughter, Mary, died at age fifty-three.

14. An anonymous son died shortly after birth.

15. A daughter, Elizabeth, died at age thirty-four.

16. A son, Edward.

The youngest, Edward, was the first of the boys to survive the dangerous years of childhood, and at his father’s death he ascended the English throne as King Edward II. In other words, it took Eleanor sixteen tries to carry out the most fundamental mission of an English queen – to provide her husband with a male heir. Edward II’s mother must have been a woman of exceptional patience and fortitude. Not so the woman Edward chose for his wife, Isabella of France. She had him murdered when he was forty-three.

To the best of our knowledge, Eleanor and Edward I were a healthy couple and passed no fatal hereditary illnesses on to their children. Nevertheless, ten out of the sixteen – 62 per cent – died during childhood. Only six managed to live beyond the age of eleven, and only three – just 18 per cent – lived beyond the age of forty. In addition to these births, Eleanor most likely had a number of pregnancies that ended in miscarriage. On average, Edward and Eleanor lost a child every three years, ten children one after another. It’s nearly impossible for a parent today to imagine such loss.

How long will the Gilgamesh Project – the quest for immortality – take to complete? A hundred years? Five hundred years? A thousand years? When we recall how little we knew about the human body in 1900, and how much knowledge we have gained in a single century, there is cause for optimism. Genetic engineers have recently managed to double the average life expectancy of Caenorhabditis elegans worms. Could they do the same for Homo sapiens? Nanotechnology experts are developing a bionic immune system composed of millions of nano-robots, who would inhabit our bodies, open blocked blood vessels, fight viruses and bacteria, eliminate cancerous cells and even reverse ageing processes. A few serious scholars suggest that by 2050, some humans will become a-mortal (not immortal, because they could still die of some accident, but a-mortal, meaning that in the absence of fatal trauma their lives could be extended indefinitely).

Whether or not Project Gilgamesh succeeds, from a historical perspective it is fascinating to see that most late-modern religions and ideologies have already taken death and the afterlife out of the equation. Until the eighteenth century, religions considered death and its aftermath central to the meaning of life. Beginning in the eighteenth century, religions and ideologies such as liberalism, socialism and feminism lost all interest in the afterlife. What, exactly, happens to a Communist after he or she dies? What happens to a capitalist? What happens to a feminist? It is pointless to look for the answer in the writings of Marx, Adam Smith or Simone de Beauvoir. The only modern ideology that still awards death a central role is nationalism. In its more poetic and desperate moments, nationalism promises that whoever dies for the nation will forever live in its collective memory. Yet this promise is so fuzzy that even most nationalists do not really know what to make of it.

The Sugar Daddy of Science

We are living in a technical age. Many are convinced that science and technology hold the answers to all our problems. We should just let the scientists and technicians go on with their work, and they will create heaven here on earth. But science is not an enterprise that takes place on some superior moral or spiritual plane above the rest of human activity. Like all other parts of our culture, it is shaped by economic, political and religious interests.

Science is a very expensive affair. A biologist seeking to understand the human immune system requires laboratories, test tubes, chemicals and electron microscopes, not to mention lab assistants, electricians, plumbers and cleaners. An economist seeking to model credit markets must buy computers, set up giant databanks and develop complicated data-processing programs. An archaeologist who wishes to understand the behaviour of archaic hunter-gatherers must travel to distant lands, excavate ancient ruins and date fossilised bones and artefacts. All of this costs money.

During the past 500 years modern science has achieved wonders thanks largely to the willingness of governments, businesses, foundations and private donors to channel billions of dollars into scientific research. These billions have done much more to chart the universe, map the planet and catalogue the animal kingdom than did Galileo Galilei, Christopher Columbus and Charles Darwin. If these particular geniuses had never been born, their insights would probably have occurred to others. But if the proper funding were unavailable, no intellectual brilliance could have compensated for that. If Darwin had never been born, for example, we’d today attribute the theory of evolution to Alfred Russel Wallace, who came up with the idea of evolution via natural selection independently of Darwin and just a few years later. But if the European powers had not financed geographical, zoological and botanical research around the world, neither Darwin nor Wallace would have had the necessary empirical data to develop the theory of evolution. It is likely that they would not even have tried.

Why did the billions start flowing from government and business coffers into labs and universities? In academic circles, many are naive enough to believe in pure science. They believe that government and business altruistically give them money to pursue whatever research projects strike their fancy. But this hardly describes the realities of science funding.

Most scientific studies are funded because somebody believes they can help attain some political, economic or religious goal. For example, in the sixteenth century, kings and bankers channelled enormous resources to finance geographical expeditions around the world but not a penny for studying child psychology. This is because kings and bankers surmised that the discovery of new geographical knowledge would enable them to conquer new lands and set up trade empires, whereas they couldn’t see any profit in understanding child psychology.

In the 1940s the governments of America and the Soviet Union channelled enormous resources to the study of nuclear physics rather than underwater archaeology. They surmised that studying nuclear physics would enable them to develop nuclear weapons, whereas underwater archaeology was unlikely to help win wars. Scientists themselves are not always aware of the political, economic and religious interests that control the flow of money; many scientists do, in fact, act out of pure intellectual curiosity. However, only rarely do scientists dictate the scientific agenda.

Even if we wanted to finance pure science unaffected by political, economic or religious interests, it would probably be impossible. Our resources are limited, after all. Ask a congressman to allocate an additional million dollars to the National Science Foundation for basic research, and he’ll justifiably ask whether that money wouldn’t be better used to fund teacher training or to give a needed tax break to a troubled factory in his district. To channel limited resources we must answer questions such as ‘What is more important?’ and ‘What is good?’ And these are not scientific questions. Science can explain what exists in the world, how things work, and what might be in the future. By definition, it has no pretensions to knowing what should be in the future. Only religions and ideologies seek to answer such questions.

Consider the following quandary: two biologists from the same department, possessing the same professional skills, have both applied for a million-dollar grant to finance their current research projects. Professor Slughorn wants to study a disease that infects the udders of cows, causing a 10 per cent decrease in their milk production. Professor Sprout wants to study whether cows suffer mentally when they are separated from their calves. Assuming that the amount of money is limited, and that it is impossible to finance both research projects, which one should be funded?

There is no scientific answer to this question. There are only political, economic and religious answers. In today’s world, it is obvious that Slughorn has a better chance of getting the money. Not because udder diseases are scientifically more interesting than bovine mentality, but because the dairy industry, which stands to benefit from the research, has more political and economic clout than the animal-rights lobby.

Perhaps in a strict Hindu society, where cows are sacred, or in a society committed to animal rights, Professor Sprout would have a better shot. But as long as she lives in a society that values the commercial potential of milk and the health of its human citizens over the feelings of cows, she’d best write up her research proposal so as to appeal to those assumptions. For example, she might write that ‘Depression leads to a decrease in milk production. If we understand the mental world of dairy cows, we could develop psychiatric medication that will improve their mood, thus raising milk production by up to 10 per cent. I estimate that there is a global annual market of $250 million for bovine psychiatric medications.’

Science is unable to set its own priorities. It is also incapable of determining what to do with its discoveries. For example, from a purely scientific viewpoint it is unclear what we should do with our increasing understanding of genetics. Should we use this knowledge to cure cancer, to create a race of genetically engineered supermen, or to engineer dairy cows with super-sized udders? It is obvious that a liberal government, a Communist government, a Nazi government and a capitalist business corporation would use the very same scientific discovery for completely different purposes, and there is no scientific reason to prefer one usage over others.

In short, scientific research can flourish only in alliance with some religion or ideology. The ideology justifies the costs of the research. In exchange, the ideology influences the scientific agenda and determines what to do with the discoveries. Hence in order to comprehend how humankind has reached Alamogordo and the moon – rather than any number of alternative destinations – it is not enough to survey the achievements of physicists, biologists and sociologists. We have to take into account the ideological, political and economic forces that shaped physics, biology and sociology, pushing them in certain directions while neglecting others.

Two forces in particular deserve our attention: imperialism and capitalism. The feedback loop between science, empire and capital has arguably been history’s chief engine for the past 500 years. The following chapters analyse its workings. First we’ll look at how the twin turbines of science and empire were latched to one another, and then learn how both were hitched up to the money pump of capitalism.

Excerpted from “Sapiens: A Brief History of Humankind” by Yuval Noah Harari. Published by Harpers. Copyright 2015 by Yuval Noah Harari. Reprinted with permission of the publisher. All rights reserved.

 

http://www.salon.com/2015/02/16/the_myth_of_pure_science_its_all_about_political_economic_religious_interests/?source=newsletter

Obama’s budget proposal cuts $50 million from immunization funding

img_lg

By Kevin Martinez

16 February 2015

As part of the 2016 budget proposal, the Obama White House announced that it will cut $50 million, or 8 percent, from $611 million for the Department of Health and Human Services’ “317 program.” The 317 Program provides free vaccinations to children with and without insurance, as well as insured adults in response to outbreaks and disaster relief. It also funds the infrastructure needed for high immunization coverage. The announcement comes at a time when the measles has now spread to 14 states with over 120 confirmed cases.

The budget proposal also calls for $128 million to be added to the Vaccines for Children program, an entitlement program that covers insured, uninsured, and Medicaid-eligible children for vaccines. The Obama Administration has argued that through the Affordable Care Act (ACA) access to immunizations will be expanded, decreasing the need for the 317 program.

L.J. Tan, chief strategy officer for the Immunization Action Coalition, told CNN that despite the ACA now covering many children who were previously covered by the 317 program, the budget cuts will be a setback. “The program funds a lot of the states’ infrastructure for vaccine delivery,” he said. The program is also critical in monitoring the spread of the disease and interviewing those who have come in contact with it.

The announcement that the federal immunization program will be cut is in direct contrast to President Obama’s statement to NBC: “There is every reason to get vaccinated, but there aren’t reasons to not,” adding, “You should get your get kids vaccinated.”

While the Affordable Care Act requires insurance providers to pay for vaccines without cost-sharing, it does not cancel out the function of the 317 program, which acts as a safety net for Americans with and without insurance. It also provides for insured children and adults during a major outbreak such as when 317 program vaccines were used to immunize privately-insured children during 2012-13 when pediatricians did not buy enough pediatric influenza vaccines, according to the Centers for Disease Control and Prevention.

Most of the 317 program provides for state and local health officials to purchase vaccines, educate immunization providers, prepare and respond to outbreaks, and have infrastructure in place. The program was cut $51.5 million last year, eliminating $37.5 million for vaccine purchases. Program operations, which include public awareness and immunization provider education, were cut by $14 million.

Despite the proposed $128 million increase for the Vaccines for Children Program, millions of Americans still rely on the 317 program’s vaccines. According to the Kaiser Family Foundation, more than 41 million non-elderly Americans did not have health insurance in 2013. This did not include the underinsured, whose health plans do not cover all vaccines.

Another strain on the country’s immunization system is that fewer physicians are providing the full host of vaccines to insured patients, while vaccines costs more and more and insurance reimbursement rates decline. This means that local public health providers have to pick up those who fall through the cracks.

The National Vaccine Advisory Committee, in its report to Health and Human Services wrote, “As we have learned over the years, insurance coverage alone is not enough to ensure disease control or high vaccination coverage rates. … Current vaccine financing strategies, including those offered now by the ACA, do not address the fundamental resource needs to support the immunization infrastructure.”

Even if the additional funds are going toward vaccine purchases, the ability of local health departments to prepare and respond to outbreaks has been diminished by the cuts. Chris Aldridge, senior director for infectious disease at the National Association of County and City Health Officials told theWashington Post, “When we’re looking at an outbreak, such as with measles, sometimes the concern is less about, ‘Is that person insured,’ than it is really about getting the vaccine out there and distributing it. There is still a need for vaccine purchasing and making sure we can get out there.”

Meanwhile, the measles outbreak, which began last December in Disneyland, has spread to 17 states and has affected at least 125 people. This year’s outbreak is on track to surpass last year’s total of 644 cases, the highest number since the disease was thought to be eradicated in 2000.

A private Christian school in Port Angeles, Washington, was quarantined after a 5-year-old kindergartner was diagnosed with measles. Students at Olympic Christian school who cannot show proof of immunity were told to stay home, avoid public places, and have no contact with anyone susceptible to measles until February 27, according to the county health department. According to the state health records, of the 115 students at the school, nearly 16 percent were exempted from the required vaccinations, meaning that some 18 students could be affected by the quarantine.

Three new cases of measles were also reported in Toronto, Canada, bringing the total in that country to 22. Health officials there confirmed an unvaccinated 14-year-old girl from the Niagara region was infected. Two more cases were also confirmed in Cook County, Illinois, bring the total there to 13 cases statewide. At least 12 of those cases have been tied to a suburban daycare center in Palatine, mostly occurring in children too young to be vaccinated.

In California, where the disease was thought to have first appeared, two new cases were reported in Ventura County, bringing the county total to 14, and the statewide total to 110. At least one of the cases involved a person who visited Disneyland last December. The amusement park has asked California health officials to reassure the public that the park is safe to attend.

 

http://www.wsws.org/en/articles/2015/02/16/meas-f16.html

Lawyer Says He Has Fool-Proof Method for Dealing With DUI Checkpoints

CIVIL LIBERTIES
Warren Redlich says keep your windows up and remain silent when stopped.

A Florida criminal defense attorney has gone to war against DUI checkpoints, saying the compulsory traffic stops by police violate state laws and civil rights.

Attorney Warren Redlich, a former Libertarian candidate for New York governor, says drivers are not required to roll down their windows at checkpoints to talk to police. Redlich says drivers open themselves up to problems when the police have direct access to them.

Redlich posted a YouTube video on New Year’s Day, which has received nearly 2.4 million views. It has spawned several copycat videos by supporters who have filmed police officers after they were stopped at checkpoints in various states.

In the video, Redlich identifies a DUI checkpoint run by the Florida Highway Patrol and Levy County Sheriff’s department and drives to it. Attached to the door is a flyer that Redlich says spells out his rights: I WILL REMAIN SILENT/I WANT MY LAWYER/NO SEARCHES, it begins. The flyer also contains his valid registration and insurance information along with a clear pocket for his driver’s license.

Redlich says it is important not to open the window, because then the police can say they smell alcohol or drugs. He also says it’s important to remain silent, because otherwise the police can claim your speech is slurred. Even if you’re innocent, Redlich says, it makes it more difficult for an attorney to mount a defense at a trial.

“I’ve seen innocent people who plead guilty because they couldn’t fight or afford an attorney,” Redlich told a Florida ABC News affiliate.

The YouTube video shows three drivers who approach police checkpoints. When the first driver approaches the checkpoint with the doors locked and the windows rolled up, the police examine his flyer quizzically before letting him go. The second and third drivers are also allowed to proceed.

Redlich cautions that the Fair DUI flyer and the procedures used by the drivers are specific to Florida laws. He has published custom flyers and information for other 10 other states on his site Fair DUI. Redlich also published a book by the same in 2013.

“This is not about helping drunks,” says Redlich. “This is about helping innocent people. If some drunk person along the way gets help because of this, I’m perfectly okay with that. I’m a criminal defense attorney.”

Redlich says following his directions, being patient and remaining silent are important, so the flyers probably wouldn’t help impaired drivers.

See Redlich’s video:

Cliff Weathers is a senior editor at AlterNet, covering environmental and consumer issues. He is a former deputy editor at Consumer Reports. His work has also appeared in Salon, Car and Driver, Playboy, Raw Story and Detroit Monthly among other publications. Follow him on Twitter @cliffweathers and on his Facebook page.

 

http://www.alternet.org/civil-liberties/method-dealing-dui-checkpoints?akid=12795.265072.hilghC&rd=1&src=newsletter1031899&t=15

US faces historic megadrought

“The 21st century predictions make the [previous] megadroughts seem like quaint walks through the garden of Eden” VIDEO

Brace yourself: US faces historic megadrought
(Credit: NASA Goddard/YouTube)

According to a new study published in the journal “Science Advances“, the Southwest and Great Plains will very likely be stricken by megadroughts–decades-long periods of extreme drought–the likes of which we haven’t seen for 1,000 years. The study’s co-author Toby Ault of Cornell University says that the chances of this happening by the end of the 21st century are at around 80 percent, unless we do something to fight man-made climate change.

USA Today’s Doyle Rice reports:

To identify past droughts, scientists studied tree-rings to find out how much — or little — rain fell hundreds or even thousands of years ago. Scientists then used that historical data in combination with 17 different computer model simulations to predict what changes we may see later this century.

The computers showed robust and consistent drying in the Southwest and Plains, due to a combination of reduced precipitation and warmer temperatures that dried out the soils.

Although the Dust Bowl of the 1930s is what we generally refer to as a worst-case drought scenario, Lisa Graumlich, head of the University of Washington’s College of the Environment, told Climate Central that the drought in the same region from 1100-1300 (called the ‘Medieval Climate Anomaly‘) “makes the Dust Bowl look like a picnic.” The coming drought is expected to surpass these two in a major way.



“The 21st century predictions make the [previous] megadroughts seem like quaint walks through the garden of Eden,” said the study’s co-author Jason Smerdon, a researcher at Columbia University’s Lamont-Doherty Earth Observatory in an interview with The Guardian.

“We haven’t seen this kind of prolonged drought even certainly in modern US history,” he continued. “What this study has shown is the likelihood that multi-decadal events comprising year after year after year of extreme dry events could be something in our future.”

Joanna Rothkopf is an assistant editor at Salon, focusing on science, health and society. Follow @JoannaRothkopf or email jrothkopf@salon.com.

 

http://www.salon.com/2015/02/12/brace_yourself_us_faces_historic_megadrought/?source=newsletter