Sunday, July 31, 2011

Earth's Atmosphere May Be More Efficient at Releasing Energy to Space Than Climate Models Indicate, Satellite Data Suggest

Earth's Atmosphere May Be More Efficient at Releasing Energy to Space Than Climate Models Indicate, Satellite Data Suggest

ScienceDaily (July 29, 2011) — Data from NASA's Terra satellite suggests that when the climate warms, Earth's atmosphere is apparently more efficient at releasing energy to space than models used to forecast climate change may indicate, according to a new study.

The result is climate forecasts that are warming substantially faster than the atmosphere, says Dr. Roy Spencer, a principal research scientist in the Earth System Science Center at The University of Alabama in Huntsville.

The previously unexplained differences between model-based forecasts of rapid global warming and meteorological data showing a slower rate of warming have been the source of often contentious debate and controversy for more than two decades.

In research published this week in the journal Remote Sensing, Spencer and UA Huntsville's Dr. Danny Braswell compared what a half dozen climate models say the atmosphere should do to satellite data showing what the atmosphere actually did during the 18 months before and after warming events between 2000 and 2011.

"The satellite observations suggest there is much more energy lost to space during and after warming than the climate models show," Spencer said. "There is a huge discrepancy between the data and the forecasts that is especially big over the oceans."

Not only does the atmosphere release more energy than previously thought, it starts releasing it earlier in a warming cycle. The models forecast that the climate should continue to absorb solar energy until a warming event peaks.

Instead, the satellite data shows the climate system starting to shed energy more than three months before the typical warming event reaches its peak.

"At the peak, satellites show energy being lost while climate models show energy still being gained," Spencer said.

This is the first time scientists have looked at radiative balances during the months before and after these transient temperature peaks.

Applied to long-term climate change, the research might indicate that the climate is less sensitive to warming due to increased carbon dioxide concentrations in the atmosphere than climate modelers have theorized. A major underpinning of global warming theory is that the slight warming caused by enhanced greenhouse gases should change cloud cover in ways that cause additional warming, which would be a positive feedback cycle.

Instead, the natural ebb and flow of clouds, solar radiation, heat rising from the oceans and a myriad of other factors added to the different time lags in which they impact the atmosphere might make it impossible to isolate or accurately identify which piece of Earth's changing climate is feedback from human-made greenhouse gases.

"There are simply too many variables to reliably gauge the right number for that," Spencer said. "The main finding from this research is that there is no solution to the problem of measuring atmospheric feedback, due mostly to our inability to distinguish between radiative forcing and radiative feedback in our observations."

For this experiment, the UA Huntsville team used surface temperature data gathered by the Hadley Climate Research Unit in Great Britain. The radiant energy data was collected by the Clouds and Earth's Radiant Energy System (CERES) instruments aboard NASA's Terra satellite.

The six climate models were chosen from those used by the U.N.'s Intergovernmental Panel on Climate Change. The UA Huntsville team used the three models programmed using the greatest sensitivity to radiative forcing and the three that programmed in the least sensitivity.

Saturday, July 30, 2011

Boehner Rides the Dragon | GOPlifer | a Chron.com blog

Boehner Rides the Dragon

If you’re one of those people who pay attention to objective reality you may be asking yourself – “WTF was John Boehner thinking in allowing the debt ceiling extension to become a major partisan battleground?”  After all, the man’s not stupid, and no one takes seriously his claims to be on the side of the Tea Party on this or any other matter.  How did he get himself into this blind alley?

The answer is ambition.

In the spring of 2010, GOP Speaker of the House gave some fatherly advice to a fellow conservative facing a primary challenge and a series of increasingly bizarre questions from constituents.  Boehner’s advice was a distillation of twenty years of tactics from the rational, but increasingly cynical Republican insiders who had up to that point survived the rising tide of weird that has destroyed so much of the Party’s core.

He gave the advice to Rep. Bob Inglis, one of the young Turks of the last great Republican wave in 1994.  Inglis asked Boehner what he should do about committed Republican constituents who were repeating to him some wildly inaccurate crap they had heard on TV, radio, and the Internet.  Inglis described some of the encounters:

“They say, ‘Bob, what don’t you get? Barack Obama is a socialist, communist Marxist who wants to destroy the American economy so he can take over as dictator. Health care is part of that. And he wants to open up the Mexican border and turn [the US] into a Muslim nation.’”

And the best one:

“I sat down, and they said on the back of your Social Security card, there’s a number. That number indicates the bank that bought you when you were born based on a projection of your life’s earnings, and you are collateral. We are all collateral for the banks. I have this look like, “What the heck are you talking about?” I’m trying to hide that look and look clueless. I figured clueless was better than argumentative. So they said, “You don’t know this?! You are a member of Congress, and you don’t know this?!” And I said, “Please forgive me. I’m just ignorant of these things.” And then of course, it turned into something about the Federal Reserve and the Bilderbergers and all that stuff. And now you have the feeling of anti-Semitism here coming in, mixing in. Wow.”

Boehner’s advice was to give them platitudes and let them be, “I would have told them that it’s not quite that bad. We disagree with him [Obama] on the issues.”  Don’t confront them or try to correct any of the dangerous lies they were repeating.  Don’t try to lead.  Inglis was stunned by the advice.

Inglis weakly and indecisively confronted the crazy and lost his primary to someone willing to indulge his constituents’ wildest fantasies.  That guy is now working hard to make sure the US Congress fails to pay the bills it incurred when it passed its budget this past April.  Our highest and best at work.

Since 1994, the ranks of the Republican Party at the national level have been steadily purged of anyone unwilling to ride the dragon of mob anger.  Barry Goldwater once wrote that “We cannot allow the emblem of irresponsibility to attach to the Republican banner.”  That advice has been replaced by the motto that “there is no enemy to the right.”  There are still rational, thinking Republicans in senior national positions in the Party, but staying there has meant making some very dark compromises (see McCain, John).

Boehner is typical of this bunch.  His ambition long ago overwhelmed his good sense. His reasoning has been that he gets to keep his very good job, no matter what happens to yours, so long as he carefully patronizes the nutjob fringe. Throughout the rise of the Tea Party he remained convinced that he could manage the mob.  Republican politics in our time is drowning in the delusion that we can harness the lowest tactics to achieve the highest ends.  In other words, the Party is controlled by people blindly convinced that their own success is what’s best for the country.

Anyone in a position of power in the GOP willing to openly challenge Death Panel Politics has been shown the door.  In the 90’s Uncle Barry also said that “the Republican Party has been taken over by a bunch of kooks.”

The non-kooks who remain now are largely compromised and thoroughly cowed.  Take McCain for example.  He played ball earn the 2008 nomination and to defeat a primary challenge in 2010.  Now, too late, he stands on the Senate floor in a pathetic and futile effort to fight back against the same “bizzaro” forces he tried to appease a year ago.  It’s depressing to think what that man could have become.

As for Boehner, he is almost certainly entering the final phase of his awakening.  His failure in a critical moment to pass even the most ludicrous of show-bills is demonstrating to anyone with eyes that he cannot continue to steer that mob.  He’ll probably keep his job for the near term, but the smell of that much blood in the water will attract unwelcome attention.  The dragon will eat him in due time.

He, along with the remaining Republican establishment, made a deal with the devil over recent years, compromising credibility for short-term power.  Nothing can repair the Party but a bitter internal fight and unfortunately there is practically no one left at the national level with the courage or credibility to wage it.

A new generation of rational Republicans will probably have to emerge from blue-state environments, places where local Republicans have been building power around sound state and county management and pragmatic politics.  Working around the edges in predominantly Democratic areas to demonstrate what effective Republican leadership can accomplish these are the kinds of folks who will be positioned to take the Party into the future.  But blocked by the Neo Confederates and Fundamentalists at the national level their presence has not yet been felt.

Things will be better for the Republican Party and the country someday, but perhaps not soon.

Friday, July 29, 2011

Untitled

Web address:
     http://www.sciencedaily.com/releases/2011/07/110728082601.htm/
     

Increased Muscle Mass May Lower Risk of Pre-Diabetes: Study Shows Building Muscle Can Lower Person's Risk of Insulin Resistance

ScienceDaily (July 28, 2011) — A recent study accepted for publication in The Endocrine Society's Journal of Clinical Endocrinology & Metabolism (JCEM) found that the greater an individual's total muscle mass, the lower the person's risk of having insulin resistance, the major precursor of type 2 diabetes.

With recent dramatic increases in obesity worldwide, the prevalence of diabetes, a major source of cardiovascular morbidity, is expected to accelerate. Insulin resistance, which can raise blood glucose levels above the normal range, is a major factor that contributes to the development of diabetes. Previous studies have shown that very low muscle mass is a risk factor for insulin resistance, but until now, no study has examined whether increasing muscle mass to average and above average levels, independent of obesity levels, would lead to improved blood glucose regulation.

"Our findings represent a departure from the usual focus of clinicians, and their patients, on just losing weight to improve metabolic health," said the study's senior author, Preethi Srikanthan, MD, of the University of California, Los Angeles (UCLA). "Instead, this research suggests a role for maintaining fitness and building muscle. This is a welcome message for many overweight patients who experience difficulty in achieving weight loss, as any effort to get moving and keep fit should be seen as laudable and contributing to metabolic change."

In this study, researchers examined the association of skeletal muscle mass with insulin resistance and blood glucose metabolism disorders in a nationally representative sample of 13,644 individuals. Participants were older than 20 years, non-pregnant and weighed more than 35 kg. The study demonstrated that higher muscle mass (relative to body size) is associated with better insulin sensitivity and lower risk of pre- or overt diabetes.

"Our research shows that beyond monitoring changes in waist circumference or BMI, we should also be monitoring muscle mass," Srikanthan concluded. "Further research is needed to determine the nature and duration of exercise interventions required to improve insulin sensitivity and glucose metabolism in at-risk individuals."

Also working on the study was Arun Karlamangla, PhD, MD, of the David Geffen School of Medicine at UCLA.

Transparency

Constitution

Red vs Blue

Trips

American Worker

Campaign Promises

Thursday, July 28, 2011

Age-Related Memory Loss Reversed in Monkeys - Technology Review

Wednesday, July 27, 2011

Age-Related Memory Loss Reversed in Monkeys

Research uncovers the cellular defects that cause this type of forgetfulness.

It happens to the best of us: you walk into the kitchen to get a cup of coffee but get distracted by the mail, and then forget what you were doing in the first place. Aging makes people particularly vulnerable to this kind of forgetfulness, where we fail to maintain a thought in the face of distractions.

New research from Yale University uncovers cellular changes that seem to underlie this type of memory loss in monkeys, and shows that it can be reversed with drugs. By delivering a certain chemical to the brain, researchers could make neurons in old monkeys behave like those in young monkeys. Clinical trials of a generic drug that mimics this effect are already underway.

The findings support the idea that some of the brain changes that occur with aging are very specific—rather than being caused by a general decay throughout the brain—and can potentially be prevented. "It helps us understand that the age-related changes in the brain are malleable," says Molly Wagster, chief of the Behavioral and Systems Neuroscience Branch at the National Institute on Aging, which funded the research. "That's a crucial piece of information, and extremely hopeful."

In the study, Amy Arnsten and collaborators recorded electrical activity from neurons in a part of the brain called the prefrontal cortex, a region especially vulnerable to aging in both humans and primates. It is vital for our most high-level cognitive functions, such as working memory and the ability to multitask and inhibit distractions. "The prefrontal cortex is a mental sketch pad, keeping things in mind even if nothing in the environment is telling us what to do," says Arnsten. "It's the building block of abstract thought."

Previous research has shown that neural circuits in this region are organized to create a sustained level of activity that is crucial for working memory. "By exciting each other, the neurons are able to maintain information that isn't currently in the environment," says Arnsten.

By analyzing activity recorded from young, middle-aged, and old monkeys, the researchers found that the firing rate of the neurons in this area declines with age. They found that other neurons, such as those that respond to cues in the environment, still fired normally even as the monkeys aged. The research was published today in the journal Nature.

Arnsten believes the problem is a stress response gone wrong. During stress, even in young animals, these brain cells are flooded with a signaling molecule called cAMP, which dampens activity by opening potassium channels. (She theorizes that this is an evolutionary adaption that allows the brain to quickly flip control from the prefrontal cortex, "a slow and thoughtful region," to a more primitive region in time of stress.) Normally, enzymes shut off the stress response and the brain goes back to normal. "But we think that in normal aging, the stress signaling pathway becomes disregulated," says Arnsten.

The researchers were able to rein in the problem by treating the cells with a drug that blocks the potassium channels. After treatment, brain cells in old monkeys fired more rapidly—just like those in their younger counterparts.

The researchers already knew that giving monkeys this drug systemically, rather than delivering it directly into the brain, could reverse age-related deficits in working memory. A clinical trial of the compound, a generic drug called guanfacine, originally used to treat hypertension, is underway at Yale.

The findings bode well for the prospect of slowing age-related cognitive decline in humans. "The more we learn about the synaptic basis of aging, the more we learn it affects very specific elements of what these neurons can do," says John Morrison, a neurologist at Mount Sinai School of Medicine. Morrison was not involved in the research. "Once we understand it, we can identify targets and deal with it," he says.

Now that researchers understand how guanfacine works, they may be able to design drugs that are more powerful or have fewer side effects. Guanfacine can act as a sedative, so people need to slowly build up their tolerance to the drug to avoid this effect.

It's not yet clear if the work has implications for the more serious memory and brain changes that occur in Alzheimer's disease and other types of dementia. (Monkeys don't get Alzheimer's, so researchers know the memory changes they see in these animals are part of the typical aging process.)

However, Morrison believes that these subtle cellular changes may make the brain more vulnerable to the cell death that occurs in Alzheimer's. And as researchers begin to explore ways to intervene earlier with Alzheimer's patients, it may be useful to target these changes early on.  

Copyright Technology Review 2011.

Wednesday, July 27, 2011

The Cult That Is Destroying America - NYTimes.com

Paul Krugman - New York Times Blog

July 26, 2011, 5:09 pm

The Cult That Is Destroying America

Watching our system deal with the debt ceiling crisis — a wholly self-inflicted crisis, which may nonetheless have disastrous consequences — it’s increasingly obvious that what we’re looking at is the destructive influence of a cult that has really poisoned our political system.

And no, I don’t mean the fanaticism of the right. Well, OK, that too. But my feeling about those people is that they are what they are; you might as well denounce wolves for being carnivores. Crazy is what they do and what they are.

No, the cult that I see as reflecting a true moral failure is the cult of balance, of centrism.

Think about what’s happening right now. We have a crisis in which the right is making insane demands, while the president and Democrats in Congress are bending over backward to be accommodating — offering plans that are all spending cuts and no taxes, plans that are far to the right of public opinion.

So what do most news reports say? They portray it as a situation in which both sides are equally partisan, equally intransigent — because news reports always do that. And we have influential pundits calling out for a new centrist party, a new centrist president, to get us away from the evils of partisanship.

The reality, of course, is that we already have a centrist president — actually a moderate conservative president. Once again, health reform — his only major change to government — was modeled on Republican plans, indeed plans coming from the Heritage Foundation. And everything else — including the wrongheaded emphasis on austerity in the face of high unemployment — is according to the conservative playbook.

What all this means is that there is no penalty for extremism; no way for most voters, who get their information on the fly rather than doing careful study of the issues, to understand what’s really going on.

You have to ask, what would it take for these news organizations and pundits to actually break with the convention that both sides are equally at fault? This is the clearest, starkest situation one can imagine short of civil war. If this won’t do it, nothing will.

And yes, I think this is a moral issue. The “both sides are at fault” people have to know better; if they refuse to say it, it’s out of some combination of fear and ego, of being unwilling to sacrifice their treasured pose of being above the fray.

It’s a terrible thing to watch, and our nation will pay the price.

Tuesday, July 26, 2011

How the Deficit Got This Big - NYTimes.com

July 23, 2011

How the Deficit Got This Big

With President Obama and Republican leaders calling for cutting the budget by trillions over the next 10 years, it is worth asking how we got here — from healthy surpluses at the end of the Clinton era, and the promise of future surpluses, to nine straight years of deficits, including the $1.3 trillion shortfall in 2010. The answer is largely the Bush-era tax cuts, war spending in Iraq and Afghanistan, and recessions.

Despite what antigovernment conservatives say, non-
defense discretionary spending on areas like foreign aid, education and food safety was not a driving factor in creating the deficits. In fact, such spending, accounting for only 15 percent of the budget, has been basically flat as a share of the economy for decades. Cutting it simply will not fill the deficit hole.

The first graph shows the difference between budget projections and budget reality. In 2001, President George W. Bush inherited a surplus, with projections by the Congressional Budget Office for ever-increasing surpluses, assuming continuation of the good economy and President Bill Clinton’s policies. But every year starting in 2002, the budget fell into deficit. In January 2009, just before President Obama took office, the budget office projected a $1.2 trillion deficit for 2009 and deficits in subsequent years, based on continuing Mr. Bush’s policies and the effects of recession. Mr. Obama’s policies in 2009 and 2010, including the stimulus package, added to the deficits in those years but are largely temporary.

The second graph shows that under Mr. Bush, tax cuts and war spending were the biggest policy drivers of the swing from projected surpluses to deficits from 2002 to 2009. Budget estimates that didn’t foresee the recessions in 2001 and in 2008 and 2009 also contributed to deficits. Mr. Obama’s policies, taken out to 2017, add to deficits, but not by nearly as much.

A few lessons can be drawn from the numbers. First, the Bush tax cuts have had a huge damaging effect. If all of them expired as scheduled at the end of 2012, future deficits would be cut by about half, to sustainable levels. Second, a healthy budget requires a healthy economy; recessions wreak havoc by reducing tax revenue. Government has to spur demand and create jobs in a deep downturn, even though doing so worsens the deficit in the short run. Third, spending cuts alone will not close the gap. The chronic revenue shortfalls from serial tax cuts are simply too deep to fill with spending cuts alone. Taxes have to go up.

In future decades, when rising health costs with an aging population hit the budget in full force, deficits are projected to be far deeper than they are now. Effective health care reform, and a willingness to pay more taxes, will be the biggest factors in controlling those deficits.

Monday, July 25, 2011

Obama’s Federal Spending Is Slower Today Than From 1965-1985 | AddictingInfo.Org

This report coming from Wells Capital Management shows that the Republicans are using this recession and current deficit as political tools to further their agenda to eliminate the social contract created by our Grandparents.

In this report Wells Capital states that Federal spending growth is slower today than it was from 1965 through 1985 and is consistent with growth from 1985 through 1995. This same report also shows that we have been having budget deficits 86% of the time since 1965. Interestingly the GOP loves to invoke the Kennedy tax cuts which passed the Senate in 1964 as a sensible way to decrease the deficit through economic expansion. They try to present to the public that a lower tax rate increases business expansion, thus this would increase revenue into the treasury. This obviously didn’t happen. Over the last 46 years, according to this study, our nation has had a federal deficit, for 39.5 years.

Wells Capital States this,

“….the blowout deficit of the last few years was not due to a surge in government spending, but rather is the result of the worst recession in the post-war era. That is, like most post-war deficits, the contemporary deficit is predominantly “cyclical.”’ “….As has been typical throughout the post-war era, the current deficit is primarily the result of weak government receipt growth. Indeed, in the last recession, government receipts declined by more than during any post-war recession.

This analysis actually reinforces what I have reported here on PoiticusUSA in the past. America needs to grow our economy in order to eliminate the deficit, there is no need for draconian cuts to our social contract, as the regressives in the tea party and the Republican party would like you to believe.

The big government spending Obama is nothing more than a figment of the Republican Party’s illusion to persuade the public into accepting cuts to Medicare, Medicaid and other programs that help the least fortunate in this great count

Drug Prices to Plummet in Wave of Expiring Patents | TIME Healthland

Drug Prices to Plummet in Wave of Expiring Patents

The cost of prescription medicines used by millions of people every day is about to plummet.

The next 14 months will bring generic versions of seven of the world's 20 best-selling drugs, including the top two: cholesterol fighter Lipitor and blood thinner Plavix.

The magnitude of this wave of expiring drugs patents is unprecedented. Between now and 2016, blockbusters with about $255 billion in global annual sales will go off patent, notes EvaluatePharma Ltd., a London research firm. Generic competition will decimate sales of the brand-name drugs and slash the cost to patients and companies that provide health benefits.

Top drugs getting generic competition by September 2012 are taken by millions every day: Lipitor alone is taken by about 4.3 million Americans and Plavix by 1.4 million. Generic versions of big-selling drugs for blood pressure, asthma, diabetes, depression, high triglycerides, HIV and bipolar disorder also are coming by then.

The flood of generics will continue for the next decade or so, as about 120 brand-name prescription drugs lose market exclusivity, according to prescription benefits manager Medco Health Solutions Inc.

"My estimation is at least 15 percent of the population is currently using one of the drugs whose patents will expire in 2011 or 2012," says Joel Owerbach, chief pharmacy officer for Excellus Blue Cross Blue Shield, which serves most of upstate New York.

Those patients, along with businesses and taxpayers who help pay for prescription drugs through corporate and government prescription plans, collectively will save a fortune. That's because generic drugs typically cost 20 percent to 80 percent less than the brand names.

Doctors hope the lower prices will significantly reduce the number of people jeopardizing their health because they can't afford medicines they need.

Dr. Nieca Goldberg, director of The Women's Heart Program at NYU Langone Medical Center in Manhattan, worries about patients who are skipping checkups and halving pills to pare costs.

"You can pretty much tell by the numbers when I check the patient's blood pressure or cholesterol levels," that they've not taken their medications as often as prescribed, she says.

Even people with private insurance or Medicare aren't filling all their prescriptions, studies show, particularly for cancer drugs with copays of hundreds of dollars or more.

The new generics will slice copayments of those with insurance. For the uninsured, who have been paying full price, the savings will be much bigger.

Daly Powers, 25, an uninsured student who works two part-time jobs at low wages, says he often can't afford the $220 a month for his depression and attention deficit disorder pills. He couldn't buy either drug in June and says he's struggling with his Spanish class and his emotions. He looks forward to his antidepressant, Lexapro, going generic early next year.

"It'd make all the difference in the world," says Powers, of Bryan, Texas.

Generic medicines are chemically equivalent to the original brand-name drugs and work just as well for nearly all patients.

When a drug loses patent protection, often only one generic version is on sale for the first six months, so the price falls a little bit initially. Then, several other generic makers typically jump in, driving prices down dramatically.

Last year, the average generic prescription cost $72, versus $198 for the average brand-name drug, according to consulting firm Wolters Kluwer Pharma Solutions. Those figures average all prescriptions, from short-term to 90-day ones.

Average copayments last year were $6 for generics, compared with $24 for brand-name drugs given preferred status by an insurer and $35 for nonpreferred brands, according to IMS Health.

Among the drugs that recently went off patent, Protonix, for severe heartburn, now costs just $16 a month for the generic, versus about $170 for the brand name. And of the top sellers that soon will have competition, Lipitor retails for about $150 a month, Plavix costs almost $200 a month and blood pressure drug Diovan costs about $125 a month. For those with drug coverage, their out-of-pocket costs for each of those drugs could drop below $10 a month.

Jo Kelly, a retired social worker in Conklin, Mich., and her husband, Ray, a retired railroad mechanic, each take Lipitor and two other brand-name medicines, plus some generic drugs. Both are 67, and they land in the Medicare prescription "doughnut hole," which means they must pay their drugs' full cost by late summer or early fall each year. That pushes their monthly cost for Lipitor to about $95 each, and their combined monthly prescription cost to nearly $1,100.

Generic Lipitor should hit pharmacies Nov. 30 and cost them around $10 each a month.

"It would be a tremendous help for us financially," she says. "It would allow us to start going out to eat again."

For people with no prescription coverage, the coming savings on some drugs could be much bigger. Many discount retailers and grocery chains sell the most popular generics for $5 a month or less to draw in shoppers.

The impact of the coming wave of generics will be widespread _ and swift.

Insurers use systems that make sure patients are switched to a generic the first day it's available. Many health plans require newly diagnosed patients to start on generic medicines. And unless the doctor writes "brand only" on a prescription, if there's a generic available, that's almost always what the pharmacist dispenses.

"A blockbuster drug that goes off patent will lose 90 percent of its revenue within 24 months. I've seen it happen in 12 months," says Ben Weintraub, a research director at Wolters Kluwer Pharma Solutions.

The looming revenue drop is changing the economics of the pharmaceutical industry.

In the 1990s, big pharmaceutical companies were wildly successful at creating pills that millions of people take every day for long-term conditions, from heart disease and diabetes to osteoporosis and chronic pain. The drugs are enormously profitable compared with drugs that are prescribed for short-term ailments.

The patents on those blockbusters, which were filed years before the drugs went on sale, last for 20 years at most, and many expire soon.

In recent years, many drug companies have struggled to develop new blockbuster drugs, despite multibillion-dollar research budgets and more partnerships with scientists at universities and biotech companies. The dearth of successes, partly because the "easy" treatments have already been found, has turned the short-term prognosis for "big pharma" anemic.

"The profit dollars that companies used to reinvest in innovation are no longer going to be coming," warns Terry Hisey, life sciences leader at consultant Deloitte LLP's pharmaceutical consulting business. He says that raises "long-term concerns about the industry's ability to bring new medicines to market."

But pharmaceutical companies can save billions when they stop promoting drugs that have new generic rivals, and U.S. drug and biotech companies are still spending more than $65 billion a year on R&D.

Drug companies have received U.S. approval for 20 drugs this year and expect approval for other important ones the next few years. Eventually, those will help fill the revenue hole.

For now, brand-name drugmakers are scrambling to adjust for the billions in revenue that will soon be lost. Typically, they raise prices 20 percent or more in the final years before generics hit to maximize revenue. Some also contract with generic drugmakers for "authorized generics," which give the brand-name company a portion of the generic sales.

Brand-name companies also are trimming research budgets, partnering with other companies to share drug development costs and shifting more manufacturing and patient testing to low-cost countries.

Pharmaceutical companies have cut about 10 percent of U.S. jobs in four years, from a peak of about 297,000 to about 268,000, according to Labor Department data. Nearly two-thirds of the cuts came in the last 1 1/2 years, partly because of big mergers that were driven by the need to bulk up drugs in development and boost profits in the short term by cutting costs.

Drug companies also are trying to grow sales by putting more sales reps in emerging markets, such as China and India, and by diversifying into businesses that get little or no generic competition. Those include vaccines, diagnostic tests, veterinary medicines and consumer health products.

As the proportion of prescriptions filled with generic drugs jumped to 78 percent in 2010, from 57 percent in 2004, annual increases in prescription drug spending slowed, to just 4 percent in 2010. According to the Generic Pharmaceutical Association, generics saved the U.S. health care system more than $824 billion from 2000 through 2009, and now save about $1 billion every three days.

The savings are only going to get greater as our overweight population ages. People who take their medicines regularly often avoid costly complications and hospitalizations, says AARP's policy chief, John Rother, which produces even bigger savings than the cheaper drugs.

In addition, many patients taking a particular brand-name drug will defect when a slightly older rival in the same class goes generic.

Global sales of Lipitor peaked at $12.9 billion in 2006, the year Zocor, an older drug in the statin class that reduces bad cholesterol, went generic. Lipitor sales then declined slowly but steadily to about $10.7 billion last year. That still will make Lipitor the biggest drug to go generic.

For patients, it's a godsend.

Douglas Torok, 59, of Erie, Pa., now spends nearly $290 every three months for insulin for his Type 2 diabetes, plus four daily pills _ Lipitor, Plavix and two generics _ for his blood pressure and cholesterol problems. The $40,000-a-year foundry supervisor fears not being able to cover the out-of-pocket costs when he retires and doesn't have a generous prescription plan.

In the meantime, once Lipitor and Plavix get generic competition his copayments will plunge.

"I will pay $16 for 90 days," says Torok, who hopes to travel more. "It's a big deal for me on my income."

LINDA A. JOHNSON

Tax Rates for the 400 Richest Americans

Forget Anonymous: Evidence Suggests GOP Hacked, Stole 2004 Election

Forget Anonymous: Evidence Suggests GOP Hacked, Stole 2004 Election

The Chart That Should Accompany All Discussions of the Debt Ceiling - Politics - The Atlantic

The Chart That Should Accompany All Discussions of the Debt Ceiling

By James Fallows
It's this one, from yesterday's New York Times. Click for a more detailed view, though it's pretty clear as is.

24editorial_graph2-popup.gif

It's based on data from the Congressional Budget Office and the Center on Budget and Policy Priorities. Its significance is not partisan (who's "to blame" for the deficit) but intellectual. It demonstrates the utter incoherence of being very concerned about a structural federal deficit but ruling out of consideration the policy that was largest single contributor to that deficit, namely the Bush-era tax cuts.

An additional significance of the chart: it identifies policy changes, the things over which Congress and Administration have some control, as opposed to largely external shocks -- like the repercussions of the 9/11 attacks or the deep worldwide recession following the 2008 financial crisis. Those external events make a big difference in the deficit, and they are the major reason why deficits have increased faster in absolute terms during Obama's first two years that during the last two under Bush. (In a recession, tax revenues plunge, and government spending goes up - partly because of automatic programs like unemployment insurance, and partly in a deliberate attempt to keep the recession from getting worse.) If you want, you could even put the spending for wars in Iraq and Afghanistan in this category: those were policy choices, but right or wrong they came in response to an external shock. 

The point is that governments can respond to but not control external shocks. That's why we call them "shocks." Governments can control their policies. And the policy that did the most to magnify future deficits is the Bush-era tax cuts. You could argue that the stimulative effect of those cuts is worth it ("deficits don't matter" etc). But you cannot logically argue that we absolutely must reduce deficits, but that we absolutely must also preserve every penny of those tax cuts. Which I believe precisely describes the House Republican position.

After the jump, from a previous "The Chart That Should..." positing, an illustration of the respective roles of external shock and deliberate policy change in creating the deficit.

UPDATE: Many people have written to ask how the impact of the "Bush-era tax cuts," enacted under George W. Bush and extended under Barack Obama (with the help, as you will recall, of huge pressure from Senate Republicans), is divided between the two presidents. I don't know and have written the creators of the chart to ask. (They have responded to say: it indicates the legacy effects of the changes made by each Administration. Thus, for instance, neither Bush nor Obama is credited with the entire cost of Pentagon spending or entitlements, but only the changes his Administration made, up or down. Thus the long-run effect of cuts initiated by Bush is assigned to him, as any long-run effect of savings he initiated would be.)

But to me it doesn't matter. As I said above, the point of the chart really isn't partisan responsibility. It is the central role of those tax cuts in creating the deficit that is now the focus of such political attention. Call them the "Obama-Extended Tax Cuts" if you'd like: either way, a deficit plan that ignores them fails a basic logic, math, and coherence test.
___
From this item three months ago:

DeficitChart.png


More: For how the Democrats are mishandling both the politics and the substance of this argument, see Joshua Green on "The Democrats Cave."

For how President Obama could use his inherent powers, in a "This is bullshit" way, see Robert Kuttner.

This article available online at:

http://www.theatlantic.com/politics/archive/2011/07/the-chart-that-should-acc...

Copyright © 2011 by The Atlantic Monthly Group. All Rights Reserved.

Sunday, July 24, 2011

How the Deficit Got This Big - NYTimes.com


July 23, 2011

How the Deficit Got This Big

With President Obama and Republican leaders calling for cutting the budget by trillions over the next 10 years, it is worth asking how we got here — from healthy surpluses at the end of the Clinton era, and the promise of future surpluses, to nine straight years of deficits, including the $1.3 trillion shortfall in 2010. The answer is largely the Bush-era tax cuts, war spending in Iraq and Afghanistan, and recessions.

Despite what antigovernment conservatives say, non-
defense discretionary spending on areas like foreign aid, education and food safety was not a driving factor in creating the deficits. In fact, such spending, accounting for only 15 percent of the budget, has been basically flat as a share of the economy for decades. Cutting it simply will not fill the deficit hole.

The first graph shows the difference between budget projections and budget reality. In 2001, President George W. Bush inherited a surplus, with projections by the Congressional Budget Office for ever-increasing surpluses, assuming continuation of the good economy and President Bill Clinton’s policies. But every year starting in 2002, the budget fell into deficit. In January 2009, just before President Obama took office, the budget office projected a $1.2 trillion deficit for 2009 and deficits in subsequent years, based on continuing Mr. Bush’s policies and the effects of recession. Mr. Obama’s policies in 2009 and 2010, including the stimulus package, added to the deficits in those years but are largely temporary.

The second graph shows that under Mr. Bush, tax cuts and war spending were the biggest policy drivers of the swing from projected surpluses to deficits from 2002 to 2009. Budget estimates that didn’t foresee the recessions in 2001 and in 2008 and 2009 also contributed to deficits. Mr. Obama’s policies, taken out to 2017, add to deficits, but not by nearly as much.

A few lessons can be drawn from the numbers. First, the Bush tax cuts have had a huge damaging effect. If all of them expired as scheduled at the end of 2012, future deficits would be cut by about half, to sustainable levels. Second, a healthy budget requires a healthy economy; recessions wreak havoc by reducing tax revenue. Government has to spur demand and create jobs in a deep downturn, even though doing so worsens the deficit in the short run. Third, spending cuts alone will not close the gap. The chronic revenue shortfalls from serial tax cuts are simply too deep to fill with spending cuts alone. Taxes have to go up.

In future decades, when rising health costs with an aging population hit the budget in full force, deficits are projected to be far deeper than they are now. Effective health care reform, and a willingness to pay more taxes, will be the biggest factors in controlling those deficits.

Druckversion - Vertical Farming: Can Urban Agriculture Feed a Hungry World? - SPIEGEL ONLINE - News - International

SPIEGEL ONLINE

SPIEGEL ONLINE

07/22/2011 10:58 AM

Vertical Farming

Can Urban Agriculture Feed a Hungry World?

By Fabian Kretschmer and Malte E. Kollenberg

Agricultural researchers believe that building indoor farms in the middle of cities could help solve the world's hunger problem. Experts say that vertical farming could feed up to 10 billion people and make agriculture independent of the weather and the need for land. There's only one snag: The urban farms need huge amounts of energy.

One day, Choi Kyu Hong might find himself in a vegetable garden on the 65th floor of a skyscraper. But, so far, his dream of picking fresh vegetables some 200 meters (655 feet) up has only been realized in hundreds of architectural designs.

In real life, the agricultural scientist remains far below such dizzying heights, conducting his work in a nondescript three-story building in the South Korean city of Suwon. The only thing that makes the squat structure stand out is the solar panels on its roof, which provide power for the prototype of a farm Choi is working on. If he and his colleagues succeed, their efforts may change the future of urban farming -- and how the world gets its food.

From the outside, the so-called vertical farm has nothing in common with the luxury high-rises surrounding it. Inside the building, heads of lettuce covering 450 square meters (4,800 square feet) are being painstakingly cultivated. Light and temperature levels are precisely regulated. Meanwhile, in the surrounding city, some 20 million people are hustling among the high-rises and apartment complexes, going about their daily lives.

Every person who steps foot in the Suwon vertical farm must first pass through an "air shower" to keep outside germs and bacteria from influencing the scientific experiment. Other than this oddity, though, the indoor agricultural center closely resembles a traditional rural farm. There are a few more technological bells and whistles (not to mention bright pink lighting) which remind visitors this is no normal farm. But the damp air, with its scent of fresh flowers, recalls that of a greenhouse.

Heads of lettuce are lined up in stacked layers. At the very bottom, small seedlings are thriving while, further up, there are riper plants almost ready to be picked. Unlike in conventional greenhouses, the one in Suwon uses no pesticides between the sowing and harvest periods, and all water is recycled. This makes the facility completely organic. It is also far more productive than a conventional greenhouse.

Choi meticulously checks the room temperature. He carefully checks the wavelengths of the red, white and blue LED lights aimed at the tender plants. Nothing is left to chance when it comes to the laboratory conditions of this young agricultural experiment. The goal is to develop optimal cultivation methods -- and ones that can compete on the open market. Indeed, Korea wants to bring vertical farming to the free market.

Nine Billion People by 2050

Vertical farming is an old idea. Indigenous people in South America have long used vertically layered growing techniques, and the rice terraces of East Asia follow a similar principle. But, now, a rapidly growing global population and increasingly limited resources are making the technique more attractive than ever.

The Green Revolution of the late 1950s boosted agricultural productivity at an astounding rate, allowing for the explosive population growth still seen today. Indeed, since 1950, the Earth's population has nearly tripled, from 2.4 billion to 7 billion, and global demand for food has grown accordingly.

Until now, the agricultural industry could keep up well enough -- otherwise swelling population figures would have leveled off long ago. But scientists warn that agricultural productivity has its limits. What's more, much of the land on which the world's food is grown has become exhausted or no longer usable. Likewise, there is not an endless supply of areas that can be converted to agricultural use.

By 2050, the UN predicts that the global population will surpass 9 billion people. Given current agricultural productivity rates, the Vertical Farm Project estimates that an agricultural area equal in size to roughly half of South America will be needed to feed this larger population.

Vertical farming has the potential to solve this problem. The term "vertical farming" was coined in 1915 by American geologist Gilbert Ellis Bailey. Architects and scientists have repeatedly looked into the idea since then, especially toward the end of the 20th century. In 1999, Dickson Despommier, a professor emeritus of environmental health sciences and microbiology at New York's Columbia University seized upon the idea together with his students. After having grown tired of his depressing lectures on the state of the world, his students finally protested and asked Despommier to work with them on a more positive project.

From the initial idea of "rooftop farming," the cultivation of plants on flat roofs, the class developed a high-rise concept. The students calculated that rooftop-based rice growing would be able to feed, at most, 2 percent of Manhattan's population. "If it can't be done using rooftops, why don't we just grow the crops inside the buildings?" Despommier asked himself. "We already know how to cultivate and water plants indoors."

With its many empty high-rise buildings, Manhattan was the perfect location to develop the idea. Despommier's students calculated that a single 30-story vertical farm could feed some 50,000 people. And, theoretically, 160 of these structures could provide all of New York with food year-round, without being at the mercy of cold snaps and dry spells.

The Power Problem

Despite these promising calculations, such high-rise farms still only exist as small-scale models. Critics don't expect this to change anytime soon. Agricultural researcher Stan Cox of the Kansas-based Land Institute sees vertical farming as more of a project for dreamy young architecture students than a practical solution to potential shortages in the global food supply.

The main problem is light -- in particular, the fact that sunlight has to be replaced by LEDs. According to Cox's calculations, if you wanted to replace all of the wheat cultivation in the US for an entire year using vertical farming, you would need eight times the amount of electricity generated by all the power plants in the US over a single year -- and that's just for powering the lighting.

It gets even more difficult if you intend to rely exclusively on renewable energies to supply this power, as Despommier hopes to do. At the moment, renewable energy sources only generate about 2 percent of all power in the US. Accordingly, the sector would have to be expanded 400-fold to create enough energy to illuminate indoor wheat crops for an entire year. Despommier seems to have fallen in love with an idea, Cox says, without considering the difficulties of its actual implementation.

Getting Closer to Reality

Even so, Despommier still believes in his vision of urban agriculture. And recent developments, like the ones in South Korea, might mean his dream is not as remote as critics say. Ten years ago, vertical farming was only an idea. Today, it has developed into a concrete model. About two years ago, the first prototypes were created.

In fact, the concept seems to be working already, at least on a small scale. In the Netherlands, the first foods from a vertical farm are already stocking supermarket shelves. The PlantLab, a 10-year-old company based three floors underground in the southern city of Den Bosch, has cultivated everything from ornamental shrubs and roses to nearly every crop imaginable, including strawberries, beans, cucumbers and corn. "We manage completely without sunlight," says PlantLab's Gertjan Meeuws. "But we still manage to achieve a yield three times the size of an average greenhouse's." What's more, PlantLab uses almost 90 percent less water than a conventional farm.

As a country which has limited land resources but which possesses much of the necessary technology, the Netherlands seems to be an ideal place to develop vertical farming. This is especially true now that its residents are increasingly demanding organic, pesticide-free foods -- and are prepared to pay more for it.

'The Next Agricultural Revolution'

Despommier believes that entire countries will soon be able to use vertical farming to feed their populations. The South Korean government, at least, is interested in exploring the possibility. At the moment, the country is forced to import a large share of its food. Indeed, according to a 2005 OECD report, South Korea places fifth-to-last in a global ranking on food security. Increasing food prices, climate change and the possibility of natural disasters can compound the problem.

These facts are not lost on the researchers in the vertical farming laboratory in Suwon. "We must be prepared to avert a catastrophe," Choi says.

Still, it will be some time before vertical farming is implemented on a commercial scale in South Korea. Choi's colleague Lee Hye Jin thinks that five more years of research are needed. "Only then will our vertical farm be ready for the free market," he says.


Thursday, July 21, 2011

Fourteen Propaganda Techniques Fox "News" Uses to Brainwash Americans | Truthout

Fourteen Propaganda Techniques Fox "News" Uses to Brainwash Americans

by: Dr. Cynthia Boaz, Truthout | News Analysis

There is nothing more sacred to the maintenance of democracy than a free press. Access to comprehensive, accurate and quality information is essential to the manifestation of Socratic citizenship - the society characterized by a civically engaged, well-informed and socially invested populace. Thus, to the degree that access to quality information is willfully or unintentionally obstructed, democracy itself is degraded.

It is ironic that in the era of 24-hour cable news networks and "reality" programming, the news-to-fluff ratio and overall veracity of information has declined precipitously. Take the fact Americans now spend on average about 50 hours a week using various forms of media, while at the same time cultural literacy levels hover just above the gutter. Not only does mainstream media now tolerate gross misrepresentations of fact and history by public figures (highlighted most recently by Sarah Palin's ludicrous depiction of Paul Revere's ride), but many media actually legitimize these displays. Pause for a moment and ask yourself what it means that the world's largest, most profitable and most popular news channel passes off as fact every whim, impulse and outrageously incompetent analysis of its so-called reporters. How did we get here? Take the enormous amount of misinformation that is taken for truth by Fox audiences: the belief that Saddam Hussein had weapons of mass destruction (WMD) and that he was in on 9/11, the belief that climate change isn't real and/or man-made, the belief that Barack Obama is Muslim and wasn't born in the United States, the insistence that all Arabs are Muslim and all Muslims are terrorists, the inexplicable perceptions that immigrants are both too lazy to work and are about to steal your job. All of these claims are demonstrably false, yet Fox News viewers will maintain their veracity with incredible zeal. Why? Is it simply that we have lost our respect for knowledge?

My curiosity about this question compelled me to sit down and document the most oft-used methods by which willful ignorance has been turned into dogma by Fox News and other propagandists disguised as media. The techniques I identify here also help to explain the simultaneously powerful identification the Fox media audience has with the network, as well as their ardent, reflexive defenses of it.

The good news is that the more conscious you are of these techniques, the less likely they are to work on you. The bad news is that those reading this article are probably the least in need in of it.

1. Panic Mongering. This goes one step beyond simple fear mongering. With panic mongering, there is never a break from the fear. The idea is to terrify and terrorize the audience during every waking moment. From Muslims to swine flu to recession to homosexuals to immigrants to the rapture itself, the belief over at Fox seems to be that if your fight-or-flight reflexes aren't activated, you aren't alive. This of course raises the question: why terrorize your own audience? Because it is the fastest way to bypasses the rational brain. In other words, when people are afraid, they don't think rationally. And when they can't think rationally, they'll believe anything.

2. Character Assassination/Ad Hominem. Fox does not like to waste time debating the idea. Instead, they prefer a quicker route to dispensing with their opponents: go after the person's credibility, motives, intelligence, character, or, if necessary, sanity. No category of character assassination is off the table and no offense is beneath them. Fox and like-minded media figures also use ad hominem attacks not just against individuals, but entire categories of people in an effort to discredit the ideas of every person who is seen to fall into that category, e.g. "liberals," "hippies," "progressives" etc. This form of argument - if it can be called that - leaves no room for genuine debate over ideas, so by definition, it is undemocratic. Not to mention just plain crass.

3. Projection/Flipping. This one is frustrating for the viewer who is trying to actually follow the argument. It involves taking whatever underhanded tactic you're using and then accusing your opponent of doing it to you first. We see this frequently in the immigration discussion, where anti-racists are accused of racism, or in the climate change debate, where those who argue for human causes of the phenomenon are accused of not having science or facts on their side. It's often called upon when the media host finds themselves on the ropes in the debate.

4. Rewriting History. This is another way of saying that propagandists make the facts fit their worldview. The Downing Street Memos on the Iraq war were a classic example of this on a massive scale, but it happens daily and over smaller issues as well. A recent case in point is Palin's mangling of the Paul Revere ride, which Fox reporters have bent over backward to validate. Why lie about the historical facts, even when they can be demonstrated to be false? Well, because dogmatic minds actually find it easier to reject reality than to update their viewpoints. They will literally rewrite history if it serves their interests. And they'll often speak with such authority that the casual viewer will be tempted to question what they knew as fact.

5. Scapegoating/Othering. This works best when people feel insecure or scared. It's technically a form of both fear mongering and diversion, but it is so pervasive that it deserves its own category. The simple idea is that if you can find a group to blame for social or economic problems, you can then go on to a) justify violence/dehumanization of them, and b) subvert responsibility for any harm that may befall them as a result.

6. Conflating Violence With Power and Opposition to Violence With Weakness. This is more of what I'd call a "meta-frame" (a deeply held belief) than a media technique, but it is manifested in the ways news is reported constantly. For example, terms like "show of strength" are often used to describe acts of repression, such as those by the Iranian regime against the protesters in the summer of 2009. There are several concerning consequences of this form of conflation. First, it has the potential to make people feel falsely emboldened by shows of force - it can turn wars into sporting events. Secondly, especially in the context of American politics, displays of violence - whether manifested in war or debates about the Second Amendment - are seen as noble and (in an especially surreal irony) moral. Violence become synonymous with power, patriotism and piety.

7. Bullying. This is a favorite technique of several Fox commentators. That it continues to be employed demonstrates that it seems to have some efficacy. Bullying and yelling works best on people who come to the conversation with a lack of confidence, either in themselves or their grasp of the subject being discussed. The bully exploits this lack of confidence by berating the guest into submission or compliance. Often, less self-possessed people will feel shame and anxiety when being berated and the quickest way to end the immediate discomfort is to cede authority to the bully. The bully is then able to interpret that as a "win."

8. Confusion. As with the preceding technique, this one works best on an audience that is less confident and self-possessed. The idea is to deliberately confuse the argument, but insist that the logic is airtight and imply that anyone who disagrees is either too dumb or too fanatical to follow along. Less independent minds will interpret the confusion technique as a form of sophisticated thinking, thereby giving the user's claims veracity in the viewer's mind.

9. Populism. This is especially popular in election years. The speakers identifies themselves as one of "the people" and the target of their ire as an enemy of the people. The opponent is always "elitist" or a "bureaucrat" or a "government insider" or some other category that is not the people. The idea is to make the opponent harder to relate to and harder to empathize with. It often goes hand in hand with scapegoating. A common logical fallacy with populism bias when used by the right is that accused "elitists" are almost always liberals - a category of political actors who, by definition, advocate for non-elite groups.

10. Invoking the Christian God. This is similar to othering and populism. With morality politics, the idea is to declare yourself and your allies as patriots, Christians and "real Americans" (those are inseparable categories in this line of thinking) and anyone who challenges them as not. Basically, God loves Fox and Republicans and America. And hates taxes and anyone who doesn't love those other three things. Because the speaker has been benedicted by God to speak on behalf of all Americans, any challenge is perceived as immoral. It's a cheap and easy technique used by all totalitarian entities from states to cults.

11. Saturation. There are three components to effective saturation: being repetitive, being ubiquitous and being consistent. The message must be repeated cover and over, it must be everywhere and it must be shared across commentators: e.g. "Saddam has WMD." Veracity and hard data have no relationship to the efficacy of saturation. There is a psychological effect of being exposed to the same message over and over, regardless of whether it's true or if it even makes sense, e.g., "Barack Obama wasn't born in the United States." If something is said enough times, by enough people, many will come to accept it as truth. Another example is Fox's own slogan of "Fair and Balanced."

12. Disparaging Education. There is an emerging and disturbing lack of reverence for education and intellectualism in many mainstream media discourses. In fact, in some circles (e.g. Fox), higher education is often disparaged as elitist. Having a university credential is perceived by these folks as not a sign of credibility, but of a lack of it. In fact, among some commentators, evidence of intellectual prowess is treated snidely and as anti-American. Education and other evidence of being trained in critical thinking are direct threats to a hive-mind mentality, which is why they are so viscerally demeaned.

13. Guilt by Association. This is a favorite of Glenn Beck and Andrew Breitbart, both of whom have used it to decimate the careers and lives of many good people. Here's how it works: if your cousin's college roommate's uncle's ex-wife attended a dinner party back in 1984 with Gorbachev's niece's ex-boyfriend's sister, then you, by extension are a communist set on destroying America. Period.

14. Diversion. This is where, when on the ropes, the media commentator suddenly takes the debate in a weird but predictable direction to avoid accountability. This is the point in the discussion where most Fox anchors start comparing the opponent to Saul Alinsky or invoking ACORN or Media Matters, in a desperate attempt to win through guilt by association. Or they'll talk about wanting to focus on "moving forward," as though by analyzing the current state of things or God forbid, how we got to this state of things, you have no regard for the future. Any attempt to bring the discussion back to the issue at hand will likely be called deflection, an ironic use of the technique of projection/flipping.

In debating some of these tactics with colleagues and friends, I have also noticed that the Fox viewership seems to be marked by a sort of collective personality disorder whereby the viewer feels almost as though they've been let into a secret society. Something about their affiliation with the network makes them feel privileged and this affinity is likely what drives the viewers to defend the network so vehemently. They seem to identify with it at a core level, because it tells them they are special and privy to something the rest of us don't have. It's akin to the loyalty one feels by being let into a private club or a gang. That effect is also likely to make the propaganda more powerful, because it goes mostly unquestioned.

In considering these tactics and their possible effects on American public discourse, it is important to note that historically, those who've genuinely accessed truth have never berated those who did not. You don't get honored by history when you beat up your opponent: look at Martin Luther King Jr., Robert Kennedy, Abraham Lincoln. These men did not find the need to engage in othering, ad homeinum attacks, guilt by association or bullying. This is because when a person has accessed a truth, they are not threatened by the opposing views of others. This reality reveals the righteous indignation of people like Glenn Beck, Bill O'Reilly and Sean Hannity as a symptom of untruth. These individuals are hostile and angry precisely because they don't feel confident in their own veracity. And in general, the more someone is losing their temper in a debate and the more intolerant they are of listening to others, the more you can be certain they do not know what they're talking about.

One final observation. Fox audiences, birthers and Tea Partiers often defend their arguments by pointing to the fact that a lot of people share the same perceptions. This is a reasonable point to the extent that Murdoch's News Corporation reaches a far larger audience than any other single media outlet. But, the fact that a lot of people believe something is not necessarily a sign that it's true; it's just a sign that it's been effectively marketed.

As honest, fair and truly intellectual debate degrades before the eyes of the global media audience, the quality of American democracy degrades along with it.

Tuesday, July 19, 2011

When Astronomy Met Computer Science | Top Stories | DISCOVER Magazine

When Astronomy Met Computer Science

07.19.2011

Digital sky surveys and real-time telescopic 
observations are unleashing an unprecedented flood 
of information. Astronomers have recently created new tools to sift through all that data, which could contain answers 
to some of the greatest questions in cosmology.

by Preston Lerner

For Kirk Borne, the information revolution began 11 years ago while he was working at NASA’s National Space Science Data Center in Greenbelt, Maryland. At a conference, another astronomer asked him if the center could archive a terabyte of data that had been collected from the MACHO sky survey, a project designed to study mysterious cosmic bodies that emit very little light or other radiation. Nowadays, plenty of desktop computers can store a terabyte on a hard drive. But when Borne ran the request up the flagpole, his boss almost choked. “That’s impossible!” he told Borne. “Don’t you realize that the entire data set NASA has collected over the past 45 years is one terabyte?”

“That’s when the lightbulb went off,” says Borne, who is now an associate professor of computational and data sciences at George Mason University. “That single experiment had produced as much data as the previous 15,000 experiments. I realized then that we needed to do something not only to make all that data available to scientists but also to enable scientific discovery from all that information.”

The tools of astronomy have changed drastically over just the past generation, and our picture of the universe has changed with them. Gone are the days of photographic plates that recorded the sky snapshot by painstaking snapshot. Today more than a dozen observatories on Earth and in space let researchers eyeball vast swaths of the universe in multiple wavelengths, from radio waves to gamma rays. And with the advent of digital detectors, computers have replaced darkrooms. These new capabilities provide a much more meaningful way to understand our place in the cosmos, but they have also unleashed a baffling torrent of data. Amazing discoveries might be in sight, yet hidden within all the information.

Since 2000, the $85 million Sloan Digital Sky Survey at the Apache Point Observatory in New Mexico has imaged more than one-third of the night sky, capturing information on more than 930,000 galaxies and 120,000 quasars. Computational analysis of Sloan’s prodigious data set has uncovered evidence of some of the earliest known astronomical objects, determined that most large galaxies harbor supermassive black holes, and even mapped out the three-dimensional structure of the local universe. “Before Sloan, individual researchers or small groups dominated astronomy,” says Robert Brunner, an astronomy professor at the University of Illinois at Urbana-Champaign. “You’d go to a telescope, get your data, and analyze it. Then Sloan came along, and suddenly there was this huge data set designed for one thing, but people were using it for all kinds of other interesting things. So you have this sea change in astronomy that allows people who aren’t affiliated with a project to ask entirely new questions.”

A new generation of sky surveys promises to catalog literally billions and billions of astronomical objects. Trouble is, there are not enough graduate students in the known universe to classify all of them. When the Large Synoptic Survey Telescope (LSST) in Cerro Pachón, Chile, aims its 3.2-
billion-pixel digital camera (the world’s largest) at the night sky in 2019, it will capture an area 49 times as large as the moon in each 15-second exposure, 2,000 times a night. Those snapshots will be stitched together over a decade to eventually form a motion picture of half the visible sky. The LSST, producing 30 terabytes of data nightly, will become the centerpiece of what some experts have dubbed the age of peta­scale astronomy—that’s 1015 bits (what Borne jokingly calls “a tonabytes”).

Mosaic view of the center of the Milky Way, composed from 1,200 images taken over the course of 200 hours by the Very Large Telescope in Cerro Paranal, Chile.

ESO/S.Guisard

The data deluge is already overwhelming astronomers, who in the past endured fierce competition to get just a little observing time at a major observatory. “For the first time in history, we cannot examine all our data,” says George Djorgovski, an astronomy professor and codirector of the Center for Advanced Computing Research at Caltech. “It’s not just data volume. It’s also the quality and complexity. A major sky survey might detect millions or even billions of objects, and for each object we might measure thousands of attributes in a thousand dimensions. You can get a data-mining package off the shelf, but if you want to deal with a billion data vectors in a thousand dimensions, you’re out of luck even if you own the world’s biggest supercomputer. The challenge is to develop a new scientific methodology for the 21st century.”

The backbone of that methodology is the data-crunching technique known as informatics. It has already transformed medicine, allowing biologists to sequence the DNA of thousands of organisms and look for genetic clues to health and disease. Astronomers hope informatics will do the same for them. The basic idea is to use computers to extract meaning from raw data too complex for the human brain to comprehend. Algorithms can scour terabytes of data in seconds, highlighting patterns and anomalies, visualizing key information, and even “learning” on the job.

In a sense, informatics merely enables astronomers to do what they have always done, just a lot more quickly and accurately. For example, data mining is useful for classifying and clustering information, two critical techniques in an astronomer’s tool kit. Is an object a star or a galaxy? If it is a galaxy, is it spiral or elliptical? If 
it is elliptical, is it round or flat? Not so many years ago, such questions were addressed by eyeballing photographic plates. Classification is not a big deal when you are working with hundreds of extrasolar planets or thousands of supernovas, but it becomes hugely complicated when you are trying to make sense of billions of objects.

Research scientist Matthew Graham of the Center for Advanced Computing Research at Caltech recalls trying to identify a few hundred quasars in 1996 for his doctoral thesis on large-scale structures in the distant universe. He did it the old-fashioned way—with pencil and paper and laborious trial and error. When the LSST is completed, it will be far simpler to assemble a data set of millions of quasars.

+++

The Antennae—a pair of
galaxies in the midst
of a violent collision 62
million light-years away-
seen in a composite of X-ray,
optical and infrared data.

NASA

Setting algorithms loose on larger samples not only makes it easier to recognize patterns but also speeds the identification of outliers. “These days, one in a million objects is a serendipitous discovery,” Graham says. “You just happened to have the telescope pointed at the right place at the right time.” This is often the case in the search for “high-redshift” quasars, extremely distant and luminous objects powered by supermassive black holes. Right now, finding them is largely a matter of luck. With computers powering through a billion objects, astronomers can search more methodically for such extreme quasars—or for any other type of unusual object. This approach is not only faster but more accurate. The ability to say with statistical certainty that something is out of the ordinary allows astronomers to focus on the exceptions that prove the rule.

On the flip side, informatics is a remarkable tool for collecting statistics on the norm and using the tools of probability to figure out what the universe is like as a whole. For instance, astronomers have traditionally estimated the distances to remote galaxies using a spectrometer, which divides light from an object into its constituent wavelengths. But for every spectrum produced by Sloan, there were about 100 objects without spectra, only images. So Brunner put astroinformatics to work: He developed an algorithm (pdf) that allows astronomers to estimate an object’s distance just by analyzing imagery, giving them a much bigger data set for studying the 3-D structure of the universe. “This will be really important with LSST,” he says, “because we won’t be able to get spectra for 99 percent of the objects.”

The interdisciplinary marriage between computer science and astronomy has not been fully embraced by either family yet, but that is changing. Last May brought a watershed moment, the debut of the Virtual Astro­nomical Observatory. This international network, 10 years in the making, allows astronomers to use the Internet to assemble data from dozens of telescopes. Then, in June, Caltech hosted the first international conference on “astroinformatics.” 
Astronomers are used to working at the limits of human imagination, but even they have a hard time envisioning the kinds of insights they will be able to pull out of the bounteous new databases. “We’ve built the roads,” Djorgovski says. “Now we need some Ferraris to drive on them.”

ZOONIVERSE

Data to the People

Hanny's Voorwerp

In 2007, Oxford doctoral candidate Kevin Schawinski, exhausted from classifying 50,000 galaxies in one week, decided to solicit help from the robust community of amateur astronomers, using a technique known as crowdsourcing. The resulting project, Galaxy Zoo, allowed volunteers to classify images from the Sloan Digital Sky Survey on their home computers.

Within 24 hours of its debut, the site was generating 70,000 classifications an hour. An upgraded Galaxy Zoo 2, launched two years later, collected 60 million classifications from tens of thousands of users in 14 months. On the back end, a statistical process called “cleaning clicks” searched for and eliminated the inevitable bogus and mistaken classifications.

The interface was so intuitive that even Galaxy Zoo participant Matthew Graham’s 6-year-old could grasp it. “She thought it was a game,” he says. But Galaxy Zoo is much more than a toy. It has produced two dozen scientific papers and identified several previously unknown objects, most notably Hanny’s Voorwerp (right), a peculiar intergalactic blob named after the Dutch schoolteacher who spotted it, and a class of hyperactive galaxies dubbed the Green Peas. “Nonexperts end up discovering weird things because they don’t know not to ask, ‘Hey, what’s that over there in the corner?’ ” says Lucy Fortson, an associate professor at the University of Minnesota and project manager for the Citizen Science Alliance.

Galaxy Zoo has since morphed into the larger Zooniverse, which oversees more than 380,000 volunteers engaged in a variety of astronomical projects. Moon Zoo is attempting to count every crater on the moon. Its volunteers have so far classified more than 1.7 million images from NASA’s Lunar Reconnaissance Orbiter. The Milky Way Project scours infrared data from the Spitzer Space Telescope for evidence of gas clouds: Participants use their computers to draw circles on cloud “bubbles” thought to result from shock waves stirred up by extremely bright young stars. Planet Hunters, meanwhile, puts citizen scientists to work analyzing readings from NASA’s Kepler space telescope, designed to find Earth-like planets orbiting other stars. Equally if not more important, scientists are using the classifications made by Zooniverse participants to develop more accurate machine-learning algorithms so that computers will be able to do this kind of work in the future. See for yourself: 
zooniverse.org


Telescopes Without Borders


To learn as much as possible about distant objects, astronomers observe them with telescopes that “see” in various wavelengths. Unfortunately, the resulting data sets are archived in many locations all over the world, which makes them difficult to access; most are also inherently incompatible, so merging them requires a lot of painstaking labor. About 10 years ago, a group of astronomers started talking about creating a unified, global virtual observatory. Like the Internet, the virtual observatory is more a framework than a physical thing—a research environment linking data from a wide array of telescopes and archives and providing the tools to study them.

In the United States, an experi­mental version (the National Virtual Observatory) launched in 2002, but the lack of good data-analyzing tools made it difficult to use. “There was no science involved, just plumbing,” says Caltech astronomer George Djorgovski, a member of the virtual observatory’s science advisory council. “People who wanted to do science, myself included, got impatient and went to work on their own projects. No results to show, nobody wants to use it. Nobody wants to use it, no results to show.” The prospects for virtual astronomy improved dramatically last May when NASA and the National Science Foundation kicked in funding of $27.5 million over five years to finally bring the Virtual Astronomical Observatory (VAO) online and continue to develop tools for sharing data with astronomers worldwide.

The vao will not produce breakthroughs on its own, but it will make them possible. Kirk Borne likens it to the http protocol used to surf the Internet: “The Internet changed the world. But http made it possible.” See for yourself: usvao.org


Smile: The Universe 
in 1 Trillion Dazzling Pixels


Early this year astronomers with the Sloan Digital Sky Survey released the largest color image of the universe ever made, a trillion-pixel set of paired portraits that covers one-third of the night sky. It includes roughly a quarter of a billion galaxies and about the same number of stars within our home galaxy, the Milky Way. The brownish image at far left—dubbed the “orange spider” by one team member—is one of the portraits, covering the Milky Way’s southern hemisphere. Each point in the image represents multiple galaxies.

A dive into the image’s densely packed imagery reveals astonishing detail. The orange box at far left calls out M33, the Triangulum Galaxy, which at 3 million light-years away is one of our closest galactic neighbors. Zooming in shows M33’s spiral form. A further zoom brings into view green, spidery NGC 604, one of the largest nebulas in M33 and home to more than 200 newly formed stars. “Astronomers can use the data we drew on to create this image as a kind of guidepost,” New York University astronomer Michael Blanton says. And so they are: In the first two weeks after the Sloan team made the map available online, researchers queried the data about 60,000 times.


SLOAN DIGITAL SKY SURVEY

Greatest Mapmaker in the Universe


The Sloan Digital Sky Survey (SDSS), launched in 2000, heralded the modern age of big-picture astronomy. For years, scientists who needed a global sense of what was out there relied on one dominant set of photographs—the Palomar Observatory Sky Survey—created in the 1950s. The Sloan Telescope (located at the Apache Point Observatory in New Mexico) retraced much of the Palomar Survey but replaced photographic plates with digital imagery that could be updated and analyzed electronically, anywhere. “Sloan was the single biggest player in converting people to embrace this approach,” says Caltech astronomer George Djorgovski. “Sky surveys became respectable not only because they brought in so much data but because the content of the data was so high that it enabled so many people to do science.”

Sloan scientists have made some spectacular discoveries. In 2000 the project’s researchers spotted the most distant quasar ever observed. But independent astronomers have authored the vast majority of the 2,000-plus scientific papers based on SDSS; they simply use Sloan public data as the basis of their research. In one dramatic example, astronomers at Cambridge University discovered the “Field of Streams,” a spray of stars stretching nearly one-quarter of the way across the sky. They seem to be the shreds of small galaxies that were cannibalized by the Milky Way.

Data mining and other tools of informatics have been particularly helpful in extracting useful information from basic brightness measurements. Such data were thought to be of secondary importance when Sloan began but actually enabled astronomers to identify 100 times as many objects as expected. University of Illinois astronomer Robert Brunner is still reveling in the Sloan’s expanded view of the universe: “Our techniques allow us to start inquiring into the relationship between dark matter and supermassive black holes and how they influence galaxy formation and evolution.” See for yourself: sdss.org


Large Synoptic 
Survey Telescope Movie Camera to the Stars


The Large Synoptic Survey Telescope [LSST], being built atop Cerro Pachón in Chile, is a $450 million megaproject that will truly cement the relationship between astronomy and informatics. It is designed to probe dark energy and dark matter, take a thorough inventory of the solar system, map the Milky Way in unprecedented detail, and generally watch for anything that changes or moves in the sky.

Armed with an 8.4-meter (27-foot) optical telescope and a 3,200-megapixel camera—the world’s largest—the LSST will record as much data in a couple of nights as the Sloan Survey did in eight years. “For the first time, we’re going to have more astronomical objects cataloged in a coherent survey than there are people on Earth,” says Simon Krughoff, a member of the LSST data management team. (For those keeping score at home, experts project 20 billion objects.)

The numbers are so big and daunting that the LSST is the first astronomical project ever to formally incorporate informatics into its design architecture. “I made the case that we needed a group focused on data mining, machine learning, and visualization research to involve not just astronomers but also computer scientists and statisticians,” says Kirk Borne, who chairs the informatics and statistics team. The LSST will image the entire visible sky so rigorously that it will produce, in effect, a 10-year-long feature film of the universe. This should lead to tremendous advances in time-domain astronomy: studying fast-changing phenomena as they occur—black holes being born, supernovas exploding—as well as locating potentially Earth-threatening asteroids and mapping the little-understood population of objects orbiting out beyond Neptune. See for yourself: lsst.org