Sunday, August 11, 2019

Open Markets Institute | Commissioner Chopra's Powerful Dissent

After the FTC Fails to Fix Facebook, Commissioner Chopra’s Powerful Dissent Points to a Path Forward
The orginal article can be found here.
On Wednesday, the Federal Trade Commission announced the details of its $5 billion fine and settlement with Facebook over charges that it violated a 2011 consent decree with the enforcement agency. In addition to the fine, the FTC and Facebook agreed that Facebook would establish a privacy committee within its board of directors to review all privacy issues and choices along with an “independent assessor.” Other conditions of the settlement include transparency and security measures around gathering, storing, and sharing user information, as well as requiring Facebook Chairman and CEO Mark Zuckerberg to certify Facebook’s ongoing compliance with the FTC’s order. Facebook also announced in its second quarter earnings call that the FTC had opened an antitrust investigation into the social media giant. 
Enforcers and policymakers should pay particular attention to FTC Commissioner Rohit Chopra’s powerful dissenting statement. Chopra condemned the settlement, writing, “This framework does not protect the public — it protects Facebook” and “ratifies Facebook’s governance structure instead of changing it.” But Chopra’s statement also lays out where policymakers and law enforcers should go in the future — which is to directly address the dependence of Facebook’s business model on “surveillance and manipulation.” 
Federal Trade Commission Commissioner Rohit Chopra penned an forceful dissenting statement. 
“The case against Facebook is about more than just privacy – it is also about the power to control and manipulate,” Chopra wrote. “Global regulators and policymakers need to confront the dangers associated with mass surveillance and the resulting ability to control and influence us. The behavioral advertising business incentives of technology platforms spur practices that are dividing our society. The harm from this conduct is immeasurable, and regulators and policymakers must confront it.”
Facebook being, after all, a private for-profit corporation, Chopra added, “We should reasonably assume it seeks to advance its own financial gains. Here, Facebook’s behavioral advertising business model is both the company’s profit engine and arguably the root cause of its widespread and systemic problems. Behavioral advertising generates profits by turning users into products, their activity into assets, their communities into targets, and social media platforms into weapons of mass manipulation. We need to recognize the dangerous threat that this business model can pose to our democracy and economy.”
Read Chopra’s full dissent here and Open Markets’ statements on the settlement here and here.
Also see the letter from Sens. Edward J. Markey, D-Mass., Josh Hawley, R-Mo., and Richard Blumenthal, D-Conn.,  calling the settlement “woefully inadequate.” And read reactions to the initial reports of the FTC’s $5 billion fine from Sens. Amy Klobuchar, D-Minn., Mark Warner, D-Va., and Elizabeth Warren, D-Mass., and Reps. David Cicilline, D-R.I., and Jan Schakowsky, D-Ill.
See coverage of the FTC’s settlement quoting the Open Markets Institute from PBS, HuffPost, The Hill, and CNN Business
Barry C. Lynn, Open Markets Institute <>

Monday, July 22, 2019

Carl Sagan on the gift of Apollo

Today, we are celebrating one of the greatest days in human history: The day we stepped foot on the surface of the Moon. To celebrate with you, I wanted to share some wise words from my old Astronomy professor, Carl Sagan. He contributed the following article in 1994 while serving as President of The Planetary Society. It's a great reflection on the past, with a new perspective to take with us into the future:
"The gates of Heaven are open wide; off I ride..."
Ch'u Tz'u (China, ca. 3rd century B.C.E.)
It's a sultry night in July. You've fallen asleep in the armchair. Abruptly, you startle awake, disoriented. The television set is on, but not the sound. You strain to understand what you're seeing. Two ghostly white figures in coveralls and helmets are softly dancing under a pitch-black sky. They make strange little skipping motions, which propel them upward amid barely perceptible clouds of dust. But something is wrong. They take too long to come down. Encumbered as they are, they seem to be flying—a little. You rub your eyes, but the dreamlike tableau persists.
Of all the events surrounding Apollo 11's landing on the Moon on July 20, 1969, my most vivid recollection is its unreal quality. Yes, it was an astonishing technological achievement and a triumph for the United States. Yes, the astronauts—Neil Armstrong, Buzz Aldrin and Mike Collins, the last keeping solitary vigil in lunar orbit—displayed death-defying courage. Yes, as Armstrong said as he first alighted, this was a historic step for the human species. But if you turned off the byplay between Mission Control and the Sea of Tranquility, with its deliberately mundane and routine chatter, and stared into that black-and-white television monitor, you could glimpse that we humans had entered the realm of myth and legend.
We knew the Moon from our earliest days. It was there when our ancestors descended from the trees into the savannahs, when we learned to walk upright, when we first devised stone tools, when we domesticated fire, when we invented agriculture and built cities and set out to subdue the Earth. Folklore and popular songs celebrate a mysterious connection between the Moon and love. Especially when we lived out-of-doors, it was a major—if oddly intangible—presence in our lives.
The Moon was a metaphor for the unattainable: "You might as well ask for the Moon," they used to say. For most of our history, we had no idea what it was. A spirit? A god? A thing? It didn't look like something big far away, but more like something small nearby—something the size of a plate, maybe, hanging in the sky a little above our heads. Walking on the Moon would have seemed a screwball idea; it made more sense to imagine somehow climbing up into the sky on a ladder or on the back of a giant bird, grabbing the Moon and bringing it down to Earth. Nobody ever succeeded, although there were myths aplenty about heroes who had tried.
Not until a few centuries ago did the idea of the Moon as a place, a quarter million miles away, gain wide currency. And in that brief flicker of time, we've gone from the earliest steps in understanding the Moon's nature to walking and joyriding on its surface. We calculated how objects move in space; liquefied oxygen from the air; invented big rockets, telemetry, reliable electronics, inertial guidance and much else. Then we sailed out into the sky.
The Moon is no longer unattainable. A dozen humans, all Americans, have made those odd bounding motions they called "moonwalks" on the crunchy, cratered, ancient gray lava- beginning on that July day in 1969. But since 1972, no one from any nation has ventured back. Indeed, none of us has gone anywhere since the glory days of Apollo except into low Earth orbit—like a toddler who takes a few tentative steps outward and then, breathless, retreats to the safety of his mother's skirts.
Once upon a time, we soared into the solar system. For a few years. Then we hurried back. Why? What happened? What was Apollo really about?
The scope and audacity of John Kennedy's May 25, 1961, message to a joint session of Congress on "Urgent National Needs"—the speech that launched the Apollo program—dazzled me. We would use rockets not yet designed and alloys not yet conceived, navigation and docking schemes not yet devised, in order to send a man to an unknown world—
a world not yet explored, not even in a preliminary way, not even by robots—and we would bring him safely back, and we would do it before the decade was over. This confident pronouncement was made before any American had even achieved Earth orbit.
As a newly minted PhD, I actually thought all this had something centrally to do with science. But President Kennedy did not talk about discovering the origin of the Moon, or even about bringing samples of it back for study. All he seemed to be interested in was sending someone there and bringing him home. It was a kind of gesture. Kennedy's science advisor, Jerome Wiesner, later told me he had made a deal with the president: if Kennedy would not claim that Apollo was about science, then he, Wiesner, would support it. So if not science, what?
The Apollo program is really about politics, others told me. This sounded more promising. Nonaligned nations would be tempted to drift toward the Soviet Union if it was ahead in space exploration, if the U.S. showed insufficient "national vigor." I didn't follow. Here was the United States, ahead of the Soviet Union in virtually every area of technology—the world's economic, military and, on occasion, even moral leader—and Indonesia would go Communist because Yuri Gagarin beat John Glenn to Earth orbit? What's so special about space technology? Suddenly I understood.
Sending people to orbit the Earth or robots to orbit the Sun requires rockets-big, reliable, powerful rockets. Those same rockets can be used for nuclear war. The same technology that transports a man to the Moon can carry nuclear warheads halfway around the world. The same technology that puts an astronomer and a telescope in Earth orbit can also put up a laser "battle station."
Even back then, there was fanciful talk in military circles, East and West, about space as the new "high ground," about the nation that "controlled" space "controlling" the Earth. Of course strategic rockets were already being tested on Earth. But heaving a ballistic missile with a dummy warhead into a target zone in the middle of the Pacific Ocean doesn't buy much glory. Sending people into space captures the attention and imagination of the world. You wouldn't spend the money to launch astronauts for this reason alone, but of all the ways of demonstrating rocket potency, this one works best. It was a rite of national manhood; the shape of the boosters made this point readily understood without anyone actually having to explain it. The communication seemed to be transmitted from unconscious mind to unconscious mind without the higher mental faculties catching a whiff of what was going on.
When President Kennedy formulated the Apollo program, the Defense Department had a slew of space projects under development—ways of carrying military personnel up into space, ways of conveying them around the Earth, robot weapons on orbiting platforms intended to shoot down satellites and ballistic missiles of other nations. Apollo supplanted these programs. They never reached operational status. A case can be made then that Apollo served another purpose—to move the US-Soviet space competition from a military to a civilian arena. There are some who believe that Kennedy intended Apollo as a substitute for an arms race in space. Maybe.
Six more missions followed Apollo 11, all but one of which successfully landed on the lunar surface. Apollo 17 was the first to carry a scientist. As soon as he got there, the program was canceled. The first scientist and the last human to land on the Moon were the same person. The program had already served its purpose that July night in 1969. The half-dozen subsequent missions were just momentum.
Apollo was not mainly about science. It was not even mainly about space. Apollo was about ideological confrontation and nuclear war—often described by such euphemisms as world "leadership" and national "prestige." Nevertheless, good space science was done. We now know much more about the composition, age and history of the Moon and the origin of the lunar landforms. We have made progress in understanding where the Moon came from. Some of us have used lunar cratering statistics to better understand the Earth at the time of the origin of life. But more important than any of this, Apollo provided an aegis, an umbrella under which brilliantly engineered robot spacecraft were dispatched throughout the solar system, making that preliminary reconnaissance of dozens of new worlds. The offspring of Apollo have now reached the planetary frontiers.
If not for Apollo—and, therefore, if not for the political purpose it served—I doubt whether the historic American expeditions of exploration and discovery throughout the solar system would have occurred. The Mariners, Vikings, Voyagers, Magellan, Galileo and Cassini are among the gifts of Apollo. Something similar is true for the pioneering Soviet efforts in solar system exploration, including the first soft landings of robot spacecraft—Luna 9, Mars 3, Venera 8—on other worlds.
Apollo conveyed a confidence, energy and breadth of vision that did capture the imagination of the world. That too was part of its purpose. It inspired an optimism about technology, an enthusiasm for the future. If we could go to the Moon, what else was now possible? Even those who were not admirers of the United States readily acknowledged that—whatever the underlying reason for the program—the nation had, with Apollo, achieved greatness.
When you pack your bags for a big trip, you never know what's in store for you. The Apollo astronauts on their way to and from the Moon photographed their home planet. It was a natural thing to do, but it had consequences that few foresaw. For the first time, the inhabitants of Earth could see their world from above—the whole Earth, Earth in color, Earth as an exquisite spinning white and blue ball set against the vast darkness of space. Those images helped awaken our slumbering planetary consciousness. They provide incontestable evidence that we all share the same vulnerable planet. They remind us of what is important and what is not.
We may have found that perspective just in time, just as our technology threatens the habitability of our world. Whatever the reason we first mustered the Apollo program, however mired it was in Cold War nationalism and the instruments of death, the inescapable recognition of the unity and fragility of Earth is its clear and luminous dividend, the unexpected final gift of Apollo. What began in deadly competition has helped us to see that global cooperation is the essential precondition for our survival.
Travel is broadening.
It's time to hit the road again.
- Carl Sagan
Founder and First President for The Planetary Society
This article was adapted from a chapter Carl Sagan's book, Pale Blue Dot: A Vision of the Human Future in Space. It was originally featured in the May/June 1994 issue of the Planetary Society member magazine, The Planetary Report.

Saturday, July 06, 2013

Quote of the Day...

Will McAvoy (WillMcAvoyACN): "Belief in a cruel God makes a cruel man." -- Thomas Paine (Sent via Seesmic

Monday, February 18, 2013

Big Box Implosion

Knustler holds forth on the coming collapse of the "big box" marketing model, albeit using some shaky assumptions along the way. Bottom line: chain stores go away and the local economy returns.

Scale Implosion - Clusterfuck Nation

Monday, January 14, 2013

Climate Change Set to Make America Hotter, Drier and More Disaster-prone | Alternet

Published on Alternet (
Home > New Report Outlines Our Future: Climate Change Set to Make America Hotter, Drier and More Disaster-prone
The Guardian [1] / By Suzanne Goldenberg [2]

New Report Outlines Our Future: Climate Change Set to Make America Hotter, Drier and More Disaster-prone

January 14, 2013  |  
Future generations of Americans can expect to spend 25 days a year sweltering in temperatures above 100F (38C), with climate change [3] on course to turn the country into a hotter, drier, and more disaster-prone place.

The National Climate Assessment, released in draft form on Friday [4] , provided the fullest picture to date of the real-time effects of climate change on US life, and the most likely consequences for the future.

The 1,000-page report, the work of the more than 300 government scientists and outside experts, was unequivocal on the human causes of climate change, and on the links between climate change and extreme weather.

Climate change is already affecting the American people, the draft report said. Certain types of weather events have become more frequent and/or intense including heat waves, heavy downpours and in some regions floods and drought [5]. Sea level is rising, oceans are becoming more acidic, and glaciers and Arctic sea ice are melting.

The report, which is not due for adoption until 2014, was produced to guide federal, state and city governments in America in making long-term plans.

By the end of the 21st century, climate change is expected to result in increased risk of asthma and other public health emergencies, widespread power blackouts, and mass transit shutdowns, and possibly shortages of food.

Proactively preparing for climate change can reduce impacts, while also facilitating a more rapid and efficient response to changes as they happen, said Katharine Jacobs, the director of the National Climate Assessment.

The report will be open for public comment on Monday.

Environmental groups said they hoped the report would provide Barack Obama with the scientific evidence to push for measures that would slow or halt the rate of climate change – sparing the country some of the worst effects.

The report states clearly that the steps taken by Obama so far to reduce emissions are not close to sufficient to prevent the most severe consequences of climate change.

As climate change and its impacts are becoming more prevalent, Americans face choices, the report said. Beyond the next few decades, the amount of climate change will still largely be determined by the choices society makes about emissions. Lower emissions mean less future warming and less severe impacts. Higher emissions would mean more warming and more severe impacts.

As the report made clear: no place in America had gone untouched by climate change. Nowhere would be entirely immune from the effects of future climate change.

A heatwave swept across the US in 2011, with temperatures reaching over 110F (43C). Photograph: Timothy A Clary/AFP

Some of those changes are already evident: 2012 was by far the hottest year on record, fully a degree hotter than the last such record – an off-the-charts rate of increase.

Those high temperatures were on course to continue for the rest of the century, the draft report said. It noted that average US temperatures had increased by about 1.5F since 1895, with more than 80% of this increase since 1980.

The rise will be even steeper in future, with the next few decades projected for temperatures 2 to 4 degrees warmer in most areas. By 2100, if climate change continues on its present course, the country can expect to see 25 days a year with temperatures above 100F.

Night-time temperatures will also stay high, providing little respite from the heat.

Certain regions are projected to heat up even sooner. West Virginia, Maryland and Delaware can expect a doubling of days hotter than 95 degrees by the 2050s. In Texas and Oklahoma, the draft report doubled the probability of extreme heat events.

Those extreme temperatures would also exact a toll on public health, with worsening air pollution, and on infrastructure increasing the load for ageing power plants.

This 8 November 2011 image shows a storm bearing down on Alaska. Photograph: Ho/AFP/Getty Images

But nowhere will see changes as extreme as Alaska, the report said.

The most dramatic evidence is in Alaska, where average temperatures have increased more than twice as fast as the rest of the country, the draft report said. Of all the climate-related changes in the US, the rapid decline of Arctic sea ice cover in the last decade may be the most striking of all.

Other regions will face different extreme weather scenarios. The north-east, in particular, is at risk of coastal flooding because of sea-level rise and storm surges, as well as river flooding, because of an increase in heavy downpours.

A flooded farm along the Mississippi River is seen in Cairo, Illinois. Photograph: Stephen Lance Dennee/AP

The north-east has experienced a greater increase in extreme precipitation over the past few decades than any other region in the US, the report said. Between 1958 and 2010, the north-east saw a 74% increase in heavy downpours.

The midwest was projected to enjoy a longer growing season – but also an increased risk of extreme events like last year's drought. By mid-century, the combination of temperature increases and heavy rainfall or drought were expected to pull down yields of major US food crops, the report warned, threatening both American and global food security.

The report is the most ambitious scientific exercise ever undertaken to catalogue the real-time effects of climate change, and predict possible outcomes in the future.

It involved more than 300 government scientists and outside experts, compared to around 30 during the last such effort when George W Bush was president. Its findings were also much broader in scope, Jacobs said.

There were still unknowns though, the report conceded, especially about how the loss of sea ice in Greenland and Antarctica will affect future sea-level rise.

Campaign groups said they hoped the report would spur Obama to act on climate change in his second term. The draft assessment offers a perfect opportunity for President Obama at the outset of his second term, said Lou Leonard, director of the climate change programme for the World Wildlife Fund. When a similar report was released in 2009, the Administration largely swept it under the rug. This time, the President should use it to kick-start a national conversation on climate change.

However, the White House was exceedingly cautious on the draft release, noting in a blogpost [6]: The draft NCA is a scientific document—not a policy document—and does not make recommendations regarding actions that might be taken in response to climate change.

See more stories tagged with:
climate change [7],
water [8],
drought [9],
heat [10]
Source URL:

Wednesday, September 26, 2012

Alzheimer's: Diabetes of the Brain? l Suzanne DeLaMonte

By Dr. Suzanne DeLaMonte
Alpert Medical School, Brown University
Neuropathologist, Rhode Island Hospital

Although we’ve always known that Alzheimer’s disease is typically associated with numerous tangles and plaque in the brain, the exact cause of these abnormalities has been hard to pin down. Now, we may be closer to an answer.


In many respects, Alzheimer’s is a brain form of diabetes. Even in the earliest stages of disease, the brain’s ability to metabolize sugar is reduced. Normally, insulin plays a big role in helping the brain take up sugar from the blood. But, in Alzheimer’s, insulin is not very effective in the brain. Consequently, the brain cells practically starve to death.


How is that like diabetes?

These days, most people with diabetes have Type 2 diabetes mellitus. Basically, cells throughout the body become resistant to insulin signals. In an effort to encourage cells to take up more sugar from the blood, the pancreas increases the output of insulin. Imagine having to knock louder on a door to make the person inside open up and answer. The high levels of insulin could damage small blood vessels in the brain, and eventually lead to poor brain circulation. This problem could partly explain why Type 2 diabetes harms the brain. In Alzheimer’s, the brain, especially parts that deal with memory and personality, become resistant to insulin.


Why does the brain need insulin?

As in most organs, insulin stimulates brain cells to take up glucose or sugar, and metabolize it to make energy. Insulin also is very important for making chemicals known as neurotransmitters, which are needed for neurons to communicate with each other. Insulin also stimulates many functions that are needed to form new memories and conquer tasks that require learning and memory.


Where does the insulin come from in the brain?

Very sensitive tests showed that insulin is made in the brain. It’s made in neurons, and the hormone made in the brain is the same as that produced in the pancreas. This point may seem surprising, but if you consider the fact that every other gut hormone is also made in the brain, it only makes sense that insulin would be among them. Insulin that’s made by the pancreas and present in blood does gets into the brain as well.

Are people with diabetes more likely to get Alzheimer’s?

Absolutely. Their risk is doubled, at least. Obesity also increases the risk of cognitive impairment, or mental decline. This doesn’t mean that everyone who has diabetes will develop Alzheimer’s or that all people with Alzheimer’s have diabetes.  The important thing to recognize is that there is considerable overlap between Alzheimer’s and diabetes.

I’ve never heard that. Is this idea new?

In reality, before about 1980, there was very little overlap between Alzheimer’s and diabetes.  In fact, up until about 1980, deaths from diabetes were declining in the United States. That’s probably because of the improvements in medical treatment. But, between 1980 and now, the deaths from Alzheimer’s and diabetes have skyrocketed at alarming rates. The diabetes story is especially frightening because, everyone agrees that today we have much better medical treatments for diabetes than we did in the 1960s and 1970s – so, why should the death rates be so high now?


Maybe people are just living longer. Isn’t that the case?

People are living longer, but more important, they are surviving with various diseases that used to be fatal. On the surface, this argument might explain the increasing death rate trends for diabetes and Alzheimer’s. But, closer examination of the data demonstrated something entirely different and, in fact, surprising.


We compared the Alzheimer’s death rates in 1980, to those in 2005, but instead of looking at the entire population as a single group, we examined the death rates according to age group.  We looked at Alzheimer’s death rates in people between 45 and 54 years old, 55 and 64, 65 and 74, and so on. We found that within every single age group, the Alzheimer death rate was much higher in 2005 than it was in 1980.  In other words, deaths from Alzheimer’s were considerably higher for 60 year olds in 2005 than they were in 1980. Worse yet, over that time period and until these days, the Alzheimer’s death rates continued to climb, year by year. Diabetes death rates increased sharply within each age group, just as they did for Alzheimer’s.


Most people think Alzheimer’s is caused by a gene problem.

Alzheimer’s disease occurrences are not strictly genetic. In fact, the vast majority of Alzheimer’s occurs sporadically.

If it’s not genetic, what else could be the cause of Alzheimer’s?

Truly genetic diseases do not change over a 30-year period. That interval is too short to affect rates of genetic diseases that arise only in middle-aged or elderly people. The human breeding, growth, development and aging cycle is much longer than 30 years. In contrast, disease like HIV/AIDS and lung cancer are clearly exposure-related, so their mortality rates can be modified within a short period if the exposure to the disease-causing agents are reduced.


Could diabetes and Alzheimer’s be caused by some types of exposures?

We have reasonable evidence that human exposure to nitrosamines is at the root cause of not only Alzheimer’s, but several other insulin-resistance diseases, including Type 2 diabetes, fatty liver disease, also known as NASH, and visceral obesity. 


The elimination of local farms in favor of mega-farms requires transport of food for long distances. To prolong shelf-life, preservatives are added. The problem is worsened with transport of “fresh” foods from across the Pacific Ocean. Nitrites are added to meats and processed foods for flavor and coloring. High levels of nitrates added to fertilizers can be incorporated into produce and then converted to nitrites and finally nitrosamines in the body.


Nitrosamines contaminate many processed foods, including fish, cheeses, hotdogs, ground beef, smoked meats like bacon, smoked turkey and ham, and beer. Originally, nitrites were added to food as preservatives to prevent salmonella infection from contaminated meet. The policy remains in place. Although efforts have been made to reduce the levels, nitrites are still added as preservatives. Over time, Western societies, particularly in the US, have been chronically exposed to increasing amounts of nitrosamines due to continuous consumption of processed foods.


Nitrosamines are well-recognized cancer-causing agents. In high doses, they cause cancers in many organs. One of the main toxins in tobacco is a nitrosamine. However, low chronic exposures have cumulative effects. 


Years ago, a few scientists suggested that nitrosamines might cause diabetes. The concept was not pursued until now. We performed experiments in the laboratory and showed that very low, limited exposures to nitrosamines (the type found in food) cause Alzheimer’s-type brain degeneration, dementia, diabetes, fatty liver disease and obesity. Adding high fat to the diet made the disease-causing effects of nitrosamines much worse.

How were these findings reached?

We were working on the idea that insulin resistance in the brain was an important cause of disease and injected another drug into the brain to see what would happen. Instead of getting what we were looking for, we found Alzheimer’s. Very soon after that, I realized that the drug I used was a nitrosamine. A bell went off in my head and suddenly I understood the problem.  All of the major diseases related to insulin resistance, which are now epidemic in the United States, could be caused by exposure to low doses of nitrosamines over a period of years.


How can I reduce my risk?

For now, the main message is to stop getting exposed. There are small steps and larger ones. Protect yourself by looking for sodium nitrite on food labels. Avoid processed foods. Eat organically grown foods. Push policies to return farming back to local environments to gain control over how food is produced and eliminate requirements for toxic preservatives. Educate children and provide only healthful food choices. Learn to cook and teach cooking in public schools. Pack a healthful lunch the night before for easy grab-and-go in the morning.



Tuesday, September 11, 2012

Thomas Jefferson, science enthusiast | Guardian News

Whatever flawed versions of Thomas Jefferson are peddled by the American right, we know he loved his science.

If you want to enter an alternative reality, all you need to do is type words like Jefferson , religion and history into Google. The American right wing's attitude to some aspects of science is deeply troublesome, but so too is their rewriting of their national history. The Jefferson Lies: Exposing the Myths You've Always Believed About Thomas Jefferson, by David Barton, is a case in point. It is endorsed by Glenn Beck despite enormous criticism from historians and publishers.

In July, readers of History News Network voted it the Least Credible History Book in Print for its distortion of history and of Jefferson's views. The particular issues are identified around religion, slavery and the relationship between church and state, which Barton presents among seven lies told about the third president of the United States.

Barton has little to say about Jefferson's intense interest in science. He would have done if Jefferson had lived a couple of generations later, as the statesman then might have accepted the geological evidence of the Earth's age (which he was not inclined to do in the first decade of the century, when there was much dispute among geologists) and Darwin's theory of evolution, and Barton might have had an eighth lie to deal with.

But there is little in established 18th and early 19th century science that the Tea Party would feel the need to reject. This is a reminder of the fact that in Jefferson's time there was no perception of a war between science and religion and, indeed, that the American right do not necessarily have a blanket anti-science approach, but theological, political and ideological issues with particular fields.

However, where Barton does bring up science, he goes rather wrong. The main passage focuses on this Jefferson quote:

Bacon, Locke and Newton, I consider them as the three greatest men that have ever lived, without any exception, and as having laid the foundation of those superstructures which have been raised in the Physical and Moral Sciences.

Quite rightly, of course, Barton can point to the religiosity of these heroes of science, but he glosses over Newton's unorthodoxy, denies Locke's and presents this quote as part of his argument against the lie that Jefferson promoted secular education. This is quite bizarre, turning a blind eye to Locke's advocacy of religious tolerance and the separation of church and state. A quick read of Locke's A Letter Concerning Toleration would put him right.

An interest in science and advocacy of secularism in public life were, and are, by no means necessary bedfellows. Likewise, the interest of leaders and politicians of all stripes in many or most aspects of science and technology, which underpin national and military success in so many areas, goes without saying. Yet Jefferson's interest in science was part of his personal identity in a way that it is hard to imagine the likes of Glenn Beck celebrating.

This summer I visited the American Philosophical Society (APS) in Philadelphia, of which Jefferson was a key early member, to do some research into the Lewis and Clark Expedition across the American continent in 1804-06, a scientific and imperialistic venture that was Jefferson's pet project.

What I found fascinating in reading about this expedition was not just Jefferson's support for a prestige national project, but his close involvement in the scientific training of Merriweather Lewis, his secretary, in preparation, and ready input to discussions about instrumentation. Jefferson was, according to an article on the instruments of the expedition, inordinately fond of an equatorial theodolite he owned, made by London instrument-maker Jesse Ramsden, and thought they should take something similar.

At the APS I dipped into some of Jefferson's correspondence with Robert Patterson, the professor of mathematics at Philadelphia. More than once he wrote to thank Patterson for copies of the Nautical Almanac (the small books of astronomical tables for navigation published by the British Board of Longitude), as well as other scientific tracts, and for advice on buying and repairing instruments.

On 21 March 1811 he added:

before I entered on the business of the world I was much attached to Astronomy & had laid a sufficient foundation at College to have pursued it with satisfaction and advantage. but after 40 years of abstraction from it, and my mathematical acquirement coated over with rust, I find myself equal only to such simpler operations & practices in it as serve to amuse me. but they give me great amusement, and the more as I have some excellent instruments...

I don't suppose that there is anything here that would particularly challenge the Tea Partyers. It is not climate science or evolution, but an enthusiasm for tracking Jupiter's satellites. In any case, Barton's claims have already been thoroughly taken down by historians. And yet, throwing up an image of a Founding Father who enjoyed tinkering with precision instruments, perusing astronomical tables and corresponding with university professors seems as good a response as any to some of the painfully bad history being produced.

© 2012 Guardian News and Media Limited or its affiliated companies. All rights reserved.