Tag: "greenhouse gases"

Smart Growth: Destroying Housing Opportunity from California to Australia (and Beyond)

Two recent stories provide further  evidence on the extent to which urban containment policies (also called smart growth, growth management, compact city policy, livability and urban consolidation) raises house prices relative to incomes, thereby reducing housing affordability. Because housing represents the largest element of household budgets (not transportation as a US government website implies), urban containment policy reduces discretionary income — the money households have left over after taxes and paying for necessities. This leads to a lower standard of living and more poverty, and violates the fundamental purposes of urban planning, described by former World Bank principal planner Alain Bertaud as:

“Increasing mobility and affordability are the two main objectives of urban planning. These two objectives are directly related to the overall goal of maximizing the size of a city’s labor market, and therefore, its economic prosperity.”

Two recent stories describe the effects of urban containment policy on the standard of living:

The Economist and Urban Containment “Fat Cats”

“Free Exchange” in The Economist came  down strongly on the side of economics in a review of housing affordability.

According to The Economist, the unusually high cost of housing in San Francisco (and other places) is principally the result of tight land use regulation, which makes it expensive or impossible to build. If “local regulations did not do much to discourage creation of new housing supply, then the market for San Francisco would be pretty competitive.” Add to that Vancouver, Sydney, Melbourne, Toronto, Portland and a host of additional metropolitan areas, where urban containment policy has driven house prices well above the 3.0 median multiple indicated by historic market fundamentals.

The Economist explains the issue in greater detail: “We therefore get highly restrictive building regulations. Tight supply limits mean that the gap between the marginal cost of a unit of San Francisco and the value to the marginal resident of San Francisco (and the market price of the unit) is enormous. That difference is pocketed by the rent-seeking NIMBYs of San Francisco. However altruistic they perceive their mission to be, the result is similar to what you’d get if fat cat industrialists lobbied the government to drive their competition out of business.” (Our emphasis).

Of course urban planning interests have long denied that that rationing land is associated with higher housing prices (read greater poverty and a lower standard of living). Nonetheless urban containment policies not only drive up the price of land, but do so even as they reduce the amount of land used for each new residence, driving prices per square foot of land up as well.

The Economist notes that unless the direction is changed, housing policy will continue to be “an instrument of oligarchy. Who knows. But however one imagines this playing out, we should be clear about what is happening, and what its effects have been.”

Land Prices Skyrocket as Residential Lot Sizes Fall in Australia

The extent to which smart growth policy (urban continament policy or urban consolidation policy) is associated with higher land (and house) prices is illustrated by a recent press release from RP Data in Australia. The analysis examined the vacant building lot prices for the period of 1993 to 2013.

During the period, the median price of a vacant lot rose 168 percent after adjustment for inflation.This is nearly 5 times the increase in the median household incomes of the seven largest capital cities (Sydney, Melbourne, Brisbane, Perth, Adelaide, Canberra and Sydney).

But it gets worse. The median lot size was reduced nearly 30 percent. This should put paid to the myth that urban containment reduces lot prices as it reduces their sizes. The same dynamic has been indicated in the United States.

Australia has been plagued by huge house cost increases relative to incomes in association with urban containment policy. Before the adoption of urban containment policy, it was typical for house prices to average three times or less than that of household income. Now, Sydney has the highest median multiple (median house price divided by median household income) of any major metropolitan area in the New World, with the exceptions of Vancouver and San Francisco. Melbourne, the second largest metropolitan area in Australia, has a median multiple of 8.4, making it fifth most costly in the New World, behind San Jose. All of Australia’s major metropolitan areas are “severely unaffordable,” including slow-growing Adelaide (6.3), as well as most smaller areas.

Getting Priorities Right

Research on these impacts led London School of Economics professor Paul Cheshire to conclude that urban containment policy is irreconcilable with housing affordability. This means that urban containment policy is irreconcilable with a better economic future for households, including those in poverty.

The purposes of urban containment policy are largely driven by a particular vision of the urban form and a manifestly wrongheaded belief that rationing land and limiting mobility can contribute materially to reducing greenhouse gas emissions. The issue is neither urban design nor the expensive and ineffective strategies of urban containment. People are more important — their standard of living and reducing the number living in poverty. There is a compelling need to reorient urban policy in this direction (see Toward More Prosperous Cities).

—–

For a complete listing of median multiples by major metropolitan area, see the 10th Annual Demographia International Housing Affordability Survey.

Additional information on the RP Data research is available at Australian Property Through Foreign Eyes

—–

Note: This article is adapted from contributions by the author to the newgeography.com

Is the EPA Playing Politics?

According to reports, the Environmental Protection Agency (EPA) may have held back on the publication of a new energy regulation in order to protect Democrats in the 2014 midterm elections.

In June 2013, President Obama asked the EPA to issue rules regulating carbon dioxide emissions from power plants, and the EPA proposed such standards on September 20, 2013.

But this is where the agency started to deviate from normal procedure.

Typically, rulemaking goes like this: The EPA proposes a new rule (usually, it announces this on its website for the public to see) and then — generally within five days — it submits the rule to the Federal Register for publication. Once published in the Federal Register, the public has a limited number of days to comment on the proposal. Notably, the Clean Air Act requires the EPA to finalize emissions rules for new power plants within one year of publication in the Federal Register.

But the September emissions rule was not sent to the Federal Register within five days. It was not even sent there a month later. Instead, a full 66 days after the rule was proposed, the EPA finally sent the proposal to the Federal Register for publication, on November 25, 2013.

What does this have to do with the 2014 midterms?

Because the publication date determines when the rule is finalized, pushing the publication date to January 2014 meant that the controversial rule would not become final until January 2015.

But had the EPA followed protocol and submitted the rule for publication in September within the usual 1-5 day window, it would likely have been published just prior to the November 2014 midterm elections. Senator James Inhofe (R – OK) said just this in a letter to EPA Administrator Gina McCarthy:

The costs of the President’s [greenhouse gas] regulations are going to be enormous with far-reaching and irreparable impacts on our electricity generation capacity, affordability and reliability. With this in mind, it makes sense that the American public would react negatively to the finalization of this first round of [greenhouse gas] regulations…This makes the timing of your proposal very important. If the rule was finalized by September 20, 2014, the American people would have about six weeks to consider the negative impact of the rule on the economy prior to going to the polls. In addition to this, my colleagues and I would have been able to force a vote on a resolution of disapproval against the final rule…This possibility of electioneering is deeply troubling.

Had the EPA submitted the rule in a timely fashion, lawmakers could have been forced to take a stand on it prior to the November elections.

The EPA has blamed the delay on consistency needs and “formatting” (it is not entirely clear what would make this rule so unique that it would require two months of formatting…), as well as the government shutdown. But the shutdown did not start until October 1, seven working days after the rule was proposed. As Sen. Inhofe said:

If the EPA had followed…protocol, the [New Source Performance Standards] rule would have been submitted to the Federal Register’s office two full working days before the shutdown.

Perhaps most damning of all, Administrator McCarthy, testifying before the Senate Environment and Public Works Committee, said, “I will assure you that as soon as that proposal was released, we had submitted it to the Federal Register office.” She went on: “The delay was solely the backup in the Federal Register office.”

The Office of the Federal Register disagrees. Federal Register Director Charles Barth said that his office did not receive the EPA’s proposal until November 25. Moreover, once the agency did receive it and scheduled it for publication on December 30, the EPA requested that publication date be pushed forward even farther, to January 8, 2014!

Politico reports that the EPA says it pushed for the delay because it did not want to release the rule during the holidays. So the government wanted to wait and release controversial news when the public was actually paying attention? That would be a first.

Government Subsides Help Distort Science

A $500,000 study released this past Sunday in the peer-reviewed journal, Nature Climate Change, detailed how corn-based biofuels release seven percent more greenhouse gases in the initial five-year time frame compared with conventional gasoline. The study which was paid for by the federal government found that regardless of how much corn residue is taken off the field, the process contributes to global warming.

But administration officials who have devoted more than a billion dollars of taxpayer funds as well as the biofuel industry disagree. DuPont claims that the ethanol it will produce will be 100 percent better than gasoline in terms of greenhouse gas emissions. The Environmental Protection Agency (EPA) says the study “does not provide useful information relevant to the life cycle greenhouse gas emissions from corn stover ethanol”.

But there are reasons to doubt DuPont and the EPA. DuPont is getting billions in subsidies to produce biofuels. Federal subsidies help its stock price; the company would be foolish if it did not defend biofuels. Meanwhile an Associated Press investigation last year found that the EPA’s analysis of corn-based ethanol failed to accurately predict the environmental consequences. California regulators earlier declared that corn ethanol would not reduce global warming and may in fact make it worse. Other federal studies have reached the same conclusion. David Tillman, a researcher at the University of Minnesota who has researched biofuels emissions from the farm to the tailpipe, says the recent study is the best he has seen on the issue.

This controversy highlights several problems. Despite claims to the contrary, politics seem to play a part at the EPA. The EPA could have simply released a statement that research in this area is still developing and it is sticking with its initial conclusion that biofuels improve the environment. By issuing such a strong rebuke, it seems the organization is not open to new information. Real scientists know new discoveries come along all the time. Scientists do not offer blanket statements, but politicians do.

Further, no matter how well intentioned, subsidies distort the market. Reducing carbon emissions is a good goal, but when government picks a winner everybody else loses. We do not know if there is a better solution that corn-based ethanol. We do not know if the EPA is investing in real science or attaching itself to its preferred winner. The EPA’s role should be to judge the best solution the private sector develops. When the EPA provides subsidies to one technology over another, taxpayer money and possibly scientific integrity are lost forever.

Global “Clean” Energy Expenditures are Down (and Respect for Economic Realities are Up) in 2013

It is refreshing to see that environmentalists and liberal governments are beginning to recognize the economic realities they face when manipulating energy markets to promote clean, renewable energy sources. For example, a recent Time Magazine article investigates why total public and private funding of “clean power” from the global renewable energy industry fell 14% in 2013. This amounts to a decline of 23% since the peak of such spending occurred in 2011. The data cited came from a study from the Frankfurt School-UNEP and Bloomberg New Energy Finance.

This study points out that Europe decreased its spending on clean, renewable energy sources by 44% while the U.S. decreased it is spending by 10%. These reductions were found to largely arise from three economic realities:

  • The declining costs of producing “clean” energy.
  • The significant reduction in public subsidies.
  • Increased competition from renewable but “unclean” biofuels power sources.

Economic Reality #1: Subsidizing an activity can drive down the unit cost of production by creating economies of scale. For example, the average cost for installing a voltaic solar cell in the U.S. declined 60% in the last few years. Indeed, despite the reductions in total spending in 2013, global clean energy capacity in 2013 (from renewable energy sources other than existing hydroelectric power sources) had remained the same as it was in 2012. However…

Economic Reality #2: Public sector funding sources are scarce. As Europe is slowly recovering from the recent global recession, the central governments of these countries are finding it very difficult to justify costly public investments in clean energy subsidies when other popular social programs compete for survival in an environment of shrinking public sector budgets. In fact, Spain and Bulgaria made their subsidy cuts retroactive, shuttering their clean energy industries, despite the falling unit costs of providing clean energy. Further…

Economic Reality #3: Every choice has an opportunity cost that cannot be avoided.  Clean power is defined as coming from renewable, sustainable fuel sources that create very low or no pollution or greenhouse gas emissions. Environmental scientists are beginning to realize that subsidized biofuel production:

  • Pushes up global food prices, because the fuel is grown with water sources and lands that could be used for growing food, which increases food prices and makes biofuels less “sustainable”.
  • Increases water pollution levels from pesticides and insecticides, making biofuels “unclean.” Indeed, an article in the magazine Scientific American notes that, “U.N. Intergovernmental Panel on Climate Change has for the first time acknowledged the risks of uncontrolled biofuels development.”

We seem to be living in a world where national governments are intent on accelerating our adoption of clean energy sources along a time line not supported by private energy markets. At least it is refreshing to see that both governments and environmentalists are slowly (if only involuntarily) admitting to economic reality: the true scarcity of valuable resources in our world creates real and unavoidable influences on the efficacy of government policies designed to accelerate clean energy industry development. We cannot simply wave the magic wand of “hope” to force the hand of the market in a manner that ignores such economic realities.

New IPCC Report: Death and Destruction!

The IPCC’s latest report (Climate Change 2014: Impacts, Adaptation, and Vulnerability) and it’s full of observations and predictions of calamity is now available.

Just a scan of the news headlines reveals the catastrophe once again forecast by the IPCC: Climate change to leave no one on planet ‘untouched,’ IPCC chief, New Climate Change Report Warns of Dire Consequences, New U.N. Report: Climate Change Risks Destabilizing Human Society, Climate change a threat to security, food and humankind – IPCC report, Panel’s Warning on Climate Risk: Worst Is Yet to Come.

Ahh!

The IPCC may be full of gloom and doom, but not everyone is on board. Joseph Bast over at Forbes looked at the 8 main risks in the report that the IPCC listed as “reasons for concern.” He puts them alongside conclusions from the Nongovernmental International Panel on Climate Change (NIPCC). Founded by atmospheric physicist Fred Singer, the NIPCC’s scientists assess global warming science and conduct independent reviews of the IPCC reports.

Just a few examples of the differences between the IPCC and NIPCC reports:

  • Food insecurity? Yes, says the IPCC. Little or no risk, says the NIPCC.
  • Severe harm for urban populations due to flooding? Yes, says the IPCC. No, says the NIPCC.
  • Systemic risks due to extreme weather events? Yes, says the IPCC. There is no support that precipitation in a warmer world becomes more variable and intense, says the NIPCC.
  • Risk of mortality, morbidity, and other harms? Yes, says the IPCC. No, says the NIPCC: Modest warming will actually result in a net reduction of human mortality.

The NIPCC reports are peer-reviewed, produced by scientists from 20 countries around the world, and cite thousands of peer-reviewed studies. The latest report is over 1,000 pages, and anyone can go online and view them.

Bast asks,

So is man-made global warming a crisis? Don’t just wonder about it, understand it yourself. Read one or a few chapters of one of the NIPCC reports, and ask if what you read is logical, factual, and relevant to the debate. See if the UN or its many apologists take into account the science and evidence NIPCC summarizes, and then decides whether its predictions ‘of death, injury, and disrupted livelihoods’ is science or fiction.

Matt Ridley over at the Wall Street Journal notes that the IPCC report predicts 70 percent more warming by the end of this century than the best science actually suggests. He then asks — what distinguishes the global warming “crisis” from the other crises we’ve been warned about in the past?

There remains a risk that the latest science is wrong and rapid warming will occur with disastrous consequences. And if renewable energy had proved by now to be cheap, clean and thrifty in its use of land, then we would be right to address that small risk of a large catastrophe by rushing to replace fossil fuels with first-generation wind, solar and bioenergy. But since these forms of energy have proved expensive, environmentally damaging and land-hungry, it appears that in our efforts to combat warming we may have been taking the economic equivalent of chemotherapy for a cold.

Almost every global environmental scare of the past half century proved exaggerated including the population “bomb,” pesticides, acid rain, the ozone hole, falling sperm counts, genetically engineered crops and killer bees. In every case, institutional scientists gained a lot of funding from the scare and then quietly converged on the view that the problem was much more moderate than the extreme voices had argued. Global warming is no different.

Are we really willing to transform our economies based on reports derived from faulty, ill-constructed models? Unless more people delve into these IPCC reports and look at the evidence presented by the NIPCC and others, we’re likely to do just that.

Electric Car Subsidies Distort Market, Without Reducing Pollution

Many states still rely on coal-burning power plants to generate over half of their electricity; electric cars are actually responsible for more greenhouse gas emissions per mile driven than hybrid cars, and are no better for the environment than comparable traditional vehicles. The hybrid Toyota Prius produces less carbon dioxide than the plug-in Nissan Leaf. The highly subsidized Chevrolet Volt in electric mode produces just as much carbon dioxide as it does when it operates in gas mode.

Lithium, the material in electric car batteries, can be resource intensive to mine. Since supplies of Lithium are limited, prices are expected to increase. Further lithium batteries need Copper and Aluminum to work correctly. Mining these elements requires significant chemicals, energy, and water.

Meanwhile conventional vehicles are becoming more fuel-efficient. For the 2013 model year, new cars averaged 23.5 miles per gallon. Cars averaged only 16.0 miles per gallon in 1980. With higher gasoline prices, manufacturers are scrambling to create even more fuel vehicles in the future.

Further, consumers are hardly demanding electric cars. Despite a $7,500 federal subsidy for buyers (and numerous state incentives), Chevrolet sold only 23,000 electric-powered Volts in 2012. The automaker sold more than 10 times as many Chevrolet Cruzes, the company’s gas-powered sister vehicle. By contrast, Ford sells 58,000 F-Series trucks a month.

Further, these programs fail to increase total car sales. Instead, they incentivize buyers to purchase a particular type of car — a Volt instead of a Cruz. Since consumers would buy a car anyway, this subsidy is a waste of precious resources.

Local municipalities like electric vehicle programs since much the subsidies come from federal and state sources. But this is not a federal freebie; it is a waste of taxpayers’ hard-earned money — money that instead could be spent or actual programs that improve transportation of the environment or better yet refunded to taxpayers.

Is the Science Really Settled?

The last 17 years without warming have been disappointments to climate activists insistent that the globe is getting hotter thanks to human activity. So what happens when the climate acts in ways that that do not fit the human-caused global warming narrative? How do warming proponents respond to them?

The IPCC released its latest climate change report in September, stating with 95 percent probability that “human influence has been the dominant cause of the observed warming since the mid-20th century.” But Benjamin Zycher at the American Enterprise Institute dove into the document and detailed exactly why the IPCC report “is a political document first and a (partial) summary of the scientific literature only secondarily.” (And if you are wondering where that 95 percent probability comes from, see Kenneth Green’s piece.)

USSC Hears Case on EPA Power

The Supreme Court heard oral arguments yesterday in a case that revolved around the EPA’s authority to regulate greenhouse gases under the Clean Air Act.

In 2010, the EPA regulated emissions from vehicles in its so-called “tailpipe rule.” The agency said that promulgating the tailpipe rule triggered authority within the Clean Air Act that stationary sources that also emit GHGs — such as factories and plants, but even stoves, fireplaces, and campfires— can be regulated.

The issue in the case is basically whether it is permissible for the EPA to regulate stationary sources based on this separate regulation of vehicles.

Notably, that part of the Clean Air Act that would, according to proponents, justify greenhouse gas regulation sets emission thresholds at such low levels that schools and small businesses would be covered by the rule. To remedy this, the EPA simply raised those emissions thresholds. Was that a reasonable move, or an illegal exercise of authority? Swing vote Justice Anthony Kennedy told the solicitor general, “I couldn’t find a single precedent that strongly supports your position,” and Justice Alito said that there existed no precedent for such unilateral revision in “the entire history of federal regulation.”

Those who heard the oral argument report that Kennedy appears once again to be the swing vote on the issue. The Washington Times has the story here, and SCOTUS Blog has even more details here and here.

The decision should be out later this Spring.

Confusion Over How to Criticize Greenhouse Gas Emissions

When environmentalists get all concerned about the negative influence of human activity over climate patterns around the globe, they typically point an accusatory finger at those nations with the highest levels of aggregate greenhouse gas production. Because greenhouse gasses are a direct side effect of economic activity, the guilty countries are always those with the largest economies. For example, a recent Global Post news article by Sarah Wolfe notes that the four biggest aggregate emitters of greenhouse gasses per year are China (6 trillion tons), the U.S. (5.9), Russia (1.7) and India (1.3).

Texas is Right to Fight EPA

Texas officials “are made as hell, and…aren’t going to take it anymore.” The Lone Star State’s attorney general says EPA has no authority to regulate the state’s greenhouse-gas emissions.