Big trees grow faster, sequester more carbon

A few years ago, I spent a lovely week in Fort Collins, Colorado. Two great things came out of my time there. First, I was introduced to one of my all-time favourite beers, Fat Tire Amber Ale. Yum. While Fat Tire was immediately gratifying, the second has only borne fruit today, in the form of this paper in Nature. Yay.

The paper, led by Nathan Stephenson and Adrian Das of the USGS, is the culmination of many hours of work by 38 authors from institutions all over the world.

Trail past ancient Sequoiadendron giganteum (Giant Sequoia) tree.

The Fort Collins workshop, held at the USGS’s John Wesley Powell Center for Analysis and Synthesis, was convened to tackle a big question in forest ecology: what is the relationship between mass growth and tree size? And more specifically, how fast do the biggest trees grow and how do they sequester carbon? Surprisingly, no one had looked at this before, apart from the odd study on a handful of tree species. This was probably due to a combination of few data (there just haven’t been that many really big trees measured) and the fact that the answer seemed obvious. The prevailing wisdom was that big trees slow down their growth. The bigger they are, it was thought, the slower they grow, until eventually they just stop growing and die. Evidence for this assumption can be found in studies showing that old forests, and the leaves of old trees, are less productive. But until enough people with enough data had been stuck in the same room together, no one could really know if this assumption was true for most trees or most of the world’s forests.

All together, the Fort Collins group had access to growth measurements from a staggering 673,046 trees, belonging to 403 tree species from tropical, subtropical and temperate regions across six continents. Our analysis of these data showed that, for most tree species, not only did the growth of the biggest trees not slow down, but mass growth rate increased continuously with size. In fact, some trees were growing so fast that in one year they could put on the mass of a 20cm-diameter tree. In other words, if they had started growing this fast as seedlings it would only take a year for their trunks to reach 20cm wide! Of course, these trees are already so big that we just don’t notice the extra mass they are putting on. But it is amazing nonetheless, and not what anybody would have expected. But perhaps  increasing mass growth shouldn’t come as a complete surprise.

Though individual trees may be growing faster than we thought, their mass growth doesn’t necessarily translate to increased forest-level productivity. Older forests have fewer trees in them because many trees die. So while the biggest trees are getting heavier quicker, the forest itself may not be.

Still, I expect this paper might cause a bit of a stir and may have future ramifications for forestry and carbon policy. As carbon sinks, big old trees might suddenly have got a whole lot more valuable.

Posted in Environment, Science | Tagged , , , , , , | 2 Comments

Profiting from pilot studies

A Greybox transplant experiment

A Greybox transplant experiment

I graduated with a Masters of Philosophy at the University Melbourne earlier this year. And the first paper to come out of that work, to paraphrase Lil’ Wayne, has just “dropped like it’s hot”.

In our new paper (in early view at Basic & Applied Ecology) Pete, Mick and I illustrate the use of Bayesian informative priors to recover the inferential and predictive power of otherwise unusable pilot study data.

You can open pretty much any textbook on experimental design and one of the first things it’ll tell you is to do a pilot study. Pilot studies are standard practice and we do them so that we don’t waste resources doing an experiment only to find we had the wrong study design. The same textbooks will also tell you that if your pilot study indicates that you have to change the design of your study, forget about the data you’ve collected and get on with the new experiment. We argue that this default stance can be wrong and that otherwise unusable pilot study data can be used to construct a Bayesian prior for an analysis of a subsequent experiment.

We showed how data from a pilot study can be used in a case study on eucalypt seedling mortality during a transplant experiment conducted in Goulburn-Broken catchment in central Victoria, Australia. Using a pilot study to construct a prior prediction of mortality rate during a subsequent larger experiment, we found that including the informative prior effectively saved us thousands of dollars. Had we ignored the pilot study and included a flat prior in the final model, we would have needed to spend the extra money on monitoring hundreds more seedlings if we wanted the same amount of information as we did when including the informative prior.

Morris, W.K., Vesk, P.A., McCarthy, M.A., (early view, 2012) Profiting from pilot studies: analysing mortality using Bayesian models with informative priors Basic & Applied Ecology.

Posted in Science | Tagged , , , , | 1 Comment

Probably the best PhD I’ll ever do

Probably the best restaurant in town

This is just the sort of qualified optimism I intend to carry all the way through my PhD (Photo courtesy of my friend Hugh Rabinovici).

I have done three interesting things in the past couple of months. I graduated from my Masters degree (stay tuned for the papers), moved from Melbourne, Australia to New York City, USA (don’t worry I’ll be back soon enough… maybe), and started my PhD.

My PhD: Learning, planning and decision making for vegetation management

Over the next few years I plan to tackle (other than crippling poverty)… Um… lots of things? Well, so far only one thing in particular.


Chapter 1: Value of information for Box Ironbark woodland management

My first chapter continues on from work colleagues and I published last year and which I blogged about a while back.

In this chapter I am undertaking a value of information analysis. In essence, a value of information analysis is a decision theoretic tool for assessing the cost effectiveness of doing science to aid decision making. The issue of doing science (or not) before making a decision has come up quite a bit lately. How often have you heard a politician say “before we do such-and-such, we need to find out more about such-and-such… blah, blah, blah”. I suspect they rarely stop to ask whether the ‘science’ needs to be done or even it is worth doing in the first place.

Central to a value of information analysis are the quantities expected value of sample information (EVSI) and expected net gain of sampling (ENGS). EVSI is the average gain you expect to achieve given you have performed an experiment (or made some observations) and are subsequently able to make a better decision. ENGS is then the difference between EVSI and the cost of obtaining the new information.

More formally, a value of information analysis asks: given a set, A, of possible, a, actions you could take, which could result (probabilistically) in any outcome, \theta, of a set of outcomes, \Theta, what is the value of performing an experiment, e, from a set of possible experiments, E, given the experiment could have any outcome z from a set of experimental outcomes Z. In the mathematics of decision theory the aim is to maximise the utility function:

   u(e,z,a,\theta).

In other words, if you are faced with a decision problem under uncertainty what experiment should you undertake (which includes not doing any experiment at all) to maximize the difference between the change in your expected gain from taking an action and the cost of doing the experiment itself.

I am applying a value of information analysis to the management of Box Ironbark Woodlands. An objective of Victoria’s Box Ironbark Woodland managers is to improve the biodiversity value of the forests they manage. They have a number of management actions at their disposal, each of which has a different cost. But the effect of management, and the dynamics of the system itself, are highly uncertain. My aim is to help them decide whether doing experimental work to resolve some uncertainty is a cost-effective strategy that can aid them in achieving their objectives.

Further reading:

Posted in PhD | Tagged , , , , | 3 Comments

Rchievement of the day #3: Bloggin’ from R

I have become a complete knitr addict of late and have been using it in combination with RStudio’s R markdown support on a regular basis. In fact I wrote this post using it! It then dawned on me how great it would be if I could upload the post directly from R/RStudio. It turned out that wasn’t too hard at all. Here’s how.


How to update your WordPress.com blog from R

Installing the RWordPress package

First I need to install (if I haven’t already done so) and load the “RWordPress” package from www.omegahat.org. The “RWordPress” package uses XML-RPC to connect to the WordPress blogging engine.

 if (!require(RWordPress)) {
    install.packages("RWordPress", repos = "http://www.omegahat.org/R")
}
 ## Loading required package: RWordPress

 

Connecting to WordPress.com

Next I set my username, password and website url as R global options.

 options(WordPressLogin = c(wkmor1 = "password"), WordPressURL = "http://wkmor1.wordpress.com/xmlrpc.php")

That was not my real password by the way!

The “RWordpress” package provides a bunch of functions (see ?RWordPress) based on methods from the WordPress and associated  APIs.

For example I can use the getUsersBlogs function to retrieve metadata about my blogs.

 getUsersBlogs()
 ## $isAdmin
## [1] TRUE
##
## $url
## [1] "http://wkmor1.wordpress.com/"
##
## $blogid
## [1] "25365214"
##
## $blogName
## [1] "William K. Morris’s Blog"
##
## $xmlrpc
## [1] "http://wkmor1.wordpress.com/xmlrpc.php"
##

 

Making knitr output compatible with WordPress.com

If I was hosting a WordPress based blog myself, then I could go straight to posting knit2html built content straight to my blog. But a WordPress.com blog is bit more restrictive when it comes to content, so I’ll have to preproccess the html content before I upload it using “RWordPress”.

I have written a little function to extract the body of a knit2html built page and replace the code block markup with WordPress.com’s shortcode sourcecode blocks. Note this function requires the “XML” package which is available from CRAN.

 knit2wp.com <- function(file) {
    require(XML)
    post.content <- readLines(file)
    post.content <- gsub(" <", "&nbsp;<", post.content)
    post.content <- gsub("> ", ">&nbsp;", post.content)
    post.content <- htmlTreeParse(post.content)
    post.content <- paste(capture.output(print(post.content$children$html$children$body,
        indent = FALSE, tagSeparator = "")), collapse = "\n")
    post.content <- gsub("<?.body>", "", post.content)
    post.content <- gsub("<p>", "<p style=\"text-align: justify;\">", post.content)
    post.content <- gsub("<?pre><code class=\"r\">", "\\1\\\n ",
        post.content)
    post.content <- gsub("<?pre><code class=\"no-highlight\">", "\\1\\\n ",
        post.content)
    post.content <- gsub("<?/code></pre>", "\\\n\\[/sourcecode\\]", post.content)
    return(post.content)
}

 

Compiling the R markdown file and posting as blog content

Now it’s time to compile this document using knitr. Note that my working directory is set to the directory containing the .Rmd file.

 knit2html("Rchievement_of_the_day_3_Bloggin_from_R.Rmd")

Obviously I couldn’t actually run this code chunk from within the .Rmd file as it would have created a crazy inifinite loop, possibly ending the universe, much like Australia’s new carbon tax.

Now all that is left to do is publish the post using the “RWordPress” newpost function. The newpost function expects a list of blog parameters which can include the content (named description), titlecategories and tags (named mt_keywords). Setting publish=FALSE uploads the post as a draft.

 newPost(
    list(
      description=knit2wp.com('Rchievement_of_the_day_3_Bloggin_from_R.html'),
      title='Rchievement of the day #3: Bloggin&rsquo; from R',
      categories=c('Programming', 'R'),
      mt_keywords=c('rstats', 'blogging', 'XML-RPC', 'R', 'knitr', 'markdown')
    ),
  publish=FALSE)

Again, this code chunk was set to eval=FALSE

You can get the original .Rmd file of this post here or as a gist from:
git://gist.github.com/3027253.git

Related Posts:

Posted in Programming, R | Tagged , , , , , | 19 Comments

My first paper

After almost a year of blogging and five posts I’ve yet to write something about any of my publications. Luckily (or maybe unluckily) I’ve only authored four papers so I can start at the beginning. Here goes.

A Box Ironbark

A Box Ironbark

My first paper, “Quantifying variance components in ecological models based on expert opinion”, came out last year in the Journal Applied of Ecology. The lead author was Christina Czembor a post-grad and former member of the QAEG lab (We miss you Chrissy. Hopefully you’ll join us again some day!) and it was the second paper to come out of her masters thesis entitled: “Incorporating uncertainty into expert models for management of Box-Ironbark forests and woodlands in Victoria, Australia”.

In a nutshell, Chrissy built a bunch of state and transition models parameterised via expert opinion of Box-Ironbark Woodlands under alternative management scenarios. How she managed to get any work done inside a nutshell I’ll never know. The aim of the work we outlined in our paper was to suss out the influence of different sources of uncertainty in the STM model predictions. In particular, we wanted to know whether, when you ask multiple experts, does the total uncertainty stem more from disagreement among experts or from the uncertainty of individual experts.

I’ll let you read the paper to find out what the answer was.

Czembor, C.A., Morris, W.K., Wintle, B.A., Vesk, P.A. (2011). Quantifying variance components in ecological models based on expert opinionJournal of Applied Ecology 48:736–745.

Posted in Environment, Science | Tagged , , , , | 1 Comment

Does your flora and fauna come with a warranty?

My good law-abiding friends at the Environmental Defenders Office have recently released a report on the implementation and enforcement of Victoria’s Flora and Fauna Guarantee Act. The FFG is the principal piece of legislation dealing with biodiversity conservation and threatened species management in Victoria. Among the EDO report author’s key recommendations were that the Victorian Government should provide adequate resources to develop and implement action statements for listed taxa. And, that transparency and accountability be improved when it comes to implementing the FFG and collecting of data on listed taxa.

The EDO held a public forum last month to launch their report. I congratulate everyone who worked on it and hope those who should being paying attention do so. The report is part of series by the EDO, the next of which is to investigate the Native Vegetation Framework, another controversial environmental policy debate in our state. Stay tuned.

Posted in Environment, Law, Policy | Tagged , , | Leave a comment

You better get a lawyer Victoria’s forest biodiversity. You’re gonna need a real good one!

Threatened Victorian forest-dependent species may soon find themselves with almost no legal protection from logging, outside of the state’s national parks and conservation reserves, if the State Government goes ahead with proposed changes to the Code of Practice for Timber Production 2007.

Many species, such as the Leadbeater’s Possum (Gymnobelideus leadbeateri), one of Victoria’s faunal emblems, and the Spotted Tree Frog (Litoria spenceri) areSpotted Tree Frog restricted to the forests of South-eastern Australia and have, in the past, enjoyed the protection of both the Commonwealth Environment Protection and Biodiversity Conservation Act 1999 and Victoria’s Flora and Fauna Guarantee Act 1988. The Regional Forest Agreements of the late 90’s and early 2000’s removed the protection afforded by the EPBC Act quicker than the Democrats could say “GST”. Now the proposed changes to the Code of Practice for Timber Production 2007 could all but do the same to the FFG Act.

Leadbeater’s Possum

As it currently stands (pun intended) the Code is the primary instrument making FFG Action Statements enforceable, with the Code itself made binding under the Sustainable Forests (Timber) Act 2004. There is an FFG Action Statement for each species listed under the FFG Act, and the statements outline what can and can’t be done by loggers in Victorian State Forests. For instance, the Action Statement for Leadbeater’s Possum stipulates that logging cannot occur in areas of critical habitat, deemed Zone 1A.

But if the changes to the Code are implemented, then a proponent could lobby the Secretary to the Department of Sustainability and Environment to have an FFG Action Statement not apply. The proposed variations to the Code are vague, to say the least, stipulating that the Secretary to the DSE should consider seven factors when deciding whether the requirements of an FFG Action Statement should apply to a logging coupe or group of contiguous coupes:

  1. The known listed taxa or communities identified at the proposed location or coupe(s).
  2. The known range of the listed taxa or communities identified at the proposed location or coupe(s).
  3. The known habitat type(s) required by the listed taxa or communities identified at the proposed location or coupe(s).
  4. The amount and quality of suitable habitat for the listed taxa or communities identified at the proposed location or coupe(s) that is already protected from timber harvesting in national parks and other conservation reserves.
  5. The population size required to maintain the viability of the listed taxa or communities identified at the proposed location or coupe(s) (i.e. to prevent extinction of the taxon or community at the species and landscape level).
  6. The amount, quality and connectivity of habitat that is required to provide for the viability of the listed taxa or communities identified at the proposed location or coupe(s) at the species and landscape level.
  7. The results of any threatened species surveys undertaken at the location or coupe(s) in the past 18 months that have been verified by the Secretary to the Department of Sustainability and Environment.

Not to mention the lack of clarity around the actual process whereby the Secretary considers these factors, there are serious holes in the list itself. There is no scope in the proposed variations to the Code for the Secretary to consider genetic factors, the dynamism of critical habitat distribution, and other threatening processes such as climate change and disease that may act synergistically with habitat loss due to forestry. There is also no mention of uncertainty – it reads as if they think someone will be able to easily quantify these factors with high precision.

But perhaps most worrying of all is the lack of any transparency or public accountability in the process. It seems that it may be possible for a proponent to make an application to have an FFG action statement not apply, and for the Secretary to make a decision, with only those two parties ever knowing the application was made.

The proposed changes to the Code come on the back of findings by Supreme Court Judge Justice Robert Osborn that the current reserve system and the regulations, which are supposed to protect listed species like Leadbeater’s Possum, may be inadequate. In the recent case MyEnvironment v Vicforests, Justice Osborn found that the 2009 Black Saturday bushfires had affected Leadbeater’s Possum habitat to such a degree as to raise questions over the adequacy of their protection and cause him to call for a review of the adequacy of the reserve system:

“The 2009 bushfires have materially changed the circumstances in which the existing system was planned and implemented and there is, on the evidence, an urgent need to review it”

The changes to the Code do the opposite of what is obviously needed to protect threatened Victorian forest-dependent species. Rather than strengthen the legal protection and status of the state’s reserve system, the changes will very likely weaken them and may lead to the further decline of many forest-dependent species.

The proposed changes were open for public comment until 2 Feb 2012. The final approval of the variations to code is expected to be made public any day now.

Posted in Environment, Law, Policy | Tagged , , , , , , , , | Leave a comment