Century of Endeavour

Biomass for Energy

(c) Roy Johnston 1999

(comments to rjtechne@iol.ie)

AN ANALYTICAL APPROACH TO BIOMASS PRODUCTION AND CONVERSION TO ENERGY IN IRELAND

Paper by the present writer (RHW Johnston) delivered at the Limerick Conference of the Operations Research Society of Ireland, November 1980; I have added a few retrospective comments in italics. RJ 2001.

Background

This work was initiated in March 1979 on the initiative of the National Board for Science and Technology.

The Energy Division, under Denis Kearney, is engaged in developing a general macro-economic energy model, with a view to evaluating alternative 'energy menus' in a future of declining oil-availability. One component in this 'Menu' is energy from biomass. The purpose of this project is to attempt to evaluate the main parameters of this component.

The main feature of bio-conversion of solar energy to fixed carbon is its relatively low efficiency (~1%). There is scope for improvement of energy crop performance by genetic procedures, both traditional and novel.

Botanists, agriculturists and foresters have been considering various feasible energy crop systems for some years, paying particular attention to energy inputs and outputs, as well as conversion efficiency.

Botanists classify energy crops into 'soft' and 'hard'; basically annual crops like the food-crops and long-cycle crops like timber.

Agricultural crops like potatoes, sugar beet and corn carry a heavy energy penalty in the cultivation process. They lend themselves to 'wet' processes like fermentation, giving an additional energy penalty in the form of latent heat of distillation. A strong farm lobby in the US has made 'gasohol' popular, i.e. gasoline laced with corn-spirit. The energetics of this is marginal; it is a roundabout way of converting coal to alcohol.

Perennial crops like tropical grasses are better, but if one goes by the 'wet' road (as in Brazil) there remains the latent heat penalty. In Brazil they extract the sugar for fermentation and burn the bagasse for process heat. The overall efficiency however is poor.

In Puerto Rico they have selected a tropical grass for fibre (bamboo). They dry in the sun and there is a high yield of high-quality fuel (or feedstock) at low cost, for use in dry processes. The energy gain is about 15-20, given that the bamboo is perennial and no further cultivation is required once the plantation is established.

In temperate climates the closest to this is forestry.

Traditional forestry involves a 40-year cycle, with thinnings and waste as by-products. To get an effective energy source using forest by-products poses considerable logistic problems. Pulp-mills in the US are beginning to convert to use forest waste as energy-source. The ESB is currently looking at this option.

It is possible to modify traditional forestry by planting at high density for maximum early yield in a single 'take-all' harvesting operation. There is however a re-planting cost penalty each cycle, and the optimal cycle is long, about 10 years. Nor is the conversion efficiency good, as leaf cover in the early years is incomplete.

Botanists with a quantitative feel for energy flows and plant physiology have homed in on coppicing hardwood species as being most likely to give good performance. Planting is a once-only operation; when the first cycle is harvested the second begins immediately by sprouting from the stump, with early lateral spread of leaf cover. Re-cycling of nutrients takes place via leaf fall, so that if harvesting is done in winter there is only minimal fertiliser input need. Some species (eg alder) even go so far as to fix nitrogen.

The informed consensus then is that the 'energy farm' of the future will be a managed plantation of a coppicing species, cropped on a fairly short cycle (3-5 years) using a mobile mechanical harvester-chipper which does a clean cut.

The harvested chips or 'chunks' will require an air-drying stage, like peat. There is a double penalty associated with moisture in wood; not only does the latent heat of evaporation impose a penalty, but the effective heat output is determined by the efficiency of the combustion system and this is dependent on the net energy-density of the fuel. So a buyer of wood chips would pay a premium for dryness.

We therefore envisage a system to supply acceptably dry biomass classified by moisture for use by various conversion processes, eg:

- combustion to give steam, process heat, electricity;

- pyrolysis at low temperature to give low BTU gas, oil (creosote) and char. (This could supply a local industry);

- high-temperature pyrolysis giving medium to high BTU gas, or 'synthetic natural gas' (SNG), which can be fed to the gas grid, or used as a feedstock for petrochemical-type processes, or converted to methanol, ammonia or whatever. liquefaction processes giving petroleum-like substances by various chemical routes.

The above are in increasing order of exoticism, with implied increasing cloudiness of process cost data. All the exotics have been done on the lab or pilot scale, and one can expect scale-up data to become available eventually.

Model Development Philosophy

We decided to opt out of 'wet' systems and to model an energy production and conversion system based on short-cycle coppicing. The model as it stands will also work for traditional forestry in shortened cycles ('single stem' as distinct from coppice; ie replanting each cycle) but requires modification if it is to deal with traditional forests with various products and by-products over a long cycle.

In 'Phase 1' of the project,we have run the production model with real data in the abstract (ie not yet site-specific) and established the economics of a feedstock supply. The conversion model we have developed but not yet used except with such dubious data as have so far been available to us.

in 'Phase II', which is now commencing, we will be running the production model site-specifically with inputs from Bord na Mona and Forestry, and we will be beginning to activate the conversion model with data on 'familiar' conversion technologies by courtesy of the ESB, while attempting to build up the data- bank on the exotics.

In the event, due perhaps to other pressures, the NBST lost interest and alas we never got to run Phase II.

When developing the software we considered that the main problem was the fluidity of the data-base. We therefore paid great attention to the organisation and structure of the input data, and took advantage of the flexibility offered by the small computer which was available (Digital PDP 11/34) as regards random access to disc files and 'conversational mode' operation with the VDU.

We introduced specialist professional programming expertise into the data-base construction and the development of fully-documented structured programmes, with an eye to future portability. The advantage of this approach is that the model can be run by non-specialists (as regards computing) who can develop an immediate and close feel for the numbers relating to the selected scenario, which can be set up instantly from the wide range of site-specific and technology-specific files available.

Production System Model

The production model consists of three parts :

(1) the data-maintenance system whereby the data-base can be set up, displayed and altered if necessary;

(2) the model proper, which runs off those elements of the data-base selected 'conversationally', together with policy variables put in 'conversationally' at run-time, all output being stored on disc;

(3) a financial analyser, which takes the output of the run and applies a cost-structure to it, giving finally a DCF analysis over the life of the project.

Looking in greater detail at the operational simulation, input structure includes:

1. plot data: quality, price, location relative to centre and to next plot. A plantation can be compact, in which case distance to next plot is zero. This is relevant when moving harvesting equipment.

2. growth data, in the form of cumulative yields each year for first, second and third coppicing cycles (the pattern tends to favour increased early growth as root system becomes established). Subsequent cycles are modified by a factor reflecting conjectured long-term performance and eventual decline. Cumulative yield is modified for each plot by the plot quality factor. Annual increments are modified by stored 'year quality' factors; we added this facility because we felt it might be significant, given that year-to-year growths as shown by 'tree-rings' appear to show considerable variation. So far we haven't used it (i.e. year quality factor taken as unity). A cumulative growth curve is taken for a given species, at a given fertiliser level at a given location. We await a good 'growth model' to generate a growth curve given land quality, climatic factors and fertiliser level for a given species. The USDOE(1) has produced a catalogue of 69 such models; each forester writes his own in Fortran, so there is a selection problem. This constitutes a valid area of research. Until a good growth model has been distilled from all the available data, with ability to predict over a validated climatic range, we will have to depend on inspired conjecture fortified by slowly developing site-specific experiments.

3. Weather data is of two types :

(a) general year quality factors, of which we can store a sequence of say 50 or so, based perhaps on integrated degree-days.

(b) short-term data on drying conditions. The Met Office produce 10-day average values of a composite parameter called 'potential evapotranspiration' which we have used provisionally. This has determined our time-frame for the harvesting, drying, and transportation operations: we divide the year into 36 10-day 'weeks' to fit the 'drying conditions' data. It goes back over 10 years, giving an acceptably variable range of wet and dry summers.


We take up to 200 plots, in a sequence of plantings over an N-year cycle. We accumulate biomass growth on each plot according to the (appropriately modified) growth data. We harvest mechanically after the Nth growth year, using equipment characterised by a harvesting rate and an inter-plot travelling rate. Equipment units are generated automatically enough to do the job in the time allotted (ie the non-growth season). If we were to harvest during the growth season we would have to monkey with the growth curves, introducing a new order of uncertainty.

We stack the chunks on the edge of the plot, generating state-variables for volume and moisture for this harvest on this plot. Moisture is updated after each 10-day 'week' in accordance with the drying conditions. When harvesting is complete, transportation to the centre can begin. Various tactics are possible:

* begin immediately with the first lot cut;
* begin after a decent interval;
* begin if moisture is below a specified level.

The importance of this tactical problem depends on the way the market price reflects moisture. We have-not yet explored this trade-off in detail; it may become important in future site-specific analyses. The three tactics give respectively higher proportions of dry wood in the central store, at increasing transport cost penalty.

The drying algorithm is the simplest we could think of; we ,take the updated moisture M(I+I) = M(I) - c*[M(I) - F{p(l)}] where F{P(1)}is the moisture level in equilibrium with 'drying condition measure' value P(I), and C is a constant reflecting the physics of the drying process (possibly related to the surface-volume ratio and the ease with which air can circulate).

The values of C and the form of F(P) have been determined by a rough fit of 'potential evapotranspiration' data for the summer of 1978 and winter of 1978-9 to some drying curves for wood chips and blocks in the open produced by G Lyons at An Foras Taluntais, Oakpark. We are under no illusions about the accuracy, but we did get a reasonable fit and so we think the methodology is valid in principle and can soak up data from other experimental work as it becomes available. If anyone can come up with a better algorithm we will be pleased to plug it in and throw this one away.

It is worth noting that C is probably a non-linear function of chunk size, and it may have an optimum reflecting a compromise between the 'surface-volume ratio effect' and the 'free circulation of air between chunks' effect. Other effects come in for small chips, such as microbial self-heating. This causes drying, at the expense of loss of carbon to C02 (up to 10% it is conjectured; this is a serious problem for milled peat). We have not allowed for this phenomenon in the model. The consensus is that the best chunk size is a 10cm cube. This is not the size given by traditional chipping machinery orientated towards pulp or chipboard; this produces a 1-2 cm chip which tends to self- heat. The agricultural/forestry engineering consensus is that you need an initial saw-type action to leave a basically clean cut (for best re-sprouting) plus a guillotine for size-reduction. Such a system would use substantially less energy than a pulping chipper.

Financial Analysis

Returning from this digression into hardware we note that we run the operational model over the whole life-cycle of the plantation, generating a huge file on disc of operational detail.

The financial analysis model works through this, multiplying the operational data by the appropriate unit costs and coming up with a current cash flow analysis for each quarter, as well as a version discounting everything to net present worth.

Any unit-cost can be played with in conversational mode for sensitivity analysis. We exhibit one such, done by Trevor Gibbons, in which he took price of land as a key variable. He also played with the price of oil relative to inflation (this is credibly assumed to reflect into price realised for output) and with all non-land costs. It is not easy to reproduce this, but basically he gave a set of curves showing the break-even land price per acre as a function of assumed discount rate, ranging from 4% to 10%, with each curve calculated on the assumption of assumed oil price escalation rates of 0%, 3% and 6%. To get break-even for land at £2000 per acre you would need to assume a 7% discount rate if the oil price was appreciating at 3%.

Conversion Model We adopted the same programming philosophy as for the production model, with added features as follows : 1. We allow technology characterisation data to be in units as specified in country of origin (there are many curious units in the US) and we convert to standard units specified by the analyst (e.g. metric), using conversion factors held in the data-base.

2) we accept up to five feed-stocks and produce up to five products in any conversion centre.

3) we set up a network linking many production centres to many markets via many conversion centres.

As the economics of the traditional technologies are well-established and obvious, we feel that this end of the model will not become useful until the exotics begin to become important. However it will be useful to go through a validation loop using the traditional process data, and we expect to be able to do this in Phase II with the aid of the ESB.

We never got around to doing this. RJ 2001.

Using the production model, we conclude that energy farming by managed coppiced hardwood or short cycle is good economics now and we are pleased to see that the State agencies are already beginning to take it seriously on the pilot scale (5-10 MW(e)) so as to get a feel for the practicalities.

Team Structure

The way this project developed can be described as somewhat opportunistic. When we took it on it was not yet well specified. We felt our way. There was no-one in the TCD Statistics and OR Lab free, so we used the 'old-boy network' and involved Aoileann nic Gearailt in Tralee RTC; we developed the basic concepts of the model with a view to working in batch mode and doing a one-off analysis. We were then joined by Dr Alasdar Mullarney who is a skilled conversational mode operator and the production model rapidly took its present form. By this time Dr Trevor Gibbons had joined SORL with a track-record of investment analysis, and he took over the sensitivity analysis from Aoileann; all the action was at least now in one place.

The opportunity arose of attending a USDoE conference in Golden, Colorado, in June 1979; I attended this with Robert Friel, who is an engineer turned computer scientist (so incidentally is Dr Mullarney). This gave us some insights into conversion technologies, enabling Robert to develop a reasonable structure for the conversion model. This was fortified by a visit to an IEA meeting in Stockholm.

Finally we are indebted to Dr Mike Jones of TCD Botany Dept for his production of reasonable guesstimates of some growth curves, and to Gerry Lyons of Oakpark and Gerry Healy of Bord na Mona for useful criticism and discussions.

The Phase I team will be centred in the TCD Dept of Computer Science and will involve recruitment. There will be continuity as regards the key people. Or so we thought at the time of the OR conference; we never got as far as Phase II and the PDP 11/34 mini-computer went a-begging, to the extent that the Applied Research Consultancy Group sold it to the Computer Laboratory, where it did good service for its useful life. RJ 2001.

Notes and References

1. Inventory of US Forest Growth, Models Environmental Science Division, Oak Ridge National Laboratory, USA.

[To 'Century' Contents Page] [1970s Overview]

Some navigational notes:

A highlighted number brings up a footnote or a reference. A highlighted word hotlinks to another document (chapter, appendix, table of contents, whatever). In general, if you click on the 'Back' button it will bring to to the point of departure in the document from which you came.

Copyright Dr Roy Johnston 1999