A half-solution for two (or more) y-axes with ggplot

I've been teaching R, and especially ggplot, to beginners in the language this week, and predictably the topic of how to put two separate y-axes (with a common x-axis) on the same plot came up.

Unfortunately, the answer is "not easily", since the inability to do this is on purpose (Hadley Wickham gives the reasons here, for example). Actually putting one y-axis on the left side of the graph and a different y-axis can be done, but requires some delving into the heart of ggplot which is a beyond my understanding at the moment.

What is easier - and in my opinion, preferable in most cases - is to use facetting or a package like gridExtra to have separate stacked panels. But gridExtra (specifically the grid.arrange() function) misaligns plots which have expressions (subscripts and superscripts) in the axis titles - and facetting by default doesn't make it easy to label axes the way I want (again, because I often need to add super/subscripts in labels), or rescale the y-axes of individual facets to values I want.

I had a think about it after the discussion we had in class, and managed to reach a reasonable compromise with the facetting approach, which is fairly straightforward and doesn't require any extra packages.

I'll demonstrate this with some arbitrary functions with very different ranges of y values:

x <- seq(from = -5, to = 5, by = 0.05)

df <- data.frame(
  x = x,
  fun_a = sin(x^2),
  fun_b = 50 * sin(x)

To make use of facet_grid() this data needs to be converted to "long" format, which is easily accomplished with tidyr::gather():


df2 <- df %>%
  gather(key = fun, value = y, -x)

Now we can ggplot() this data with the two functions in separate facets, making use of the scales = "free_y" argument:

ggplot(df2) +
  geom_path(aes(x = x, y = y, color = fun)) +
  facet_grid(fun ~ ., scales = "free_y")

plot of chunk unnamed-chunk-3

This is fine, but what if I want to plot data series which have different units? I'd prefer to have an axis title on the left for each facet. There's only one y-axis title here, and I can't easily change that - but what I can do instead is change the facet labels, move them and make them look like axis titles.

The easiest thing to do seems to be to change the column names with dplyr::rename() before gather(). To show the superscripts, etc, the column names have to have the form of expressions, like they would if you were to do the same thing with ylab().

df3 <- df %>%
  rename(`sin~(x^2)` = fun_a, `'50'~sin~(x)` = fun_b) %>%
  gather(key = fun, value = y, -x)

Now I can remake the plot with a couple of extra arguments to facet_grid(), and some theme() modifications to make the strip.text (facet label) look the same as the x-axis label.

ggplot(df3, aes(x = x, y = y, color = fun)) +
  geom_path() +
  facet_grid(fun ~ ., scales = "free_y", 
             labeller = label_parsed,
             switch = "y") +
  theme(strip.background = element_blank(),
        axis.title.y = element_blank(),
        strip.text = element_text(size = rel(1))) +
  guides(color = FALSE)

plot of chunk unnamed-chunk-5

Great! This is more or less what I'm after. I still have some grumbles though. One of the main ones is that I can't easily rescale the y-axis on an individual facet - I'm stuck with scales = "free_y".

The most I can do - as far as I know - is force it to rescale a bit outside the range of the data by making some dummy data which I include in the plot as an invisible geom_blank. Like this:

dummy <- data.frame(
  x = 0,
  y = c(-1.5, 2),
  fun = "sin~(x^2)"

ggplot(df3, aes(x = x, y = y, color = fun)) +
  geom_path() +
  geom_blank(data = dummy) +
  facet_grid(fun ~ ., scales = "free_y", 
             labeller = label_parsed,
             switch = "y") +
  theme(strip.background = element_blank(),
        axis.title.y = element_blank(),
        strip.text = element_text(size = rel(1))) +
  guides(color = FALSE)

plot of chunk unnamed-chunk-6

But I don't know if rescaling the y-axis of a single facet to a range within that of the data is easily achievable. I would also like to be able to change the facet heights manually - maybe that's possible with gtable, for example, but that's out of my expertise. Here's hoping for an easier implementation in a future version of ggplot!

Follow this link to comment...

Could wind + batteries really replace a nuclear power plant?

tl;dr: Probably not. The intermittency and seasonal variation of wind is so severe that even optimistically, it would be considerably more expensive and, if it was to use Li-ion batteries for storage, would require as many as have been produced in the entire world over the last four years.

The UK government's recent decision to delay the final decision on the planned Hinkley Point C nuclear power plant has somewhat rekindled the debate on whether the UK should have the plant at all. Some have gone as far as to suggest that given the projected cost, the guaranteed price of the energy produced and the timescale of the project, it should be scrapped and allowed to be replaced with a combination of renewable energy sources and with energy storage, both of which are dropping in cost.

[Jeremy Leggett, the founder of solar panel maker Solarcentury] is delighted that others are picking up on arguments he has been making for years. "Finally the message is getting through that Hinkley, and indeed nuclear, make no sense today simply because wind and solar are cheaper. If we accelerate renewables in the UK, we can get to 100% renewable power well before 2050," he says.

I'm sure we can all agree that cheap, low-carbon renewable electricity would be a great thing. But if you are proposing to eventually remove nuclear and fossil fuels entirely, are renewables still as "cheap" when we need to rely on them to maintain demand? From that same Guardian article:

The Economist believes improved electricity storage is a key answer to the frequently repeated criticism of wind and solar that it is intermittent, and points out that battery technology is fast improving.

First, let's be clear. It is not a mere "criticism" of wind and solar that it is intermittent - it is a cold, hard fact. It's physically impossible to generate solar power at night and to generate significant wind power when the wind's not blowing. In order for electricity generation to meet or follow demand, excess generation must be curtailed somewhere, or the energy stored. Similarly, insufficient generation must be supported by some other form of energy generation to avoid blackouts. At the moment, this role is largely provided by gas in the UK. But in the absence of other conventional means of generation, this role needs to be filled by some form of storage, perhaps the much vaunted batteries. This is especially important if neighbouring countries make similar moves towards renewables, since available wind speed and sunlight does not tend to vary much across neighbouring countries - those neighbours may not be able to export energy when generation is barely sufficient across an entire continent.

But how much storage would be needed, say, to effectively convert intermittent renewable power into providing baseload power equivalent to that which would be provided by Hinkley Point C - a constant 3.2 GW? Well, this can be estimated with some crude analysis of publicly available data. I was interested to see how it turned out, and I figured it was worth reproducing here. I did the analysis with R, and have included the code (except for the code generating the plots) and the data here so that it can be reproduced.


I'm going to use existing wind generation data for this analysis, since the UK already has a significant amount of wind power, and on the assumption that large-scale deployment of solar power would not be all that sensible for one of the darkest countries in the world. The data I've used is the energy production data for the UK for the entire year 2015 - from gridwatch.co.uk - which I've reuploaded to this website.

The data is in the standard csv format, and I use a couple of addon packages for analysis.

gridwatch <- read.csv("http://lacey.se/dl/gridwatch-2015.csv")


Let's check what it looks like:

##       id            timestamp demand frequency coal nuclear ccgt wind
## 1 377525  2015-01-01 00:00:04  28809    50.090 9079    8049 3360 5251
## 2 377526  2015-01-01 00:05:02  28645    50.092 8947    8053 3369 5254
## 3 377527  2015-01-01 00:10:02  28768    50.116 8843    8052 3372 5272
## 4 377528  2015-01-01 00:15:02  28917    50.045 8763    8047 3339 5303
## 5 377529  2015-01-01 00:20:02  28964    50.030 8818    8051 3386 5223
## 6 377530  2015-01-01 00:25:02  29055    50.006 8906    8055 3392 5189
##   french_ict dutch_ict irish_ict ew_ict pumped hydro oil ocgt other
## 1        582       900       -72   -136     15   443   0    0  1157
## 2        586       898      -100   -134      0   441   0    0  1157
## 3        586       898      -100   -134      0   440   0    0  1157
## 4        586       898      -100   -134      0   439   0    0  1155
## 5        586       898      -100   -134      0   440   0    0  1155
## 6        586       898      -100   -134      0   441   0    0  1155

The data is quite thorough, but all I really want for now is the data for wind. First I'm going to convert the timestamp to POSIXct date/time format with the appropriate function from the lubridate package, then I can select out the data I need.

gridwatch$timestamp <- ymd_hms(gridwatch$timestamp)
df1 <- select(gridwatch, timestamp, wind)

Check again:

##             timestamp wind
## 1 2015-01-01 00:00:04 5251
## 2 2015-01-01 00:05:02 5254
## 3 2015-01-01 00:10:02 5272
## 4 2015-01-01 00:15:02 5303
## 5 2015-01-01 00:20:02 5223
## 6 2015-01-01 00:25:02 5189

The wind column shows the power generated in units of MW.

plot of chunk unnamed-chunk-6

Straight away you can see the issue with intermittency. Wind production averages about 2.6 GW over the whole year, but this can be in excess of 6 GW during windy times, and almost nothing during some lulls in the summer. At the end of 2015, the UK had a total of 13.6 GW of capacity installed, indicating a capacity factor of 19%, which seems reasonable.

I'll make two new columns in this data frame - one for the time increment (in seconds) and then use that to integrate the power column to get the total energy generated in MWh.

df1$difftime <- c(0, diff(df1$timestamp))
df1$totwind <- cumsum(df1$wind * df1$difftime / 3600)

So, I want to see how a constant 3.2 GW baseload can be generated by wind, with excess energy stored and then released when the wind isn't sufficient. We can reasonably assume that with more turbines the power generated will scale linearly. We can make a new table from the same data, but adjust the wind so that the total energy generated by wind power throughout the year will be equal to 3.2 GW x 24 hours x 365 days.

df2 <- select(gridwatch, timestamp, wind)
df2$difftime <- c(0, diff(df2$timestamp))

df2$wind <- df2$wind * (3200 * 24 * 365 / (max(df1$totwind)))
df2$totwind <- cumsum(df2$wind * df2$difftime / 3600)

The average should come out to be about 3200 MW now, so let's check that's the case:

##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##   77.92 1506.00 2889.00 3201.00 4805.00 8022.00

For plotting purposes I'll include equivalent columns for the constant 3.2 GW.

df2$base <- 3200
df2$baseenergy <- cumsum(3200 * df2$difftime / 3600)

Now the table looks like this:

##             timestamp     wind difftime   totwind base baseenergy
## 1 2015-01-01 00:00:04 6294.875        0    0.0000 3200     0.0000
## 2 2015-01-01 00:05:02 6298.472      298  521.3735 3200   264.8889
## 3 2015-01-01 00:10:02 6320.050      300 1048.0443 3200   531.5556
## 4 2015-01-01 00:15:02 6357.213      300 1577.8120 3200   798.2222
## 5 2015-01-01 00:20:02 6261.309      300 2099.5878 3200  1064.8889
## 6 2015-01-01 00:25:02 6220.550      300 2617.9669 3200  1331.5556

Let's see how the total amount of energy generated from wind over the year would look like compared to a constant 3.2 GW.

plot of chunk unnamed-chunk-12

The energy generated by wind is shown in red. The difference between these two lines will show us roughly the difference between the energy generated by wind power and the energy consumed by the constant 3.2 GW we're looking for. So let's do that:

df2$diffpower <- df2$wind - df2$base
df2$diffenergy <- cumsum(df2$diffpower * df2$difftime / 3600)

The power requirements will look like this (positive values indicate wind in excess, so therefore the batteries would be charging. Negative values indicate that wind is insufficient, so the batteries need to take over to ensure 3.2 GW):

plot of chunk unnamed-chunk-14

The total energy stored or released looks like this:

plot of chunk unnamed-chunk-15

It's clear from this plot there is a huge seasonal variation in wind power, with greater generation in the winter, and the storage needed as backup in the summer. The difference between those minimum and maximum peaks (in March and October respectively) is the total amount of energy we would need to backup the wind power - therefore it's the capacity of the storage we would need to guarantee a constant 3.2 GW baseload without relying on other methods of generation. In this case, we can see it's of the order of about 4 TWh (that's TERAwatt-hours).

4 TWh is a colossal amount of energy - roughly the yield of 35 Trident nuclear missiles, roughly the total battery capacity of about 45 million Tesla Model S cars, and about 100 years worth of Li-ion batteries at the current rate of production.

More than that, at a target price of $100/kWh for batteries (=$100bn/TWh) this would cost of the order of $400 bn for the batteries alone. This price for usable battery storage is considerably cheaper than currently available for any of the available chemistries, even if you're Tesla.

Meanwhile, based on a capacity factor of about 20%, we would need wind power with a capacity of 16 GW, which based on estimates of $1.3m - $2.2m per MW would cost between $20.8 and $35.2bn. The cost of Hinkley Point C by comparison is estimated to be of the order of $24 bn. So yes, the wind turbines could potentially be cheaper to install than the nuclear power plant, but even then, they can't supply power on the same basis without additional - and possibly insane - storage capability.

More wind turbines?

Ok, maybe it's not viable to try and store every last bit of energy produced by the wind turbines. Perhaps we can just disconnect them during very windy periods and only store enough energy needed when wind isn't enough. How many batteries might we need then?

To estimate this I went back again and scaled up the wind further, so that the averege power is 20% over the 3.2 GW we actually want. The rest of the code is the same.

df4 <- select(gridwatch, timestamp, wind)
df4$difftime <- c(0, diff(df4$timestamp))

df4$wind <- df4$wind * (1.2 * 3200 * 24 * 365 / (max(df1$totwind)))
df4$totwind <- cumsum(df4$wind * df4$difftime / 3600)

df4$base <- 3200
df4$baseenergy <- cumsum(3200 * df4$difftime / 3600)

df4$diffpower <- df4$wind - df4$base
df4$diffenergy <- cumsum(df4$diffpower * df4$difftime / 3600)

If I remake the same plot and rescale it a bit:

plot of chunk unnamed-chunk-17

Then the difference between the minimum and maximum values here is the amount of energy that needs to be stored, which is about 2 TWh - halving the cost of batteries needed for just 20% more wind turbines. This seems more reasonable, so what if you keep adding more wind power?

Well, I won't reproduce the plot here - you can do it yourself - but the cost does keep shrinking. If you double the amount of wind power - to 32 GW, giving an average of 6.4 GW over the year, you can still see that there are periods of insufficient wind to meet 3.2 GW that still require about 200 GWh worth of storage. This is still a huge amount, equivalent to the total storage of a few million electric cars and about four years worth of production of Li-ion batteries. For comparison, the world's largest grid storage battery opened earlier this year in Japan - a sodium-sulfur battery with a capacity of 300 MWh (that's 0.3 GWh).

The cost of installing this would probably then be in the range:

  • Wind turbines: 32 GW capacity, $41.6 - $70.4bn
  • Battery storage: 200 GWh, ~$20bn

This would imply an installation cost of maybe three times as much as Hinkley Point C, for a system with a considerably shorter lifetime, and most likely with much more expensive electricity for the consumer. I doubt if I kept going that it would get considerably cheaper than this, since at this point the cost of the wind turbines is already rather in excess of the cost of the batteries.

So what can we conclude? I think the main thing this thought experiment shows is just how important it is to have a diverse mix of technologies in the energy generation mix. I think this fact is usually lost on people with stubbornly anti-nuclear and anti-fossil fuel views. I know as well that I have not considered other things like tidal or hydroelectric power here, but these are very much geography-dependent and not always a viable option.

I don't intend to say that there is no place for renewables at all in our energy mix, but I cannot see a target of 100% solar or wind grid production as being anything except ruinously expensive, and it seems wildly improbable that battery storage could make it work.

Bear in mind too, that this is just one power plant. Average demand in the UK was 32.8 GW in 2015!

The last word

At this point it is reasonable to point out that the seasonal trend in solar generation runs (very roughly) opposite to that of wind. That is, it is less windy in the summer, but it is sunnier in the summer, so a mix of solar and wind might reduce the requirement for storage. This is true, but some level of storage would still be required, and my intuition tells me that it would not be a significant difference (and other analyses done elsewhere suggest as much). At the moment I don't have data for UK solar generation to play around with, but this would be interesting for a future post.

Follow this link to comment...

Can't underestimate luck

I took this photo of a soap bubble on Sunday:

Rural Sweden in a bubble

It received a surprising amount of attention on Flickr, with almost 8,000 views in one day, way more than any photo of mine has ever had. At the time I didn't even really think about what the photo would look like, and the perfect reflection of the house and the colours in the bubble itself were an unexpected bonus!

Follow this link to comment...

Don't believe in miracles

The above was one of the closing points in an excellent talk I saw earlier today by Dr Jef Ongena, the chairman of the Energy group at the European Physical Society. Based on the email announcement for the talk I was expecting to hear mostly about current progress in nuclear fusion research, but what we got was partly a thought-provoking and fiercely critical overview of the European approach towards renewable energy, and partly a call for better public awareness and consideration of wider context. It really deserved a larger audience than it had so I thought I’d write a bit about it!

“Europe alone cannot save the world” was the first key message, giving examples such as Germany: around 1 trillion Euros has been committed to the Energiewende, and has so far brought about very little meaningful reduction in Germany’s CO2 emissions, which themselves contribute only ~2.5% of the world’s emissions (interestingly enough, the decade following the collapse of the GDR saw German CO2 emissions drop by a quarter, simply because of the loss or modernisation of East German industry). Globally, this reduction is insignificant, because emissions from countries such as India and China have increased to a greater extent. More than this, as a result of this policy, German energy prices are among the highest in the world.

Dr Ongena was also keen to point out that at least a portion of the EU’s reduced emissions are effectively an accounting trick – we import more goods from China rather than produce them here, meaning the associated emissions appear as Chinese emissions and not European. It’s worth reading the EPS Energy Group’s position paper on this topic.

What was most interesting for me was the stark look at what the consequences of an energy system based 100% on renewable energy would be. At this point I am reminded of this excellent post on the situation in Scotland, since this is exactly what the Scottish government is aiming for in the short-term (the blog at which that post resides, Energy Matters, is excellent in general by the way). The biggest issue with renewable energy such as solar and wind is the intermittency; wind is unpredictable, and the demand for solar is typically out of phase with the consumption, so some energy storage is essential. But how much?

It’s also one thing to cope with diurnal (day/night variation) in production/consumption, but if you plan for a future in which solar is a large or the major part of the energy production mix, then the huge seasonal variation in energy production becomes a big problem.

Dr Ongena gave an example of a study looking at German energy production as it would look in 2050 based on current plans. Unfortunately, I didn’t note down the reference, but the short version is that coping with the seasonal variation in energy production and consumption would require of the order of 33 TWh of storage capacity for Germany alone! To put this into context, if this energy was to be stored using batteries, 27 cubic kilometers(!!) of space would be needed to store the batteries themselves. How much space is that? Well, I calculated that myself, and it would take an aircraft hangar tall enough to fit an Airbus A380, this big:

What 900 km<sup>2</sup> looks like

Or, about 2,000 buildings the size of the Boeing Everett Factory, the largest building in the world by volume. I won’t even try to estimate the impossible cost of such a solution.

This idea of “knowing your numbers” was I think the main scientific point in the talk, which pleased me greatly: this is something I think is really important, and is something I try to prioritise in any teaching that I do. Dr Ongena gave a few interesting factoids – for example, that the energy consumed by satellite TV boxes in Belgium is something like 17 times more than the energy consumed lighting all the country’s roads – but the main point was about the power density of energy production.

Power density, as in power per unit area, tells you about the land area needed to produce energy, and in this respect all renewable energies are considerably more “dilute” than conventional (fossil fuel, nuclear) technologies. I’ve since found some good references for this, notably this one, so I won’t write anything else on this except to say that it is often and easily forgotten that wind power, for example, requires 400-500 times the land area to provide the same power as nuclear (and optimistically about 50 times for solar). It’s long been a mystery to me as to why some types of environmental destruction (e.g., large scale pumped hydro or vast fields of massive on-shore wind turbines) are apparently preferable to others (e.g., storing relatively small amounts of nuclear waste, or fracking).

Although it was a talk with relatively few crumbs of comfort for the future energy landscape (fusion power was not even discussed), it was still rather refreshing to get such a brutal reality check and plenty of food for thought. Unfortunately, my suspicion is that politics and emotions will always trump science in any decision-making process, but I would be happy to be proved wrong on that.

Follow this link to comment...

Energy density and Li-S batteries

Yesterday I created a new Shiny app for estimating the gravimetric energy density of Li-S cells based on ten different parameters of the materials that make up those cells. You can find the page describing it the app here (or you can jump straight to the app here.

I wrote some accompanying text giving some background to the app – more specifically about the gap between theory and practice and why wild promises surrounding new battery technologies never seem to come true. That page got a bit long, so I’ve split off that text into this post. It’s still a bit long, but I hope it can be of some interest!

Gravimetric energy density, or more properly specific energy – the amount of energy stored for a given mass – is one of the most important characteristics of a battery for any portable application, whether it’s for a laptop or an electric vehicle. It’s especially important for the latter, because the energies required to move something as heavy as a car over distances of hundreds of kilometers currently require exceptionally heavy batteries. This is one of the main motivations for research into new rechargeable battery chemistries which have much higher theoretical energy densities than Li-ion batteries, the current state-of-the-art.

The theoretical specific energy (hereafter referred to simply as energy density) of the lithium-sulfur (Li-S) battery system, for example, is given in various papers, review articles, news articles, etc, to be about 2,600 Wh/kg. This number comes from simply converting the Gibbs free energy of formation of Li2S (-432 kJ/mol) into the units of Wh/kg, and isn’t actually based on anything relating to the construction of a battery (it doesn’t consider, for example, an electrolyte, without which a battery can’t function). The real energy density of a battery is the energy released from the electrochemical reaction divided by the masses of everything in that battery – both electrodes, the separator, the electrolyte, the current collectors, and the packaging.

Most often, the number of interest is the energy density on the cell level, that is, the energy density of a single cell. This is of course much lower than the theoretical energy released from a perfectly efficient reaction of the reactants in that cell, because all the components besides the active materials (sulfur or lithium, in this case) do not contribute to the energy density, even if the battery can’t work without them. However, this fact hasn’t stopped a large number of researchers and journalists describing the system from writing things like:

The theoretical energy of the Li-S battery is 2,600 Wh/kg, which is much higher than for Li-ion batteries, currently 150-180 Wh/kg.

This is enormously misleading, and I am of the opinion that even unintentionally making these inappropriate comparisons – and the implicit wild promises of magic technology to come – does not do the reputation of the field any favours. Making cells with high energy densities is very hard: there are two companies that I know of (Sion Power and OXIS Energy) which have been developing these batteries long before the current rush of academic interest and currently produce real cells with energy densities of >300 Wh/kg. Both are claiming to be able to deliver 400 Wh/kg in the near future. For comparison, the highest energy Li-ion battery in production that I’m aware of is the 243 Wh/kg Panasonic NCR18650B.

I’m also of the opinion that many researchers in the field do not really appreciate what a remarkable achievement companies like Sion Power and OXIS have made in producing 300+ Wh/kg cells that can actually be recharged for more than a few cycles. One of the key conclusions I’ve come to in the time that I’ve been working in this field is that the more you work to make a cell that will actually have a high energy density, the more you realise the system really doesn’t want to work nicely under those conditions.

Making a high-energy density battery is hard

What I mean by “a cell that will actually have a high energy density” is one where the dead weight – the weight in the cell which is not active lithium or sulfur, in this case – is minimised as far as possible. The electrolyte and current collectors are big contributors to this, but other electrode additives and any excess on the part of one of the electrodes also contribute. More importantly, the electrochemistry of the lithium-sulfur battery is extremely sensitive to many of these factors in ways that Li-ion batteries simply aren’t.

As far as I can see, the most serious issues regarding long-term rechargeability (cycle life) are a direct result of the instability of the negative electrode and destruction of the electrolyte. In most academic work when results from test batteries are reported, the electrode “loading” (the amount of sulfur on the positive electrode per unit area) is usually low, and the electrolyte and negative (lithium) electrode are in huge excess. This minimises the effect of capacity loss due to these serious issues. This is not an issue in itself: you can deliberately test cells in this way, so as to look at the stability of the positive electrode itself (in what we would typically call a half cell). However, in most cases it is not obviously deliberate: even though there are a number of papers now which have demonstrated how fundamentally important the electrolyte volume is, most papers in the recent past do not even report how much electrolyte was used. And from my own experience, I have so far only reviewed one or two articles on Li-S batteries where I have not had to ask the authors to include the electrolyte volume (or more specifically, the electrolyte/sulfur ratio). The thickness of the negative (lithium) electrode is reported even less frequently, and is also important. This situation of unreported experimental parameters does seem to be slowly improving, though.

It is also, simply, more convenient to make test batteries like this. For example, it is harder to coat thicker positive electrodes. Very thin Li foil (e.g. tens of µm thick) has also only recently become available, is difficult to work with and is relatively expensive. It is also not trivial to work with realistically small electrolyte-to-active material ratios for such test batteries either, because the volumes are usually very small (perhaps less than 30 µL, for batteries with a few mAh of capacity). For all these reasons, the rapid capacity fade that comes with having the combination of a low electrolyte volume, a thick positive electrode and a thin negative electrode, is usually not seen.

It is common now to see journal articles reporting test Li-S batteries as completing hundreds and hundreds of cycles with little capacity fade. This frequently comes with the implication, intended or not, that a major hurdle with the system has been cleared and we are well on our way to long-lived, high energy batteries with a dizzying range of applications. More disturbingly, I have seen a number of papers report relatively unremarkable results along with a statement to the effect of:

“We estimate that this corresponds to an energy density of 750 Wh/kg in a complete cell

or something similarly unsubstantiated, where this projection is made on the basis of assuming masses of other components which may not be realistically achievable. I do not know if those who would write this or similar things actually believe it, but certainly many readers would, and would assume it to be the truth, not least because these sorts of statements pass peer review. I know the urge and need to promote and spin one’s research is a strong one, but this is a bad habit that really needs to be kicked.

This all sounds very bad!

I’m not trying to play down the potential of the lithium-sulfur system at all, in case that’s what it looks like. As much as I believe that wild, unrealistic promises are likely to eventually kill of interest from funding agencies and industry when those promises can’t be fulfilled, I also believe that direct and brutal criticism of these promises may also achieve the same result. I’m not going to say that a 500 Wh/kg Li-S battery is impossible. On the contrary, I think it’s quite realistic! I could tell you some values we need to reach in order to get there, although I couldn’t tell you how to actually get to them (that’s what research is for!). I will say that I’m fairly sure it’s not going to happen without better awareness of the limitations of our experiments and what conclusions can be drawn from them. It’s just as Feynman said (quoted on the top “Science” page on this site):

The first principle is that you must not fool yourself - and you are the easiest person to fool.

Follow this link to comment...