^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Always wanted to do that! :)
A while back I posted a pictorial representation of the effects of temperature and SOC on battery degradation. The Pix did not have a time line but I did as much as I could to emphasize that time at these conditions is what should be avoided. That did not go over well. So I am back to try again. Plus I have a confession to make.
First off; my contention was a major portion of degradation we were seeing has more to do with time at high SOC rather than just heat. There are simply too many people in areas that see little in the way of oppressive climates including one guy with very modest transportation needs with a 2016 LEAF 2015 build who has lost more than 10% capacity in less than a year but the real shocker is that he lives in Northwest Oregon! It became apparent to me that the general conception of "degradation hot" needed an adjustment. I began to see a pattern where even temps as mild as the 80's seemed to matter but I see that all Summer and more so why was I not affected?
Now if we go to Battery U, there are some interesting insights and with the now infamous LEAF Degradation uproar, they naturally wanted to nose into this as well. But the real shocker for me was this statement
Batteries chosen for an electric powertrain go through strenuous life cycle testing and Nissan selected a manganese-based Li-ion for the Leaf EV because of solid performance. To beat the clock, the test protocol mandated a rapid charge of 1.5C (less than 1 hour) and a discharge of 2.5C (20 minutes) under a temperature of 60°C (140°F). Under these harsh conditions, a heavy-duty battery is expected to lose 10 percent after 500 cycles, which represents 1–2 years of driving. This emulates driving an EV through the heat of a biblical hell, leaving rubber marks from aggressive driving, and still coming out with a battery that boasts 90 percent capacity.Its no wonder Nissan was so confident in their pack. They put the pack in an oven for a month and it came out in fairly good shape! But what happened? The cycle test obviously failed to resemble reality but this much off?
So a few labs dissected the degraded LEAF packs to find out what the major malfunction was and determined that there are several types of degradation and they were detectable. The LEAF packs suffered from extended time at a high SOC COMBINED with high temps.
The cathode (positive electrode) develops a similar restrictive layer known as electrolyte oxidation. Dr. Dahn stresses that a voltage above 4.10V/cell at elevated temperature causes this, a demise that can be more harmful than cycling a battery. The longer the battery stays in a high voltage, the faster the degradation occurs.By now, I was beginning to realize that although heat is a major player, its not as significant, per the general consensus, by itself. Its heat AND high SOC AND time that is the real killer. What does the bolded statement above mean? The obvious; the higher the heat and the longer the time at high SOC the more we need to avoid this. But the statement implies the rate of degradation increases in heat. To compare; the stress test using a fairly wide range of SOC probably running between 15-100% SOC (a search of the site provided no specifics) and lasted roughly a month with losses of less than 5% per "rated" year. Most manufacturers recommend a long term storage @ 15ºC (59ºF) and 40% SOC. In a year at the same temp and SOC, the cell loses about 25% capacity. A pack sitting on the shelf at the same temperature at full charge losses roughly 40% of its capacity in 3 months or over 18% per month.
Well, avoiding heat is not possible. The climate is what its going to be and by and large its getting warmer so we need to look at the other two factors we can control. The reason the cycle test yielded favorable results is because it was discharged to a lower, less critical, SOC immediately in 20 mins! So there was high SOC, high temps but minimal time at high SOC making the temperature a lesser factor due to the transient nature of the fast cycle testing. This also implies that temperature alone is not the major factor since the oven was at 140ºF the entire month! (Actually a 500 cycle test would run just over 28 days)
In the previous blog, One of the statements I made that caused the greatest uproar was when I posited that a major cause of degradation was free workplace charging. Likely done during the hottest part of the day, sometimes uncovered and its an extended time at high SOC. Extended you say? Yes, if using level 2 charging; even if you are lucky enough to time your exit just as the charge completes, level 2 is so slow that you are still spending as much as two hours in the heat if you recharged to full. Back to the bolded statement above. Can we say for certain that two hours at high temps/high SOC is twice as bad as one hour...or is it more?
Now all of this is something I have suspected for years mostly because of people in areas like San Diego and in other areas where anything above the mid 80's was not common,
were seeing heavy degradation. But what is common is Sunny days and asphalt; an apparently lethal combination.
Ok; so I mentioned that I had a confession to make and by now most of you know that I have used a LOT of the free charging compliments of Nissan's 2 year free charging program NCTC. What you don't know is that when weather turned in mid May, I stopped fully charging on level 2 resorting to almost all fast charging publicly frequently during the hottest part of the day . Crazy you say? Well, maybe, but...
But another experiment started in Feb was seeing how bad multi QCs heating up the pack really was. I can't change my climate but generating heat was easy and so I started on many multi fast charge days hitting 9, 10, 11 TBs on a regular basis. During this time, I only encountered one time where I think my charge speed was limited due to heat when I started a fast charge on an AV that should have run at 50 KW but it only ran at 30 KW and I was only at 9 TBs when I started. I have started several fast charges at 10 TBs and didn't see any slow downs so maybe a station glitch?
As luck would have it; Summer came early to Western WA and ended up being the longest Summer in my 32 years here by a long long long shot. The general consensus is maybe a week of Summer like weather by the 4th of July so having it start in mid May was a pleasant surprise! The 9ish Holidays (3 day Memorial and Labor Day weekends plus the 4th) were all hot, sunny and gorgeous! There might have been one time when Memorial Day had 3 good days in 32 years but I mostly remember wishing there was one "decent" day most years!
This meant no more charging on level 2 to full and we had 6 months of warm! In reality, I did charge to full 12 times but each time my leave time from home was before 3 AM. I did leave garage door open at night until bedtime to help cool it more and only a few times did the garage temp remain in the 80's sooo...not too bad. This was great on the electric bill since the only time I plugged in was when I got home if below blinking bar but only for an hour.
Since my experiment is based on time at high SOC as a main cause of degradation and not heat, it was essential to;
1) Heat up the pack A LOT and keep it hot which I did a fairly good job of.
2) but still have enough range to average over 2500 miles a month and I did (obviously since there were no rides of shame to report!)
3) Never let the car sit at high SOC which I did accomplish for the most part.
4) Keep LEAF at lowest SOC possible as much as possible. I will admit a bit of concern plugging into QC needing at least 75 miles of range while already at 10 TBs during the hottest part of a very warm Summer day but all in the name of Science!!
On most days, I parked my LEAF for the day at SOC ranging from 30 to 70% with battery temps at 9-11 bars. Unlike my previous LEAFs which NEVER got that hot (mostly due to pathetically slow fast charge rates), my 30 kwh pack cooled off faster it seems especially when battery was over 110º. Several times I would leave a QC in the mid to upper 120's and be in the mid to low 100 teens within a half hour while driving home during hottest part of the day. Naturally on the hottest days, dissipation was much slower. I did manage to maintain minimum pack temps in the 90's for a pretty good stretch of the Summer mostly due to very early morning charging and 60 hour work weeks!
At least twice, I parked LEAF with greater than 90% SOC and batt temps over 110º due to last minute work cancellations. I resisted the strong urge to drive around town to reduce the SOC. I am thinking being 10,000+ miles over my lease miles played a part in that decision...
When the weather got cold the last week of October, I ended the experiment, started charging at home again and as expected, the reduced usage allowed battery stats to drop.
New; 363 GIDs, 28.1 Kwh Available, 82.34 ahr, 100% SOH, 102ish Hx
Lowest; 363 GIDs, 28.1 kwh Available, 79.55 ahr, 100% SOH, 95.35 Hx.
Work has also slowed down combined with several other personal days scheduled weeks in advance but did see a bump in numbers almost back to new levels with just a 2 day flurry of driving that included several QCs so I venture to say my real loss is probably around 4% give or take.
Now the debate becomes;
**Its only 4% solely because of where I live
** Despite my experiment, heat was still not a factor for me
** The high number of QCs bolstered my numbers which is why my degradation despite more than double the miles is nearly 3X less than another LEAF driver living 100 miles south of me in a similar climate?
**What is the definition of hot?
In retrospect, there is other data I should have collected including recording battery temps several times a day. This would have been difficult to do at any set time but would have given an idea of the lowest temps the pack obtained daily.
To summarize;
YMMV. This is my car, my experiment, my situational driving. My weird hours means large portions of my driving consists of higher speed driving VERY early in the morning before traffic buildup, a TON of crawl during the afternoon trying to get home during gridlock which made it tough to run my SOC down quickly after a QC especially when averaging less than 20 mph.
30 kwh packs should have more durability simply because they are bigger, cycle less, etc. So why didn't that happen for most? Is it charging to 97.7% SOC instead of 97.3? Is it the steeper fast charge profile? Or simply a hiccup in the process? There is a lot of talk about cell voltages and I found after looking at several screen shots that they seem to vary, guessing due to temperatures with my pack running from 4.112 to 4.136. I spent quite a bit of time looking at LEAF Spy data to determine what voltage 90% SOC, 70%, etc was and its not consistent. Kinda clears up why customized SOC settings are not all that straightforward. My respect for the legions of aftermarket people out there working on this has attained new heights!
Options; This isn't written for anyone out shopping for an EV. Its for the ones who have already made the commitment. I am here to tell you that you don't have to roll over and take it. There are options; obviously not easy ones but the consequences are hardly a cakewalk either. But the key is even if the 40 kwh pack has no improvements in this area, it becomes a very viable option for many simply because now charging to 70 or 80% and being able to get where you need to be is much easier.
LEAF uses LMO
Breakpoints? On the storage experiment, there was a degradation difference of only 2% for the 40% SOC pack at temps between 0 and 25ºC which is 77ºF. At 100% SOC the difference was 14% (94 to 80%) but as always, Battery U is very good at giving us a very very small picture of what we need to see. A chart showing say 70% SOC and 15ºC would have cleared up a lot. But the site is huge. The info very well could be there and I simply haven't stumbled across it yet. The other thing is this is a different chemistry so the numbers will vary a bit but the mechanisms apply to all Li.
On the flipside; Winter reduces range so being on the high side of the range was not only a minimal compromise on longevity but simply a good idea erring on the side of convenience and safety.
So there you have it. Go to Battery U for a LOT more details, graphs, charts, etc. but realize there is no one chart, statement, graph or experiment that is going to tell you what you want to know. It does require reading between the lines, extrapolating, etc. So did I take leaps here? Had to. There is no other way to get there. Am I right? I think I am, at least for now.
It wouldn't be the first time I was certain of something only to find out later I was simply in the right place but the wrong zip code...
Now did I charge to 100%? Of course I did. Its too hard to drive 26,000 miles a year without doing that. In fact, I charged to 100% on Chademo at least 2-3 dozen times and yeah, the pack was a bit "warm" but 20-39 miles down the road, the pack was still hot but now the LEAF was moving out of the "SOC Danger" zone.
So DIYers; We need that custom charging app!!
Excellent material, well researched. Thanks.
ReplyDeleteThanks! Unfortunately, was not able to connect every dot but at the same time, I don't think the leaps were out of bounds either!
DeleteThe 2011 used Manganese and Nickel in a spinel structure. The 40 kWh uses Mn, Ni, and Cobalt ina layer structure http://www.nissan-global.com/EN/TECHNOLOGY/OVERVIEW/li_ion_ev.html Do you have any thoughts about that difference with regard to your study?
ReplyDeleteThat is a major improvement but an expensive one as well. Check out the chart above. Cobalt should be much more heat tolerant. I think cost was reason Nissan didn't do this earlier. They are probably banking on costs dropping enough to make it feasible.
DeleteI had read elsewhere that Cobalt is much more expensive the Ni and Mn, but that it improves battery function significantly. I was disappointed with my 2011 Leaf - Lost the 4th bar at 65 months & 55K miles (it got totaled in a low speed rear-end) - I swore I'd never buy another vehicle from Nissan. Now I am trying to decide whether to go for a 2018 or wait for the 2019 with 60 kWh and (hopefully) TMS. I live in Albuquerque and every Saturday travel to Los Alamos. I start at 5058 feet elevation, ending at 7309. In between I go over mountains with total climb around 4443 and descent 2188 to gain 2255 feet.
ReplyDeleteI arrive by 9:00 am and leave not earlier than 7:00 pm, with access to L2 chargers there. Currently I drive the wife's Volt using about 1.5 gallons going and 0.5 gallons returning.
We get a couple weeks above 100 deg, I park in the driveway, and daily commute is 30 miles - so I wouldn't have to charge above 50% during the week.
Winter never gets below the teens Fahrenheit. Do you think the 40 kWh battery would maintain enough capacity to make the trip in the Winter for several years?
CO supposedly "near" best. Titanium is the best but VERY expensive and somewhat heavy at least for car applications. But CO seems to be much more heat tolerant while maintaining high power. Right now, its all about politics which is the main reason CO is so expensive right now
DeleteAs far as longevity, my previous blog I did an experiment where I only charged on DC all Summer. This meant SOC no more than 70% when parked with only a handful of exceptions.
DeleteAs far as your trip? Looks like its 100 miles going which will be the challenge. Hard to say with the elevation but using the Volt as a guide, 1.5 gallons @ 50% efficiency says you would be roughly needing 26 kwh to get there and obviously not a lot to get home.
But weather will complicate this issues and that altitude will have snow a good 3-4 months at the very least so traction tires will cut your range, snow on the roads, etc.
what you want to do is get shade! and not charge your car to more than you will need. with 40 kwh, it will be a lot easier to charge to 60-70% or something and still do what you need to do.