solar

Climate Change

No doubt many will have heard about the “bushfire crisis” that has basically been wreaking havoc for the past month. Here in Brisbane things haven’t been too bad, but we’ve had our fair share of smoke haze and things of course are exceptionally dry.

From where I sit, this is a situation we have let ourselves get into. Some argue that this is all because of the lack of back-burning, and to a certain extent this is true.

Back-burning doesn’t make it rain however. The lack of back-burning is a casualty of a few things, partly a lack of firefighting resources, and also significantly, a hotter, dryer climate.

Climate change has been known about for a long time. When I was growing up in the early 90s, the name used was the “greenhouse effect”. The idea being that all the “greenhouse gasses” we were generating, was causing heat to be trapped in the atmosphere like a greenhouse, and thus heating up the planet.

Back then, there didn’t seem to be any urgency to combat the problem.

So, we’ve just continued the way we always have since the start of the industrial revolution. Some things have improved, for instance electric vehicles just weren’t practical then, they are slowly gaining traction.

Large-scale PV generation in the 90s would have been seen as a joke, now we have entire paddocks dedicated to such activities. Renewable power generation is big business now. Whilst it won’t displace all traditional methods, it has an important place going forward.

Yet, in spite of all this progress, we’ve still got people in government, and in big corporate organisations who cling to the “business as usual” principle.

When South Australia announced they were going to install a big battery to help back-up their power supply, the idea was poo poohed, with many saying it wouldn’t be big enough to make a difference. What it doesn’t have in running-time, it makes up for in very fast responsiveness to load changes.

A coal-fired power station operates by using thermal energy produced by burning coal, to boil water to produce steam which drives turbines that in turn, drive electric generators. A nuclear station isn’t much different — the thermal source is the only bit that changes. Geothermal is basically using a nuclear station that mother nature has provided.

The thing all these systems have in common is rotating mass. It takes significant energy to cause a step-change in rotational speed of the turbine. If the turbine is still, you’re going to have to pump a lot of energy in, somehow, to get it spinning. If it’s spinning, it’ll take a lot of energy to stop it. Consequently, they are not known for reaction times. Cold starts for these things in the realm of a day is not unknown. They also don’t take kindly to sudden changes of load. It is during these times the emissions from such generators are at their worst.

Solar is great during the day when it’s fine, but on a cloudy day like today the output is likely to be greatly diminished, and it’ll be utterly useless at night. If we had big enough battery storage, then yes, we could theoretically capture enough during the sunny days to carry us over the nights and cloudy days. That’s a big if.

So I still see the traditional methods being a necessary evil. The combination of all three options though (renewables, traditional generation and battery storage) could be a winner. Let the older stations carry the evening base-load and keep the battery topped up, ramp them down a bit when we’re getting good renewable output, use the batteries to cover the load spikes.

Nuclear could be an option, however to my mind they have two big problems:

  1. Public perception
  2. Commissioning time

Without a doubt, the modern designs for these things has greatly improved on what graced the sites of Chernobyl, Three Mile Island and Fukushima. They generate waste still, but in many cases the half-life and quantity of this waste is greatly reduced. The biggest problem though is public perception, as there are many who will not differentiate between the designs, and will immediately respond: “not in my back yard!”

Even if you could win peoples’ trust, you’ve got a second problem, getting them built and commissioned in time. If we had started in the 90s, then maybe they’d be doing useful things for us now. That boat has long set sail and is dipping over the horizon now.

Transportation is another area where we’re, as a nation, addicted to fossil fuels. It’s not hard to see why though. Go outside a major capital city, and infrastructure for a purely electric vehicle disappears.

Moreover, the manufacturers, stuck in their echo-chamber, don’t see larger electric vehicles as worth the investment.

Back in 2007, my father was lucky enough to win the Multicap Art Union, and so replaced the Subaru stationwagon he’s owned since 1982 with a Holden Rodeo ute (we had the choice between that or Toyota).

This vehicle was chosen with the intent of towing a caravan with it — something he later purchased. The caravan weighs about two tonnes. Yes, an electric vehicle could theoretically tow it, and could even do a better job, but at the time, no such vehicle was available from any of the available suppliers.

To my knowledge, this is still the case. Few, if any of the electric vehicles on the market here in Australia, have the necessary facilities to tow a caravan even if the motor is capable of it.

Then there’s infrastructure to consider. A pure electric vehicle would probably be impractical outside of major regional centres and capital cities. Once you got away from the network of high-power chargers, you better plan for staying a few days in each town where you charge, because it will take that long to charge that battery from a 240V 10A socket!

Diesel-electric though, could be a winner since diesel engines similarly operate most efficiently at constant speed and could drive a generator to charge battery storage.

A return of the gas turbine engine could also be a good option. This was tried before, but suffered from the typical characteristic of turbines, they don’t like changing speed quickly. Poor throttle response is a deal-breaker when the engine is providing the traction, but it is a non-issue in a generator. They run on a wide variety of fuel types, including petroleum and diesel, so could utilise existing infrastructure, and the engines are generally simpler designs.

Is there research going into this? Not from what I’ve seen. Instead, they trot out the same old style vehicles. Many people buy them because that’s all that’s on offer that fulfils their requirements. Consequently this inflates the apparent desire for these vehicles, so the vehicle makers carry on as usual.

The lack of cycle infrastructure also pushes people into vehicles. When I do ride to work (which I’ve been trying to do more of), I find myself getting up early and getting on the road before 4:30AM to avoid being a nuisance to other road users.

In particular road users who believe: “I paid vehicle registration, therefore this road is MINE!” I needn’t waste space on that assertion, the Queensland government raised about $557M in revenue (page 14) from vehicle registration in 2018-19, whilst the DTMR’s expenditure at that time was over $6bn (page 15).

The simple truth is that a lot of these initiatives are seen as nothing but a “cost”. Some simple-minded people even say that the very concept of climate change is invented simply to slug the developed world. We need to get past this mentality.

The thing is, business as usual is costing us more. We’re paying for it big time with the impact on the climate that these emissions are having. Yes, climate does go in cycles, but what we’re experiencing now is not a cycle.

I can remember winters that got down to the low signal digits here in Brisbane. I have not experienced those sorts of conditions here for a good 15 years now. Yes, this is a land of drought and flooding rain, however, we seem to be breaking climate records that have stood longer than any of us have been alive by big margins.

The “fire season”, which is used to determine when back-burning should take place has also been lengthening. It will get to a point where there just isn’t a safe time to conduct back-burning as theoretically every day of the year will be “fire season” conditions.

This is costing us.

  • It will cost us with property being destroyed.
  • It will cost us with work being disrupted.
  • It will cost us with food production being threatened.
  • It will cost us with health issues due to increasing ambient temperatures and air pollution issues.

Lately I’ve been suffering as a result of the smoke haze that has been blowing through the Brisbane area. I recognise that it is nowhere near as bad as what Sydney has to put up with. Whilst not severely asthmatic, I have had episodes in the past and can be susceptible to bronchitis.

On one occasion, this did lead to a case of pneumonia.

About a fortnight ago I started to go down with a bout of bronchitis. I’ve had two visits to the doctor already, prescribed antibiotics and a puffer, normally by now my symptoms would be subsiding by now. This time around, that has not been the case. Whilst the previous bouts have been stress-related, I think this time it is smoke-induced.

I think once the smoke clears, I’ll recover. I am not used to this level of air pollution however, and I think if it becomes the new “normal”, it will eventually kill me. If I lived in Sydney, no question, that level probably would kill me.

This is a wake-up call. Whilst I don’t plan to join the Extinction Rebellion — as I don’t think blocking up traffic is doing anyone any favours, I do think we need to change direction on our emissions. If we carry on the way we are now, things are only going to get worse.

Solar Cluster: Re-wiring the rack

It’s been on my TO-DO list now for a long time to wire in some current shunts to monitor the solar input, replace the near useless Powertech solar controller with something better, and put in some more outlets.

Saturday, I finally got around to doing exactly that. I meant to also add a low-voltage disconnect to the rig … I’ve got the parts for this but haven’t yet built or tested it — I’d like to wait until I have done both,but I needed the power capacity. So I’m running a risk without the over-discharge protection, but I think I’ll take that gamble for now.

Right now:

  • The Powertech MP-3735 is permanently out, the Redarc BCDC-1225 is back in.
  • I have nearly a dozen spare 12V outlet points now.
  • There are current shunts on:
    • Raw solar input (50A)
    • Solar controller output (50A)
    • Battery (100A)
    • Load (100A)
  • The Meanwell HEP-600C-12 is mounted to the back of the server rack, freeing space from the top.
  • The janky spade lugs and undersized cable connecting the HEP-600C-12 to the battery has been replaced with a more substantial cable.

This is what it looks like now around the back:

Rear of the rack, after re-wiring

What difference has this made? I’ll let the graphs speak. This was the battery voltage this time last week:

Battery voltage for 2019-05-22

… and this was today…

Battery voltage 2019-05-29

Chalk-and-bloody-cheese! The weather has been quite consistent, and the solar output has greatly improved just replacing the controller. The panels actually got a bit overenthusiastic and overshot the 14.6V maximum… but not by much thankfully. I think once I get some more nodes on, it’ll come down a bit.

I’ve gone from about 8 hours off-grid to nearly 12! Expanding the battery capacity is an option, and could see the cluster possibly run overnight.

I need to get the two new nodes onto battery power (the two new NUCs) and the Netgear switch. Actually I’m waiting on a rack-mount kit for the Netgear as I have misplaced the one it came with, failing that I’ll hack one up out of aluminium angle — it doesn’t look hard!

A new motherboard is coming for the downed node, that will bring me back up to two compute nodes (one with 16 cores), and I have new 2TB HDDs to replace the aging 1TB drives. Once that’s done I’ll have:

  • 24 CPU cores and 64GB RAM in compute nodes
  • 28 CPU cores and 112GB RAM in storage nodes
  • 10TB of raw disk storage

I’ll have to pull my finger out on the power monitoring, there’s all the shunts in place now so I have no excuse but to make up those INA-219 boards and get everything going.

Solar Cluster: Dusty solar panels?

So recently, I had a melt-down with some of the monitor wiring on the cluster… to counteract that, I have some parts on order (RS Components annoyingly seem to have changed their shipping policies, so I suspect I’ll get them Monday)… namely some thermocouple extension cable, some small 250mA fast-blow fuses and suitable in-line holders.

In the meantime, I’m doing without the power controller, just turning the voltage down on the mains charger so the solar controller did most of the charging.

This, isn’t terribly reliable… and for a few days now my battery voltage has just sat at a flat 12.9V, which is the “boost” voltage set on the mains charger.

Last night we had a little rain, and today I see this:

Battery voltage today… the solar charger is doing some work.

Something got up and boogied this morning, and it was nothing I did to make that happen.  I’ll re-instate that charger, or maybe a control-only version of the #High-power DC-DC power supply which I have the parts for, but haven’t yet built.

Solar Cluster: When things get hot

It’s been a while since I posted about this project… I haven’t had time to do many changes, just maintaining the current system as it is keeps me busy.

One thing I noticed is that I started getting poor performance out of the solar system late last week.  This was about the time that Sydney was getting the dust storms from Broken Hill.

Last week’s battery voltages (40s moving average)

Now, being in Brisbane, I didn’t think that this was the cause, and the days were largely clear, I was a bit miffed why I was getting such poor performance.  When I checked on the solar system itself on Sunday, I was getting mixed messages looking at the LEDs on the Redarc BCDC-1225.

I thought it was actually playing up, so I tried switching over to the other solar controller to see if that was better (even if I know it’s crap), but same thing.  Neither was charging, yet I had a full 20V available at the solar terminals.  It was a clear day, I couldn’t make sense of it.  On a whim, I checked the fuses on the panels.  All fuses were intact, but one fuse holder had melted!  The fuse holders are these ones from Jaycar.  10A fuses were installed, and they were connected to the terminal blocks using a ~20mm long length of stranded wire about 6mm thick!

This should not have gotten hot.  I looked around on Mouser/RS/Element14, and came up with an order for 3 of these DIN-rail mounted fuse holders, some terminal blocks, and some 10A “midget” fuses.  I figured I’d install these one evening (when the solar was not live).

These arrived yesterday afternoon.

New fuse holders, terminal blocks, and fuses.

However, it was yesterday morning whilst I was having breakfast, I could hear a smoke alarm going off.  At first I didn’t twig to it being our smoke alarm.  I wandered downstairs and caught a whiff of something.  Not silicon, thankfully, but something had burned, and the smoke alarm above the cluster was going berserk.

I took that alarm down off the wall and shoved it it under a doonah to muffle it (seems they don’t test the functionality of the “hush” button on these things), switched the mains off and yanked the solar power.  Checking the cluster, all nodes were up, the switches were both on, there didn’t seem to be anything wrong there.  The cluster itself was fine, running happily.

My power controller was off, at first I thought this odd.  Maybe something burned out there, perhaps the 5V LDO?  A few wires sprang out of the terminal blocks.  A frequent annoyance, as the terminal blocks were not designed for CAT5e-sized wire.

By chance, I happened to run my hand along the sense cable (the unsheathed green pair of a dissected CAT5e cable) to the solar input, and noticed it got hot near the solar socket on the wall.  High current was flowing where high current was not planned for or expected, and the wire’s insulation had melted!  How that happened, I’m not quite sure.  I got some side-cutters, cut the wires at the wall-end of the patch cable and disconnected the power controller.  I’ll investigate it later.

Power controller with crispy wiring

With that rendered safe, I disconnected the mains charger from the battery and wound its float voltage back to about 12.2V, then plugged everything back in and turned everything on.  Things went fine, the solar even behaved itself (in-spite of the melty fuse holder on one panel).

Last night, I tore down the old fuse box, hacked off a length of DIN rail, and set about mounting the new holders.  I had to do away with the backing plate due to clearance issues with the holders and re-locate my isolation switch, but things went okay.

This is the installation of the fuses now:

Fuse holders installed

The re-located isolation switch has left some ugly holes, but we’ll plug those up with time (unless a friendly mud wasp does it for us).

Solar isolation switch re-located, and some holes wanting some putty.

For interest’s sake, this was the old installation, partially dismantled.

Old installation, terminal strips and fuse holders.You can see how the holders were mounted to that plate.  The holder closest to the camera has melted rather badly.  The fuse case itself also melted (but the fuse is still intact).

Melted fuse holder detail

The new holders are rated at 690V AC, 30A, and the fuses are rated to 500V, so I don’t expect to have the same problems.

As for the controller, maybe it’s time to retire that design.  The high-power DC-DC converter project ultimately is the future replacement and a first step may be to build an ATTiny24A-based controller that can poll the current shunt sensors and switch the mains charger on and off that way.

Solar Cluster: Reverting back to the Powertech MP-3735

So, for the past few weeks I’ve been running a Redarc BCDC-1225 solar controller to keep the batteries charged.  I initially found I had to make my little power controller back off on the mains charger a bit, but was finally able to prove conclusively that the Redarc was able to operate in both boost and float modes.

In the interests of science, I have plugged the Powertech back in.  I have changed nothing else.  What I’m interested to see, is if the Powertech in fact behaves itself, or whether it will go back to its usual tricks.

The following is the last 6 hours.

Next week, particularly Thursday and Friday, are predicted to have similar weather patterns to today. Today’s not a good test, since the battery started at a much higher voltage, so I expect that the solar controller will be doing little more than keeping the battery voltage up to the float set-point.

For reference, the settings on the MP-3735 are: Boost voltage 14.6V, Float voltage 13.8V. These are the recommended settings according to Century’s datasheets for the batteries concerned.

Interestingly, no sooner do I wire this up, but the power controller reaches for the mains. The MP-3735 definitely likes to flip-flop. Here’s a video of its behaviour shortly after connecting up the solar (and after I turned off the mains charger at the wall).

Now looking, it’s producing about 10A, much better than the 2A it was doing whilst filming.  So it can charge properly, when it wants to, but it’s intermittent, and inside you can sometimes hear a quiet clicking noise as if it’s switching a relay.  At 2A it’s wasting time, as the cluster draws nearly 5× that.

The hesitation was so bad, the power controller kicked the mains charger in for about 30 minutes, after that, the MP-3735 seems to be behaving itself.  I guess the answer is, see what it does tomorrow, and later this week without me intervening.

If it behaves itself, I’m happy to leave it there, otherwise I’ll be ordering a VSR, pulling out the Powertech MP-3735 and re-instating the Redarc BCDC-1225 with the VSR to protect against over-discharge.


Update 2018-10-28… okay, overcast for a few hours this morning, but by 11AM it had fined up.  The solar performance however was abysmal.

Let’s see how it goes this week… but I think I might be ordering that VSR and installing the Redarc permanently now.


Today’s effort:

Each one of those vertical lines was accompanied by a warning email.

Solar Cluster: Further power bill reductions

So, since the last power bill, our energy usage has gone down even further.

No idea what the month-on-month usage is (I haven’t spotted it), but this is a scan from our last bill:

GreenPower? We need no stinkin’ GreenPower!

This won’t take into consideration my tweaks to the controller where I now just bring the mains power in to do top-ups of the battery. These other changes should see yet further reductions in the power bill.

Solar Cluster: Making the BCDC1225 get up and boogy!

So, I’ve been running the Redarc controller for a little while now, and we’ve had some good days of sunshine to really test it out.

Recall in an earlier posting with the Powertech solar controller I was getting this in broad daylight:

Note the high amount of “noise”, this is the Powertech solar controller PWMing its output. I’m guessing output filtering is one of the corners they cut, I expect to see empty footprints for juicy big capacitors that would have been in the “gold” model sent for emissions testing. It’ll be interesting to tear that down some day.

I’ve had to do some further tweaks to the power controller firmware, so this isn’t an apples-to-apples comparison, maybe next week we’ll try switching back and see what happens, but this was Tuesday, on the Redarc controller:

You can see that overnight, the Meanwell 240V charger was active until a little after 5AM, when my power controller decided the sun should take over. There’s a bit of discharging, until the sun crept up over the roof of our back-fence-neighbour’s house at about 8AM. The Redarc basically started in “float” mode, because the Meanwell had done all the hard work overnight. It remains so until the sun drops down over the horizon around 4PM, and the power controller kicks the mains back on around 6PM.

I figured that, if the Redarc controller saw the battery get below the float voltage at around sunrise, it should boost the voltage.

The SSR controlling the Meanwell was “powered” by the solar, meaning that by default, the charge controller would not be able to inhibit the mains charger at night as there was nothing to power the SSR. I changed that last night, powering it from the battery. Now, the power controller only brings in the mains charger when the battery is below about 12.75V. It’ll remain on until it’s been at above 14.4V for 30 minutes, then turn off.

In the last 24 hours, this is what the battery voltage looks like.

I made the change at around 8PM (can you tell?), and so the battery immediately started discharging, then the charge-discharge cycles began. I’m gambling on the power being always available to give the battery a boost here, but I think the gamble is a safe one. You can see what happened 12 hours later when the sun started hitting the panels: the Redarc sprang into action and is on a nice steady trend to a boost voltage of 14.6V.

We’re predicted to get rain and storms tomorrow and Saturday, but maybe Monday, I might try swapping back to the Powertech controller for a few days and we’ll be able to compare the two side-by-side with the same set-up.


It’s switched to float mode now having reached a peak boost voltage of 14.46V.  As Con the fruiterer would say … BEEEAAUUTIFUUUL!

DC-DC Converter: Researching designs

So, I’ve been pondering doing a more capable power controller for the purpose of enhancing or even outright replacing the solar controllers I have now on the Solar-powered cloud computing cluster.

The idea started as a straight DC power meter with a Modbus interface, as there’s pretty much nothing on the market.  Lots of proprietary jobbies, or display-only toys, but nothing that will talk an open protocol.  I started designing a circuit.  I thought: it’d be handy to have digital inputs and outputs.

Lots of mains energy meters have them, and they’re handy for switching loads.  My remote reset facility when porting the mainline kernel to the TS-7670 was a digital output on the CET PMC-519.  If I crashed the TS-7670, I basically fired up ipython, loaded pymodbus, connected to a Modbus/TCP gateway, then issued a few write-coil commands to power-cycle the TS-7670.

Often the digital inputs are hooked to water or gas pulse meters to meter usage: you get a pulse every N litres of water or every N cubic metres of gas.

A meter with digital I/O that’s programmable would be just perfect for the job my little power controller is doing.

I could make the channels PWMable, thus be able to step down the voltage.  Put an INA219 on there, and I’d have current measurement and power control.  The idea evolved to putting the INA219 and a MOSFET on a board, so it was a separate module: just to make board layout easier and to reduce the size of the boards.

For a buck converter, you just add an inductor and a few smoothing capacitors.  Better yet, two INA219s and a MCU would let me measure power in, out, and have localised brains.  Thus the idea of a separate, smart module, was born.  For kicks, I’m also adding the ability to boost as well by tacking a boost converter to the end.

The principle is really quite simple.  A buck converter has this topology (source, Wikipedia):

If you swap out that diode for another MOSFET, you get a synchronous buck converter, which looks like this (same Wikipedia article):

It’s worth noting that a MOSFET switch has a body diode that might be exploitable, I’ll have to check.  You drive the two switches with complimentary outputs.  Both these circuits will step down a voltage, which is what I want to do 99% of the time, but it’s also useful to go up .  A boost converter looks like this (source: Wikipedia):

Again, that diode can be replaced with another MOSFET.  This will step up a voltage, but not down.

There are a few options that allow going both ways:

  • Flyback : which uses a transformer and provides galvanic isolation.  The downside is the secondary must take all the output current, and finding a transformer that can do 50A is EXPENSIVE!  So scratch that.  (And no I am not winding my own transformers… tried that back at uni when a lecturer asked us to build a flyback converter.  Bugger that!)
  • SEPIC is basically a boost plus a buck-boost.  Efficiency is not their strong suite apparently, although there are ways to improve it.  I’m not sure the added complexity of two boost converters sandwiching a buck converter is worth it.
  • Ćuk is no good here because it inverts its output, unless you want an isolated one, which is going to be $$$$ because of the transformer needed.  Yes, I’m chickening out!
  • Split-Pi looks real interesting, in that it’s bi-directional.  In the event I ever decide to buy a front-wheel motor for my bicycle, I could use one of these to do regenerative braking and save some wear on my brake pads.  I haven’t seen many schematics for this though, just generalised ones like the one above.
  • Zeta also looks interesting, but it’s pretty unknown, and has a higher parts requirement.  It could be worth looking at.  It’s a close relative of the SEPIC and Ćuk.

The route I’m looking to try first is the 4-switch buck-boost, which looks like this:

The goal will be to have a simple microcontroller-based switch-mode power supply module with the following characteristics:

  • Up to 50A current switching capability
  • 9-30V input voltage range
  • 12-15V output voltage range
  • 250kHz switching frequency

99% of the time, conditions will be:

  • input voltage: 18~20V
  • output voltage: 13.8~14.6V

So I’m thinking we design for a buck converter, then the boost just comes along for the ride.  There’s a handy primer here for designing a buck converter.  For a 20V input and 14.6V output, their formulas suggest a 33µH inductor should cut the mustard. One of these which can handle >50A is not a big or expensive component.

In the above system, we need to be able to drive up two pairs of complementary outputs at high speed.  We can forget the hacker darling ATTiny85, as with 5 pins, it barely has enough to do the SPI interface, let alone talk I²C and drive four MOSFETs.

A good candidate though is the chip I used for the #Toy Synthesizer — the ATTiny861 .

This chip has the same high-speed PWM, and I already know how to drive it.  It doesn’t have a lot of brains, but I think it’ll do.  The challenge will be in-circuit programming.  There are just 3 PWM outputs that don’t clash with ICSP.

Don’t be fooled by the presence of two DI/DO/USCK pins, it’s the one UCI interface, just you can switch which pins it uses.  That’ll be handy for talking I²C, so I’ve earmarked those pins in purple.  The nRESET pin is marked in green, the PWM pins in blue.  When nRESET is pulled low, PB[012] switch to the functions marked in red.

This doesn’t matter for the toy synthesizer as I only need two PWM channels, and so I chose OC1B and OC1D.  Here, I need nOCID (no problem) and nOC1B (uhh oh).

During run-time, I could put my SPI interface on PA[012] and bit-bang the I²C, but I don’t want MOSFETs chattering when flashing new firmware.  Thus, when in reset, I need to inhibit nOC1B somehow.

The other way of doing this is to just forget about nOC1B altogether.  Consider this:

  • When stepping up: SW1 will be held on , SW2 will be held off , and the PWM will drive SW3/SW4.
  • When stepping down: SW3 will be held off , SW4 will be held on , and the PWM will drive SW1/SW2.

We’re only PWMing two MOSFETs at a time, so could get away with just one complementary pair, OC1D/nOC1D.  We ignore OC1B and can now use it for something else.  We just need a means of “selecting” which pair of MOSFETs we’re driving.  A PWM output with 8-bits resolution and 250kHz cycle frequency has a minimum pulse width of about 15ns (1/64MHz).

A SN74AHC244 tri-state buffer will do nicely.  We can use two GPIOs to control the two enable pins to switch the OC1D/nOC1D and a pair of other pins for manual control.  Perhaps PA[01] to select where the PWM signals get routed, and PA[23] to control the non-PWMed MOSFETs.

Due to switching speed requirements, we will need to run the ATTiny861 at 5V.  So maybe make some room for some level shifting of the SPI interface to support 3V master interfaces as this is much lower speed than the PWM output.

This leaves PB3, PB6, PA1 and PA3 free for GPIOs to use however we wish.  Two of these should probably be made part of the host interface, one for chip select (PB3 seems a good choice) and one for an interrupt (maybe PB6).  Add the 5V/0V power rails, and nRESET, and the same interface can be used for ICSP too.

The idea looks doable.  The challenge will be making the control algorithm work within the constraints of the ATTiny861, but given what people have done with the ’85 which has the same core, I’m sure it can be done.

Solar Cluster: Jury still out on solar controller, thinking of PSU designs

So, the last few days it’s been overcast.  Monday I had a firmware glitch that caused the mains supply to be brought in almost constantly, so I’d disregard that result.

Basically, the moment the battery dropped below ~12.8V for even a brief second, the mains got brought in.  We were just teetering on the edge of 12.8V all day.  I realised that I really did need a delay on firing off the timer, so I’ve re-worked the logic:

  • If battery drops below V_L, start a 1-hour timer
  • If battery rises above V_L, reset the 1-hour timer
  • If the battery drops below V_CL or the timer expires, turn on the mains charger

That got me better results.  It means V_CL can be quite low, without endangering the battery supply, and V_L can be at 12.8V where it basically ensures that the battery is at a good level for everything to operate.

I managed to get through most of Tuesday until about 4PM, there was a bit of a hump which I think was the solar controller trying to extract some power from the panels.  I really need a good sunny day like the previous week to test properly.

This is leading me to consider my monitoring device.  At the moment, it just monitors voltage (crudely) and controls the logic-level enable input on the mains charger.  Nothing more.  It has done that well.

A thought is that maybe I should re-build this as a Modbus-enabled energy meter with control.  This idea has evolved a bit, enough to be its own project actually.  The thought I have now is a more modular design.

If I take the INA219B and a surface-mount current shunt, I have a means to accurately measure input voltage and current.  Two of these, and I can measure the board’s output too.  Stick a small microcontroller in between, some MOSFETs and other parts, and I can have a switchmode power supply module which can report on its input and output power and vary the PWM of the power supply to achieve any desired input or output voltage or current.

The MCU could be the ATTiny24As I’m using, or a ATTiny861.  The latter is attractive as it can do high-speed PWM, but I’m not sure that’s necessary in this application, and I have loads of SOIC ATTiny24As.  (Then again, I also have loads of PDIP ATTiny861s.)

The board would expose the ICSP pins plus two more for interrupt and chip select, allowing for a simple jig for reprogramming.  I haven’t decided on a topology yet, but the split-pi is looking attractive.  I might start with a buck converter first though.

This would talk to a “master” microcontroller which would provide the UI and Modbus interface.  If the brains of the PSU MCU aren’t sufficient, this could do the more grunty calculations too.

This would allow me to swap out the PSU boards to try out different designs.

Solar Cluster: Return of the Redarc BCDC1225

Well, I’ve now had the controller working for a week or so now… the solar output has never been quite what I’d call, “great”, but it seems it’s really been on the underwhelming side.

One of the problems I had earlier before moving to this particular charger was that the Redarc wouldn’t reliably switch between boosting from 12V to MPPT from solar.  It would get “stuck” and not do anything.  Coupled with the fact that there’s no discharge protection, and well, the results were not a delight to the olfactory nerves at 2AM on a Sunday morning!

It did okay as a MPPT charger, but I needed both functions.  Since the thinking was I could put a SSR between the 12V PSU and the Redarc charger, we tried going the route of buying the Powertech MP3735 solar charge controller to handle the solar side.

When it wants to work, it can put over 14A in.  The system can run on solar exclusively.  But it’s as if the solar controller “hesitates”.

I thought maybe the other charger was confusing it, but having now set up a little controller to “turn off” the other charger, I think I can safely put that theory to bed.  This was the battery voltage yesterday, where there was pretty decent sunshine.

There’s an odd blip at about 5:40AM, I don’t know what that is, but the mains charger drops its output by a fraction for about 50 seconds.  At 6:37AM, the solar voltage rises above 14V and the little ATTiny24A decides to turn off the mains charger.

The spikes indicate that something is active, but it’s intermittent.  Ultimately, the voltage winds up slipping below the low voltage threshold at 11:29AM and the mains charger is brought in to give the batteries a boost.  I actually made a decision to tweak the thresholds to make things a little less fussy and to reduce the boost time to 30 minutes.

The charge controller re-booted and turned off the mains charger at that point, and left it off until sunset, but the solar controller really didn’t get off its butt to keep the voltages up.

At the moment, the single 120W panel and 20A controller on my father’s car is outperforming my 3-panel set-up by a big margin!

Today, no changes to the hardware or firmware, but still a similar story:

The battery must’ve been sitting just on the threshold, which tripped the charger for the 30 minutes I configured yesterday.  It was pretty much sunny all day, but just look at that moving average trend!  It’s barely keeping up.

A bit of searching suggests this is not a reliable piece of kit, with one thread in particular suggesting that this is not MPPT at all, and many people having problems.

Now, I could roll the dice and buy another.

I could throw another panel on the roof and see if that helps, we’re considering doing that actually, and may do so regardless of whether I fix this problem or not.

There’s several MPPT charger projects on this very site.  DIY is a real possibility.  A thought in the back of my mind is to rip the Powertech MP3735 apart and re-purpose its guts, and make it a real MPPT charger.

Perhaps one with Modbus RTU/RS-485 reporting so that I can poll it from the battery monitor computer and plot graphs up like I’m doing now for the battery voltage itself.  There’s a real empty spot for 12V DC energy meters that speak Modbus.

If I want a 240V mains energy meter, I only have to poke my head into the office of one of my colleagues (who works for the sister company selling this sort of kit) and I could pick up a little CET PMC-220 which with the addition of some terminating resistors (or just run comms at 4800 baud), work just fine.  Soon as you want DC, yeah, sure there’s some for solar set-ups that do 300V DC, but not humble 12V DC.

Mains energy meters often have extra features like digital inputs/outputs, so this could replace my little charge controller too.  This would be a separate project.

But that would leave me without a solar controller, which is not ideal, and I need to shut everything down before I can extract the existing one.  So for now, I’ve left the Powertech one in-place, disconnected its solar input so that now it just works as a glorified VSR and voltmeter/ammeter, as that bit works fine.

The Redarc is now hooked up to solar, with its output going into a spare socket going to the batteries.  This will cost me nothing to see if it’s the solar controller or not.  If it is, then I think some money on a VSR to provide the low-voltage protection, and re-instating the Redarc charger for solar duty will be the next step.  Then I can tear down the Powertech one at my leisure and figure out what they did wrong, or if it can be re-programmed.

The Meanwell charger is taking care of things as I type this, but tomorrow morning, we should hopefully see the solar set-up actually do some work…

… maybe. 🙂