atmel-attiny24a

Solar Cluster: Considering options for over-discharge protection

Right now, the cluster is running happily with a Redarc BCDC-1225 solar controller, a Meanwell HEP-600C-12 acting as back-up supply, a small custom-made ATTiny24A-based power controller which manages the Meanwell charger.

The earlier purchased controller, a Powertech MP-3735 now is relegated to the function of over-discharge protection relay.  The device is many times the physical size of a VSR, and isn’t a particularly attractive device for that purpose.  I had tried it recently as a solar controller, but it’s fair to say, it’s rubbish at it.  On a good day, it struggles to keep the battery above “rock bottom” and by about 2PM, I’ll have Grafana pestering me about the battery slipping below the 12V minimum voltage threshold.

Actually, I’d dearly love t rip that Powertech controller apart and see what makes it tick (or not in this case).  It’d be an interesting study in what they did wrong to give such terrible results.

So, if I pull that out, the question is, what will prevent an over-discharge event from taking place?  First, I wish to set some criteria, namely:

  1. it must be able to sustain a continuous load of 30A
  2. it should not induce back-EMF into either the upstream supply or the downstream load when activated or activated
  3. it must disconnect before the battery reaches 10.5V (ideally it should cut off somewhere around 11-11.5V)
  4. it must not draw excessive power whilst in operation at the full load

With that in mind, I started looking at options.  One of the first places I looked was of course, Redarc.  They do have a VSR product, the VS12 which has a small relay in it, rated for 10A, so fails on (1).  I asked on their forums though, and it was suggested that for this task, a contactor, the SBI12, be used to do the actual load shedding.

Now, deep inside the heart of the SBI12 is a big electromechanical contactor.  Many moons ago, working on an electric harvester platform out at Laidley for Mulgowie Farming Company, I recall we were using these to switch the 48V supply to the traction motors in the harvester platform.  The contactors there could switch 400A and the coils were driven from a 12V 7Ah battery, which in the initial phases, were connected using spade lugs.

One day I was a little slow getting the spade lug on, so I was making-breaking-making-breaking contact.  *WHACK*… the contactor told me in no uncertain terms it was not happy with my hesitation and hit me with a nice big back-EMF spike!  I had a tingling arm for about 10 minutes.  Who knows how high that spike was… but it probably is higher than the 20V absolute maximum rating of the MIC29712s used for power regulation.  In fact, there’s a real risk they’ll happily let such a rapidly rising spike straight through to the motherboards, frying about $12000 worth of computers in the process!

Hence why I’m keen to avoid a high back-EMF.  Supposedly the SBI12 “neutralises” this … not sure how, maybe there’s a flywheel diode or MOV in there (like this), or maybe instead of just removing power in a step function, they ramp the current down over a few seconds so that the back-EMF is reduced.  So this isn’t an issue for the SBI12, but may be for other electromechanical contactors.

The other concern is the power consumption needed to keep such a beast activated.  The other factor was how much power these things need to stay actuated.  There’s an initial spike as the magnetic field ramps up and starts drawing the armature of the contactor closed, then it can drop down once contact has been made.  The figures on the SBI12 are ~600mA initially, then ~160mA when holding… give or take a bit.

I don’t expect this to be turned on frequently… my nodes currently have up-times around 172 days.  So while 600mA (7~8W at 12V nominal) is high, that’ll only be for a second at most.  Much of the current will be holding current at, let’s call it 200mA to be safe, so about 2~3W.

That 2-3W is going to be the same, whether my nodes collectively draw 10mA, 10A or 100A.

It seemed like a lot, but then I thought, what about a SSR?  You can buy a 100A DC SSR like this for a lot less money than the big contactors.  Whack a nice big heat-sink on it, and you’re set.  Well, why the heat-sink?  These things have a voltage drop and on resistance.  In the case of the Jaycar one, it’s about 350mV and the on resistance is about 7mΩ.

Suppose we were running flat chat at our predicted 30A maximum…

  • MOSFET switch voltage drop: 30A × 350mV = 10.5W
  • Ron resistance voltage drop: (30A)² × 7mΩ = 6.3W
  • Total power dissipation: 10.5W + 6.3W = 16.8W OUCH!

16.8W is basically the power of an idle compute node.  The 3W of the SBI12 isn’t looking so bad now!  But can we do better?

The function of a solid-state relay, amongst other things, is to provide electrical isolation between the control and switching components.  The two are usually galvanically isolated.  This is a feature I really don’t need, so I could reduce costs by just using a bare MOSFET.

The earlier issues I had with the body diode won’t be a problem here as there’s a definite “source” and “load”, there’ll be no current to flow out of the load back to the source to confuse some sensing circuit on the source side.  This same body diode might be an issue for dual-battery systems, as the auxiliary battery can effectively supply current to a starter motor via this body diode, but in my case, it’s strictly switching a load.

I also don’t have inductive loads on my system, so a P-channel MOSFET is an option.  One candidate for this is the Infineon AUIRFS3004-7P.  The Ron on these is supposedly in the realm of 900µΩ-1.25mΩ, and of course, being that it’s a bare MOSFET and not a SSR, there’s no voltage drop.  Thus my power dissipation at 30A is predicted to be a little over 1W.

There are others too with even smaller Ron values, but they are in teeny tiny 5mm square surface-mount packages.  The AUIRFS3004-7P looks dead-buggable, just bend up the gate pin so I can solder direct to it, and treat the others as single “pins”, then strap the sucker to a big heatsink (maybe an old PIII heatsink will do the trick).

I can either drive this MOSFET with something of my own creation, or with the aforementioned Redarc VS12.  The VS12 still does contain a (much smaller) electromechanical relay, but at 30mA (~400mW), it’s bugger all.

The question though was what else could be done?  @WIRING_SOLUTIONS suggested some units made by Victron Energy.  These do have a nice feature in that they also have over-voltage protection, and conveniently, it’s 16V, which is the maximum recommended for the MIC29712s I’m using.  They’re not badly priced, and are solid-state.

However, what’s the Ron, what’s the voltage drop?  Victron don’t know.  They tell me it’s “minimal”, but is that 100nV, 100mV, 1V?  At 30A, 100mV drop equates to 3W, on par with the SBI12.  A 500mV drop would equate to a whopping 15W!

I had a look at the suppliers for Victron Energy products, and via those, found a few other contenders such as this one by Baintech and the Projecta LVD30.  I haven’t asked about these, but again, like the Victron BatteryProtect, neither of these list a voltage drop or Ron.

There’s also this one from Jaycar, but given this is the same place that sold me the Powertech MP-3735, and sold me the original Powertech MP-3089, provided a replacement for that first one, then also replaced the replacement under RMA.  The Jaycar VSR also has practically no specs… yeah, I think I’ll pass!

Whitworths marine sell this, it might be worth looking at but the cut-out voltage is a little high, and they don’t actually give the holding current (330mA “engage” current sounds like it’s electromechanical), so no idea how much power this would dissipate either.

The power controller isn’t doing a job dissimilar to a VSR… in fact it could be repurposed as one, although I note its voltage readings seem to drift quite a lot.  I suspect this is due to the choice of 5% tolerance resistors on the voltage sensing circuit and my use of the ~1.1V internal voltage reference.  The resistors will drift a little bit, and the voltage reference can be anywhere from 1.0 to 1.2V.

Would a LM311N with good quality 1% resistors and a quality voltage reference be “better”?  Who knows?  Maybe I should try an experiment, see if I can get minimal drift out of a LM311N.  It’s either the resistors, the voltage reference, or a combination of the two that’s responsible for the power controller’s drift.

Perhaps I need to investigate which is causing the problem and see what can be done in the design to reduce it.  If I can get acceptable results, then maybe the VS12 can be dispensed with.  I may be able to do it with another ATTiny24A, or even just a simple LM311N.

Solar Cluster: Making the BCDC1225 get up and boogy!

So, I’ve been running the Redarc controller for a little while now, and we’ve had some good days of sunshine to really test it out.

Recall in an earlier posting with the Powertech solar controller I was getting this in broad daylight:

Note the high amount of “noise”, this is the Powertech solar controller PWMing its output. I’m guessing output filtering is one of the corners they cut, I expect to see empty footprints for juicy big capacitors that would have been in the “gold” model sent for emissions testing. It’ll be interesting to tear that down some day.

I’ve had to do some further tweaks to the power controller firmware, so this isn’t an apples-to-apples comparison, maybe next week we’ll try switching back and see what happens, but this was Tuesday, on the Redarc controller:

You can see that overnight, the Meanwell 240V charger was active until a little after 5AM, when my power controller decided the sun should take over. There’s a bit of discharging, until the sun crept up over the roof of our back-fence-neighbour’s house at about 8AM. The Redarc basically started in “float” mode, because the Meanwell had done all the hard work overnight. It remains so until the sun drops down over the horizon around 4PM, and the power controller kicks the mains back on around 6PM.

I figured that, if the Redarc controller saw the battery get below the float voltage at around sunrise, it should boost the voltage.

The SSR controlling the Meanwell was “powered” by the solar, meaning that by default, the charge controller would not be able to inhibit the mains charger at night as there was nothing to power the SSR. I changed that last night, powering it from the battery. Now, the power controller only brings in the mains charger when the battery is below about 12.75V. It’ll remain on until it’s been at above 14.4V for 30 minutes, then turn off.

In the last 24 hours, this is what the battery voltage looks like.

I made the change at around 8PM (can you tell?), and so the battery immediately started discharging, then the charge-discharge cycles began. I’m gambling on the power being always available to give the battery a boost here, but I think the gamble is a safe one. You can see what happened 12 hours later when the sun started hitting the panels: the Redarc sprang into action and is on a nice steady trend to a boost voltage of 14.6V.

We’re predicted to get rain and storms tomorrow and Saturday, but maybe Monday, I might try swapping back to the Powertech controller for a few days and we’ll be able to compare the two side-by-side with the same set-up.


It’s switched to float mode now having reached a peak boost voltage of 14.46V.  As Con the fruiterer would say … BEEEAAUUTIFUUUL!

DC-DC Converter: Controller designed

So this is what I’ve come up with for the core controller.

There’s provisions for two versions on this board, one with an ATTiny861 which does high-speed (250kHz) PWM and can drive a buck, boost or buck-boost DC-DC converter.  It features differential I²C interfaces for the input and output INA219 boards, and LVDS for controlling the MOSFET boards.

The other version is built around the ATTiny24A, and just features the ability to turn on and off MOSFETs.  It can drive two statically, or PWM one (at a much lower speed), with the user supplying the driver logic.  Due to the the fact that this device does not do high-speed switching, I’ve forgone the LVDS control over a simple current loop.  The I²C is still differential though as that could be some distance away and is still somewhat high frequency.

The layout of the board is a small 5×5cm 4-layer PCB.

I had to go 4-layer as I needed to route signals both sides and didn’t want to interrupt the power planes.  The two inner layers are VCC and GND.  There’s de-coupling capacitors galore, although the two power planes will probably function as a decent capacitor in their own right.  ICSP is via the interface header at the bottom.

DC-DC Converter: Splitting up the project

I was originally thinking of one monolithic board which would have everything it needed.

There was provision for the lot, including a separate ATTiny24A so that you could omit all but one of the MOSFETs, swap the remaining MOSFET for a P-channel, drop the MOSFET drivers, one of the INA219s, and the ATTiny861, and you’d have just a monitoring board with a (low-speed) PWMable switch.  It’d plug into the same place and use the same host interface.  The one board could be made into just a boost, or just a buck.  Flexibility.

There was just one snag.  That’ll work for small power supplies with maybe up to 5A capability (~100W) but not for the 50A version.  The MOSFETs will fit, but the tracks will need to be huge, the board will be hideously expensive to make, and they don’t make inductors big enough.

Looking around for inductors, I did see these.  They’re not massive like the 10uH one I saw, and they’re not expensive.  The downside is they’re about 10% of what I really need.  I guess I’ll just make do.  They’re also not PCB-mount (mind you, a 40kg inductor doesn’t PCB-mount either).

Thus, it may be more sensible to separate the MOSFETs and high-power stuff from the controller.  Now here’s the rub.  We’re dealing with sub-15ns pulses.  PWM.

Years ago for my industrial experience, I did work on an electric harvester platform.  The system ran 48V.  The motors were rated at 20kW, and were made in house using windings wound from 5mm diameter enamelled copper wire and neodymium magnets.

We had loads of issues with MOSFETs blowing.  The MOSFET driver was mounted close to the MOSFETs, as I’m proposing to do here, but between DSP and driver, was a long-ish run of ribbon cable. @Bil Herd posted this article covering the challenges involving inductance on PCB layout.  That same problem applies to “long” cable runs too.

10 years ago when we were working on this project, I remember asking about if we had considered maybe using coax cable instead of a ribbon cable.  The idea was rubbished at the time.  Given we were PWMing 400A, I think there might’ve been something in that suggestion.

That ribbon (10~20cm of it) would have had a lovely inductance all of its own, and while I have no idea what frequency the PWM was running at (I might have the code somewhere but I can’t be stuffed digging it up), and we were fundamentally driving a single-ended signal over a fairly long distance.  Yes, ground was close, but probably not enough, a twisted pair would have been better, but even then not perfect.  We blew many MOSFETs on that project.  Big TO-263s!

An earlier article on differential signalling got me thinking: why not use LVDS for the PWM?  A quick search has revealed this receiver and transmitter (Mouser says two receivers on it, but I think that’s a typo).  The idea being that I send the PWM down a differential pair using LVDS.  155Mbps should be plenty fast enough (the ATTiny861 can only do 64MHz) and these parts will run at the 5V needed for fast switching.  In fact they require it.

Using twisted pairs, the inductance should cancel.  I’ll make a MOSFET board that just has these signal pairs:

  • +12V (for the MOSFET driver) and 0V
  • +5V (for the LVDS receiver) and 0V
  • High side PWM + and –
  • Low side PWM + and –

There’s a ground-loop I need to be wary of between the 12V and 5V rails, really it’s the same 0V rail for both.  I suspect they’ll still need to be connected at both ends.  Add in more of those screw terminals to take the input and output power off-board, and I think we should be set.

Similarly, the INA219 should probably be a separate board, with scope for having a chassis-mounted current shunt.  The connection to the current shunt’s sense output is a low-power connection, so no issue there.  You want to keep it short for accuracy reasons, but a simple twisted pair will work fine.

Solar Cluster: Jury still out on solar controller, thinking of PSU designs

So, the last few days it’s been overcast.  Monday I had a firmware glitch that caused the mains supply to be brought in almost constantly, so I’d disregard that result.

Basically, the moment the battery dropped below ~12.8V for even a brief second, the mains got brought in.  We were just teetering on the edge of 12.8V all day.  I realised that I really did need a delay on firing off the timer, so I’ve re-worked the logic:

  • If battery drops below V_L, start a 1-hour timer
  • If battery rises above V_L, reset the 1-hour timer
  • If the battery drops below V_CL or the timer expires, turn on the mains charger

That got me better results.  It means V_CL can be quite low, without endangering the battery supply, and V_L can be at 12.8V where it basically ensures that the battery is at a good level for everything to operate.

I managed to get through most of Tuesday until about 4PM, there was a bit of a hump which I think was the solar controller trying to extract some power from the panels.  I really need a good sunny day like the previous week to test properly.

This is leading me to consider my monitoring device.  At the moment, it just monitors voltage (crudely) and controls the logic-level enable input on the mains charger.  Nothing more.  It has done that well.

A thought is that maybe I should re-build this as a Modbus-enabled energy meter with control.  This idea has evolved a bit, enough to be its own project actually.  The thought I have now is a more modular design.

If I take the INA219B and a surface-mount current shunt, I have a means to accurately measure input voltage and current.  Two of these, and I can measure the board’s output too.  Stick a small microcontroller in between, some MOSFETs and other parts, and I can have a switchmode power supply module which can report on its input and output power and vary the PWM of the power supply to achieve any desired input or output voltage or current.

The MCU could be the ATTiny24As I’m using, or a ATTiny861.  The latter is attractive as it can do high-speed PWM, but I’m not sure that’s necessary in this application, and I have loads of SOIC ATTiny24As.  (Then again, I also have loads of PDIP ATTiny861s.)

The board would expose the ICSP pins plus two more for interrupt and chip select, allowing for a simple jig for reprogramming.  I haven’t decided on a topology yet, but the split-pi is looking attractive.  I might start with a buck converter first though.

This would talk to a “master” microcontroller which would provide the UI and Modbus interface.  If the brains of the PSU MCU aren’t sufficient, this could do the more grunty calculations too.

This would allow me to swap out the PSU boards to try out different designs.

Solar Cluster: Return of the Redarc BCDC1225

Well, I’ve now had the controller working for a week or so now… the solar output has never been quite what I’d call, “great”, but it seems it’s really been on the underwhelming side.

One of the problems I had earlier before moving to this particular charger was that the Redarc wouldn’t reliably switch between boosting from 12V to MPPT from solar.  It would get “stuck” and not do anything.  Coupled with the fact that there’s no discharge protection, and well, the results were not a delight to the olfactory nerves at 2AM on a Sunday morning!

It did okay as a MPPT charger, but I needed both functions.  Since the thinking was I could put a SSR between the 12V PSU and the Redarc charger, we tried going the route of buying the Powertech MP3735 solar charge controller to handle the solar side.

When it wants to work, it can put over 14A in.  The system can run on solar exclusively.  But it’s as if the solar controller “hesitates”.

I thought maybe the other charger was confusing it, but having now set up a little controller to “turn off” the other charger, I think I can safely put that theory to bed.  This was the battery voltage yesterday, where there was pretty decent sunshine.

There’s an odd blip at about 5:40AM, I don’t know what that is, but the mains charger drops its output by a fraction for about 50 seconds.  At 6:37AM, the solar voltage rises above 14V and the little ATTiny24A decides to turn off the mains charger.

The spikes indicate that something is active, but it’s intermittent.  Ultimately, the voltage winds up slipping below the low voltage threshold at 11:29AM and the mains charger is brought in to give the batteries a boost.  I actually made a decision to tweak the thresholds to make things a little less fussy and to reduce the boost time to 30 minutes.

The charge controller re-booted and turned off the mains charger at that point, and left it off until sunset, but the solar controller really didn’t get off its butt to keep the voltages up.

At the moment, the single 120W panel and 20A controller on my father’s car is outperforming my 3-panel set-up by a big margin!

Today, no changes to the hardware or firmware, but still a similar story:

The battery must’ve been sitting just on the threshold, which tripped the charger for the 30 minutes I configured yesterday.  It was pretty much sunny all day, but just look at that moving average trend!  It’s barely keeping up.

A bit of searching suggests this is not a reliable piece of kit, with one thread in particular suggesting that this is not MPPT at all, and many people having problems.

Now, I could roll the dice and buy another.

I could throw another panel on the roof and see if that helps, we’re considering doing that actually, and may do so regardless of whether I fix this problem or not.

There’s several MPPT charger projects on this very site.  DIY is a real possibility.  A thought in the back of my mind is to rip the Powertech MP3735 apart and re-purpose its guts, and make it a real MPPT charger.

Perhaps one with Modbus RTU/RS-485 reporting so that I can poll it from the battery monitor computer and plot graphs up like I’m doing now for the battery voltage itself.  There’s a real empty spot for 12V DC energy meters that speak Modbus.

If I want a 240V mains energy meter, I only have to poke my head into the office of one of my colleagues (who works for the sister company selling this sort of kit) and I could pick up a little CET PMC-220 which with the addition of some terminating resistors (or just run comms at 4800 baud), work just fine.  Soon as you want DC, yeah, sure there’s some for solar set-ups that do 300V DC, but not humble 12V DC.

Mains energy meters often have extra features like digital inputs/outputs, so this could replace my little charge controller too.  This would be a separate project.

But that would leave me without a solar controller, which is not ideal, and I need to shut everything down before I can extract the existing one.  So for now, I’ve left the Powertech one in-place, disconnected its solar input so that now it just works as a glorified VSR and voltmeter/ammeter, as that bit works fine.

The Redarc is now hooked up to solar, with its output going into a spare socket going to the batteries.  This will cost me nothing to see if it’s the solar controller or not.  If it is, then I think some money on a VSR to provide the low-voltage protection, and re-instating the Redarc charger for solar duty will be the next step.  Then I can tear down the Powertech one at my leisure and figure out what they did wrong, or if it can be re-programmed.

The Meanwell charger is taking care of things as I type this, but tomorrow morning, we should hopefully see the solar set-up actually do some work…

… maybe. 🙂

Solar Cluster: Power controller installed

Well, I finally got around to installing that power controller.

Yes, the top of that rack is getting to be a pigsty. Even the controller isn’t my best work:

You can see above I’ve just tacked wires onto the points I need and brought those out to a terminal strip.  There’s provision there for some PWM-controlled fans, but right now this is unused.  I’ve omitted the parts not required for the application.  If this works out, I might consider doing another board, this time better dedicated to the task at hand.

With that controller in place, I’ve now wound the charger back up.  In fact I made a whoopsie at first: I forgot that the Vout pot on the HEF-600C-12 sets the float voltage and wound that right up to 14.4V which meant a boost voltage of 15V!

Thankfully I looked over at the volt meter on the solar controller and realised my mistake quick.  15 seconds won’t hurt anything, but it is now set at 13.6V.  You don’t even see it on the 40-sample average.  The controller should let the mains charger sit there for an hour before it reconsiders the need for mains.

I think my next step … there’s a yard that could do with a hair cut… I’ll drag the mower out and chase that around the yard for a bit.  Then we’ll see what it looks like.


Okay, back from a little mowing… and sure enough, the controller is mostly doing the right thing.  I think I’ll need to tweak some set-points, maybe set the solar threshold lower.  Thankfully the “inhibit” LED is just an indication that it considers the solar voltage low, the solar is going to be on no matter what.

Yes, that SSR is massive for the job. It’s what I had on hand at the time. I’ll probably replace it with something small, maybe a reed relay since they’re cheap.

Right at this point, I have the SSR’s inputs connected between the solar V+ and the BC-547B on the board, so when the sun *does* go down, the mains power will be turned back on no matter what the controller thinks.

A close-up look of the status LEDs shows what mode we’re in:

We’ll ignore the temperature ones. Ultimately they indicate the state of the fan controller, and in this state, it’d be running the fans, as indicated by the Fan PWM LED to the left. Temperature is measured by the sensor in the ATTiny24A on-board, so not highly accurate.

The other mystery “LED” is the shiny surface on the BC-547B to the left of the two source status LEDs.

Here, I suspect the sun ducked behind a cloud so the voltage dropped, hence both “inhibit” LEDs came on. Earlier, “Float” was lit (you can sort-of make it out in the previous photo), the charger was actually actively trying to charge the battery, but to the controller it looked to be done. It left it go for the hour as programmed, then turned off the mains charger to let the solar panel take over.

The idea is that during the day, if it gets low, give it a boost from mains, then go back to solar. We only want to rely on mains at night.

Now, it should stay in that state until tonight, when the lack of sun should bring the mains charger online (by sheer fact that the solar panels power the “coil” of the relay).


So, I saw that, and had a look… sure enough, the controller is still asserting that the mains charger should be off. I think I need to bump the battery thresholds up a bit, although that’s still safely above the danger zone, it’s lower than I’m comfortable with.

Right up until 5:58PM, it seems the MCU just held on, thinking the battery voltage was “good enough”, so no need for a charge yet. I might want to drop the solar threshold down some so it doesn’t “flap” when broken cloud passes over, then raise the minimum battery threshold a bit.

Even now, the thing that’s turning the mains charger on is the fact that the 1.5V coming off the panels is not sufficient for activating the solid-state relay I’m using. I’m thankful I wired the SSR to Vsolar and made the MCU output open-collector. This is a useful little safety feature, making it impossible for the MCU to latch-up and hold the mains charger off, as the sun will eventually set, and that will force the mains charger to turn on like it did tonight.

Solar Cluster: Return of the power controller

Well, I’ve been tossing up how to control the mains charger for a while now.

When I first started the project, my thinking was to use an old Xantrex charger we had kicking around, and just electrically disconnect it from the batteries when we wanted to use the solar power.  I designed a 4-layer PCB which sported a ATTiny24A microcontroller, MOSFETs (which I messed up) and some LEDs.

This was going to be a combined fan controller and power management unit.  It had the ability (theoretically) to choose a supply based on the input voltage, and to switch if needed between supplies.

It didn’t work out, the charger got really confused by the behaviour of the controller.  I was looking to re-instate it using the Redarc solar controller, but I never got there.  In the end, it was found that the Redarc controller had problems switching sources and would do nothing whilst the batteries went flat.

We’ve now replaced both ends of the system.  The solar controller is a Powertech MP3735 and integrates over-discharge protection.  The mains charger is now a MeanWell HEP-600C-12 (which has not missed a beat since the day it was put in).

Unlike my earlier set-up, this actually has a 5V logic signal to disable it, and my earlier controller could theoretically generate that directly.

Looking at the PCB of my earlier power controller attempt, it looks like this could still work.

Above is the PCB artwork.  I’ve coloured in the sections and faded out the parts I can omit.

In green up the top-left we have the mains control/monitoring circuitry.  We no longer see the mains voltage, so no point in monitoring it, so we can drop the resistor divider that fed the ADC.  This also means we no longer need the input socket P2.

Q2 and Q7 were the footprints of the two P-channel MOSFETs.  We don’t need the MOSFETs themselves, but the signals we need can be found on pin 1 of Q2.  This is actually the open-drain output of Q1, which we may be able to hook directly to the REMOTE+ pin on the charger.  A pull-up between there and the charger’s 5V rail, and we should be in business.

In yellow, bottom left is the solar monitoring interface.  This is still useful, but we won’t be connecting solar to the battery ourselves, so we just keep the monitoring parts.  The LED can stay as an indicator to show when solar is “good enough”.

In purple, occupying most of the board, is the controller itself.  It stays for obvious reasons.

In red, is the fan control circuitry.  No reason why this can’t stay.

In blue is the circuitry for monitoring the battery voltage.  Again, this stays.

The main advantage of doing this is I already have the boards, and a number of microcontrollers already present.  There’s a board with all except the big MOSFETs populated: with the MOSFETs replaced by 3-pin KK sockets.

How would the logic work?  Much the same as the analogue version I was pondering.

  • If battery voltage is low, OR, the sun has set, enable the mains charger.

What concerned me about an analogue solution was what would happen once the charger got to the constant-voltage stage.  We want to give it a bit of time to keep the battery topped up.  Thus it makes sense to shut down the charger after a fixed delay.

This is easy to do in a microcontroller.  Not hard with analogue electronics either, it’s fundamentally just a one-shot, but doing it with an MCU is a single-chip solution.  I can make the delay as long as I like.  So likely once the battery is “up to voltage”, I can let it float there for an hour, or until sunrise if it’s at night.

Solar Cluster: Charge controller testing, battery bank dimensioning and other thoughts

So, further progress on the charge controller.

Thinking about the problem … I realised that I really do not want to be testing for VBNVH when entering the CHARGE_CHECK state, as it’ll prematurely terminate the charge when the battery is being bulk-charged.

Better to wait until the charger decides to stop … which we’ll see due to the battery voltage ceasing to increase. We do want to check we’re not critically high however, so we can swap out VH for VCH.

Next, when we find that VBNVBL, meaning the battery is not charging, there we can check for VBNVH and stop charging at that point.

That change, worked pretty well, but it was still flapping between sources. A little state management helped this. If we declare another state variable, charger_warning, we can flip this to 1 upon first detecting that VBNVBL, then wait a little longer. If after a time-out this is still the case, then we can take action. Thus we define a new timer tCWARN, which delays acting on the not-charging case.

A bit of threshold tweaking, and things are behaving themselves. I’m using an el-cheapo 3-way camping fridge as the stand-in for the cluster. This is an Aldi special bought some years ago that draws about 5-6A… and when running on 12V power, features no thermostat.

We’re finding that its cooling capacity is no match for Brisbane’s early autumn weather anyway, so it’s pretty much would be a constant load even if the thermostat worked on 12V.

The battery we’re using is an old 105Ah AGM battery… which is one of two batteries from the caravan we have here. They were the original batteries, and this battery’s mate had failed when both were replaced. We’ve noted this battery getting warm whilst charging, so we think it might now be on the way out too.

What to replace it with? LiFePO₄ is AU$1000 for 100Ah, so too expensive. AGM is still the better bet. I can get 300Ah AGM batteries, but they weigh nearly as much as I do. I can just manage the 105Ah, so we’ll stick with those. I will need more than one long-term.

That brings the thorny issue of connecting them. I am not keen to hook batteries in parallel for various reasons. At least not permanently.

Now, the charger I’m using for mains is a 3-channel charger. I can make additional charge controllers (with the caveat that I need heatsinks for the MOSFETs…sigh!) and I can look for 3-channel solar chargers, or just get multiple chargers for the extra batteries. This can be done.

The load is the elephant in the room. I’d ideally like to manage it as a single load, although conceivably, I could put the switch and one storage node on one battery, a second storage node and a compute node on a second, and the final storage and compute nodes on the third. If the switch goes however, my cluster is toast.

I can put batteries in parallel, but this really does need to be done with care, using carefully matched batteries. So the better solution is to have a controller that chooses the battery with the highest voltage.

There might be an analogue means of implementing this, but a microcontroller is a single-chip solution. The ATTiny24As have up to 8 ADC/GPIO pins and three non-ADC GPIO pins. It’s what I’m already using for the charge controller, so is an easy choice.

The cluster will not tolerate a break-before-make switch-over. I thought about using a capacitor bank to keep the cluster alive during a brief (~1sec max) switch-over. Back-of-the-envelope calculations suggested I would need a 10F capacitor bank. I can get a 16V 470mF capacitor for AU$70 each… and would need 20 of these. Ouch!

A small battery is another option, maybe a 7Ah, but that has its own maintenance issues, and represents a single point of failure.

I can get Schottky diodes capable of 40A, they still present a 0.6V voltage drop. At 30A, that represents 16W! For comparison, a relay with a 225ohm coil resistance will draw ~60mA when the battery is at the maximum of 15V, representing a load of 1W.

Or I can use more MOSFETs like the ones I’m already using, which draw even less power; poor man’s solid-state relays. Latching relays also exist, but they can be rather expensive, more so than a solid-state relay.

I can probably get away with temporary parallel connections, so a make-before-break would let me switch sources. Or, I could place my switch across a Schottky, meaning I put up with that 16W load for a brief moment while I switch sources.

So more to think about, but we are getting close. I can defer this decision until I get a second battery, but I am getting close to the point where the cluster will be running full-time.

Solar Cluster: Charge control flow control diagram

So, as promised, the re-design of the charge controller. … now under the the influence of a few glasses of wine, so this should be interesting…

As I mentioned in my last post, it was clear that the old logic just wasn’t playing nice with this controller, and that using this controller to maintain the voltage to the nodes below 13.6V was unrealistic.

The absolute limits I have to work with are 16V and 11.8V.

The 11.8V comes from the combination of regulator voltage drop and ATX PSU power range limits: they don’t operate below about 10.8V, if you add 700mV, you get 11.5V … you want to allow yourself some head room. Plus, cycling the battery that deep does it no good.

As for the 16V limit… this is again a limitation of the LDOs, they don’t operate above 16V. In any case, at 16V, the poor LDOs are dropping over 3V, and if the node is running flat chat, that equates to 15W of power dissipation in the LDO. Again, we want some headroom here.

The Xantrex charger likes pumping ~15.4V in at flat chat, so let’s go 15.7V as our peak.

Those are our “extreme” ranges.

At the lower end, we can’t disconnect the nodes, but something should be visible from the system firmware on the cluster nodes themselves, and we can thus do some proactive load shedding, hibernating virtual instances and preparing nodes for a blackout.

Maybe I can add a small 10Mbps Ethernet module to an AVR that can wake the nodes using WOL packets or IPMI requests. Perhaps we shut down two nodes, since the Ceph cluster will need 2/3 up, and we need at least one compute node.

At the high end, the controller has the ability to disconnect the charger.

So that’s worked out. Now, we really don’t want the battery getting that critically low. Thus the time to bring the charger in will be some voltage above the 11.8V minimum. Maybe about 12V… perhaps a little higher.

We want it at a point that when there’s a high load, there’s time to react before we hit the critical limit.

The charger needs to choose a charging source, switch that on, then wait … after a period check the voltage and see if the situation has improved. If there’s no improvement, then we switch sources and wait a bit longer. Wash, rinse, repeat. When the battery ceases to increase in voltage, we need to see if it’s still in need of a charge, or whether we just call it a day and run off the battery for a bit.

If the battery is around 14.5~15.5V, then that’s probably good enough and we should stop. The charger might decide this for us, and so we should just watch for that: if the battery stops charging, and it is at this higher level, just switch to discharge mode and watch for the battery hitting the low threshold.

Thus we can define four thresholds, subject to experimental adjustment:

Symbol Description Threshold
V_{CH} Critical high voltage 15.7V
V_H High voltage 15.5V
V_L Low voltage 12.0V
V_{CL} Critical low voltage 11.8V

Now, our next problem is the waiting… how long do we wait for the battery to change state? If things are in the critical bands, then we probably want to monitor things very closely, outside of this, we can be more relaxed.

For now, I’ll define two time-out settings… which we’ll use depending on circumstances:

Symbol Description Period
t_{LF} Low-frequency polling period 15 sec
t_{HF} High-frequency polling period 5 sec

In order to track the state, I need to define some variables… we shall describe the charger’s state in terms of the following variables:

Symbol Description Initial value
V_{BL} Last-known battery voltage, set at particular points. 0V
V_{BN} The current battery voltage, as read by the ADC using an interrupt service routine. 0V
t_d Timer delay… a timer used to count down until the next event. t_{HF}
S Charging source, an enumeration:

  • 0: No source selected
  • 1: Main charging source (e.g. solar)
  • 2: Back-up charging source (e.g. mains power)
0

The variable names in the actual code will be a little more verbose and I’ll probably use #defines for the enumeration.

Below is the part-state-machine part-flow-chart diagram that I came up with. It took a few iterations to try and describe this accurately, I was going to use a state machine syntax similar to what my workplace uses, but in the end, found the ye olde flow chart shows it best.

In this diagram, a filled in dot represents the entry point, a dot with an X represents an exit point, and a dot in a circle represents a point where the state machine re-enters the state and waits for the main loop to iterate once more.

You’ll note that for this controller, we only care about one voltage, the battery voltage. That said, the controller will still have temperature monitoring duties, so we still need some logic to switch the ADC channel, throw away dummy samples (as per the datasheet) and manage sample storage. The hardware design does not need to change.

We can use quiescent voltages to detect the presence of a charging source, but we do not need to, as we can just watch the battery voltage rise, or not, to decide whether we need to take further action.