atmel-attiny24a

Solar Cluster: Re-working the charge controller, setting up a home

So, having knocked the regulation on the LDOs down a few pegs… I am closer to the point where I can leave the whole rig running unattended.

One thing I observed prior to the adjustment of the LDOs was that the controller would switch in the mains charger, see the voltage shoot up quickly to about 15.5V, before going HOLYCRAP and turning the charger off.

I had set a set point at about 13.6V based on two facts:

  • The IPMI BMCs complained when the voltage raised above this point
  • The battery is nominally 13.8V

As mentioned, I’m used to my very simple, slow charger, that trickle charges at constant voltage with maximum current output of 3A. The Xantrex charger I’m using is quite a bit more grunty than that. So re-visiting the LDOs was necessary, and there, I have some good results, albeit with a trade-off in efficiency.

Ahh well, can’t have it all.

I can run without the little controller, as right now, I have no panels. Well, I’ve got one, a 40W one, which puts out 3A on a good day. A good match for my homebrew charger to charge batteries in the field, but not a good match for a cluster that idles at 5A. I could just plug the charger directly into the battery and be done with it for now, defer this until I get the panels.

But I won’t.

I’ve been doing some thought about two things, the controller and the rack. On the rack, I found I can get a cheap one for $200. That is cheap enough to be considered disposable, and while sure it’s meant for DJ equipment, two thoughts come to mind:

  • AV equipment with all its audio transformers and linear power supplies, is generally not light
  • It’s on wheels, meant to be moved around… think roadies and such… not a use case that is gentle on equipment

Thus I figure it’ll be rugged enough to handle what I want, and is open enough to allow good airflow. I should be able to put up to 3 AGM batteries in the bottom, the 3-channel charger bolted to the side, with one charge controller per battery. There are some cheap 30A schottky diodes, which would let me parallel the batteries together to form one redundant power supply.

Downside being that would drop about 20-25W through the diode. Alternatively, I make another controller that just chooses the highest voltage source, with a beefy capacitor bank to handle the switch-over. Or I parallel the batteries together, something I am not keen to do.

I spent some time going back to the drawing board on the controller. The good news, the existing hardware will do the job, so no new board needed. I might even be able to simplify logic, since it’s the battery voltage that matters, not the source voltages. But, right now I need to run. So perhaps tomorrow I’ll go through the changes. 😉

Solar Cluster: Re-working the node power regulators

So… in the last test, I tried setting up the nodes with the ATTiny24A power controller attempting to keep the battery between 11.8 and 13.8V.

This worked… moreover it worked without any smoke signals being emitted.

The trouble was that the voltage on the battery shot up far faster than I was anticipating. During a charge, as much as 15.5V is seen across the battery terminals, and the controller was doing exactly as programmed in this instance, it was shutting down power the moment it saw the high voltage set-point exceeded.

This took all of about 2 seconds. Adding a timeout helped, but it still cycled on-off-on-off over a period of 10 seconds or so. Waay too fast.

So I’m back to making the nodes more tolerant of high voltages.

The MIC29712s are able to tolerate up to 16V being applied with peaks at 20V, no problem there, and they can push 7.5A continuous, 15A peak. I also have them heatsinked, and the nodes so far chew a maximum of 3A.

I had set them up to regulate down to approximately 13.5V… using a series pair of 2.7kΩ and 560Ω resistors for R1, and a 330Ω for R2. Those values were chosen as I had them on hand… 5% tolerance ¼W carbon film resistors. Probably not the best choice… I wasn’t happy about having two in series, and in hindsight, I should have considered the possibility of value swing due to temperature.

Thinking over the problem over the last week or so… the problem seemed to lay in this set point: I was too close to the upper bound, and so the regulator was likely to overshoot it. I needed to knock it back a peg. Turns out, there were better options for my resistor selections without resorting to a trim pot.

Normally I stick to the E12 range, which I’m more likely to have laying around. The E12 series goes …2.7, 3.3, 3.9, 4.7, 5.6… so the closest I could get was by combining resistors. The E24 range includes values like 3.0 and 3.6.

Choosing R1=3.6kΩ and R2=390Ω gives Vout ~= 12.7V. Jaycar sell 1% tolerance packs of 8 resistors at 55c each. While I was there today, I also picked up some 10ohm 10W wire wound resistors… before unleashing this on an unsuspecting AU$1200 computer, I’d try it out with a dummy load made with four of these resistors in parallel… making a load that would consume about 5A for testing.

Using a variable voltage power supply, I found that the voltage could hit 12.7V but no higher… and was at worst .7V below the input. Good enough.

At 16V, the regulator would be dropping 3.3V, passing a worst case 3A current for a power dissipation of 9W out of the total 48W consumption. About 80% efficiency.

Not quite what I had hoped for… but this is a worst case scenario, with the nodes going flat chat and the battery charger pumping all the electrons it can. The lead acid battery has a nominal voltage of 13.8V… meaning we’re dropping 1.1V.

On a related note, I also overlooked this little paragraph in the motherboard handbook:

(*Do not use the 4-pin DC power @J1 when the 24-pin ATX Power @JPW1 is connected to the power supply. Do not plug in both J1 and JPW1 at the same time.)

Yep, guess what I’ve done. Being used to motherboards that provide both and needed both, I plugged them both in.

No damage done as all nodes work fine… (or they did last time I tried them… yet to fire them up since this last bit of surgery). It is possible there is no isolation between the on-motherboard PSU and the external ATX one and that if you did plug in power from two differing sources, you could get problems.

In a way if I had spotted this feature before, I could have done without those little PSUs after all, just needing a Molex-style power adaptor cable to plug into the motherboard.

Still… this works, so I’m not changing it. I have removed that extra connection though, and they’ve been disconnected from the PSUs so they won’t cause confusion in future.

I might give this a try when things cool down a bit … BoM still reports it being about 32°C outside (I have a feeling where I live is a few degrees hotter than that) and so I don’t feel energetic enough to drag my cluster out to the workbench just now. (Edit: okay, I know…those in NSW are facing far worse. Maybe one of the mob in New Holland should follow the advice of Crowded House and take the weather with them over here to the east coast! Not all of it of course, enough to cool us off and reduce their flood.)

Solar Cluster: Load testing… with the new power modules

Well, I’ve finally dragged this project out and plugged everything in to test the new power modules out.

I’ll be hooking up the laptop and getting the nodes to do something strenuous in a moment, but for now, I have them just idling on the battery, with the battery charger being switched by the charge controller, built around an ATTiny24A and this time, a separate power module rather than it being integrated.

I’ve had it going for a few hours now… and so far, so good. The PSU is getting turned on and off more often than I’d like, but at least the smoke isn’t escaping now. The heatsink for the power modules is warm, but still not at the “burn your fingers off” stage.

That to me suggests the largish heatsink was the right one to use.

Two things I need to probably address though:

  • In spite of the LDOs, the acceptable voltage range of the computers is still rather narrow… I’m not sure if it’s just the IPMI BMC being fussy or if the LDOs need to be knocked down a peg to keep the voltage within limits. Perhaps I should use the same resistor values as I did for the Ethernet switch.
  • The thresholds seem to get reached very quickly which means the timeouts still need lengthening. Addressing the LDO settings should help with this, as it’ll mean I can bump my thresholds higher.

If I can nail those last two issues, then I might be at risk of having the hardware aspect of this project done and having a workable cluster to do the software side of the project. Shock horror!

Solar Cluster: Separating concerns

So, my last attempt at a fully integrated power controller was a smouldering failure. Q7 decided it wasn’t happy about where things were going, and let the world know by the only way it knew: smoke signals!

Curiously, only one of the MOSFETs in use seems to be damaged. When we look at its mate on the other side, Q2, sure it’s discoloured, which could be indication that it has been stressed, or maybe it just got burned by the other MOSFET.

It goes without saying that the pair in the background are fine: no current flowed through them during this test.

This got me thinking, did it get too hot, or did something else go wrong? This is the schematic and PCB layout of that part of the board.

Now looking at it, one thing strikes me. Seems I might have the source pins connected back-to-back, not the drain pins. Could that be it? I think I intended it the other way but didn’t pay enough attention to the schematic symbol. This could be a factor, or maybe the other MOSFET might’ve blown instead.

One thing is certain, I cannot join the two tabs together to the same heat-sink like I was intending with this schematic. Mia culpa!

Thinking about the design, the idea of putting the MOSFETs might be a tad naïve, as by far they are going to be the most likely component to fail on the board. The concept of a separate power module would work better since it’s easy then to just rip one out and put another in its place. Designed right, this could even be hot-pluggable, and can incorporate a heat-sink, and can make use of larger or smaller MOSFETs for different applications.

Luckily, the existing board layout will accommodate this just fine. We put 3-pin KK connectors in place of Q2 and Q4, and we can jumper across Q7 and Q8. This means the feed to the battery only needs to be a light-gauge wire, sufficient to power the controller and measure the battery voltage.

The pin-out isn’t ideal for this: it would be better to have a pin connecting to 0V instead of the battery +12V, but it’s workable. The gate pin becomes an open-collector output, and can theoretically drive (low-current) relays or MOSFETs.

As for what to build this power module on? Well, without going and buying a heat-sink, I’m spoiled for choice:

All of these dwarf the TO-220 package transistor I’m using… okay the one shown is an IRF-9540, but it’s still a TO-220 put there for scale reference. Most of these are for Intel CPUs that are long obsolete, and the top right has the CPUs in question still firmly attached.

The Pentium CPU heat-sink/fan would be the closest in the size, I was hoping I might’ve had a 486 heat-sink laying around, I’m of the opinion that if the power module needs a fan on any of these heat sinks, I’m doing it wrong. This might not be the case if I wanted the full 70A capability, but I’m pushing for 30 which is less than 50% capacity.

The only passive ones I have, and happen to have multiple of, are the ones on the lower right, which were extracted from dead Netgear (Bay Networks) switches. The BGA package still stuck to one of them is a Broadcom BCM5308A2KTB Ethernet switch SoC… it talked to a couple of SRAM packages (duly harvested) and a number of Ethernet PHYs.

The thought is that two MOSFETs could be fixed to the underside with a small PCB. (Well okay, there’s room for all four, but then I’ve got to somehow electrically insulate the two pairs.)

A connector of some sort, either a PCB edge connector, or perhaps a specially keyed Andersen Power Pole connector pairs (which can be rotated 90°) could connect power and control in one secure mounting. Two 30A connectors and a 15A would serve this job well, and they come in a range of colours for the housings. Thus I can avoid the red/black typical colouring to avoid confusion.

Solar Cluster: Full load test: FAIL

So, I drag the cluster, battery and 20A charger out to the deck to do a full load test. This is the first time I’ve fired this newly built controller on a full load. Uncharted territory so far.

Here’s the set up.

The charge controller that I built earlier is in a nice shiny re-purposed case that I had laying around. Just the right size too. These were USB extender devices that my father’s workplace had used in a project: they wanted the innards, so we got the empty boxes. I just mounted the PCB on a small piece of plastic to insulate it from the case, and routed my DC leads out through a hole in the case originally intended for an RJ-45 jack.

At this point, everything is humming along fine. Our battery charger is on stand-by.

The meter is showing a sedate 12.4V and the controller is happy with this. That said, I’ll have to work on the visibility of those LEDs. The two on the power MOSFET control lines are off at this stage.

So I give the system a bit of curry. I transfer a copy of a Linux kernel git repository to each, tell them to update their working copies from that, and build a version of kernel v4.8.5. This made the current jump up to about 10A. So far so good, the battery is holding.

Then, about 30 seconds in, the controller decides it’s a bit too low, so it kicks the charger on. There’s a few false starts, as the charger delays its start-up a bit. Eventually though they get into sync and start charging. At this point, the charger is taking the load of the battery and the cluster.

Great. So it continues for a minute, then decides it wants to shut down, which it does, followed by a moment of oscillation. It seems the controller is too impatient for the charger, waiting for the power to come on…but then… what’s that smell???

Oops, guess that MOSFET got just a little too hot. I was hoping to avoid the need for heatsinks by over-dimensioning the MOSFETs. These are supposedly able to take 70A, I realise that’s with a heatsink, but I thought that at 20A, they would be able to handle it.

One somewhat roasted MOSFET says otherwise. Interestingly, only one has visible marks, its mate looks okay, but likely isn’t.

The MOSFETs don’t have to be mounted directly on the PCB, we can re-locate them to where we can squeeze a heatsink in if I can’t get one in between the two already. The thought was each pair have a heatsink in between them. More pondering to do it seems.

Solar Cluster: Light testing of the charge controller

So, late yesterday afternoon, I devised a light test of the controller to see how it would perform.

For this I disconnected all but one of the nodes, and hooked up one of my old 10Ah LiFePO₄ packs and my 3A charger hooked to mains. The LM2576-based charger is just able to hold this load and provide 1A charging current.

The first thing I noticed is that the fan seemed to turn on and off a lot… this could be a difference in the temperature sensors between the DIP version of the ATTiny24A that the prototype used and the SOIC version which the new controller used.

The test ran overnight. The node basically was idling, as were the two Ethernet switches. But, it served the purpose. I now know the logic is sound, although I might want to adjust my set-points a little.

That’s the output data from a small digital power meter that was hooked up in circuit. This device is unable to display negative current, so the points at which the battery was charging is shown as 0A. Left axis is voltage, right is current. You can see that the charger gets brought in when the battery dips below 12V and clicks off just before 13.2V.

I can probably go a little higher than that, maybe about 13.6V. I may also need to re-visit the fixed resistor settings on the linear regs inside the nodes to knock them down a few more pegs to prevent the BMCs whining about the high voltage.

Next weekend, I might consider hooking up the 20A mains charger and giving it a full load test.

Solar Cluster: Boards Arrived

So, after a couple of email enquiries, the truth is unveiled. As it turns out, Swiss Post send parcels via one or more transit countries which due to a quirk in their tracking UI, may appear as the destination.

Asendia were in touch last night informing me that: “Please kindly note that sometimes tracking website will show the wrong destination. Canada is only the transit country.” Ahh okay, no problem then. Confusing, but that’s fine. 🙂

I know now for later not to panic if it says Canada.

This morning, there was a surprise parcel arrived at work, containing 6 PCBs. Bonus! Guess I had better get the other parts on order. I won’t have them ready for this week end, but I predict much solder smoke in my future next weekend.

Solar Cluster: Getting PCBs made

So a few weeks ago, I gave the charge controller a test, seeing if it in fact reacted in the manner I expected before deciding whether to proceed with the existing prototype or whether I should iterate the design.

In the end, I decided I’d tweak the design and get new boards built. By using SMD parts and a 4-layer board, I was able to shrink my design down to a 5×5cm square, which is relatively inexpensive to have fabricated.

I’ll be getting a few boards which means I can have some spares in case something goes bang or if I want to scale out my battery bank.

The updated design is published in the files section. This also incorporates @K.C. Lee‘s advice regarding back-to-back MOSFETs.

After some fun and games, one PCB fab house telling me to “check my passwords match” (when I know for certain that they did match), and another seemingly ignoring the inner two layers, I settled on a PCB manufacturer (thanks to PCBShopper) and got the boards ordered.

I put down my home address for the billing address and my work address as the delivery address. Both given as being in “Queensland, Australia”.

This is a learning experience for me, I’m used to just drawing my circuit out with a dalo pen, but unfortunately my skills aren’t up to producing a board for SOICs.

They reported that they shipped the boards on the 21st, and had previously estimated about 2-3 weeks for delivery. No problem there.

Just one niggling concern…

Not familiar with Swiss Post procedures, I’d have expected it to show Hong Kong → Australia, but maybe that’s how they do things. I do hope someone didn’t get Queensland, Australia mixed up with Quebec, Canada!

Update: Just been in touch, no the manufacturer didn’t get it mixed up, and it’s the right tracking number. They’re chasing it up with Swiss Post.

Solar Cluster: Load test using the power controller

So, last night I started doing some light testing of the power controller. I installed a 5A fuse and hooked it up to a 10Ah LiFePO₄ battery and a homebrew 3A charger (LM2576-based with a 16V IBM laptop PSU as mains source) set to its maximum voltage (~15V).

The controller came to life and immediately started flashing its “high voltage” and “low temperature” LEDs, indicating that:

  • It thought the temperature was high enough to warrant a fan turning slowly (above ~20°C)
  • It thought the battery voltage was too high for comfort (IPMI complains when the voltage gets much about 13.6V.)

In order to get the battery to discharge, I plugged in an old Icom IC-706MkII G transceiver, set it on receive mode. I didn’t have an antenna attached, so this would have represented a very light load for what would be production use.

The battery started discharging, and after a few tens of minutes, the high voltage warning LED had stopped flashing and instead the “good voltage” LED was staying constantly on. So battery was in a range the controller was happy with.

It was going to take a long time though, for that set to drain a 10Ah battery in receive mode.

This morning, I got out the big guns. I plugged the actual cluster in and fired it up. After about 30 minutes of run time, the battery had drained sufficiently that the charger started flashing the “good voltage” LED, indicating battery was getting low. It had also turned on the MOSFET controlling the charger I had plugged in, and the charger was desperately trying to keep up with the ~5A load that the cluster was drawing. (Did I mention this was a 3A charger?)

So far so good. I powered off the cluster and unplugged it. It continued to let the charger do its job, and after another short while, the battery had regained some charge. It kept the charger on until momentarily, it peaked over the “high voltage” threshold. The “high voltage” LED blinked a few times, then the MOSFET turned off and the “good voltage” LED remained solid.

The battery was sitting at 13V. A little short of my 13.5V set-point, but close enough given it was on a fairly weak charger. So that ATTiny24A is doing exactly what I intended.

Maybe the high set-point could do with some adjustment, or a turn-off delay added (with a separate “high critical” set-point for immediate shut-off), but so far, so good.

It might be worth me getting some PCBs fabricated and shrinking the prototype board down using SMD parts, but this controller works so far.

Solar Cluster: Power controller firmware taking shape

So after a long hiatus, some of it involving some yak shaving (e.g. #Open-source debugWire debugger yes, I’ll get back to that), I managed to get a version of the firmware together for the power controller that seems to be doing what I ask of it.

The means of overcoming the road block was knocking up a very crude (and slow!) UART driver so I could print data out on the serial port. I avoided doing this previously because I didn’t have an easy way to interface to a TTL serial port. Recently though, I bought some FTDI serial cables, one 5V and one 3.3V, so now I had little excuse.

I feel these will give me some valuable insights into tacking the debugWire project.

I was able though, to bit-bang a UART using avr-libc’s _delay_us, and get a respectable 4800 baud serial stream out. This obviously dropped to 300 baud when I had other tasks running, but still, that’s enough to do what I’m after. (Once upon a time, that was considered fast!)

After figuring out where I was going wrong… perhaps I had been sniffing too much solder smoke that day… I re-wrote my firmware, using this UART library as a means of debugging the code. I set up Timer1 to run at 1.2kHz, which meant I could also use it as a baud rate generator for my software UART and upping the baud rate to 1200bps.

Some further work on a breadboard, and I had more-or-less working firmware.

I’ve thrown the code up on GitHub, it’s very much in a raw state, and I might do a second revision of the PCB, since this prototype seems to be more or less on the money now.