Balcony solar panels can save 30% on a typical household’s electricity bill and, with vertical surface area in cities larger than roof space, the appeal is clear
Transformers, power lines, roads, trucks, and maintenance teams to move from large scale plants to houses also doesn’t grow on trees, but if maintenance in remote places doesn’t happen it can burn a lot of them.
Sometimes large scale plants make sense, but as the back up too microgeneration where the costs of infrastructure to move from unpopulated to populus areas make sense.
I am also a fan of less inverted power in microgeneration though. More and more of power usage is DC anyways. The need to convert to AC as much IMHO, but that is my far more radical take
microgeneration purely in DC only really makes sense in stuff like campers and RV’s where you’re going to be using primarily nearby, low power consumption devices.
AC is still better, plus modern switching technology while still fairly expensive, is considerably more efficient now. If you’re doing AC you also get a number of other benefits, notably, literally every existing appliance and device uses and works with AC voltages, the entire standard around electricity and home wiring is based on AC mains, all of the accessible hardware is also produced for AC mains, not that you can’t use it for something else, it’s just not intended for that.
Certain appliances will use induction motors, and similar other tech (clocks for example, often use the frequency of the power grid to keep time) based directly on the AC sinewave. You could still run them on DC, it’s just significantly sillier. Plus transmission efficiency is a BIG loss in DC (even now with modern solid state switching components, it’s still just, not ideal), granted thats less of a problem on a micro grid scale, it’s still a concern and potential restriction, nothing beats the simplicity and reliability of a simple wire wound iron core transformer. There are a handful of other technical benefits, and drawbacks as well, but fairly minor.
Having a dedicated DC supply side might be nice for a home environment, but the question is what do you standardize on? DC/DC voltage conversion is fairly efficient as it is already. Converting from AC/DC is incredibly easy and not particularly inefficient at lower power consumption, it’s more of a problem with higher draw devices. But you can easily get around that by using a higher voltage to convert down from.
Agreed. I maybe a radical DC home evangelist but yeah AC has its place still and it being THE standard for home appliances is a good example of the powers of scale.
So far for my home usage I’m standardizing on 48vdc because that is the last multiple of 12 before you go above OSHAs low voltage regs.
From there I really want to standardize further on the power delivery spec, because I just love the idea of smart grid for my home. I can then have dispered batteries in my home for either the primary benefits of that device is portable but doesn’t always need to be (laptop, power tool batteries, little robot thing, car, etc) or as a way to reduce some crazy limited time power draw (like servers starting up, oven running for an hour a day, etc).
From there maybe just Microadapter for a few standard circuits so the outlets work the same.
yeah, i’m definitely not as aggressive on that, but then again i also dont really like having a lot things on my network, or connected to my grid, so i suppose i just sort of optimize that problem out. Plus like i said, convenience, running 120v and 240v is going to be significantly more beneficial for me since i primarily use high wattage draw devices that would benefit from more efficient transmission and conversion (servers and any high power switching power supply basically) i’ve thought about doing a low voltage network, but that really only seems like it’s going to be a bigger mess, for no real significant gain, i have to have central DC conversion and regulation now? I’m just not sure it’s worth it, unless i’m pulling it straight from a dedicated battery bank or something, but that doesn’t really make any sense to me. I might end up using lower voltage LED products for a lot of lighting, but i think i would rather have a handful of high quality high efficiency power supplies, rather than a global one and some weird ass 48v system where i need to convert from AC natively, unless i’m doing some really weird shit, and then down/up convert to any device as needed. It seems like a bit much for removing the AC conversion part of the problem, but that’s just me i guess.
One of the nice things about 120/240 is that our grid is sort of designed for it, so there are some clever ways you can go about utilizing it appropriately. Certain plug specs use both hot/live legs, and neutral (plus ground) so you can technically pull 120/240 voltage out of a single plug, which is quite the trick. You could also fairly easily wire up both of these in more standardized outlet receptacles as well. (although i dunno what the electric code looks like for this one)
My ultimate goal would be doing a decentralized off grid production/storage solution, so high efficiency on higher draws is going to be really important, as well as the ability to standardize on a widely accepted voltage standard. The only real advantage i can think of to using DC grid, is that it would be safer, but like, that’s a solved problem so idk.
personally im not huge on smart grid stuff, though i like the idea of smart grid management, being able to do “useful” things with excess generated power, or pull from storage banks at will given a certain rule set defined under a smart home system is way too convenient to ignore.
Reducing the money spent on DC-AC conversion is my main thoughts. If my power generated is all DC, my battery storage is all DC, my servers are DC, my lights, and water pumps can be DC, my car is DC, then switching from AC to just switch back to DC 20-40 feet just doesn’t make sense to me.
I would like to actually find a better formula then the napkin math I’ve done to say when it does and doesn’t have benefit.
Really want to get my hands on a Open compute Rack for my next server build and have the UPS and power rail be all 48v too (as per spec). Again why have another component to possibly fail and use power if I don’t need it.
Personally, a remnant of that. Being able to use standard lead acid batteries is a perk, but primarily I find that that voltage range of < 20-50>vdc in terms of equipment is in those 12v increments too. With the powedelivery (PD) extended power range (epr) going up to 48v right now, and the fixed voltages in that spec being multiples of 12 again matching the industry it is now.
With adjustable voltage supplies (AVS) it might matter less (because it can increment in 100mv instead of a couple fixed voltages) but I haven’t messed with that yet myself
The PC industry has been trying to get rid of those 3.3V and 5V rails for over a decade now, trying to get everyone on board with 12V only. The only hold-out in a modern PC should be SATA, at 5V, the mainboard already doesn’t care and GPUs definitely don’t. Also no -12V any more. Any year now, not that SATA will die that quickly but the mainboard knows how many SATA connectors it has and can provide sufficient 5V to power your disks.
PD’s default comms voltage is 5v at the moment too.
I’m for moving up the default voltage, but that is naive take for me. It just sounds right I have no idea the actual pros and cons on that low of level if that messes with components and what insulations to expect etc
Transformers, power lines, roads, trucks, and maintenance teams to move from large scale plants to houses also doesn’t grow on trees, but if maintenance in remote places doesn’t happen it can burn a lot of them.
Sometimes large scale plants make sense, but as the back up too microgeneration where the costs of infrastructure to move from unpopulated to populus areas make sense.
I am also a fan of less inverted power in microgeneration though. More and more of power usage is DC anyways. The need to convert to AC as much IMHO, but that is my far more radical take
microgeneration purely in DC only really makes sense in stuff like campers and RV’s where you’re going to be using primarily nearby, low power consumption devices.
AC is still better, plus modern switching technology while still fairly expensive, is considerably more efficient now. If you’re doing AC you also get a number of other benefits, notably, literally every existing appliance and device uses and works with AC voltages, the entire standard around electricity and home wiring is based on AC mains, all of the accessible hardware is also produced for AC mains, not that you can’t use it for something else, it’s just not intended for that.
Certain appliances will use induction motors, and similar other tech (clocks for example, often use the frequency of the power grid to keep time) based directly on the AC sinewave. You could still run them on DC, it’s just significantly sillier. Plus transmission efficiency is a BIG loss in DC (even now with modern solid state switching components, it’s still just, not ideal), granted thats less of a problem on a micro grid scale, it’s still a concern and potential restriction, nothing beats the simplicity and reliability of a simple wire wound iron core transformer. There are a handful of other technical benefits, and drawbacks as well, but fairly minor.
Having a dedicated DC supply side might be nice for a home environment, but the question is what do you standardize on? DC/DC voltage conversion is fairly efficient as it is already. Converting from AC/DC is incredibly easy and not particularly inefficient at lower power consumption, it’s more of a problem with higher draw devices. But you can easily get around that by using a higher voltage to convert down from.
Agreed. I maybe a radical DC home evangelist but yeah AC has its place still and it being THE standard for home appliances is a good example of the powers of scale.
So far for my home usage I’m standardizing on 48vdc because that is the last multiple of 12 before you go above OSHAs low voltage regs.
From there I really want to standardize further on the power delivery spec, because I just love the idea of smart grid for my home. I can then have dispered batteries in my home for either the primary benefits of that device is portable but doesn’t always need to be (laptop, power tool batteries, little robot thing, car, etc) or as a way to reduce some crazy limited time power draw (like servers starting up, oven running for an hour a day, etc).
From there maybe just Microadapter for a few standard circuits so the outlets work the same.
yeah, i’m definitely not as aggressive on that, but then again i also dont really like having a lot things on my network, or connected to my grid, so i suppose i just sort of optimize that problem out. Plus like i said, convenience, running 120v and 240v is going to be significantly more beneficial for me since i primarily use high wattage draw devices that would benefit from more efficient transmission and conversion (servers and any high power switching power supply basically) i’ve thought about doing a low voltage network, but that really only seems like it’s going to be a bigger mess, for no real significant gain, i have to have central DC conversion and regulation now? I’m just not sure it’s worth it, unless i’m pulling it straight from a dedicated battery bank or something, but that doesn’t really make any sense to me. I might end up using lower voltage LED products for a lot of lighting, but i think i would rather have a handful of high quality high efficiency power supplies, rather than a global one and some weird ass 48v system where i need to convert from AC natively, unless i’m doing some really weird shit, and then down/up convert to any device as needed. It seems like a bit much for removing the AC conversion part of the problem, but that’s just me i guess.
One of the nice things about 120/240 is that our grid is sort of designed for it, so there are some clever ways you can go about utilizing it appropriately. Certain plug specs use both hot/live legs, and neutral (plus ground) so you can technically pull 120/240 voltage out of a single plug, which is quite the trick. You could also fairly easily wire up both of these in more standardized outlet receptacles as well. (although i dunno what the electric code looks like for this one)
My ultimate goal would be doing a decentralized off grid production/storage solution, so high efficiency on higher draws is going to be really important, as well as the ability to standardize on a widely accepted voltage standard. The only real advantage i can think of to using DC grid, is that it would be safer, but like, that’s a solved problem so idk.
personally im not huge on smart grid stuff, though i like the idea of smart grid management, being able to do “useful” things with excess generated power, or pull from storage banks at will given a certain rule set defined under a smart home system is way too convenient to ignore.
Reducing the money spent on DC-AC conversion is my main thoughts. If my power generated is all DC, my battery storage is all DC, my servers are DC, my lights, and water pumps can be DC, my car is DC, then switching from AC to just switch back to DC 20-40 feet just doesn’t make sense to me.
I would like to actually find a better formula then the napkin math I’ve done to say when it does and doesn’t have benefit.
Really want to get my hands on a Open compute Rack for my next server build and have the UPS and power rail be all 48v too (as per spec). Again why have another component to possibly fail and use power if I don’t need it.
Is the multiple of 12 thing just for ease of lead-acid storage?
Personally, a remnant of that. Being able to use standard lead acid batteries is a perk, but primarily I find that that voltage range of < 20-50>vdc in terms of equipment is in those 12v increments too. With the powedelivery (PD) extended power range (epr) going up to 48v right now, and the fixed voltages in that spec being multiples of 12 again matching the industry it is now.
With adjustable voltage supplies (AVS) it might matter less (because it can increment in 100mv instead of a couple fixed voltages) but I haven’t messed with that yet myself
The PC industry has been trying to get rid of those 3.3V and 5V rails for over a decade now, trying to get everyone on board with 12V only. The only hold-out in a modern PC should be SATA, at 5V, the mainboard already doesn’t care and GPUs definitely don’t. Also no -12V any more. Any year now, not that SATA will die that quickly but the mainboard knows how many SATA connectors it has and can provide sufficient 5V to power your disks.
PD’s default comms voltage is 5v at the moment too.
I’m for moving up the default voltage, but that is naive take for me. It just sounds right I have no idea the actual pros and cons on that low of level if that messes with components and what insulations to expect etc