The “wall wart” is one of humanity’s worst inventions (not counting all of the inventions that are actually intended to kill and maim each other, I’ll admit). AC-plug power supplies are a cheap workaround to various engineering, economic, and regulatory problems that manufacturers face, and they solve those problems by pushing them off onto end users. So what can we do about it? OSAlert takes a look at an ingenious workaround to the Wall Wart problem, and some hopeful trends that might make them a thing of the past.
It’s generally more expensive to design a small electronic device with a power supply built-in, especially if heat dissipation and electrical isolation are an issue, and it’s more difficult to get a product with a built-in power supply approved with the CE and FCC than it is to just buy an already-approved external power supply and build on that. And of course, since AC electrical infrastructure is such a crazy quilt of voltages and plug types around the world, it’s easier for manufacturers to design a single DC-based device and then include the proper adapter for each country it wants to ship to.
So now you know why every small electronic device has one of these, so what’s so bad about them?
- It essentially divides your device into two parts, and if you lose or damage the power supply, it won’t work until you find a replacement.
- There’s no standardization of voltage, polarity, or terminal type or size, so while it may be possible to use a wall wart from one device in another, there are so many variations, you’re just as likely to find that even if the plug fits, it won’t work, or will even ruin the device.
- If you want to travel with a bunch of devices, particularly ones that need to be recharged, you can find yourself lugging around a whole briefcase full of awkward-to-store, bulky power supply cords.
- They are inefficient, turning precious electricity into useless heat. In many cases, the devices they power need just a trickle of current, but a torrent of electricity is burned off as heat, even when the device isn’t in use. More efficient power supplies exist, but they’re not as common since they’re more expensive.
- And finally, and most annoyingly, having the power supply housed in an oversized black wall plug makes it difficult for several devices to be plugged into the same outlet or strip, and makes it easy for the plug to fall out, if the wall wart is particularly heavy.
Wall warts have become such a universal nuisance that besides having given them an insulting slang name, we’ve pretty much accepted them as a part of how the world works, and have developed ingenius ways of coping. You can buy a bunch of outlet saver small extension cords to make room on your power strip, or you can buy a strangely-shaped power strip, such as the squid, flying saucer, pyramid, or just plain big and bulky.
One of the most elegantly-designed wall-wart-friendly devices I’ve seen recently is the wall adapter from 360electrical. It’s a company that’s headquartered close to where I live, and I asked them to send me their latest product, the 4 outlet rotating surge protector. As you can see from the picture below, each outlet rotates 360 degrees so your wall warts can be tilted away from each other. It’t the perfect accompaniment to the outlet behind my nightstand, where a lamp, cordless phone, mobile phone, and sometimes a laptop need to be plugged in. It replaced a very bulky traditional surge protector/power strip, and I love it.
If you’re really cool, you could install the built-in 360electrical outlets into your wall (but I honestly don’t think they’re as useful, partly because the 360 degree turning is mostly useful when you need four outlets, and partly because a surge protector is pretty useful).
But enough capitulation to an unnecessary nuisance! What we really need to do is get rid of wall warts altogether. On this front, there is a very promising trend that’s worth examining: USB-power. No, I’m not talking about the proliferation of ridiculous USB-powered desktop appliances, such as eye massagers, room fresheners or foot warmers (see this link). I’m talking about regular, useful devices that draw their power from a standard USB outlet. The two that I have are the iPhone and the Jawbone Bluetooth headset. Each can be plugged into a computer’s USB port or an included (small) AC adaptor. When I travel, I only need to bring the cables, which I can plug into my laptop at night. If I need it in the car, I only need an auto-USB adaptor, not a special cable. Very handy. However, both of these devices are lacking the final step: they still require a special cable, because they use proprietary connectors. The iPhone, at least, uses the iPod connector, which is proprietary but ubiquitous. The Jawbone uses a nifty magnetic connector, but if I lose it, I’m screwed.
Back in February,at the GSMA Mobile World Congress conference, 17 handset manufacturers indicated that they would be standardizing their chargers on the Micro-USB format. Then, last week, the InternationalTelecommunicationsUnion threw its weight behind the initiative.
Though this standardization push covers only phones, undoubtedly phone accessories and other small peripherals will follow, and the micro USB plug is small enough that it will be a good fit for even featherweight accessories like Bluetooth headsets. Early in the process, RIM, Palm, and Apple weren’t on board, but now the newest RIM and Palm devices use micro USB, and I’ve seen some reports that say that Apple has signed on to the initial agreement, but no hard evidence that this is so. Apple is alone among handset manufacturers in that its decade-long drive to make the iPod connector ubiquitous has made things quite convenient for iPod and iPhone owners, therefore the buyers of any theoretical Apple device with only a micro USB connector would be at a major disadvantage, being unable to use all of the thousands of iPod-enabled accessories. So Apple has two bad options: embrace the new standard and make a staggeringly large installed base of accessories obsolete, or ignore the standard and slowly see accessories moving to micro USB, requiring Apple device users to use an adapter.
Despite all this, even if not everyone can agree that to have micro USB on one end of the cable, if they can at least agree to have standard USB on the other end, it will make a big difference. If the trend continues, I could even imagine homes being wired with a centralized DC power supply sending power to wall-mounted USB power ports throughout the house. Now, USB is limited to500mA, so more power-hungry DC devices, such as halogen lamps, wouldn’t be able to use that kind of infrastructure, but many devices around the house would, such as all the various battery chargers, cordless phone bases, postage scales, clocks, and office appliances. Making this move would not only be convenient, but it would save a lot of electricity, replacing a house-full of inefficient, cheap power supplies for one efficient one. Perhaps evendeviceslike TVs and Audio equipment that are notorious for sucking power even when on standby could be configured to draw from a central DC source while on standby and only pull AC when in operation.
Bad enough most people leave the darn things in the wall 24/7/365 and each second they use a bit of power. Think of that times 80 million. A lot of energy.
The second bad part is they are made to operate at 50 hz. You have to do that since you can’t run a 60 hz on 50 hz but the other way around is well, OK. You simply loose efficiency. And a lot of energy is now lost even if you are smart enough to put them on a power strip where they belong.
Saying all that is similar to many other devices that use energy each second, on or off. TV, microwave ovens and radios are energy leaches.
(didn’t GW Bush get laughed at for wanting to get rid on these wall warts?)
Edited 2009-10-26 19:49 UTC
I keep reading these statements. And yet… my Killa-Watt fails to register even 1 watt of electric consumption on any of the units I have tested, as long as the device they are supposed to power is not plugged in to it. I declare the “vampire electronics” status of modern wall-warts to be a myth.
Most wall warts I have are warm to the touch when they’re plugged in, even if the associated device isn’t attached. If it’s warm, it’s got to be using at least some small amount of power – and if you multiply that miniscule amount by the millions of wall warts in use, it adds up to a sizable amount.
Well… I just picked a wall-wart that I had plugged in that felt warm. I plugged it into the Killa-Watt and it registers 0 watts. Looking at Volt-Amps, it registers 5 VA. And the power factor reading is 0.14. Current is 0.04 amps.
5 VA * 0.14pf yields 0.7 watts. (That gives us a little more precise figure to work with.)
0.7 watts * 24 hrs/day * 30 days/month = 504 watt-hrs per month, or 0.5 kilowatt-hrs per month. Now… I’m pretty parsimonious when it comes to electric usage. And I still use about 650KWH/month, on average. Most households use a lot more. 0.5 KWH/month compared to 650 KWH/month is pretty insignificant, warm wall-wart or no. I suggest that we stop giving people the false impression that if they unplug their wall-warts when they are not using them, then they can continue to drive their SUVs.
Edited 2009-10-26 20:51 UTC
650 kWh/month is quite a lot to my standards.
Is that amount used by your household, or by you only?
If it’s for a 5 Person Household, I would say it is not particularely energy-saving, but OK.
We have a 2 person household, and need 250 kWh/month, so that’s 125 for one person. We use electricity for “normal use”, and for “warm water”. No air conditioning is needed where I live, and we do not heat electrically.
Our household total energy consumption (every form of energy the household consumes) is approximately 500 kWh/month. So if you don’t use any other form of energy than electricity, and you are more than one person it’s probably OK.
I was wondering if someone would challenge me on that.
Well, its me, my dog, and my cat.
Here, we do need A/C. (Thermostat setting is 80-82 degrees.) I keep the hot water turned off except for about 0.5 to 1.0 hours a day. (I turn it on 30 minutes before my shower, and turn it off afterwards. It’s set to 115 degrees F, which is as low as the thermostat goes.) We need heat during the winter, too. Heat pump. I live in a townhouse, and keep the upstairs doors and vents blocked off, year round. Windows are blocked off with 0.5 inch styrofoam… except for the sliding section of the patio door. We pretty much live downstairs. The heatpump had a pretty good (10.5 EER) rating back when it was intstalled. And the charge is still good as measured by the sweat on the evaporator return line in the summer.
Yes, everything is electric here. But it would be nice if you would elucidate the rest of us on exactly how you manage a two-person household on 250 KWh/month. It would be appreciated, and potentially beneficial.
Edited 2009-10-26 22:07 UTC
I’ll chip in here. I use about 80 kWh per month in my apartment (staying alone). My strategy is to be conscious of power-guzzlers and only switch things on as I need them. My wireless router and fridge stay on round-the-clock. I switch on the TV, laptop and 1-2 lights for a few hours each day. Water heating on when I shower, washing machine once a week. I’m living in Singapore but my apartment is cool enough to be comfortable with a fan. Heating of course is never needed.
I could switch off the router during the day, but that involves climbing under my desk.
We have a funky extension cable with the switch about a foot down from the fork and majority of the ten feet or so between switch and ports end. The idea is that the switch stays close to the socket rather than digging for the christmas lights to plug or unplug. Something similar may work with your router by moving the power switch for it out to an easily accessible place.
My similar problem was switching between headphones and speakers; both are 5.1 making use of mulitple ports in my sound card. The solution was to buy one USB and four stereo extensions then lash them together into a custom extension cable for 5.1 (I’ll have to add a strand if I upgrade the audio to 7.1 outputs).
If the port or switch is not accessible; move it.
Someday I’ll have a house that is totally off the grid. Would be so nice. I think we use around 500kWh a month as well at my house. I need to double check that.
My current computer pulls about 200watts idling. Maybe I should get a more energy efficient one
btw, check your computer with it off but plugged up. It’ll probably be pulling around 5watts.
My PSU gives some measurements but I really need to stick a meter on it and see what my rig is burning away. It’s rarely idle even if I am.
That’s interesting. I’ve never used a Killa-Watt before. Shouldn’t wall warts waste some energy since many do seem to generate heat even when the attached device is not being used?
Well worth the investment. And you get to learn all about how your intuition has been right on some things and totally wrong on others.
http://tinyurl.com/yzmu5ss
Well, as I said, the unit I spot checked looks like it would add 0.5 KWH per month to your electric bill. Multiply that by the number of units you might leave plugged in and compare it to your average monthly electric consumption. Here in Oklahoma City, the average monthly residential usage is about 1200 KWH.
Of course, that’s just one data point. I just checked the power supplies for my netbook and cell phone (without the netbook and cell phone plugged into them) and got these results:
————-
Netbook
Power: 0 watts
Volt-amps: 0 VA
Current: 0.00 amps
————–
Phone
Power: 0 watts
Volt-amps: 0 VA
Current: 0.00 amps
————–
In my opinion, much of the “vampire electronics” hype I’ve seen is rubbish. And counterproductive, since it makes people think they are saving significant energy when they are not.
that this was going to be about how Thom hates Wallmart. Which was even more confusing since that company isn’t even in Thom’s home country.
Now that the confusion is past, i think something like this would be nice for power cords: http://www.thinkgeek.com/computing/accessories/93ad/
looking forward to the cable nightmear being over…
Edited 2009-10-26 20:20 UTC
There’s a link in the article to the AC Squid, which was around before the USB squid was.
…touche. thank you, not sure how i missed that.
Yeah, me too. It happens with all stories about wall warts. My mind just inverts the w.
But more realted to this story, I blew up my first hand held video game due to using the wrong wall wart. Ye old tandy electronic football. I was sick of buying batteries for it, and I saw it had a dc connector. I found one that fit plugged it in and the game worked! … For about 30 seconds before a capacitor blew up cracking the screen.
It was about that time that I started to try and understand electricity before just pluging random stuff in.
missread the title that this was going to be about how Thom hates Wallmart. Which was even more confusing since that company isn’t even in Thom’s home country.
you read my mind, I had the same confusion.
Useless stuff: I haven’t been able to login for ages afaik, weird that it worked now, but I guess atleast now I can post comments then ..
Useful stuff: But why buy that from thinkgeek for 20 dollars + shipping when you could get it for around 5 with free worldwide shipping?
4.55 dollars: http://www.dealextreme.com/details.dx/sku.28670
5.31 dollars with more power and one mini-USB: http://www.dealextreme.com/details.dx/sku.13526
Why do wall outlets even output AC? I understand that it makes much more sense to distribute power over the grid with AC, but inside your home, just about everything uses DC.
Why can’t each house have a large AC -> DC inverter hidden away in the basement and have the outlets output DC? Normal outlets would probably be at about 12 V. Special outlets for things like ovens that require more voltage would be needed, but this would eliminate “wall warts”, and probably improve efficiency.
Man, that’s a really good idea. About half the stuff we have in the home could run off of a low amperage 12V DC circuit, though I’d suggest a 3 prong plug – ground, +5VDC and +12VDC so lower power devices wouldn’t have to down convert.
Though I think it would be better to build these into the outlets themselves. Just put a high quality converter in the plug, and have it shut off if there are no devices plugged in.
Which bring to mind an excellent product idea – a replacement outlet plug that has 4-5 USB ports in it that just provide 5V to devices that charge via USB.
A former colleague of mine equipped her newly-buildt house with a 12V grid, as well as the standard 230V grid. Worked like a charm.
The only device that I use and that eats raw 230V AC is a water kettle. Even my light bulbs need 12V only.
update: just realized, that we have a fridge.
Edited 2009-10-26 22:49 UTC
Don’t forget the vacuumcleaner and the powertools.
the tv, hifi, computers etc.
You’ll need very thick cables in the wall to provide that kind of power on a low voltage network.
Modern wallwarts consume less power when idle and are not as big as the old ones so they take up less space around the poweroutlet.
It wouldn’t be feasible. Have a look at the wiring in your car to see how complicated and expensive it would be. Lower voltage means much higher amperage. This requires very thick wiring and relays on all switches to prevent arcing or contact melting. An electric kettle or toaster would need a cable as thick as a garden hose and would need to be hard-wired.
Not to mention DC is pretty dangerous if you do god forbid get shocked by it, AC pulses and many people get kicked back by it, DC just freezes you up, its always delivering and only .02 amps can stop a weak heart if delivered at the wrong time. Ive seen someone at work get seriously injured working on a dc motor he didn’t know was live, the motor was smaller than most household fans, and ran off of 12vdc.
That is actually how things used to be done a long time ago when the grids were first being developed in some places. It’s an interesting read
Edited 2009-10-27 20:06 UTC
That power transfer efficiency is very low at low voltages, and small gauge wire: the higher the resistance of the wire (which is made higher by reducing the gauge) and the longer the wire and the lower the voltage, the higher the power losses: why do you think power companies transfer power via very high voltage, high tension power transformers that measure at 100KV+ for the big ones, and reduce down to like 14,400V on smaller poles, and then 240V (US) in homes, and, oh, it’s AC?
Even if you had efficient power supplies you plugged in your USB cable to, USB cables just aren’t an energy-efficient way to transport power, at such low voltage DC. One of the reasons AC became the standard rather than DC was power efficiency possible for transmission via power lines, because DC can’t be transformed readily to higher/lower voltages with corresponding changes in current: the other big reason is if the voltage/current is high enough for DC and you get electrocuted, DC will hold you locked there, with certainty, while at least AC gives you a bit more of a chance. So, there goes any reasonable manner of making USB cables an efficient way to transport power to avoid dealing with the volt-on-a-rope or wall wart while still being safe.
Now, I’ll throw this one out there, just for fun: the people (at least in the US) that have the ultimate solution are the Amish: they don’t have to fight the problem of wall warts/volt-on-a-rope at all!
Thank you for a nice little piece, with real food-for-thought. A refreshing change from the usual Flamebait.
The below links seem relevant:
http://www.dailytech.com/article.aspx?newsid=16609
http://news.bbc.co.uk/1/hi/technology/8323018.stm
This standard could be used for many other small devices. Laptops and other higher power applications could use a 12V 5A standard.
Finally, in the future old copper cables will be a thing of the past and replaced by laser light delivered via fibre optics. However getting companies to agree on a wavelength would be the main challenge.
Edited 2009-10-26 22:41 UTC
I wish wall warts would all use a short AC cable from the brick so all the strip outlets would be useful. I also wish that vendors would use distinct colors on the power wiring so a quick glance tells me which device is on which power wart. At least Apple uses white cables and a better looking PSUs on the Mini Mac.
The vast majority of my power bricks are cold, and the Killawatt device tell me that the power is negligible.
Only the speaker system and the typical PC shutdown use about 4W each so it might be worth turning off the strip, but it is not going to save the earth.
As Jonathon said, distributing DC around a house is a nice idea on paper but a bad idea if you ask the electrician or the house inspector.
I’d like to do it too as I have added some AC wiring, outlets around the house but my EE background stops me from going further with DC.
More and more lights are going to low power/amp CFLs, hardly any current, but I still must use Romex14 for regulation sake just in case the next owner in 20 years wants to use high amp lighting (if that even exists then). I could easily see running 12V DC wiring short distances off a DC power bar up and down the house center to enable direct use of LEDs strips when those become affordable as well as DC computing.
I can also image one day having solar PV panels on the roof battery backed and send that power into the house for use only by 12DC computing or entertainment devices, not mess with the AC system at all. So AC half stays on grid, DC half off grid. If the power goes out, the huge AC equipment won’t work, but the computer stuff will.
I think technology is just moving way to fast to pick any particular plug standard. If we did this 20 years ago we might all have DIN plugs or RCA jacks all over the walls. I bet many youngens don’t even know about DIN. In 20 years USB might be largely forgotten too. It might be better to let the electrical industry pick an eternal DC plug standard and use adapters from that.
A good reason for using DC is to avoid some of the losses involved in PSUs when AC is converted to DC, but DC wiring would also have losses too, so I think we are stuck with AC for a long time.
A more long term solution is probably going to be wireless power, maybe at very high AC frequencies and also perhaps Supercaps (EESTOR hopefully) replacing batteries.
I currently use a 15 watt and a 9 watt solar panel to charge 2 18 amp batteries. I use a USB car adapter to charge my phone and bluetooh headset. I use a 350 watt inverter to run my rooms lights, and charge my netbook.
Great article, I agree wholeheartedly, and it’s good to see at least the phone manufacturers making a move – hopefully they’ll have enough clout to move the rest of the gadget makers.
I do think Apple has a third option to your mentioned two; include a micro USB in addition to their proprietary connector for a couple of generations – it’s small enough for this to be feasible, would add little in the way of cost or weight, and would allow them to march forward with USB while retaining compatibility with devices featuring their proprietary connector (why Apple insists on supporting oddball connectors forever is beyond me – die Firewire!).
Also note that the USB 3.0 standard increases the maximum power draw possible, especially for “configured” devices. From the (possibly wrong) Wikipedia article:
“The bus power spec has been increased so that a unit load is 150 mA (+50% over minimum using USB 2.0). An unconfigured device can still draw only 1 unit load, but a configured device can draw up to 6 unit loads (900 mA, an 80% increase over USB 2.0 at a registered maximum of 500 mA). Minimum device operating voltage is dropped from 4.4 V to 4 V.”
I don’t understand why there has to be a difference between configured and unconfigured devices – using Ubuntu makes it impossible for me to charge my BlackBerry for example, as I can’t get it “configured” (without painful workarounds) – that’s just silly. I also can’t plug it in to a friend’s laptop without installing a bunch of drivers, so I have to travel with the USB wall wart all the time.
EDIT: Installing “barry” from Synaptic now allows me to charge my BlackBerry from within Ubuntu, so that’s progress! Autodetection would have been nice though…
Edited 2009-10-27 00:00 UTC
The issue is that if the device is “dumb” and can’t manage it’s own power draw, then your computer has to protect it’s chips from overdrawn current. Most cheap computers are designed at the lower end of the power requirements and may support one high powered device but typically share the power wiring with all the devices.
On the flip side the trouble I have is that many DEVICES won’t charge from “dumb” powered hubs.. that’s the big flaw in the plan. I have several phone devices (from companies beginning with A or M) that won’t charge except with THEIR USB charger even though the charger has a standard USB or mini-USB port available.
The idea of USB power is great because then we can buy one or two highly efficient chargers for our home or office for all our devices. Custom cords don’t really bother me because they’re small and standard at one end. Chargers would be optimized for our local power grid, and to output the standard USB so we wouldn’t need to carry them with us. They could also have advanced switching that detects when ports are not being used and turns off circuitry and even power off the whole unit to save “vampire” power.
Actually, I’d never heard the term “wall wart” before but it certainly does fit nicely. I’m not sure which is worse, the wall warts or the half-and-half power cables with a big power brick in the middle. The wall warts make it harder to plug in a lot of devices, but those power bricks seem to tangle with one another especially if you have a lot of computer equipment and, if you have carpet, those power bricks can get really hot if you’re not careful.
I like USB power, and I actually have a surge protector that includes USB power ports in addition to the typical electrical ports. It puts out more power than the ports on my computer and my devices charge twice as fast when plugged into the surge protector. I have to wonder though, would USB power be sufficient to handle anything beyond small devices like phones or headsets? I can’t imagine, say, a refrigerator running off of USB. Still, something along the same lines would be nice if it could be done though the wires would have to be able to handle a lot more power than the typical USB cable.
I too am delighted when I can get a device that accepts USB power, but there is a problem with them.
I have had several USB plugs break from normal use. The middle little plastic part breaks out of the socket. I usually am able to carefully glue it back in. But bottom line USB isn’t durable enough of a plug to become the standard adapter in my opinion. However, maybe if we can start moving towards a single DC plug solution, we can refine it towards something better in the future.
So I found this Green Plug group posted about a long time ago, and their ideas really make a lot of sense. Rather than a central AC/DC and run the whole house on DC, sell a AC/DC converter that will identify the AC in use and negotiate the DC with the remote device and provide what’s needed on all sides.
http://www.greenplug.us/
Check it out and let me know what everyone here thinks.
Also while we’re talking about this, I’d recommend an internal DC voltage of 60V be the standard as it is divisible by 2,3,4,5,6 which should yield integer down conversion voltages of 5V, 6V, 10V, 12V, in case conversion is ever made efficient in DC.
You don’t see many devices requiring more than 30 Vdc in the retail market as such high voltage is dangerous. Of course, some devices can use high voltages internally. However, they won’t make it available as output. These devices are usually labelled as not customer-servicable for this reason.
Switched DC-DC converters are quite efficient (up to 98%). You don’t need integer factors : it won’t give you more efficiency and sources won’t give you a integer voltage, anyway. You can even generate an DC output with a higher voltage than the input (e.g. 12 Vdc from a 5 Vdc source; step-up/boost configuration), although the output would have a lower current capacity than the input.
My pet hate about the wall wart is (at least in South Africa) usually come with those awful round 2 pin plugs bonded on so you can cut them off and put a proper standard 3 pin plug on.
So as well as a wall wart, you have to plug in an adapter most of which (in South Africa) are appalling quality. Scanner not working in a friends house – oh you have to wobble the wall wart. Buy a new appliance and play search the two pin adapter.
I predict that within 5 years this problem has solved itself by the usage of induction chargers.
I’m thinking something like a induction mat on which you place all your equipment that needs to be charged.
There’s already a standard on its way [1].
Think of the consequences in product design and form by this:
You no longer need to have a opening where to put the charging pin. Also, the batteries may be much more embedded.
[1] http://www.wirelesspowerconsortium.com/news/press-releases/release-…
… every plug has a switch on it – turn off the plug, the power to all attached devices is cut. No need to unplug and no need to do anything special. Coupled with a separate fuse in every plug and a mandatory third pin (often used to earth, but also operates the plug locking mechanism that prevents kids from electrocuting themselves unless they are very determined to do so) it sort of makes all of this fairly moot. Sure people will still leave stuff plugged in and turned on, but that is more social engineering.
The real travesty is that the round DC plug connectors weren’t standardized fifty years ago!
“solar panel to power” It took more energy to make the solar panel then you could ever recover from it.
So you guys are smart enough to take one wall wart and measure it. What you didn’t add up is the 20 or 30 most people have and the electricity it took to air condition that space. It is kind of less if you are using heat but in that case you are paying for electric heat.
The simple solution is not to try a whole house DC. The simple solution is to put it on a power strip and turn them off at the same time. I have 6 on my computer and they all get turned off when I shut down the system.
I agree you are not making a great statement if you waste energy in other areas.