I was just thinking if you are on your computer anyway would it just use some of the excess electricity that your computer would have wasted or would it be worse than charging your phone from a charger while using your laptop separately.

  • litchralee@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    26
    ·
    edit-2
    5 months ago

    The other answers have touched upon the relative efficiencies between a phone charger and a desktop computer’s PSU. But I want to also mention that the comparison may be apples-to-oranges if we’re considering modern smartphones that are capable of USB Power Delivery (USB PD).

    Without any version of USB PD – or its competitors like Quick Charge – the original USB specification only guaranteed 5 V and up to 500 mA. That’s 2.5 W, which was enough for USB keyboards and mice, but is pretty awful to charge a phone with. But even an early 2000s motherboard would provide this amount, required by the spec.

    The USB Battery Charging (USB BC) spec brought the limit up to 1500 mA, but that’s still only 7.5 W. And even in 2024, there are still (exceedingly) cheap battery banks that don’t even support USB BC rates. Motherboards are also a mixed bag, unless they specifically say what they support.

    So if you’re comparing, for example, the included phone charger with a Samsung S20 (last smartphone era that shipped a charger with the phone) is capable of 25 W charging, and so is the phone. Unless you bought the S20 Ultra, which has the same charger but the phone can support 45 W charging.

    Chatting the S20 Ultra on a 2004-era computer will definitely be slower than the stock charger. But charging with a 2024-era phone charger would be faster than the included charger. And then your latest-gen laptop might support 60 W charging, but because the phone maxes out at 45 W, it makes no difference.

    You might think that faster and faster charging should always be less and less efficient, but it’s more complex since all charging beyond ~15 Watts will use higher voltages on the USB cable. This is allowable because even the thinnest wire insulation in a USB cable can still tolerate 9 volts or even 20 volts just fine. Higher voltage reduces current, which reduces resistive losses.

    The gist is: charging is a patchwork of compatibility, so blanket statements on efficiency are few and far between.

  • NeoNachtwaechter@lemmy.world
    link
    fedilink
    arrow-up
    23
    arrow-down
    1
    ·
    5 months ago

    There is no such thing as ‘excess electricity’ in a modern (switching) power supply unit. They use as much power as is needed. There is a few percent of loss in the device, no big deal.

    Some desktop computers are less efficient because they have too strong psu’s (lots of reserve for your future “gaming” graphics card) built in.

    • dudeami0@lemmy.dudeami.win
      link
      fedilink
      arrow-up
      2
      ·
      5 months ago

      This would only affect the 12V rail though no? It’s not like they are beefing up the 5V rail that supplies your USB ports in excessive amounts. Picking a random PSU from pcpartpicker, the CORSAIR RM650e vs RM1200e (650W vs 1200W) both have a +5V@20A rail. There would be no need to have a larger 5V rail to support gaming cards.

      Also correct me if I am wrong, most PSU’s are more efficient at 20-50% utilization, not 100%. I’m basing this off the higher ratings for 80 Plus.

  • Robin@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    5 months ago

    It’ll depend on how efficient your phone charger is vs your PC PSU. Looking at some charts, it’s a very close battle but generally the phone charger seems to win out. Probably because it’s more optimized for its max power output, whereas the PSU needs to support a wider range of loads.

    https://silentpcreview.com/power-lost-a-better-way-to-compare-psu-efficiency/

    https://www.electronicdesign.com/technologies/power/whitepaper/21129264/gan-fulfills-promise-of-efficiency-and-simplified-power-adapter-design

  • oo1@lemmings.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    5 months ago

    If you’re bothered about overall waste, consider that some batteries degrade slower if you charge slower. I tend to prefer a slow charger when i can.

  • bladerunnerspider@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    5 months ago

    Charging at higher voltage is more efficient (at least on electric vehicles) but also your phone is such a relatively small battery it barely matters. Avoid wireless charging as it is extremely inefficient.

    • Gamers_Mate@kbin.runOP
      link
      fedilink
      arrow-up
      3
      ·
      5 months ago

      I was just curious also I was more thinking of energy efficiency for environmental reasons.

      • Jako301@feddit.de
        link
        fedilink
        arrow-up
        7
        ·
        edit-2
        5 months ago

        Not to discourage such thoughts in the future, but your single post asking here probably used up more electricity than what you would save over the course of the next ten years.

        • Gamers_Mate@kbin.runOP
          link
          fedilink
          arrow-up
          2
          ·
          5 months ago

          I know the fediverse needs a green push and some instances are using renewable energy but really 10 years worth from one post?

          • Serinus@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            5 months ago

            I think that’s more about the scale than it is about the fediverse.

            Effectively you’re asking about a quarter cup of water where the answer isn’t even clear. Wireless charging is a bit wasteful though.

            I still appreciate your asking, because there’s been interesting discussion in the answers.

          • Jako301@feddit.de
            link
            fedilink
            arrow-up
            1
            ·
            5 months ago

            It was a bit of a hyperbole, I have no idea about the exact amount.

            Let’s say you charge your 2000mAh battery every day and your PSU is 10% more efficient than your charger (the difference is most likely not even this big).

            2Ah × 5V x 356d= 3.56kwh

            3.56kwh × 0.1 = 356Wh

            356Wh would be the difference per year, that’s about 12ct per year.

            Now estimating the power usage for fediverse messages is very hard to do since it depends on a lot of different factors (your device, cellular or WiFi data, amount of hops needed to reach you, general state of your nearby network, your instances infrastructure).

            The only even remotely similar thing I could find was emails with pictures producing about 20-40g CO2, which only slightly increases with more recipients, and Reddit usage comes at about 2.5g per minute. Comparing these two numbers just shows that all estimates done are pretty much useless for us since we have no idea how they are done.

            But if we go with a low estimate of 0.1g (slightly above SMS and somewhere around spammail level) per user seeing it and a few hundred to a thousand users seeing this even if they just scroll past, we reach the CO2 equivalent of 1kWh pretty fast without even talking about long term storage and future indexing. Not to mention that comments produce something too since they need to be federated, albeit not so much as the post itself.

            So while 10 years was a bit much, 2-3 years would be very much in the realm of possibilities, but no one knows or can even properly estimate the actual numbers.

      • Fester@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        5 months ago

        In that case, you can make it a point to charge when the grid is “cleaner” - usually overnight. Your electricity costs may be cheaper then anyway.

        The Apple Home app shows a grid forecast for your location, with cleaner times highlighted in green. I’m sure they pull this info from the utility company, so the info should be available in other smart home apps or maybe even your utility’s website.

        But like others said, phone charging is very minimal. We’re talking about a 20W charger vs. say, a 1500W air fryer. Running larger appliances off-hours is a bigger deal - dishwasher, laundry, etc.

        • Gamers_Mate@kbin.runOP
          link
          fedilink
          arrow-up
          1
          ·
          5 months ago

          Overnight? I thought it would be cleaner during the day because that is when the sun shines. I haven’t had an Iphone in a while but I will have a look into grid forecasts. I still use an air fryer not sure what the the wattage is though I would assume it is similar to an oven.

          • Robin@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            5 months ago

            Kinda depends on where you live but there often is an excess of hydro and wind power overnight.

  • dudeami0@lemmy.dudeami.win
    link
    fedilink
    arrow-up
    3
    ·
    5 months ago

    Your computer doesn’t “waste” electricity, power usage is on-demand. A PSU generally has 3 “rails”; a 12V (this powers most of the devices), a 5V (for peripherals/USB) and 3.3V (iirc memory modules use this). Modern PSUs are called Switched-mode power supplies that use a Switching voltage regulator which is more efficient than traditional linear regulators.

    The efficiency of the PSU/transformer would be what determines if one or the other is more wasteful. Most PSUs (I would argue any PSU of quality) will have a 80 Plus rating that defines how efficiently it can convert power. I am not familiar enough with modern wall chargers to know what their testing at… I could see the low-end wall chargers using more wasteful designs, but a high quality rapid wall charger is probably close to if not on par with a PC PSU. Hopefully someone with more knowledge of these can weigh in on this.

  • banfin@feddit.org
    link
    fedilink
    Deutsch
    arrow-up
    1
    ·
    5 months ago

    It is a little bit more efficient, because your PCs Power Supply is not very efficient when you dont use much power. By using it more of it, it becomes more efficient, peaking somewhere at 50%. But: we are talking about very little differences here and only for Desktop PCs. When you use a Laptop, you Powersupply is way less powerful, so you use more of it by just using the Laptop. So in that case, i would rather use the charger. To be perfectly honest with you: All that is not realy worth thinking about. Its like opening your Fridge for only 6 Secounds instead of 8. Yes, its saves power, but there are petter ways to do so. Doing only 1 kilometer less with your electric car ist about the same as charging your phone 10 times.

  • Blue_Morpho@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    5 months ago

    Gallium Nitride based modern phone chargers are 95% efficient.

    The very best, most expensive PC power supplies on 115v AC will only reach 94% at the very specific 50% load of power supply rated wattage. So if you have a 500 watt power supply, and aren’t using almost exactly 250 watts, you aren’t getting that 94% efficiency. Regular power supplies under normal variable load conditions are going to be somewhere in the 80% efficient range. If the PC is idle, that efficiency can drop to 20% (but it’s fine because it’s only a few watts).

    https://forum.level1techs.com/t/super-high-efficiency-300-400w-psu/184589/2

    So using a modern Gallium Nitride stand alone charger will be more efficient. It will be extremely more efficient if you use that stand alone charger instead of charging off your PC while your PC is idle.

    • nowwhatnapster@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      5 months ago

      Counter point. Most computer power supplies have a curve to their charging efficiency (somewhere north of 50% load). If your PC is substantially below the peak of that curve, then adding load (the phone) could raise the PSU’s efficiency say from 80 to 85% (I’m making up numbers) which would affect the overall efficiency of the entire PC’s load.

      I think your answer is still probably correct, but it’s an interesting nuance to think about.

      Side notes: Some PSU’s use gallium, e.g., Corsair ax1600i, though by and large most do not. Also if your in the EU then your working with 220/240v PSU’s which adds more efficiency, but that would apply to the phone charger as well.

  • Paragone@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    3
    ·
    5 months ago

    IF you’ve a high-efficiency charger, then I’d say it’s probably more-efficient to use that charger.

    The warmer you run your computer, the less-efficient it becomes, & the shorter the lifespan of the hottest chips in it ( this effect shouldn’t be significant )

    e.g. increasing a CPU by 10Celsius should cut its lifespan in half.

    by having more heat-generating-stuff going on in your computer, you impair the cooling of your CPU & GPU ( slightly, probably ), & that may affect your computer’s time-to-failure.

    Fan-bearings may dry-out sooner, too.

    hth, eh?

    _ /\ _

      • Gamers_Mate@kbin.runOP
        link
        fedilink
        arrow-up
        2
        ·
        5 months ago

        If the heat is negligible I would assume it should not matter as long as you do not charge while your pc is doing a task that uses up to much resources?

    • Gamers_Mate@kbin.runOP
      link
      fedilink
      arrow-up
      2
      ·
      5 months ago

      Thank you I will charge them separately there are some good ways to save electricity but I guess this is not one of them.