কেন ব্যাটারিগুলি অ্যাম্পিয়ার-ঘন্টাগুলিতে পরিমাপ করা হয় তবে কিলোওয়াট-ঘন্টাগুলিতে বিদ্যুতের ব্যবহার পরিমাপ করা হয়?


20

আমি ব্যাটারিগুলিতে শক্তির ব্যবহার সম্পর্কে পড়ছিলাম এবং কেন এটি ঘরের বৈদ্যুতিক ব্যবহারের চেয়ে বিভিন্ন ইউনিটে পরিমাপ করা হয় তা বেশ বুঝতে পারি না। একটি অ্যাম্পিয়ার ঘন্টা একটি ভোল্ট একটি পরিমাপ অন্তর্ভুক্ত করে না। তবে আমার বোধগম্যতা হ'ল যে কোনও ব্যাটারির যেমন স্থির ভোল্টেজ থাকে (1.5V, 9V, ...) ঠিক তত বাড়ির বৈদ্যুতিক ব্যবহারের (120 ভি, 220 ভি, ...)। সুতরাং আমি দেখতে পাচ্ছি না কেন তাদের বিভিন্ন ইউনিট রয়েছে যার দ্বারা তারা মাপা হয়।


FYI। দেখে মনে হচ্ছে আপনার প্রশ্নটি ব্যাটারি গজিংয়ের বিষয়টিকে স্পর্শ করে। এখানে একটি সাম্প্রতিক ওভারভিউ নিবন্ধের batteryuniversity.com/learn/article/...
নিক Alexeev

উত্তর:


18

kWh are a measure of energy, for which grid customers are billed and usually shows up on your invoice in easily understood numbers (0-1000, not 0-1 or very large numbers; ranges which, unfortunately, confuse many people).

Ah are a measure of electrical charge. A battery (or capacitor) can store more or less a certain amount of charge regardless of its operating conditions, whereas its output energy can change. If the voltage curve for a battery in certain operating conditions are known (circuit, temperature, lifetime), then its output energy is also known, but not otherwise, though you can come up with some pretty good estimates.

To convert from Ah to kWh for a constant voltage source, multiply by that voltage; for a changing voltage and/or current source, integrate over time:

1kWh1000Wht1t2I(t)E(t)dt ;  E [V], I [A], t1,2 [h]

Grid customers are billed for both power and energy. A battery's "output energy" can't change (neither created nor destroyed). I think you mean "output power". The energy storage capacity of a secondary (i.e. rechargeable) battery can change over the lifetime of the battery due to changes in its structure (because work has been done on it).
Argyle

A battery's output energy and charge does change, @Josh, most notably with respect to changes in temperature and current draw, but the number of coulombs it took to deliver that energy is more constant. I wouldn't mind if you'd explain in chat how one could be billed for power alone.
tyblu

My bad, your usage is correct. I thought it was important to draw the distinction between power and energy here, but they are directly related. Energy output will vary with power output because energy is the integral of power. The whole point of a battery is to store energy and of course its output will diminish as you use it. Power isn't usually billed alone to residential customers (though part of what we pay does go to the infrastructure), but it is billed to large customers as "Demand Charges".
Argyle

5

A note about battery voltage: Rated battery voltage is "nominal". A fully charged 12 volt lead acid battery actually starts out around ~14.4 volts and drops off as you draw energy from it. The actual battery voltage depends on a number of factors not limited to state of charge, battery age, load profile, chemistry, etc,... For instance, A lithium ion battery of 3.7V (nominal) may start out at 4.15 volts and diminish to ~2.7 volts before requiring recharge.

Watt-Hours (or kW-H) is an indicator of the energy storage capacity of the battery, whereas amp-hours would refer to how many amps minimum you can draw from a battery at full charge for an hour before it was no longer capable of providing that level of flow (perhaps at or above the rated voltage?). They are closely related, but not equivalent. Some batteries are designed more for high current draw devices, whereas others are designed to last a long time for lower current draw devices.

Appended: Now that I look at my cell phone battery, I notice that it has all three ratings printed on it. It is a Lithium-Ion battery whose nominal voltage rating is 3.7V. It's energy capacity is marked as 4.81 Watt-Hours. It's electric charge rating is 1300 milliAmp-Hours. This seems to indicate that Energy = Voltage * Electric Charge (at least in terms of the battery ratings), though I think that this equation is hiding the fact that there is an integration of P=VI going on and that V is more like an average value than a constant, which probably gives a pretty good approximation.


The voltage of a fully charged battery (that has been left alone for 24H) is pretty close to 13 V rather than 14.4 V. 14.4 V is the voltage the charger uses under a charging phase.
Gunnish

1
This is wrong: "amp-hours would refer to how many amps minimum you can draw from a battery at full charge for an hour". This is not true. You can easily get a 2000mAH battery that will never produce 2 amps, but it could probably produce 500mA for 4 hours.
rocketsarefast

4

The way a battery works, the total coulombs it can push around falls out more directly than the total energy it can store. The voltage is not constant. It varies by state of charge for one, and the relationship between the two can be quite different between battery chemistries. All this is to say that A-h is more relevant to battery manufacturers than W-h or Joules.

Joules can of course be relevant to circuit design, so this information is available, just not included in the 2 second sound bite called the Amp-hour rating. Battery datasheets can get quite complex. As with most things, there are a host of tradeoffs and thorough information is more than a single number. If you do have to pick just two numbers to quickly characterize a battery, Volts and Amp-hours are as good as any, and are what the industry has converged on.


The problem with Ah is that it is dependent on the load, and it in usually not specified at the short info located on the battery, its not even always available from the manufacturer.
Gunnish

@Gunnish: All battery datasheets I've seen give you some idea of what the battery can produce over a variety of conditions, with A-h being very common. Yes, the A-h a battery can deliver varies with the current profile, temperature, and other parameters, but so does the total energy. Everything is a function of lots of other parameters when it comes to batteries, but the total deliverable charge is probably more constant than others, like the total deliverable energy.
Olin Lathrop

@Josh: No, it's not that easy. The total charge a battery can deliver does vary with load. Good battery datasheets will show a curve of the Ah or the derating factor from nominal for discharge current. Your 25 Ah battery may be rated at that at 25 degC over 10 h discharge (2.5 A). At 25 A you may get 75% of that, for example. At lower temperature it will go down additionally. This all varies differently between battery chemistries, construction details, and various other parameters. Batteries are complicated. Go look at a datasheet some time.
Olin Lathrop

3

A batteries voltage changes over its lifetime. The current is set by the circuit it is connected to.

As the current is a constant known value and can be predicted, and the voltage cannot, the units are in the value that can be predicted.

Your electricity supply is a constant voltage and can be predicted.


I feel that this concise answer is much better than the accepted answer; it gets right to the heart of "why?", which is what the asker actually wanted to know.
Ryan V. Bissell

It doesn't entirely get to the heart of why, but if the answer is saying "The number of Coulombs of electrical charge (Ampere-seconds or 1/3.6 mAh) that can be stored in and recovered from a battery is roughly constant over its lifetime whereas the number of Joules of energy (J or 1/3,600,000 kWh) that can be stored in and recovered from the same battery varies considerably over its lifetime" then I would agree
codeshot

3

One factor not yet mentioned is that because batteries have a certain amount of internal resistance, drawing more current will cause the voltage to sag. Suppose, hypothetically, that a particular battery that's been discharged a certain amount will supply 12 volts when supplying 10mA, or 10 volts when supplying 100mA. Drawing 10mA from the battery for 10 hours will discharge it about as much as drawing 100mA for an hour, but in the former scenario the battery would have supplied 20% more "useful" energy. Key point: a larger fraction of the energy in a battery will be lost when trying to drain it quickly than when trying to drain it slowly.

Power lines also have a certain level of resistance, and similar factors may apply, but the line voltage reaching a residental customer's meter is generally not appreciably affected by that customer's usage. A power company could supply one amp at 105 volts using 20% less energy (per unit time) than would be required to supply one amp at 126 volts. If customers were billed per amp-hour, the power companies would have an incentive to supply their energy at the lowest possible voltage. Billing per kWh means the customer's billable usage will be proportional to the amount of energy the power company has to generate to supply it. Incidentally, some devices (e.g. induction motors) will often draw less current at higher voltages (while doing the same amount of work), while other devices like incandescent lamps and heaters will draw more current at higher voltages (while producing substantially more light and heat).


0

Simply an "Amp.Hour" is not a scientific unit or SI unit. Amp.hr is a rating that battery manufacturers use but because one ampre = one Coulomb per second, when multiplied by one hour the two time factors cancel out and the result is simply 1 Amp.Hr = 3600 Coulombs of charge, no time factor involved. So a bit of smoke and mirrors from the battery manufacturers. If you want to really know how your battery is going to perform you will have to look a little deeper than taking the word of the sales people ... !


The amp-hour is scientific in that both the ampere and the hour are scientifically defined. The amp-hour is in very common use in engineering. Note that the question has an accepted answer from seven years ago.
Transistor
আমাদের সাইট ব্যবহার করে, আপনি স্বীকার করেছেন যে আপনি আমাদের কুকি নীতি এবং গোপনীয়তা নীতিটি পড়েছেন এবং বুঝতে পেরেছেন ।
Licensed under cc by-sa 3.0 with attribution required.