Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Power Hardware Technology

Ask Jonathan Koomey About 'Koomey's Law' 52

A few weeks back, we posted a story here that described Koomey's Law, which (in the spirit of Moore's Law) identifies a long-standing trend in computer technology. While Moore's prediction centers on the transistor density of microprocessors, Jonathan Koomey focuses instead on computing efficiency — in a nutshell, computing power per watt, rather than only per square nanometer. In particular, he asserts that the energy efficiency of computing doubles every 1.5 years. (He points out that calling this a "law" isn't his idea, or his doing — but it's sure a catchy turn of phrase.) Koomey has agreed to respond to your questions about his research and conclusions in the world of computing efficiency. Please observe the Slashdot interview guidelines: ask as many questions as you want, but please keep them to one per comment.
This discussion has been archived. No new comments can be posted.

Ask Jonathan Koomey About 'Koomey's Law'

Comments Filter:
  • by eldavojohn ( 898314 ) * <eldavojohn@gma[ ]com ['il.' in gap]> on Monday September 26, 2011 @10:04AM (#37515746) Journal
    What is your take on the interpretation of Futurists -- like Raymond Kurzweil -- in regards to extrapolating these 'laws' out to extreme distances [wikipedia.org]?
  • by Anonymous Coward

    Find an arbitrary pattern or trend, then name it after yourself.

  • by PPH ( 736903 ) on Monday September 26, 2011 @10:16AM (#37515940)

    ... and see where the Babbage Engine [slashdot.org] fits on the curve.

  • This one doesn't seem to have fundamental physical limits, so long as we eventually transition to reversible computing [wikipedia.org], in which the computer does not use up useful energy because every process it uses is fully reversible (i.e. the original state could be inferred).

    All the limits on computation (except regarding storage) that you hear about (e.g. Landauer limit) are on irreversible computing, which is how current architecture works. It is the irreversibility of an operation that causes it to increase entro

    • by DriedClexler ( 814907 ) on Monday September 26, 2011 @10:29AM (#37516102)

      Sorry, the question there wasn't clear. How about "Could the whole process be bypassed by the near-infinite efficiency of reversible computers?"

    • All the limits on computation (except regarding storage) that you hear about (e.g. Landauer limit) are on irreversible computing, which is how current architecture works. It is the irreversibility of an operation that causes it to increase entropy.

      No. The loss of power as heat has nothing whatsoever to do with the reversibility of computing operations.

      Power is lost (and thermal entropy increased) because of the electrical resistivity of the materials from which our CPUs are made.

      If you discover a room-temperature superconductor, please let the rest of the world know, because the researchers haven't found it yet. Bonus points if you can fabricate a modern CPU using that superconducting material.

      • Power is lost (and thermal entropy increased) because of the electrical resistivity of the materials from which our CPUs are made.

        Right, like I said: it produces entropy because you are using a reversible process to do it -- in this case, a wire that heats up.

        Yes, it's *difficult* right now to get the computation speeds we want using only reversible processes. I didn't intend to suggest otherwise.

  • A lot of consumer grade machines have begun focusing on multicore chips with a lower frequency to provide the same or better perceived computing performance than a high frequency single core chip. What happens when a technology like this subverts our craving for higher transistor density [wikipedia.org]? Can you argue that your "law" is immune to researchers focusing on some hot new technology like a thousand core processor [slashdot.org] or a beefed up system on a chip in order to improve end user experience over pure algorithm crunch
    • by LWATCDR ( 28044 )

      I do not understand how multi-core subverts our craving for transistor density? You kind of need that to increase the core count. The comes form the lack really good tools for parallel programing.

      • >>I do not understand how multi-core subverts our craving for transistor density

        Look I do understand what you just said but think with the herd ..

        back in the 80's ( early ), a phrase used in corporate America, it was " no on got fired for buying IBM "
        sometime is the mid-late 80's the 80386 came out, and corporate america said they had those PC's prior to them being sold to the market.
        back then it was a MHZ race

        people will no longer brag about speed, they will brag about core count, with the thinking t

        • by LWATCDR ( 28044 )

          Well I do feel that we are at that speed threshold. Mobile will drive down power so we are are already there. I am waiting for the next big jump. There is now reason why a Tegra II could not power the average desktop pc.

          • Really? I feel that the speed threshold is very near, it's not like we use all the power already that the chip can do, but the consumer will want all the extra power it can get. I think, I believe that we are nearing the start of using parallel programing so as to take advantage of all the extra cores. most likely it will be the Audio/Visual segment of the market that will create the software for it ( Game designers i am willing to bet ).

            We have seen this already done by splitting off the graphics to it's o

  • I would like to see not only the Babbage engine on your curve, but also the abacus and slide rule. Maybe the physical Rod, too, which used to be used in surveying. (Hey, you try calculating property area using pencil, paper, and a deed.)
    • But... why? Those aren't even remotely comparable. What kind of energy do you suppose be measured, the amount of effort it takes to groan when someone makes a comment like this that they think is witty?
      • by xkr ( 786629 )
        Not witty at all. Evolution is continuous. For example, one can compare energy costs backwards from nuclear, coal, back to cutting wood. People use energy ... not that hard to make an estimate for slide-rule energy costs. There used to be people who were paid to work 8-hours a day doing calculations by hand (including military ballistic tables). Why would *you* assume that tube-based computers are comparable to an IPad? The fact there there are general "rules" that appear to apply across an extremely wide r
        • Koomey's law only relates to the amount of power required to operate an electronic device [technologyreview.com]. The very purpose of laws like Koomey and Moore is to describe advances in electronics. While perhaps the amount of energy involved in the unwinding of the mainspring of a mechanical computer can be analysed, I think you'll find that you'd be hard-pressed to get meaningful figures for the energy involved in the operation of an abacus or slide rule—which aren't even complete calculating devices and rely very heavi

          • by dtmos ( 447842 ) *

            They're not actually computing machines.

            Well, not generally programmable computing machines.

            • The difference engine wasn't generally programmable, either, but we can still establish a sense of its power usage based on how much kinetic energy was required to perform its calculations. Automatic mechanical calculators are much closer to being true computers than completely manual devices like the slide rule, the abacus, or a book of trigonometric tables.
            • by xkr ( 786629 )

              OK, I concede they are not programmable. They certainly (in my opinion) should be considered computing machines. However, I left off of my "request list" both programmable analog computers and plug programmable punch card equipment. Today's engineers may laugh, but I was able to do some pretty amazing things with both of those types of hardware. You work with the tools you have.

              I don't know if these fit his proposed curve or not. I would just like to see the result of thinking about that question.

  • OK J.K here is the list of moral / ethical arguments about the path we're on, as seen in your law. You saw the path clearly enough to define a time based law. Are there any issues I'm not seeing on our current path?

    1) Lower energy consumption at point of use
    2) Higher energy consumption at manufacturing point
    3) faster cpu = bigger programs = more bugs = lower quality of life
    4) faster cpu = stronger DRM possibilities
    5) Better processing * battery life = better medical devices
    6) Better processing * battery l

  • Hey J.K. have you run into a law relating battery capacity (either per Kg or L) vs proc speed over time? I bet there is some kind of interesting curve for mobile devices. Or, maybe not, donno thats why I'm asking a guy with previous success at data analysis in a closely related field...

  • I have one:

    Gates' Law: "The bloatedness of software keeps pace exactly with the increase in power of hardware, to ensure that no actual improvements occur in the end user experience."

  • by yakovlev ( 210738 ) on Monday September 26, 2011 @01:03PM (#37517960) Homepage
    Mr. Koomey, if we take your numbers from the attached article, which may not have been quoted correctly...

    Feynman indicated that there was approximately 100 billion times efficiency improvement possible, and 40,000 times improvement has happened so far.

    If we take Feynman's number at face value, this means that if computing efficiency improvements continue at the current rate (doubling every 18 months,) we will reach the theoretical maximum in 2043.

    Based on that, do you believe that we will see a dramatic reduction in efficiency improvements in the next 10-20 years as we approach the theoretical limit, or do you think Feynman was conservative in his estimate?

    Thanks!

We must believe that it is the darkest before the dawn of a beautiful new world. We will see it when we believe it. -- Saul Alinsky

Working...