SevenInchScrew
Nov 12, 08:01 PM
We've been given release dates for this game before, so until this game is in my PS3 and I'm actually playing it, I don't believe them. ;)
It will be nice to actually USE my PS3 again, though, so bring it on Sony.
It will be nice to actually USE my PS3 again, though, so bring it on Sony.
Drew n macs
Apr 7, 10:40 PM
On topic, I called Best Buy and was told that unless I pre-ordered before the day of the sale, I could not get an iPad 2. My co-worker walked in last week off the street and purchased one. Why the inconsistent message? I don't get it.
The same thing happened to me at bestbuy, inventory showed they had ipads available I went to the store and none available. I called a couple hours later and they said the had the 32gb available, so I trek back to BB and by the time I got there they were all gone. Interesting, I don't know what to believe.
The same thing happened to me at bestbuy, inventory showed they had ipads available I went to the store and none available. I called a couple hours later and they said the had the 32gb available, so I trek back to BB and by the time I got there they were all gone. Interesting, I don't know what to believe.
faroZ06
Apr 27, 08:54 AM
http://online.wsj.com/article/SB10001424052748704123204576283580249161342.html
Ah, I see. I wasn't checking the WSJ, only Macrumors.
Ah, I see. I wasn't checking the WSJ, only Macrumors.
Bill McEnaney
Apr 29, 01:04 PM
Would you start a new thread about this please? You've really taken this off course.
As to your second point, it's pointless. I called you out on your assertion that liberals do more of the name calling.
I'll start a new thread. I wasn't talking about liberals in general. I said that most of the name-callers I knew of were liberals.
As to your second point, it's pointless. I called you out on your assertion that liberals do more of the name calling.
I'll start a new thread. I wasn't talking about liberals in general. I said that most of the name-callers I knew of were liberals.
zelet
Aug 25, 04:10 PM
Im sure ill get alot of "pro apple kool-aid drinker" attacks from this but this dosent make me any less of an Apple enthusiest iMikeT
Amen to that! I'm a huge fan of Apple but I wont let them polish a turd and tell me its a diamond.
Amen to that! I'm a huge fan of Apple but I wont let them polish a turd and tell me its a diamond.
zac4mac
Nov 29, 12:47 PM
I also wanted to add... go onto UNIVERSAL MUSIC GROUP (http://new.umusic.com/flash.aspx) and see how many groups you would be missing if ITUNES didn't offer Universal.
If you need "98 DEGREES" on your iPOD, then you better start freaking out...
Otherwise, don't sweat it. Universal has nothing to threaten Apple with. No worries here.
I went there, made it thru the "D"s and came up with these artists in my digital collection:
Aaron Neville
Al Jarreau
BB King
Big Bad Voodoo Daddy
Blues Traveller
Bob Marley and the Wailers
Cardigans
Counting Crows
Cowboy Mouth
Cranberries
David Benoit
Def Leppard
Del Amitri
There's a boat-load more there, yes they're a BIG label.
Z
If you need "98 DEGREES" on your iPOD, then you better start freaking out...
Otherwise, don't sweat it. Universal has nothing to threaten Apple with. No worries here.
I went there, made it thru the "D"s and came up with these artists in my digital collection:
Aaron Neville
Al Jarreau
BB King
Big Bad Voodoo Daddy
Blues Traveller
Bob Marley and the Wailers
Cardigans
Counting Crows
Cowboy Mouth
Cranberries
David Benoit
Def Leppard
Del Amitri
There's a boat-load more there, yes they're a BIG label.
Z
Eidorian
Jul 27, 12:07 PM
hate to be repetative, but this tells me what i already know.
but the quesiton comes with this line:
Does anyone know if the chips that are actually shipping are the same as the prototype chips?
again, sorry for the repatition, but id really like to drop one of these in my mini and dont want to find out AFTER i tear apart the mini that the new chips wont fit!http://www.pcper.com/article.php?aid=276&type=expert&pid=3
There might be a voltage issue. It will fit though.
but the quesiton comes with this line:
Does anyone know if the chips that are actually shipping are the same as the prototype chips?
again, sorry for the repatition, but id really like to drop one of these in my mini and dont want to find out AFTER i tear apart the mini that the new chips wont fit!http://www.pcper.com/article.php?aid=276&type=expert&pid=3
There might be a voltage issue. It will fit though.
Tom359
Apr 25, 04:14 PM
This is why we need a "loser pays" system.
This would get rid if the "I'm going to sue you so you pay money to go away becuase it's cheaper than paying the legal bills." Our system has been corrupted by these nuisance law suits.
This would get rid if the "I'm going to sue you so you pay money to go away becuase it's cheaper than paying the legal bills." Our system has been corrupted by these nuisance law suits.
Vulpinemac
Apr 19, 09:07 PM
Yes. People here are failing to understand the difference between traditional patents that we usually hear about here, and design patents. I believe what Apple is suing over is infringed design patents. That the Galaxy S has a icon grid method for selecting applications is irrelevant in that case. They tried to copy the general design and likeness of the iPhone, which is against the design patents.
Also, whoever it was arguing it previously... Let's not trot out the whole "Apple lost the 'look and feel' argument against Microsoft" thing. That was a different case. Design patents still get filed and granted all the time. This is a new case.
To clarify even farther, the Microsoft "look and feel" lawsuit was a Breach of Copyright suit that Apple lost, not a patent suit. Apple took to patenting their 'look and feel' in order to have a more solid foundation to base future lawsuits.
Also, whoever it was arguing it previously... Let's not trot out the whole "Apple lost the 'look and feel' argument against Microsoft" thing. That was a different case. Design patents still get filed and granted all the time. This is a new case.
To clarify even farther, the Microsoft "look and feel" lawsuit was a Breach of Copyright suit that Apple lost, not a patent suit. Apple took to patenting their 'look and feel' in order to have a more solid foundation to base future lawsuits.
fenderbass146
Apr 8, 12:56 AM
I agree, this rumor is sketchy. It looks like they have one unreliable source. Still, I don't see why BB is good for Apple stuff unless the Apple store is too crowded.
I agree, if I am shopping for apple stuff, i would prefer an apple store, however there is a best buy every where. I live in northwest indiana, and the nearest apple store is 40 minutes away, and im sure a lot of people have it worse.. It would be absoutly idioitic of apple to quit supplying best buy because best buy has a longer reach then apple to more people.
I agree, if I am shopping for apple stuff, i would prefer an apple store, however there is a best buy every where. I live in northwest indiana, and the nearest apple store is 40 minutes away, and im sure a lot of people have it worse.. It would be absoutly idioitic of apple to quit supplying best buy because best buy has a longer reach then apple to more people.
infidel69
Apr 11, 11:33 AM
Big mistake if true.
TripHop
Jun 17, 05:51 AM
West Coast corporate store. At 10AM Paciific they had white codes to order with but no orders were possible due to the overload. So my store manager put all the orders on paper and manually placed them with corporate later in the day over the telephone. He thinks he's getting one WHITE 32 for me and will let me know Tuesday when he gets a copy of the shipping manifest. :eek:
Evangelion
Sep 14, 08:56 AM
On the server side.
Plenty of people ran NT on their desktops.
Nevertheless, ok. Windows did it first.
Admission of your mistakes is a good step in becoming a better person.
Plenty of people ran NT on their desktops.
Nevertheless, ok. Windows did it first.
Admission of your mistakes is a good step in becoming a better person.
NoSmokingBandit
Aug 19, 02:25 PM
All that I get from that quote is that they are using older models, but that they will, obviously, be rendered in the new GT5 engine. So, the marketing team can say all they want, but actual screen shots of Standard™ cars do not show much improvement, if any at all, resolution increase notwithstanding.
Based on what, old gameplay footage? Game are often tested with old resources while the new models are being built. God of War used a stick man with a sword until they got Kratos done.
Look at this pic:
http://us.gran-turismo.com/c/binary/images/5294/gamescom2010_029a.jpg
That rx-7 looks tons better than anything GT4 ever had, but its still not as nice as the "premium" cars. I am assuming of course that this is live-rendered, and i believe it is due to the jaggies on the rear of the rx-7, which i can't imagine they would let slide on a pre-rendered shot.
Time will tell, of course, but i'm certain they didnt just import models from GT4. What the hell would they have been doing for the past 5 years?
Based on what, old gameplay footage? Game are often tested with old resources while the new models are being built. God of War used a stick man with a sword until they got Kratos done.
Look at this pic:
http://us.gran-turismo.com/c/binary/images/5294/gamescom2010_029a.jpg
That rx-7 looks tons better than anything GT4 ever had, but its still not as nice as the "premium" cars. I am assuming of course that this is live-rendered, and i believe it is due to the jaggies on the rear of the rx-7, which i can't imagine they would let slide on a pre-rendered shot.
Time will tell, of course, but i'm certain they didnt just import models from GT4. What the hell would they have been doing for the past 5 years?
Jimmy James
Apr 6, 02:12 PM
I used to own an iPad 1, gave it away, didn't want an iPad 2. Why do I need two devices of the same OS where the UI was designed for the iPhone (smaller device) to begin with?
As was pointed out by a previous poster, iOS was developed for tablet use.
Perhaps you should own an iPad and an Android phone?
As was pointed out by a previous poster, iOS was developed for tablet use.
Perhaps you should own an iPad and an Android phone?
hyperpasta
Nov 28, 06:27 PM
And I don't understand why they should...Can somebody explain it?
The rationale is that iPods are used only for stolen music (which they aren't) and this will help offset the losses (which it won't).
The rationale is that iPods are used only for stolen music (which they aren't) and this will help offset the losses (which it won't).
ergle2
Sep 15, 12:50 PM
More pedantic details for those who are interested... :)
NT actually started as OS/2 3.0. Its lead architect was OS guru Dave Cutler, who is famous for architecting VMS for DEC, and naturally its design influenced NT. And the N-10 (Where "NT" comes from, "N" "T"en) Intel RISC processor was never intended to be a mainstream product; Dave Cutler insisted on the development team NOT using an X86 processor to make sure they would have no excuse to fall back on legacy code or thought. In fact, the N-10 build that was the default work environment for the team was never intended to leave the Microsoft campus. NT over its life has run on X86, DEC Alpha, MIPS, PowerPC, Itanium, and x64.
IBM and Microsoft worked together on OS/2 1.0 from 1985-1989. Much maligned, it did suck because it was targeted for the 286 not the 386, but it did break new ground -- preemptive multitasking and an advanced GUI (Presentation Manager). By 1989 they wanted to move on to something that would take advantage of the 386's 32-bit architecture, flat memory model, and virtual machine support. Simultaneously they started OS/2 2.0 (extend the current 16-bit code to a 16-32-bit hybrid) and OS/2 3.0 (a ground up, platform independent version). When Windows 3.0 took off in 1990, Microsoft had second thoughts and eventually broke with IBM. OS/2 3.0 became Windows NT -- in the first days of the split, NT still had OS/2 Presentation Manager APIs for it's GUI. They ripped it out and created Win32 APIs. That's also why to this day NT/2K/XP supported OS/2 command line applications, and there was also a little known GUI pack that would support OS/2 1.x GUI applications.
All very true, but beyond that -- if you've ever looked closely VMS and at NT, you'll notice, it's a lot more than just "influenced". The core design was pretty much identical -- the way I/O worked, its interrupt handling, the scheduler, and so on -- they're all practically carbon copies. Some of the names changed, but how things work under the hood hadn't. Since then it's evolved, of course, but you'd expect that.
Quite amusing, really... how a heavyweight enterprise-class OS of the 80's became the desktop of the 00's :)
Those that were around in the dim and distant will recall that VMS and Unix were two of the main competitors in many marketplaces in the 80's and early 90's... and today we have OS X, Linux, FreeBSD, Solaris, etc. vs XP, W2K3 Server and (soon) Vista -- kind of ironic, dontcha think? :)
Of course, there's a lot still running VMS to this very day. I don't think HP wants them to tho' -- they just sent all the support to India, apparently, to a team with relatively little experience...
NT actually started as OS/2 3.0. Its lead architect was OS guru Dave Cutler, who is famous for architecting VMS for DEC, and naturally its design influenced NT. And the N-10 (Where "NT" comes from, "N" "T"en) Intel RISC processor was never intended to be a mainstream product; Dave Cutler insisted on the development team NOT using an X86 processor to make sure they would have no excuse to fall back on legacy code or thought. In fact, the N-10 build that was the default work environment for the team was never intended to leave the Microsoft campus. NT over its life has run on X86, DEC Alpha, MIPS, PowerPC, Itanium, and x64.
IBM and Microsoft worked together on OS/2 1.0 from 1985-1989. Much maligned, it did suck because it was targeted for the 286 not the 386, but it did break new ground -- preemptive multitasking and an advanced GUI (Presentation Manager). By 1989 they wanted to move on to something that would take advantage of the 386's 32-bit architecture, flat memory model, and virtual machine support. Simultaneously they started OS/2 2.0 (extend the current 16-bit code to a 16-32-bit hybrid) and OS/2 3.0 (a ground up, platform independent version). When Windows 3.0 took off in 1990, Microsoft had second thoughts and eventually broke with IBM. OS/2 3.0 became Windows NT -- in the first days of the split, NT still had OS/2 Presentation Manager APIs for it's GUI. They ripped it out and created Win32 APIs. That's also why to this day NT/2K/XP supported OS/2 command line applications, and there was also a little known GUI pack that would support OS/2 1.x GUI applications.
All very true, but beyond that -- if you've ever looked closely VMS and at NT, you'll notice, it's a lot more than just "influenced". The core design was pretty much identical -- the way I/O worked, its interrupt handling, the scheduler, and so on -- they're all practically carbon copies. Some of the names changed, but how things work under the hood hadn't. Since then it's evolved, of course, but you'd expect that.
Quite amusing, really... how a heavyweight enterprise-class OS of the 80's became the desktop of the 00's :)
Those that were around in the dim and distant will recall that VMS and Unix were two of the main competitors in many marketplaces in the 80's and early 90's... and today we have OS X, Linux, FreeBSD, Solaris, etc. vs XP, W2K3 Server and (soon) Vista -- kind of ironic, dontcha think? :)
Of course, there's a lot still running VMS to this very day. I don't think HP wants them to tho' -- they just sent all the support to India, apparently, to a team with relatively little experience...
MacRumors
Jul 27, 09:34 AM
http://www.macrumors.com/images/macrumorsthreadlogo.gif (http://www.macrumors.com)
Intel announced (http://www.macworld.com/news/2006/07/27/core2duo/index.php) the long anticipated Core 2 Duo processors today. Intel announced 10 new chips including 5 designed for latops (Merom) and 5 for desktops (Conroe).
Core 2 Duo runs at slower clock speeds than Pentium-era chips, but is still more productive because it handles more calculations per clock cycle, said Sean Tucker, a product manager at HP. Thanks to that slower speed, Core 2 Duo chips need less electricity, drawing just 65 watts compared to the Pentium 4�s 95 watts and Pentium D�s 130 watts.
Intel has already started shipping Core 2 Duo chips to manufacturers, so the first Core 2 Duo Desktop machines should reach consumers in early August. Meanwhile Core 2 Duo laptops will reach consumers by the end of August.
Conroe and Merom are successors to the Core Duo processor which was introduced by Intel early this year. The Core Duo (Yonah) was the first Intel chip used in Apple's switch to intel earlier this year.
At present Apple's lineup is as follows:
Intel: MacBook, MacBook Pro, iMac, Mac mini: Core Duo or Core Solo (Yonah)
PowerPC: PowerMac, Xserve: PowerPC 970 (G5)
Newer processors from Intel sharing a new architecture now include:
Core 2 Duo mobile (Merom)
Core 2 Duo desktop (Conroe)
Xeon 5100 (Woodcrest)
Woodcrest is rumored (http://www.macrumors.com/pages/2006/07/20060711225142.shtml) to be used in the Mac Pro, which is expected be released at WWDC 2006. Apple's use of the Core 2 Duo is not yet clear, but the Core 2 Duo mobile (Merom) is pin compatible (http://www.macrumors.com/pages/2006/06/20060613185240.shtml) with the current Core Duo (Yonah). This means that Apple could easily upgrade the existing Intel-based Macs to the newer processor with no design changes.
Intel announced (http://www.macworld.com/news/2006/07/27/core2duo/index.php) the long anticipated Core 2 Duo processors today. Intel announced 10 new chips including 5 designed for latops (Merom) and 5 for desktops (Conroe).
Core 2 Duo runs at slower clock speeds than Pentium-era chips, but is still more productive because it handles more calculations per clock cycle, said Sean Tucker, a product manager at HP. Thanks to that slower speed, Core 2 Duo chips need less electricity, drawing just 65 watts compared to the Pentium 4�s 95 watts and Pentium D�s 130 watts.
Intel has already started shipping Core 2 Duo chips to manufacturers, so the first Core 2 Duo Desktop machines should reach consumers in early August. Meanwhile Core 2 Duo laptops will reach consumers by the end of August.
Conroe and Merom are successors to the Core Duo processor which was introduced by Intel early this year. The Core Duo (Yonah) was the first Intel chip used in Apple's switch to intel earlier this year.
At present Apple's lineup is as follows:
Intel: MacBook, MacBook Pro, iMac, Mac mini: Core Duo or Core Solo (Yonah)
PowerPC: PowerMac, Xserve: PowerPC 970 (G5)
Newer processors from Intel sharing a new architecture now include:
Core 2 Duo mobile (Merom)
Core 2 Duo desktop (Conroe)
Xeon 5100 (Woodcrest)
Woodcrest is rumored (http://www.macrumors.com/pages/2006/07/20060711225142.shtml) to be used in the Mac Pro, which is expected be released at WWDC 2006. Apple's use of the Core 2 Duo is not yet clear, but the Core 2 Duo mobile (Merom) is pin compatible (http://www.macrumors.com/pages/2006/06/20060613185240.shtml) with the current Core Duo (Yonah). This means that Apple could easily upgrade the existing Intel-based Macs to the newer processor with no design changes.
louis Fashion
Apr 11, 12:01 PM
Hope to see VZ convergence in 2012. Hate to wait tho.....
notabadname
Mar 22, 04:06 PM
It's simple: Apple is always behind hardware-wise because they like to priorize esthetics and appearance
Android phones are selling more than iPhone.
I've only bought the first iPad because there were no competitors at that time (and I hate netbooks), but now things are different. To be honest, A LOT different.
1st point: It's factually inaccurate to make your first statement, as evidenced by your last statement. Kind of funny, don't you think?
In your second statement, you are comparing all Android software-running phones to a single model/product line, the iPhone. The iPhone (each generation) has out sold any single phone model (generation) over it's life than that of any offered by any other hardware manufacturer.
Your comparison is like saying Toyota has sold more cars than Ford has sold F-150s. That may be true, but the F-150 is still the number one selling truck in the US, even though it does not outsell the sum total of all other trucks by all other manufacturers.
You should compare a single phone model, say Motorola Droid or HTC Incredible. You are simply talking software. Apple is primarily a hardware company that happens to make the software for its hardware. (yes, I know about FCP and other software) They do not license the iOS software to other manufacturers, so comparison to Google's OS and number of DIFFERENT phones it runs on is really irrelevant to whether any hardware manufacturer has had a more successful phone than the iPhone.
Android phones are selling more than iPhone.
I've only bought the first iPad because there were no competitors at that time (and I hate netbooks), but now things are different. To be honest, A LOT different.
1st point: It's factually inaccurate to make your first statement, as evidenced by your last statement. Kind of funny, don't you think?
In your second statement, you are comparing all Android software-running phones to a single model/product line, the iPhone. The iPhone (each generation) has out sold any single phone model (generation) over it's life than that of any offered by any other hardware manufacturer.
Your comparison is like saying Toyota has sold more cars than Ford has sold F-150s. That may be true, but the F-150 is still the number one selling truck in the US, even though it does not outsell the sum total of all other trucks by all other manufacturers.
You should compare a single phone model, say Motorola Droid or HTC Incredible. You are simply talking software. Apple is primarily a hardware company that happens to make the software for its hardware. (yes, I know about FCP and other software) They do not license the iOS software to other manufacturers, so comparison to Google's OS and number of DIFFERENT phones it runs on is really irrelevant to whether any hardware manufacturer has had a more successful phone than the iPhone.
Dark K
Jun 19, 03:29 PM
If anyone can answer me this question, it would be most appreciated :D
Does anyone know how many iPhone 4s Radioshack will be getting apart from those that they "reserved"?
Does anyone know how many iPhone 4s Radioshack will be getting apart from those that they "reserved"?
studiomusic
Nov 29, 12:08 PM
Does she appear on emusic?
Why yes, she does!
Got a few people from the SLC here I see...
Why yes, she does!
Got a few people from the SLC here I see...
shamino
Jul 22, 12:18 PM
So I read in this thread that Kentsfield and Clovertown ARE compatible with Conroe and Woodcrest sockets (respectively) (Cloverton or Clovertown?)
Well, people here have mentioned it. I haven't seen any sources for these claims, however.
It's worth noting that the Pentium 4 shipped in several different socket packages over the years. The fact that the cores might be electrically compatible does not necessarily mean you're going to be able to perform a chip-swap upgrade on your Mac!
Hope for upgrading an iMac to Quad Core is kindled! At least if Apple releases Conroe iMacs.
And assuming they don't solder the chip to the motherboard, or hardwire the clock-multiplier chips, or hard-wire the voltage regulator settings, etc.
There are a lot of things that can be done to a motherboard to make these kinds of upgrades painful or even impossible.
With any kind of rumor like this, "I'll believe it when I see it" should be your mantra. Sure, these kinds of upgrades would be great, and it may even be possible to perform them on generic PC motherbaords, but this doesn't necessarily mean it will be easy or even possible on the systems Apple ends up shipping.
BTW, In my opinion, one thing a person should never, ever say is some computer has too much power, and that it will never be needed.
"Never" is always too strong a word. But there are plenty of good reasons to say "useless for today's applications" or "not worth the cost".
When applications start demanding more, and when costs come down, then the equations change. As they always do.
When we will be able to download our entire lives, and even conciousness into a computer, as is said to happen in about 40 years (very much looking forward to)...
You're looking forward to this? Let's hope for your sake that Microsoft has nothing to do with the system software.
I don't think it will be possible, even in 40 years, despite what sci-fi authors are predicting. And there's no way I'd ever have such a system installed even if it would be come possible. The possibility of dying or becoming comatose, or even worse, as a result of a software glitch is something I'm not going to allow. To quote McCoy from Star Trek: "Let's see how it scrambles your molecules first."
So as a conclusion to my most recent rant, Please, never tell me a computer is too powerfu, has too many cores, or has too much storage capacity. If it is there to be used, it will be used. It always is.
But do you want to be the first person to have to pay for it?
Well, people here have mentioned it. I haven't seen any sources for these claims, however.
It's worth noting that the Pentium 4 shipped in several different socket packages over the years. The fact that the cores might be electrically compatible does not necessarily mean you're going to be able to perform a chip-swap upgrade on your Mac!
Hope for upgrading an iMac to Quad Core is kindled! At least if Apple releases Conroe iMacs.
And assuming they don't solder the chip to the motherboard, or hardwire the clock-multiplier chips, or hard-wire the voltage regulator settings, etc.
There are a lot of things that can be done to a motherboard to make these kinds of upgrades painful or even impossible.
With any kind of rumor like this, "I'll believe it when I see it" should be your mantra. Sure, these kinds of upgrades would be great, and it may even be possible to perform them on generic PC motherbaords, but this doesn't necessarily mean it will be easy or even possible on the systems Apple ends up shipping.
BTW, In my opinion, one thing a person should never, ever say is some computer has too much power, and that it will never be needed.
"Never" is always too strong a word. But there are plenty of good reasons to say "useless for today's applications" or "not worth the cost".
When applications start demanding more, and when costs come down, then the equations change. As they always do.
When we will be able to download our entire lives, and even conciousness into a computer, as is said to happen in about 40 years (very much looking forward to)...
You're looking forward to this? Let's hope for your sake that Microsoft has nothing to do with the system software.
I don't think it will be possible, even in 40 years, despite what sci-fi authors are predicting. And there's no way I'd ever have such a system installed even if it would be come possible. The possibility of dying or becoming comatose, or even worse, as a result of a software glitch is something I'm not going to allow. To quote McCoy from Star Trek: "Let's see how it scrambles your molecules first."
So as a conclusion to my most recent rant, Please, never tell me a computer is too powerfu, has too many cores, or has too much storage capacity. If it is there to be used, it will be used. It always is.
But do you want to be the first person to have to pay for it?
Hellhammer
Apr 8, 09:01 AM
The trouble is .. I find the TDP numbers for Sandy Bridge very misleading. For example the previous i7 2.66Ghz dual core had a TDP of 35W and the current i7 2.2Ghz quad core has a TDP of 45W. Theoretically, it should only use 10W more when doing CPU intensive task, but according to anandtech who measured the task, the i7 Sandy Bridge Quad core was using almost 40W more when running cinebench.
http://www.anandtech.com/show/4205/the-macbook-pro-review-13-and-15-inch-2011-brings-sandy-bridge/14
It just doesn't make any sense. Going by those figures, if the i7 dual core was 35W, the i7 Sandy Bridge quad core would be around 70W.
Not sure how this relates to potential MacBook Air Sandy Bridge processors, but keep in mind.. there must be a reason why Samsung went for the ULV processor in their 13" laptop instead of the LV one.
CPU isn't the only thing that changed. AMD 6750M (~30W) has higher TDP than NVidia GT 330M (~23W). I had to put ~ because their TDPs are not officially stated by AMD or NVidia so it's just based on previous GPUs and their TDPs. The point is that AMD 6750M has higher TDP.
There is also another thing. TDP is not the maximum power draw. Maximum power dissipation is usually 20-30% more than the actual TDP. While MPD is rarely achieved as it requires maximum voltage and temperature, it can (nearly) be achieved with heavy benchmarking applications.
For example, the combined TDP from quad core SB and AMD 6750M is 75W. If we use 20% extra as the MPD, that is 90W, just from the CPU and GPU! Of course those parts are not using 90W in that test because things like screen, HD, RAM etc need power too. As the MPD is usually in percents, it can explain why the difference is so big in watts.
40W sounds a bit too much to explain with MPD though. IIRC the GT 330M is underclocked but I'm not 100% sure. You have a valid point that the SBs may be using more power than their predecessors. To make this more accurate, we should compare them with C2Ds though ;)
I guess we will have to wait and see, but an ULV in 13" would be more than a disappointment.
http://www.anandtech.com/show/4205/the-macbook-pro-review-13-and-15-inch-2011-brings-sandy-bridge/14
It just doesn't make any sense. Going by those figures, if the i7 dual core was 35W, the i7 Sandy Bridge quad core would be around 70W.
Not sure how this relates to potential MacBook Air Sandy Bridge processors, but keep in mind.. there must be a reason why Samsung went for the ULV processor in their 13" laptop instead of the LV one.
CPU isn't the only thing that changed. AMD 6750M (~30W) has higher TDP than NVidia GT 330M (~23W). I had to put ~ because their TDPs are not officially stated by AMD or NVidia so it's just based on previous GPUs and their TDPs. The point is that AMD 6750M has higher TDP.
There is also another thing. TDP is not the maximum power draw. Maximum power dissipation is usually 20-30% more than the actual TDP. While MPD is rarely achieved as it requires maximum voltage and temperature, it can (nearly) be achieved with heavy benchmarking applications.
For example, the combined TDP from quad core SB and AMD 6750M is 75W. If we use 20% extra as the MPD, that is 90W, just from the CPU and GPU! Of course those parts are not using 90W in that test because things like screen, HD, RAM etc need power too. As the MPD is usually in percents, it can explain why the difference is so big in watts.
40W sounds a bit too much to explain with MPD though. IIRC the GT 330M is underclocked but I'm not 100% sure. You have a valid point that the SBs may be using more power than their predecessors. To make this more accurate, we should compare them with C2Ds though ;)
I guess we will have to wait and see, but an ULV in 13" would be more than a disappointment.
No hay comentarios:
Publicar un comentario