View Full Version : Intel Broadwell
Timid
01-17-2013, 01:06 PM
Hi,
as you all know, there has been some rumor that Intel is going to kill the desktop by not offering socketed CPUs. Semi Accurate also had an article about it, claiming that there won't be any desktop Broadwell but instead only an improved Haswell (Broadwell only for mobile applications).
But according to these sites, there will be a socketable Broadwell CPU for desktop:
http://www.techpowerup.com/177817/Intel-Haswell-and-Broadwell-Silicon-Variants-Detailed.html
http://gamingio.com/2012/12/intel-haswell-and-broadwell-processors-detailed/
So what's true now?
I'm also a little bit puzzled since Semi Accurate also had an article about how Intel could kill dedicated GPUs. It was said that Broadwell would only offer 4 PCIe 2.0 lanes.
So if there will be a socketable Quad-Core Broadwell for Desktop, will it also only support PCIe 2.0 x4?
Joe212
01-17-2013, 03:58 PM
Hi,
I'm also a little bit puzzled since Semi Accurate also had an article about how Intel could kill dedicated GPUs. It was said that Broadwell would only offer 4 PCIe 2.0 lanes.
So if there will be a socketable Quad-Core Broadwell for Desktop, will it also only support PCIe 2.0 x4?
I would highly doubt that limit on LGA1150 desktop versions. I think it will support as many lanes as Haswell, and as many as the socket supports.
The BGA package will likely be scaled down for laptops and NUC types, that don't allow of full PCIe expansion in the first place, and maybe that BGA socket / pinout only supports 4 PCIe lanes.
And, there will likely be a high end Broadwell-E desktop at some point, sharing socket with servers, that will definitely need all the PCIe lanes that servers need.
In a story that SemiAccurate has been following for several months, Broadwell will not come in an LGA package, so no removable CPU.
Any LGA, consumer or business, for desktop/mobile = non-story
charlie
01-17-2013, 10:01 PM
Hi,
as you all know, there has been some rumor that Intel is going to kill the desktop by not offering socketed CPUs. Semi Accurate also had an article about it, claiming that there won't be any desktop Broadwell but instead only an improved Haswell (Broadwell only for mobile applications).
But according to these sites, there will be a socketable Broadwell CPU for desktop:
http://www.techpowerup.com/177817/Intel-Haswell-and-Broadwell-Silicon-Variants-Detailed.html
http://gamingio.com/2012/12/intel-haswell-and-broadwell-processors-detailed/
So what's true now?
I'm also a little bit puzzled since Semi Accurate also had an article about how Intel could kill dedicated GPUs. It was said that Broadwell would only offer 4 PCIe 2.0 lanes.
So if there will be a socketable Quad-Core Broadwell for Desktop, will it also only support PCIe 2.0 x4?
You notice they all copied a chart from someone Spanish, and no one has any real info. That should tell you something.
-Charlie
dragontamer5788
01-18-2013, 01:24 AM
At the risk of shaving too far with Occam's Razor, what about the possibility that Broadwell is going to be only a Laptop / Tablet part?
Fewer PCIe lanes, no socket, relatively powerful integrated GPU... Intel seems to be moving away from a classic PC design.
Drunkenmaster
01-18-2013, 01:56 AM
The question marks next to the names of the Broadwell chips fills you with confidence in the graph... no really ;)
Timid
01-18-2013, 05:30 AM
You notice they all copied a chart from someone Spanish, and no one has any real info. That should tell you something.
-Charlie
Another question: You wrote that Intel is likely to bring out socketed processors with the Skylake architecture again.
Do you believe that Intel is going to offer the full amount of PCIe 3.0 with Skylake, or will they be already using this CPU generation to kill dedicated GPUs from the desktop market (and only offer a limited amount of PCIe lanes - e.g. PCIe 2.0 x4)?
charlie
01-18-2013, 11:09 AM
At the risk of shaving too far with Occam's Razor, what about the possibility that Broadwell is going to be only a Laptop / Tablet part?
Fewer PCIe lanes, no socket, relatively powerful integrated GPU... Intel seems to be moving away from a classic PC design.
Nearly 100%. There are AIW/embedded versions though.
-Charlie
Timid
01-18-2013, 01:13 PM
Nearly 100%. There are AIW/embedded versions though.
-Charlie
Is it possible that this LGA Broadwell for Desktop in those slides is in reality just a renamed Haswell, just like there are renamed GPUs all of the time? (as you said, Intel might bring out Haswell-CPUs for PC with slightly increased clock for Desktop)
Could you also answer my previous question: You wrote that Intel is likely to bring out socketed processors with the Skylake architecture again.
Do you believe that Intel is going to offer the full amount of PCIe 3.0 with Skylake, or will they be already using this CPU generation to kill dedicated GPUs from the desktop market (and only offer a limited amount of PCIe lanes - e.g. PCIe 2.0 x4)?
sdlvx
01-18-2013, 02:00 PM
Is it possible that this LGA Broadwell for Desktop in those slides is in reality just a renamed Haswell, just like there are renamed GPUs all of the time? (as you said, Intel might bring out Haswell-CPUs for PC with slightly increased clock for Desktop)
Could you also answer my previous question: You wrote that Intel is likely to bring out socketed processors with the Skylake architecture again.
Do you believe that Intel is going to offer the full amount of PCIe 3.0 with Skylake, or will they be already using this CPU generation to kill dedicated GPUs from the desktop market (and only offer a limited amount of PCIe lanes - e.g. PCIe 2.0 x4)?
Yes
http://money.cnn.com/2013/01/17/technology/enterprise/intel-earnings/index.html
I remember getting laughed at for saying Intel would be in trouble. I forget who but someone owes me an appology.
Alexko
01-18-2013, 03:10 PM
They still made a very comfortable amount of profit.
So did HTC a couple of years ago. Things can change.
-dan
Timid
01-18-2013, 03:21 PM
Yes
http://money.cnn.com/2013/01/17/technology/enterprise/intel-earnings/index.html
I remember getting laughed at for saying Intel would be in trouble. I forget who but someone owes me an appology.
I'm not quite sure which one of my two questions you wanted to answer :)
Anyway, these were my questions:
1: Is it possible that this LGA Broadwell for Desktop in those slides (those which I linked to in my first post) is in reality just a renamed Haswell, just like there are renamed GPUs all of the time? (as Semi Accurate wrote, Intel might bring out Haswell-CPUs for PC with slightly increased clock for Desktop)
2: Semi Accurate wrote that Intel is likely to bring out socketed processors with the Skylake architecture again.
Will Intel offer the full amount of PCIe 3.0 with Skylake, or will they be already using this CPU generation to kill dedicated GPUs from the desktop market (and only offer a limited amount of PCIe lanes - e.g. PCIe 2.0 x4)?
Hope it's clearer now :)
sdlvx
01-18-2013, 10:12 PM
I'm not quite sure which one of my two questions you wanted to answer :)
Anyway, these were my questions:
1: Is it possible that this LGA Broadwell for Desktop in those slides (those which I linked to in my first post) is in reality just a renamed Haswell, just like there are renamed GPUs all of the time? (as Semi Accurate wrote, Intel might bring out Haswell-CPUs for PC with slightly increased clock for Desktop)
Sorry, I wasn't very clear. Intel will have to do something with the chips that can't be binned for mobile parts. I am guessing Intel will just shovel the reject mobile chips onto desktop.
Intel has the enthusiast market and DIY market nearly conquered right now, yet profits are declining and the market is shrinking. You can't keep working on a dying market on its own and expect profits.
Also, given Intels' history, they're going to go where the money is. Intel has never been a company that would continue to do things for enthusiasts or whatever. If they are losing money on enthusiast platforms they will scale back.
Look how much goes into LGA 2011 platform and look how many people you see building LGA2011 rigs. We will be running into a time when the enthusiast platform is two generations behind the commoner platform. If there was money in LGA2011 rigs, Intel would be all over it and we'd already have IB-E.
I expect that attitude to trickle down to LGA1150 and I expect BGA to trickle up from the mobile platforms until every DIY guy is coerced into LGA2011. LGA1150 will start to get the new stuff much slower than the mobile platforms, and it will slowly be consumed by LGA2011.
2: Semi Accurate wrote that Intel is likely to bring out socketed processors with the Skylake architecture again.
Will Intel offer the full amount of PCIe 3.0 with Skylake, or will they be already using this CPU generation to kill dedicated GPUs from the desktop market (and only offer a limited amount of PCIe lanes - e.g. PCIe 2.0 x4)?
Hope it's clearer now :)
I'm going to guess that by the time Skylake comes around, there will be no platform on Intel that is currently occupied by LGA1155, and all the CPUs that would fall into that category (like core i5 3570k and lower) will be BGA.
Skylake will probably be the update to the enthusiast platform (replacing LGA2011) after LGA2011 lives off of IB-E and *maybe* Haswell-E for the time between.
When Broadwell comes out, I'm assuming Intel is planning on the jump to 14nm eliminating the need for binning reject chips as desktop parts and the focus on that platform will shrink even more.
Of course, all of this is just speculation. Lots of enthusiast computers are built for gaming, and these rigs are all playing games designed for consoles that are very old. When the new consoles come out, we may find a huge need to upgrade CPUs and it may lift the market back up. I would imagine these new builds that are getting Phenom 2 and core i3 will face plant by that time. However, I don't know how many people who are building these very cheap rigs will be able to jump into Intels more expensive platforms if Intel tries to coerce everyone into LGA2011 or its successor.
All of this is speculation and conjecture. I'm open for criticism but please be nice about it.
Timid
01-19-2013, 05:59 AM
Thanks sdlvx for your answer!
I expect that attitude to trickle down to LGA1150 and I expect BGA to trickle up from the mobile platforms until every DIY guy is coerced into LGA2011. LGA1150 will start to get the new stuff much slower than the mobile platforms, and it will slowly be consumed by LGA2011.
Would it be profitable for Intel to release a Braodwell LGA 1150 because of it's smaller (14 nm) process and therefore smaller die (less cost per die)?
I'm going to guess that by the time Skylake comes around, there will be no platform on Intel that is currently occupied by LGA1155, and all the CPUs that would fall into that category (like core i5 3570k and lower) will be BGA.
There was an article about socketed CPUs on Semi Accurate a few weeks ago. As far as I know it was said that Intel would reintroduce socketed CPUs with Skylake, but it wasn't unveiled if this would be LGA 1150 or LGA 2011 :(
sdlvx
01-19-2013, 02:28 PM
Would it be profitable for Intel to release a Braodwell LGA 1150 because of it's smaller (14 nm) process and therefore smaller die (less cost per die)?
Yes. Intel is more than likely making an absolute killing on i3 dual cores. It's something like a 90mm^2 die and it's competing with AMD FX 4300 and FX 6300 (by price), which are large disabled dies.
I'm personally conflicted because I don't know when Intel is going to push everyone into enthusiast platform or how high up that's going to go. It's possible Intel would just make the i3s.
However, notice how dual core LGA is missing at broadwell? It's all a part of Intel slowing getting rid of LGA mainstream socket. I suppose that the whole rumor of Intel sockets disappearing was released intentionally to gauge how the public would react, and how quickly Intel transitions to BGA in mainstream.
Broadwell LGA is going to exist solely for quad core broadwells. How long do you think Intel is going to keep a platform for a single CPU with different bins? Not very.
I also don't see why Intel would transition main stream to BGA and then turn around and go, "oh, we changed our minds! We're bringing back the mainstream socket again!" I definitely think Skylake socket is pure enthusiast socket at that point. No dual core Broadwell on LGA is proof that Intel is heading in that direction.
There was an article about socketed CPUs on Semi Accurate a few weeks ago. As far as I know it was said that Intel would reintroduce socketed CPUs with Skylake, but it wasn't unveiled if this would be LGA 1150 or LGA 2011 :(
It will be LGA 2011 or whatever comes after that, I can promise you that. Those roadmaps point clearly to the fact that Intel is slowly killing mainstream sockets from the bottom up.
charlie
01-20-2013, 01:04 PM
Another question: You wrote that Intel is likely to bring out socketed processors with the Skylake architecture again.
Do you believe that Intel is going to offer the full amount of PCIe 3.0 with Skylake, or will they be already using this CPU generation to kill dedicated GPUs from the desktop market (and only offer a limited amount of PCIe lanes - e.g. PCIe 2.0 x4)?
Last I heard, desktop Sky Lakes were still being debated internally, so this is all predicated on an 'if' scenario. If they do, then yes, it would have some PCIe lanes, 3 or 4 I am not sure, but it would have them. That is when you start talking about bifurcating the low end SKUs again, and then it gets really messy.
-Charlie
charlie
01-20-2013, 01:06 PM
1: Is it possible that this LGA Broadwell for Desktop in those slides (those which I linked to in my first post) is in reality just a renamed Haswell, just like there are renamed GPUs all of the time? (as Semi Accurate wrote, Intel might bring out Haswell-CPUs for PC with slightly increased clock for Desktop)
That's my current guess. I am 100% certain that Intel will put out bumped Haswells in that time period. What they call them is unknown, but they will be haswell or haswell step+1 cores.
-Charlie
Last I heard, desktop Sky Lakes were still being debated internally, so this is all predicated on an 'if' scenario. If they do, then yes, it would have some PCIe lanes, 3 or 4 I am not sure, but it would have them. That is when you start talking about bifurcating the low end SKUs again, and then it gets really messy.
-Charlie
This makes the articles about Intel going BGA/"killing off the desktop and GPU's" kind of worthless. So it will be exactly like people thought: a staggered change over 3-4 years. By then no one is going to care about Intel in the consumer market.
Joe212
01-20-2013, 03:04 PM
Anyway, these were my questions:
1: Is it possible that this LGA Broadwell for Desktop in those slides (those which I linked to in my first post) is in reality just a renamed Haswell, just like there are renamed GPUs all of the time? (as Semi Accurate wrote, Intel might bring out Haswell-CPUs for PC with slightly increased clock for Desktop)
Broadwell is a 14nm shrink of Haswell. It will be entering production in 2nd half of 2013, with release probably in 1st half of 2014.
I am not sure what you mean by renamed. There will most likely be no changes to CPU core from Haswell to Broadwell - if that is what you mean by renamed. There may be changes to the graphics part of the die, since the shrink will allow more GPU resources on the die..
As far as clock speed, Ivy Bridge shrink brought pretty much nothing as far as clock speed scaling - so far. We will see what Ivy Bridge - E brings, but probably not a lot.
As far as absolute (single thread) performance increases, Broadwell shrink will not likely bring any more than Haswell, so if you are looking for single thread CPU performance increases, Haswell will be it for a while...
With the Broadwell die shrink, I would espect Intel to invest big portion of die savings into GPU resources and performance. With Bay Trail (next Atom), Intel will have a CPU that will target low power targets. Does Broadwell need to go there as well? More likely, Intel will be targetting GPU performance levels of Kaveri with the Broadwell at more mainstream power budgets... Just my opinion...
charlie
01-21-2013, 10:52 AM
This makes the articles about Intel going BGA/"killing off the desktop and GPU's" kind of worthless. So it will be exactly like people thought: a staggered change over 3-4 years. By then no one is going to care about Intel in the consumer market.
I disagree, but feel free to disregard that. We shall see.
-Charlie
Timid
01-24-2013, 05:50 AM
Last I heard, desktop Sky Lakes were still being debated internally, so this is all predicated on an 'if' scenario. If they do, then yes, it would have some PCIe lanes, 3 or 4 I am not sure, but it would have them. That is when you start talking about bifurcating the low end SKUs again, and then it gets really messy.
-Charlie
PCIe 2.0?
And are you talking about LGA 1150 (or whatever it's successor will be called) or LGA 2011?
And what about Servers? Will they have enough PCI lanes? And if not: How exactly would Intel enable their customers to plug in one of their Larrabees/Phi?
Copper
01-24-2013, 10:10 AM
Servers are always a different ballgame.
charlie
01-24-2013, 11:39 AM
PCIe 2.0?
And are you talking about LGA 1150 (or whatever it's successor will be called) or LGA 2011?
And what about Servers? Will they have enough PCI lanes? And if not: How exactly would Intel enable their customers to plug in one of their Larrabees/Phi?
It won't be called anything because it won't exist. :P
And as Copper said, this is desktop, not server.
-Charlie
EvilBlitz
01-24-2013, 03:28 PM
So if servers will have the extra PCI lanes, what is stopping someone from taking a server chipset and making a desktop board out of it? Already been done with Skulltrail.
Might be pricey for the low end, but I could see an enthusiast board.
Edit. Does this change mean they will be killing off the K series CPUs as well?
rich wargo
01-24-2013, 03:55 PM
Sure, and I could purchase an IBM zSeries mainframe and use it to run games on, too. Or maybe one of those "old" DOE supercomputers. Helluva gaming system.
Besides, what are you going to do once TinyFlaccid decides to "fix" consumer Windoze so it won't run on server CPUs? You'd have to buy Windoze Server, which, lo and behold, won't support GPUs. I see that coming. Windoze becomes even more bifurcated. Actually trifurcated, one Windoze for phones/tables, another Windoze for laptops/desktops, and one for servers, all tailored to meet their environment with NO crossover.
Megol
01-25-2013, 07:12 AM
Sure, and I could purchase an IBM zSeries mainframe and use it to run games on, too. Or maybe one of those "old" DOE supercomputers. Helluva gaming system.
That's a hell of a strawman you built there.
Hint: servers aren't all super computer level...
Besides, what are you going to do once TinyFlaccid decides to "fix" consumer Windoze so it won't run on server CPUs? You'd have to buy Windoze Server, which, lo and behold, won't support GPUs. I see that coming. Windoze becomes even more bifurcated. Actually trifurcated, one Windoze for phones/tables, another Windoze for laptops/desktops, and one for servers, all tailored to meet their environment with NO crossover.
Why would they? They have zero reason not supporting workstations _with_ GPUs. Why wouldn't they support GPUs on servers?!?
Do you really think MS would do extra work just to spite people that want powerful computers without any chance of earning extra money?
EvilBlitz
01-25-2013, 10:25 PM
Rich, you don't seem to remember the multiple instances in the past of server/workstation parts or even mobile being used in the enthusiast market when that was not their intended market.
The Opteron 100 series cpus you probably helped make quite a few of sold like mad to enthusiasts.
That is why I asked if they will be getting rid of K series CPUs as well, as I don't see the point in unlocked CPUs if you are locking down the platform so much.
I would expect Intel to further lock it down or restrict partners etc, not MS. I do not believe the OS side would be an issue.
Joe212
01-27-2013, 02:20 PM
Rich, you don't seem to remember the multiple instances in the past of server/workstation parts or even mobile being used in the enthusiast market when that was not their intended market.
The instances I remember where result of motherboard companies not providing full set of drivers for all the OSs, or not supporting, for example, plain, un-buffered DRAM in mobos intended for processors that do support un-buffered DRAM.
I never came to one instance where Microsoft would prevent you from installing whatever version of OS on whatever version of hardware. Microsoft supports developers, and makes it relatively easy for the developers to set up test environment that suits them...
charlie
01-27-2013, 08:03 PM
The instances I remember where result of motherboard companies not providing full set of drivers for all the OSs, or not supporting, for example, plain, un-buffered DRAM in mobos intended for processors that do support un-buffered DRAM.
I never came to one instance where Microsoft would prevent you from installing whatever version of OS on whatever version of hardware. Microsoft supports developers, and makes it relatively easy for the developers to set up test environment that suits them...
MS has locked out all other OSes from machines that get Win8 logo'd.
-Charlie
testbug00
01-27-2013, 09:42 PM
MS has locked out all other OSes from machines that get Win8 logo'd.
-Charlie
Ah! That must be why Lenovo is still selling do many thinkpads (all currently?) with default windows 7 :)...
I love my T61... If only it didn't have a Quadro nvs 140m driver/overheat clusterfk :(
tgrech
01-28-2013, 11:29 AM
MS has locked out all other OSes from machines that get Win8 logo'd.
-Charlie
I thought a workaround was made for installing Linux on Windows 8 UEFI SecureBoot PCs, with an official one in the works, and Windows(Older version) installs are fine? Obviously there are a few more OS' in circulation that aren't Windows NT or Linux based that won't pass SecureBoot, but then I doubt the type of people who need/use an OS like that probably wouldn't be buying a Windows 8 PC first to install it on.
rich wargo
01-28-2013, 11:34 AM
Do you really think MS would do extra work just to spite people that want powerful computers without any chance of earning extra money?
I think TinyFlaccid will do whatever it can get away with to maximize their profit. If that means having multiple "versions" of Windoze, they will gladly do that.
It's not a matter of spite, which is an emotional response; it's a matter of profit, which is a business response.
rich wargo
01-28-2013, 11:40 AM
Rich, you don't seem to remember the multiple instances in the past of server/workstation parts or even mobile being used in the enthusiast market when that was not their intended market.
The Opteron 100 series cpus you probably helped make quite a few of sold like mad to enthusiasts.
That is why I asked if they will be getting rid of K series CPUs as well, as I don't see the point in unlocked CPUs if you are locking down the platform so much.
I would expect Intel to further lock it down or restrict partners etc, not MS. I do not believe the OS side would be an issue.
1. I remember the introduction of the IBM OS/360. And pretty much everything since.
2. Nobody cares about enthusiasts; the money is made elsewhere. Who cares what a few maniacs with too much time on their hands do? Businesses are focused on profits.
3. I don't know or care about Intel CPUs, or if they produce unlocked SKUs or not.
4. If TinyFlaccid thought they could get away with it, they would provide a tailored version of Windoze for each Intel SKU, and fix it so that it would only work with that particular SKU. However, TinyFlaccid is wary of FTC and DOJ. Not scared, just wary. Remember, it's all about maximizing profits.
charlie
01-28-2013, 12:40 PM
I thought a workaround was made for installing Linux on Windows 8 UEFI SecureBoot PCs, with an official one in the works, and Windows(Older version) installs are fine? Obviously there are a few more OS' in circulation that aren't Windows NT or Linux based that won't pass SecureBoot, but then I doubt the type of people who need/use an OS like that probably wouldn't be buying a Windows 8 PC first to install it on.
It needs a MS signed shim which they can arbitrarily and retroactively pull at any time. Other OSes operate with their permission now,
-Charlie
Joe212
01-28-2013, 02:25 PM
3. I don't know or care about Intel CPUs, or if they produce unlocked SKUs or not.
Intel does have unlocked CPUs but is there any point to overclocking?
CPUs have automatic overclock (Turbo or whatever), you can set memory speed independently. That's all I need, and I used to overclock.
One way to look at it is that Intel and AMD have brought hassle free overclocking to masses...
rich wargo
01-28-2013, 02:27 PM
A reason to overclock is that the interchip links are tied to CPU clock, not to CPU multiplier.
It needs a MS signed shim which they can arbitrarily and retroactively pull at any time. Other OSes operate with their permission now,
-Charlie
http://install-climber.blogspot.com/2013/01/windows8-howtodisablesecureboot-bootfromcddvdinstallationmedia.html
You can still disable secure boot.... for now.
Joe212
01-28-2013, 03:39 PM
A reason to overclock is that the interchip links are tied to CPU clock, not to CPU multiplier.
The main one to overclock that made a difference is the CPU -> main memory. That one can now be set independently.
Another one, to memory controller is no longer bottleneck since, memory controller is on CPU die.
So what is left is link to South Bridge (no reason to overclock) and PCIe - no reason to overclock.
What an overclock can give you is ability to run all cores simultaneously at higher clock speed. Let's say 3.8 vs. 3.2 GHz. But outside of folding etc, it is not that common that the CPU would have all cores running with 100% utilization. If only one core is at 100%, Turbo may give you as much as an overclock, with no fiddling whatsoever, and complete stability.
Intel and AMD already picked the low hanging fruits of overclocking and gave it to its customers...
podspi
01-28-2013, 10:29 PM
http://install-climber.blogspot.com/2013/01/windows8-howtodisablesecureboot-bootfromcddvdinstallationmedia.html
You can still disable secure boot.... for now.
Wasn't it actually a requirement (from Microsoft) that secure boot on x86 had to have an off switch?
For ARM devices, yeah. But that is the norm for ARM devices that aren't a Nexus.
charlie
01-29-2013, 02:06 PM
http://install-climber.blogspot.com/2013/01/windows8-howtodisablesecureboot-bootfromcddvdinstallationmedia.html
You can still disable secure boot.... for now.
Disabling isn't the problem, now, but it could be. The fact that users don't have access to the keys on their own system is the problem.
-Charlie
podspi
01-29-2013, 11:32 PM
Disabling isn't the problem, now, but it could be. The fact that users don't have access to the keys on their own system is the problem.
-Charlie
That doesn't appear to be the case on my ATIV 500t. Can set and export keys.
And the ATIV's BIOS is an abomination, so they can't be the only ones who allow this functionality.
charlie
01-30-2013, 02:04 PM
That doesn't appear to be the case on my ATIV 500t. Can set and export keys.
And the ATIV's BIOS is an abomination, so they can't be the only ones who allow this functionality.
I was told may don't let you, and MS is pressuring hard to end that 'loophole'.
-Charlie
podspi
01-30-2013, 06:34 PM
I was told may don't let you, and MS is pressuring hard to end that 'loophole'.
-Charlie
It wouldn't surprise me. I am very disappointed at the direction the tech industry is heading -- locked down devices and walled-garden app stores :mad:
chipshot64
10-18-2013, 09:22 AM
"The delay of Intel's next-gen Broadwell chip means that PC and Mac upgrade cycles may get longer and longer."
http://news.cnet.com/8301-1001_3-57608046-92/waiting-for-windows-8.1-on-intel-broadwell-second-half-2014/
With AMD Kaveri desktop APUs confirmed shipping around Nov/Dec 2013 and intro of mobile Kaveris at CES 2014 with shipping in Q1, this is an enormous window for AMD to get widespread OEM wins (hello Apple Macbook, iMac?).
The release of a wide range of Kaveri compatible FM2+ was a hint that many OEMs believe in the new APU and it looks like AMD has the greatest opportunity to dominate from mid mobile to desktops in 2014.
nissangtr786
01-17-2014, 10:30 AM
This is intel haswell + baytrail NUC boxes. I was suprised a whole motherboard cpu ram hdd or ssd fits in this thing.
http://www.youtube.com/watch?v=iU-cfBgt_JM
NUC's are cute but do people actually BUY these things? They're always marked up too high for my taste.
Anonaru
01-17-2014, 11:38 AM
It wouldn't surprise me. I am very disappointed at the direction the tech industry is heading -- locked down devices and walled-garden app stores :mad:
This. And its always met with hate. See how the Kindle opened up to the full android market :D
Can't wait until our computers are as locked down as pre-smartphone era cell phones.
elemein
01-17-2014, 02:23 PM
NUC's are cute but do people actually BUY these things? They're always marked up too high for my taste.
I'm sure someone who needs an x86 HTPC is gobbling these up like Tic-Tacs. Sure there are other boxes like the ZBOX (my dad owns one with an E350) and the Gigabyte Brix with a similar size, but... At that point they're at a similar price point and Intel's CPU power would be superior to something like a lower power Jag box.
So there is a market for it, but it's niche.
rich wargo
01-17-2014, 02:28 PM
I'm sure someone who needs an x86 HTPC is gobbling these up like Tic-Tacs.
So are we talkin' 5 vs. 10 people here? Ain't too many folks what want spend all that dosh for an x86 HTPC. People like that generally wait for Fruityco (Apple) and their overpriced toys.
testbug00
01-17-2014, 02:38 PM
NUC's are cute but do people actually BUY these things? They're always marked up too high for my taste.
I actually know someone who is getting one of these (!)
Although, that is because it is on "sale" (i3 model) which is around $180 right now, and apparently that is the same price they are paying (they live in Poland). (http://www.newegg.com/Product/Product.aspx?Item=N82E16856102001)
They are going to be streaming to it, and they really wanted a solution that they could easily mount to the back of their TV.
They already have a mSATA drive ready, so they just need 2-4 GB of RAM.
-Q
elemein
01-17-2014, 02:41 PM
So are we talkin' 5 vs. 10 people here? Ain't too many folks what want spend all that dosh for an x86 HTPC. People like that generally wait for Fruityco (Apple) and their overpriced toys.
You'd be surprised as to how many folks would do exactly that. While it isn't super high, a lot of folk would definitely want a low power HTPC that is already compatible with their existing software and games. It's really a perk to have x86 hardware with the current ecosystem. Android is nice, but only for basic streaming really.
testbug00
01-17-2014, 02:57 PM
You'd be surprised as to how many folks would do exactly that. While it isn't super high, a lot of folk would definitely want a low power HTPC that is already compatible with their existing software and games. It's really a perk to have x86 hardware with the current ecosystem. Android is nice, but only for basic streaming really.
Yeah, but his point was that a mac mini does what the NUC does, except you get also get OSX, and not have to deal with *generally* overpriced parts.
It costs ~380 dollars to get a NUC that is slower (CPU side) than the cheapest Mac Mini (in the US).
~$40 for dual channel RAM (4GB for each)
~$400 for a 480GB hard drive (Mac Mini still has 20GB more)
~$ a few for a power cable for the NUC
Now, you could save ~100 dollars going to an i3 (well, currently with the crazy deal, you can save about $200)
But, either with that, you still are paying about the same cost as a Mac Mini, and you have:
1. less ports
2. less storage
3. smaller setup
4. integrated power supply (I do not believe the newest Mac Mini has integrated PSU... could be wrong).
5. slower CPU
6. slower GPU
7. CPU + GPU probably throttle when you try to run something that is demanding on both.
Now, if you want approx equal CPU speed (and a faster GPU), you are spending an extra $200+
-Q
elemein
01-17-2014, 03:07 PM
Yeah, but his point was that a mac mini does what the NUC does, except you get also get OSX, and not have to deal with *generally* overpriced parts.
It costs ~380 dollars to get a NUC that is slower (CPU side) than the cheapest Mac Mini (in the US).
~$40 for dual channel RAM (4GB for each)
~$400 for a 480GB hard drive (Mac Mini still has 20GB more)
~$ a few for a power cable for the NUC
Now, you could save ~100 dollars going to an i3 (well, currently with the crazy deal, you can save about $200)
But, either with that, you still are paying about the same cost as a Mac Mini, and you have:
1. less ports
2. less storage
3. smaller setup
4. integrated power supply (I do not believe the newest Mac Mini has integrated PSU... could be wrong).
5. slower CPU
6. slower GPU
7. CPU + GPU probably throttle when you try to run something that is demanding on both.
Now, if you want approx equal CPU speed (and a faster GPU), you are spending an extra $200+
-Q
That's all very true. I'm not denying that at all and I'm by no means an Intel spokesperson. You could probably get even cheaper than the Mac Mini. I'm just saying HTPC owners or small form factor lovers who dislike Apple or are diehard Intel fans or something surely want it. The niche is small but it's gotta be there, no?
testbug00
01-17-2014, 03:25 PM
That's all very true. I'm not denying that at all and I'm by no means an Intel spokesperson. You could probably get even cheaper than the Mac Mini. I'm just saying HTPC owners or small form factor lovers who dislike Apple or are diehard Intel fans or something surely want it. The niche is small but it's gotta be there, no?
Certainly, I would imagine most people would also use a small mSATA drive + external drive (which could pretty much cut the price to equal).
Of course, I choose the Mac mini because it is basically what Intel is aiming to make with the NUC (I think...)
-Q
laurent
01-18-2014, 04:06 AM
What does this NUC discussion have to do with Broadwell?
elemein
01-18-2014, 09:35 AM
What does this NUC discussion have to do with Broadwell?
We'll probably be seeing a Broadwell NUC in the works at some point, but I can definitely see where this is going a bit off track.
OT:
So does anyone have any idea is Broadwell will be just a die shrink with SoC (but not major core) changes? Or will it be like the transition from Sandy to Ivy where the CPU architecture changed slightly?
testbug00
01-18-2014, 10:26 AM
We'll probably be seeing a Broadwell NUC in the works at some point, but I can definitely see where this is going a bit off track.
OT:
So does anyone have any idea is Broadwell will be just a die shrink with SoC (but not major core) changes? Or will it be like the transition from Sandy to Ivy where the CPU architecture changed slightly?
It *should* be a shrink.
Remember, Intel is (still?) on its Tick Tock model:
Tick = shrink
Tock = new architecture.
Of course, Charlie says differently (about what model they are on) http://semiaccurate.com/2013/01/29/why-intels-tick-tock-model-had-to-die/
-Q
Really haven't heard much about Broadwell other than the iGPU is supposed to get around a 30% uplift.
NTMBK
01-18-2014, 04:26 PM
Really haven't heard much about Broadwell other than the iGPU is supposed to get around a 30% uplift.
The main feature Intel are touting is 30% power reduction, which should put the current i5-4300Y level of performance into fanless tablets.
Vithren
01-18-2014, 05:01 PM
Surface 3 Pro should be a nice thingy. Pricey, but nice.
juanrga
01-18-2014, 07:03 PM
PCIe 2.0?
And are you talking about LGA 1150 (or whatever it's successor will be called) or LGA 2011?
And what about Servers? Will they have enough PCI lanes? And if not: How exactly would Intel enable their customers to plug in one of their Larrabees/Phi?
Is not Intel saying that the PCIe Phi is the past and the socketed Phi the future?
TemplarGR
01-19-2014, 01:17 AM
Really haven't heard much about Broadwell other than the iGPU is supposed to get around a 30% uplift.
Theoritically, a GPU can almost double its performance with each die shrink. But Intel is targeting lower power usage for the mobile market, so i wouldn't be surprised if they didn't upgrade their igpu at all for Broadwell...
NTMBK
01-19-2014, 04:59 AM
Is not Intel saying that the PCIe Phi is the past and the socketed Phi the future?
I'd expect them to keep pushing both; a PCIe accelerator card is still a good fit for speeding up somebody's workstation, while a standalone socketed version gives maximum density for HPC.
(On a related note, I also expect that NVidia will make a standalone Tesla server driven by the integrated ARM Denver cores, while still selling PCIe graphics cards. This is just pure speculation, mind.)
NTMBK
01-19-2014, 05:00 AM
Theoritically, a GPU can almost double its performance with each die shrink. But Intel is targeting lower power usage for the mobile market, so i wouldn't be surprised if they didn't upgrade their igpu at all for Broadwell...
Actually, the Broadwell GPU is pretty heavily redesigned. There was a commit log to the Linux open source drivers listing loads of the changes. It's similar in scope to the Sandy Bridge->Ivy Bridge transition.
TemplarGR
01-19-2014, 07:24 AM
Actually, the Broadwell GPU is pretty heavily redesigned. There was a commit log to the Linux open source drivers listing loads of the changes. It's similar in scope to the Sandy Bridge->Ivy Bridge transition.
That is bad news for AMD then...
Vithren
01-19-2014, 07:47 AM
That is bad news for AMD then...
When was the last time when any Intel move was a good news for AMD? ; )
NTMBK
01-19-2014, 09:08 AM
When was the last time when any Intel move was a good news for AMD? ; )
Prescott?
/10char
juanrga
01-19-2014, 09:12 AM
I'd expect them to keep pushing both; a PCIe accelerator card is still a good fit for speeding up somebody's workstation, while a standalone socketed version gives maximum density for HPC
(On a related note, I also expect that NVidia will make a standalone Tesla server driven by the integrated ARM Denver cores, while still selling PCIe graphics cards. This is just pure speculation, mind.)
I expect Intel to push the card format for legacy customers and before migrating completely to the socket version.* Intel is very clear about which is the future for the Phi
http://cdn3.wccftech.com/wp-content/uploads/2013/11/Xeon-Phi-Knights-Landing-GPU-CPU-Form-Factor-635x358.png
* AMD and Nvidia will do the same with GPUs. Both will migrate to socket by 2018 or so.
elemein
01-19-2014, 09:29 AM
I expect Intel to push the card format for legacy customers and before migrating completely to the socket version.* Intel is very clear about which is the future for the Phi
http://cdn3.wccftech.com/wp-content/uploads/2013/11/Xeon-Phi-Knights-Landing-GPU-CPU-Form-Factor-635x358.png
* AMD and Nvidia will do the same with GPUs. Both will migrate to socket by 2018 or so.
Tomorrow? So 14nm CPUs are coming tomorrow?!
Sweet!
;)
sirroman
01-19-2014, 10:14 AM
I expect Intel to push the card format for legacy customers and before migrating completely to the socket version.* Intel is very clear about which is the future for the Phi
* AMD and Nvidia will do the same with GPUs. Both will migrate to socket by 2018 or so.
What are your basis for that prediction?
Dahakon
01-19-2014, 10:17 AM
What are your basis for that prediction?
His prediction is off somewhat, AMD is already doing it since this week, with Kaveri.
sdlvx
01-19-2014, 01:02 PM
What are your basis for that prediction?
I've been saying the same thing as well for a while. The basis is that PCIe simply can't meet the demands of GPGPU and HSA style computers.
The way I have been viewing it, is Intel and AMD have been making APUs for HSA and OpenCL because that's currently the only feasible way to get the CPU and GPU to talk to each other without massive bottlenecks.
A 4ghz QPI link could muster up 40GB/s theoretical performance. That's more than enough to handle DDR3 (at least two channels).
The final specs for PCIe 4.0 aren't due until sometime this year or in 2015. That's a long time to wait just to be able to push HSA and similar things, specifically when we're at a time when Intel is cutting back employees and not moving fabs to new nodes because of lack of demand.
I am expecting both of them to do this. Intel will probably give us Xeon Phi in a socket and I wouldn't be surprised if AMD came out with a better version of hypertransport and revived the PCIe pin compatible HTX slot, but in a far more aggressive form with all their dGPUs supporting HSA and HTX slot. AMD is probably tied a little more than Intel, because AMD would need to keep PCIe compatibility to work with Intel platforms for gamers and stuff. I don't think it'd be feasible for AMD to tell everyone that you need to jump on the AMD platform if you want to play games, unless you go Nvidia + Intel only.
themassau
01-19-2014, 01:35 PM
I am expecting both of them to do this. Intel will probably give us Xeon Phi in a socket and I wouldn't be surprised if AMD came out with a better version of hypertransport and revived the PCIe pin compatible HTX slot, but in a far more aggressive form with all their dGPUs supporting HSA and HTX slot. AMD is probably tied a little more than Intel, because AMD would need to keep PCIe compatibility to work with Intel platforms for gamers and stuff. I don't think it'd be feasible for AMD to tell everyone that you need to jump on the AMD platform if you want to play games, unless you go Nvidia + Intel only.
maybe they can make it pin compatible and run PCIe when on an intel platform and HTX when on an AMD platform, could they also have a wider slot like x46 (this probably doesn't help but still)?
gzubeck
01-19-2014, 01:38 PM
maybe they can make it pin compatible and run PCIe when on an intel platform and HTX when on an AMD platform, could they also have a wider slot like x46 (this probably doesn't help but still)?
You could manufacture an adapter card if they we're system compatible except for the slot design.
themassau
01-19-2014, 03:19 PM
You could manufacture an adapter card if they we're system compatible except for the slot design.
why would't it be possible you have the pcie X2 a little pace the 14X other so you could do another space and put some extra lines on it?
juanrga
01-20-2014, 05:08 AM
Tomorrow? So 14nm CPUs are coming tomorrow?!
Sweet!
;)
:D
What are your basis for that prediction?
See below.
I've been saying the same thing as well for a while. The basis is that PCIe simply can't meet the demands of GPGPU and HSA style computers.
The way I have been viewing it, is Intel and AMD have been making APUs for HSA and OpenCL because that's currently the only feasible way to get the CPU and GPU to talk to each other without massive bottlenecks.
A 4ghz QPI link could muster up 40GB/s theoretical performance. That's more than enough to handle DDR3 (at least two channels).
The final specs for PCIe 4.0 aren't due until sometime this year or in 2015. That's a long time to wait just to be able to push HSA and similar things, specifically when we're at a time when Intel is cutting back employees and not moving fabs to new nodes because of lack of demand.
I am expecting both of them to do this. Intel will probably give us Xeon Phi in a socket and I wouldn't be surprised if AMD came out with a better version of hypertransport and revived the PCIe pin compatible HTX slot, but in a far more aggressive form with all their dGPUs supporting HSA and HTX slot. AMD is probably tied a little more than Intel, because AMD would need to keep PCIe compatibility to work with Intel platforms for gamers and stuff. I don't think it'd be feasible for AMD to tell everyone that you need to jump on the AMD platform if you want to play games, unless you go Nvidia + Intel only.
In fact Intel says about the socket version of the Knight Landing that will be faster than the PCIe card version:
will significantly reduce programming complexity and eliminate 'offloading' of the data, thus improving performance and decreasing latencies caused by memory, PCIe and networking.
http://www.theregister.co.uk/2013/11/19/intel_says_bootable_knights_landing_cpu_will_be_a_ game_changer/
However at exascale level the problem is not solved by inventing a faster PCIe bus or a new HTX slot.
During development of exascale level supercomputers, the research has identified that current supercomputers (CPU+dGPU) architecture doesn't scale up with independence of the slot used. Even a hypothetical heterogeneous dual socket configuration with CPU in one socket and GPU in the other socket, both connected by an ultra-fast bus, doesn't work at that scale.
One the challenges for developing an exascale level supercomputer is the principle of locality. A CPU+dGPU doesn't satisfy this principle and that is why both Nvidia and AMD exascale designs are based around APUs/SoCs. The Nvidia design is a 20TFLOP (DP) 300W APU (SoC). AMD proposes a ~10TFLOP 200-250W APU. Both use stacked RAM on die and have 8 CPU cores and both APUs are scheduled for 2018 or so.
Due to the principle of locality a traditional CPU+dGPU of similar performance, 10--20TFLOP DP, would consume about 10x more power than their APUs. Or alternatively, at the same level of power consumption the APUs would be about 10x faster.
Therefore dGPUs will be crushed both by the low end APUs (cheaper) and by the top end APUs (faster). I repeat: the fastest designs made by both Nvidia and AMD are APUs, not dGPUs.
MNVolts
01-20-2014, 05:37 AM
During development of exascale level supercomputers, the research has identified that current supercomputers (CPU+dGPU) architecture doesn't scale up with independence of the slot used. Even a hypothetical heterogeneous dual socket configuration with CPU in one socket and GPU in the other socket, both connected by an ultra-fast bus, doesn't work at that scale.
Its was only a matter of time, trace lengths between components impose more of a penalty than people realize. Direct integration provides the lowest latency possible, and the less time data spends moving around the better. I wouldn't be too surprised to see storage move on die either a few years down the line. If we don't have on die Memristor based storage by 2025 I'll eat my own boot.
Theoritically, a GPU can almost double its performance with each die shrink. But Intel is targeting lower power usage for the mobile market, so i wouldn't be surprised if they didn't upgrade their igpu at all for Broadwell...
As far as power goes it's more efficient to double the GPU cores and run them slower. The cpu side doesn't really need any modification.
maybe they can make it pin compatible and run PCIe when on an intel platform and HTX when on an AMD platform, could they also have a wider slot like x46 (this probably doesn't help but still)?
They already offer HTX/PCIe compatibility in the same port. It's specified in the latest Steamroller guide.
"HyperTransport3. HyperTransport3 increases the aggregate link bandwidth to a maximum of 25.6 Gbyte/s (16-bit link)."
Which falls between PCIe 3 and 4 speeds at 12.8GT/s per lane/link.
Megol
01-21-2014, 05:27 AM
Sockets doesn't really scale better than slots to moving to them wouldn't be significantly better. Multi chip modules soldered to the motherboard is the only reasonable way to significantly improve the communication bandwidth. Unpopular view on this site yes, but technically sound.
MCMs can be combined with sockets like IBM have done but the costs are enormous - their MCMs have > 100 interconnection layers and huge amounts of pins. The mainstream market can never pay the costs for doing something similar.
Drashek
01-25-2014, 08:28 AM
What are the chances of Broadwell supporting TrueAudio? I have read that it will feature a DSP for low power playback and voice/speech recognition and they already use Tensilica (Cadence) IP in other processors (http://www.tensilica.com/news/24/330/Tensilica-Confirms-New-Intel-Media-Processor-for-Consumer-Electronics-Devices-Uses-Company-s-HiFi-2-Audio-Processor.htm). If they were going to use Tensilica then it would make sense for them to use the same processor (HiFi2 EP).
testbug00
01-25-2014, 09:31 AM
What are the chances of Broadwell supporting TrueAudio? I have read that it will feature a DSP for low power playback and voice/speech recognition and they already use Tensilica (Cadence) IP in other processors (http://www.tensilica.com/news/24/330/Tensilica-Confirms-New-Intel-Media-Processor-for-Consumer-Electronics-Devices-Uses-Company-s-HiFi-2-Audio-Processor.htm). If they were going to use Tensilica then it would make sense for them to use the same processor (HiFi2 EP).
Considering TrueAudio is done via the graphics driver, I think none. AMD hasn't announced if TrueAudio is open or not (although, I would imagine it would be open...)
The equivalent of TrueAudio now...
-Q
Drashek
01-25-2014, 05:49 PM
Considering TrueAudio is done via the graphics driver, I think none. AMD hasn't announced if TrueAudio is open or not (although, I would imagine it would be open...)
The equivalent of TrueAudio now...
-Q
Yes that is what I was asking (supporting all of the features). I thought that I had read it was going to be open (or partially) (http://semiaccurate.com/?s=trueaudio) but it is 420 here a lot so I could have memory errors.
Does the driver used (Graphics or Mantle driver used for TA?) make a difference as long as they are integrated with similar resources (link with a coprocessor and access to a shared pool of memory iirc).
I have only seen to the audio DSP for Broadwell referred to as "Intel smart sound Technology (SST)" (https://www.google.co.uk/?q=intel+sst+dsp#q=intel+sst+dsp) and have seen no mention of Tensilica IP (complete conjecture) but they have used it before in the CE 3100 (PDF) (http://download.intel.com/design/celect/downloads/ce3100-product-brief.pdf) media processor / SoC. If they were to use Tensilica IP they'd (imo) likely go with the same DSP (low power, audio decoding / encoding, speech and voice processing).
testbug00
01-25-2014, 06:43 PM
Yes that is what I was asking (supporting all of the features). I thought that I had read it was going to be open (or partially) (http://semiaccurate.com/?s=trueaudio) but it is 420 here a lot so I could have memory errors.
Does the driver used (Graphics or Mantle driver used for TA?) make a difference as long as they are integrated with similar resources (link with a coprocessor and access to a shared pool of memory iirc).
My understanding is that Mantle and trueaudio are both part of the Radeon drivers.
Also, I don't see anything about Trueaudio being open source in the article that talks about it. (http://semiaccurate.com/2013/10/09/like-mantle-amds-trueaudio-massive-step-forward/) Could you perhaps link to where you saw it?
Aside from supporting open-source codecs, that is.
-Q
Drashek
01-25-2014, 08:02 PM
My understanding is that Mantle and trueaudio are both part of the Radeon drivers.
Also, I don't see anything about Trueaudio being open source in the article that talks about it. (http://semiaccurate.com/2013/10/09/like-mantle-amds-trueaudio-massive-step-forward/) Could you perhaps link to where you saw it?
Aside from supporting open-source codecs, that is.
-Q
No you are right but it was good reading over those articles again none the less. Mantle is going to be an open standard / API isn't it and, it can utilise the TA (TrueAudio) processor. As long as the (DSP) configuration is similar I don't believe it would even have to be Tensilica IP you'd just have to write a Mantle driver.
Fottemberg
02-12-2014, 01:40 PM
http://seekingalpha.com/article/2014501-intel-is-this-rumor-true
pmoses
02-12-2014, 02:35 PM
If I remember correctly, Charlie or some other authority told, that they are on track. Even if they have sub-par yields (lower than 50%), they should make some premium mobile chips, just to make OEMs ready for back-to-school season.
rich wargo
02-12-2014, 04:06 PM
First time I've ever heard of Digitimes referred to as a " reputable trade press firm." Usually, their modus operandi is to throw tons of rumors at the wall to see what sticks.
No you are right but it was good reading over those articles again none the less. Mantle is going to be an open standard / API isn't it and, it can utilise the TA (TrueAudio) processor. As long as the (DSP) configuration is similar I don't believe it would even have to be Tensilica IP you'd just have to write a Mantle driver.
IIRC Given that there is currently no standard API to wrap the audio accelerators up in the way of GPUs under Windows, the support of TrueAudio's DSP is done only via audio engine plugins and incompatible with others. It should not the business of Mantle graphics API in this case, as TrueAudio is just a DSP integrated with the GPU, pretty much like other integrated blocks such as UVD and VCE.
elemein
02-13-2014, 10:21 AM
If I remember correctly, Charlie or some other authority told, that they are on track. Even if they have sub-par yields (lower than 50%), they should make some premium mobile chips, just to make OEMs ready for back-to-school season.
Wait, do you have a link to this? As someone whose going to be looking for a premium mobile laptop for networking in college, this sounds perfect... A nice 13.3" ultrathin with 6 hrs of battery life and an i5-5200U and some amazing iGP (for an ultrathin iGP)... I'll pay a tidy sum for that. I dont even need a touch screen.
pmoses
02-13-2014, 11:40 AM
Wait, do you have a link to this? As someone whose going to be looking for a premium mobile laptop for networking in college, this sounds perfect... A nice 13.3" ultrathin with 6 hrs of battery life and an i5-5200U and some amazing iGP (for an ultrathin iGP)... I'll pay a tidy sum for that. I dont even need a touch screen.
Xbitlabs:
By the time we enter the back to school selling season we have nearly 70 unique 2-in-1 designs with outstanding battery life and performance across a range of price points in those markets,” said Brian Krzanich, chief executive officer of Intel, during a conference call with investors and analysts.
http://www.xbitlabs.com/news/cpu/display/20140120233109_Intel_on_Track_with_Broadwell_Produ ction_in_Q1_Company.html
laurent
02-13-2014, 01:46 PM
BK has bullshi*ed so much I'm not sure he can be trusted as far as schedules are concerned. As an example he promised a dozen of Android Bay Trail tablets for end of last year; there's still none. Or he said in Q3 the 14nm process was OK, to finally admit a month later they had issues.
Fottemberg
02-13-2014, 02:13 PM
http://www.electronicsweekly.com/news/general/intels-14nm-mobile-delayed-till-2015-2014-02/
Intel’s 14nm delay must be worrying Altera which had been promised access to the process this year.
Rumours have persisted lately that Altera is talking to TSMC about ditching Intel as a foundry and returning to its old foundry partner -TSMC.
BK has to explain a lot of things ...
testbug00
02-13-2014, 04:02 PM
BK has bullshi*ed so much I'm not sure he can be trusted as far as schedules are concerned. As an example he promised a dozen of Android Bay Trail tablets for end of last year; there's still none. Or he said in Q3 the 14nm process was OK, to finally admit a month later they had issues.
Maybe they figured because of Bluestacks Windows 8 could count as Android (because AMD APU can run Android apps) so they counted the few dozen Windows tablets sold as Android tablets?
-Q
Third Eye
02-13-2014, 08:46 PM
Maybe they figured because of Bluestacks Windows 8 could count as Android (because AMD APU can run Android apps) so they counted the few dozen Windows tablets sold as Android tablets?
-Q
I thought that was Rory's formula :D
Seriously if you think Intel is in bad shape in Android world, AMD is in far worse shape.
testbug00
02-13-2014, 09:09 PM
I thought that was Rory's formula :D
Seriously if you think Intel is in bad shape in Android world, AMD is in far worse shape.
I am not saying AMD is doing well, and if you think I am, you are wrong.
I am trying to find excuses BK could use.
"Windows can run Android applications (shows it on unnamed device) so that means Windows tablets could as Android tablets"
-Q
Stuckey
02-18-2014, 05:34 AM
I thought that was Rory's formula :D
Seriously if you think Intel is in bad shape in Android world, AMD is in far worse shape.
How so ? Aren't AMD the ones working with Bluestacks ? And rumor has it MS are seriously considering adding Android app support to Windows anyway ? Either of those outcomes would probably leave AMD with the best dual OS/dual app support ? Certainly as far as Mullins vs Bay Trail goes ?
Personally I believe those dual OS tablets that were shown off at CES are going to be the norm for non ARM tablets in future. There doesn't seem much point having an x86 chip that doesn't run both if it can. The only question is which one will be the better solution.
elemein
02-19-2014, 02:04 PM
How so ? Aren't AMD the ones working with Bluestacks ? And rumor has it MS are seriously considering adding Android app support to Windows anyway ? Either of those outcomes would probably leave AMD with the best dual OS/dual app support ? Certainly as far as Mullins vs Bay Trail goes ?
Personally I believe those dual OS tablets that were shown off at CES are going to be the norm for non ARM tablets in future. There doesn't seem much point having an x86 chip that doesn't run both if it can. The only question is which one will be the better solution.
Stupid question but I've always wondered why exactly CPUs like the Atom can run Android... How does it run ARM software? Is it via a virtual machine that is transparent to the user or something?
Stupid question but I've always wondered why exactly CPUs like the Atom can run Android... How does it run ARM software? Is it via a virtual machine that is transparent to the user or something?
Android is built upon linux which has run on x86 since forever, so it's usually just a recompile to run on Atom.
elemein
02-19-2014, 02:33 PM
Android is built upon linux which has run on x86 since forever, so it's usually just a recompile to run on Atom.
Doesn't that make projects like Android x86 sort of redundant though?
NTMBK
02-19-2014, 05:29 PM
Stupid question but I've always wondered why exactly CPUs like the Atom can run Android... How does it run ARM software? Is it via a virtual machine that is transparent to the user or something?
Most Android software runs on a Java virtual machine known as Dalvik. Port Dalvik to a new platform, and the majority of Android software will run on it.
However- some software does use native ARM code (things like high performance 3D games). This is the trickier situation, but Intel has produced an ARM->x86 binary translator which will convert the binary to an x86 program, which obviously then runs on Atom. It's not an ideal solution, but it works.
There are also various libraries which use native ARM code under the hood to provide high performance, and these will need to be ported to x86.
elemein
02-19-2014, 05:57 PM
Most Android software runs on a Java virtual machine known as Dalvik. Port Dalvik to a new platform, and the majority of Android software will run on it.
However- some software does use native ARM code (things like high performance 3D games). This is the trickier situation, but Intel has produced an ARM->x86 binary translator which will convert the binary to an x86 program, which obviously then runs on Atom. It's not an ideal solution, but it works.
There are also various libraries which use native ARM code under the hood to provide high performance, and these will need to be ported to x86.
Ah that's very interesting and seems like a good solution since Java is so widely supported from platform to platform. It's a little bit hard to believe that the level of performance and fluidity offered by Android is provided by something that runs entirely in a Java virtual machine (since most Java programs I'm familiar with as a consumer are not optimized well at all; ie. most games and production software.) save for the cases that do not run in the VM.
Definitely interesting though. I thought Android was entirely written and compiled in ARMv6/7/8 and ran without the aid of any virtualization.
Thanks for the info!
Definitely interesting though. I thought Android was entirely written and compiled in ARMv6/7/8 and ran without the aid of any virtualization.
Thanks for the info!
Well Android is a child of Google and they're processor agnostic. They're supporting MIPS as well.
elemein
02-19-2014, 07:29 PM
Well Android is a child of Google and they're processor agnostic. They're supporting MIPS as well.
True, though I still wonder if that makes projects like Android for x86 redundant.
Either way thats a bit off topic, so thanks for allowing the minor derail :p Back to business as usual
ThirdEye
02-20-2014, 06:24 PM
Ah that's very interesting and seems like a good solution since Java is so widely supported from platform to platform. It's a little bit hard to believe that the level of performance and fluidity offered by Android is provided by something that runs entirely in a Java virtual machine (since most Java programs I'm familiar with as a consumer are not optimized well at all; ie. most games and production software.) save for the cases that do not run in the VM.
Definitely interesting though. I thought Android was entirely written and compiled in ARMv6/7/8 and ran without the aid of any virtualization.
Thanks for the info!
Actually it is similiar but different.
Dalvik is a Virtual Machine of Googles' running over a Linux Kernel. That environment is called Android.
So apps that need to be run on Android get executed in the Dalvik VM. Like Java Apps need a JVM.
So while Sun made Java open source, if you have a JVM you need to get it certified by Sun. And only the Standard edition was open source but the Mobile edition was not. So there was J2ME which Sun expected companies to license.
Instead Android Inc (founded by Andy Rubin) after Google's acquisition went a different way. They basically discovered a loop-hole because of the differing license schemes between J2SE vs. J2ME.
The source code was similar to Java, but an Android software development kit compiles the JAVA code to intermediate Java Classes
but then converted the Java classes into Dalvik Executables (.dex) that run in a Dalvik VM rather than a rather than a Java Executable in a Java VM.
There is a quite bit of architectural difference between a Dalvik VM vs. Java VM.
That was the principal reason ORCL sued Google (and lost) when the were stating that GOOG ripped of JAVA (as by converting Java Class to DEX instead of JEX, they run foul of ORCL's intellectual property) instead of willing to pay for ORCL for that previlege.
elemein
02-20-2014, 09:09 PM
Actually it is similiar but different.
Dalvik is a Virtual Machine of Googles' running over a Linux Kernel. That environment is called Android.
So apps that need to be run on Android get executed in the Dalvik VM. Like Java Apps need a JVM.
So while Sun made Java open source, if you have a JVM you need to get it certified by Sun. And only the Standard edition was open source but the Mobile edition was not. So there was J2ME which Sun expected companies to license.
Instead Android Inc (founded by Andy Rubin) after Google's acquisition went a different way. They basically discovered a loop-hole because of the differing license schemes between J2SE vs. J2ME.
The source code was similar to Java, but an Android software development kit compiles the JAVA code to intermediate Java Classes
but then converted the Java classes into Dalvik Executables (.dex) that run in a Dalvik VM rather than a rather than a Java Executable in a Java VM.
There is a quite bit of architectural difference between a Dalvik VM vs. Java VM.
That was the principal reason ORCL sued Google (and lost) when the were stating that GOOG ripped of JAVA (as by converting Java Class to DEX instead of JEX, they run foul of ORCL's intellectual property) instead of willing to pay for ORCL for that previlege.
So, if I understand this correctly;
The Android environment/OS itself is not virtualized but runs entirely natively on the ARM device, however most apps (if they're not hardcoded with the ARM ISA) run in a virtualized environment called Dalvik?
However that still doesnt reason as to why projects like Android for x86 aren't entirely redundant if you can just recompile Android from the original x86 kernel and have full functionality right out of the box (minus drivers)...
Third Eye
02-21-2014, 12:11 AM
So, if I understand this correctly;
The Android environment/OS itself is not virtualized but runs entirely natively on the ARM device, however most apps (if they're not hardcoded with the ARM ISA) run in a virtualized environment called Dalvik?
However that still doesnt reason as to why projects like Android for x86 aren't entirely redundant if you can just recompile Android from the original x86 kernel and have full functionality right out of the box (minus drivers)...
Because while the source code of Android is free and released by Google as AOSP project, someone has to create the Dalvik JIT and ultimately convert the DEX "bytecodes" to native/near native x86 instructions.
Google does that all the required ones for ARM ISAs. For x86 it is done by Android x86 project. The Dalvik JIT was contributed by Intel to the Ax86 project.
Similarly for MIPS Android, there is a Myriad Dalvik JIT and so on....
elemein
02-21-2014, 08:21 AM
Because while the source code of Android is free and released by Google as AOSP project, someone has to create the Dalvik JIT and ultimately convert the DEX "bytecodes" to native/near native x86 instructions.
Google does that all the required ones for ARM ISAs. For x86 it is done by Android x86 project. The Dalvik JIT was contributed by Intel to the Ax86 project.
Similarly for MIPS Android, there is a Myriad Dalvik JIT and so on....
Ohhh alright alright I think I'm getting the hang of it.
So the Android environment itself can be recompiled back and forth between ARM and x86 since its based on the Linux kernel.
Dalvik is not open source, and the purpose of the Android for x86 project is to create a suitable environment for the closed source Dalvik programs.
Here's the part I definitely didnt see coming: Android tablets using Intel uses Android for x86 (I never knew that such a project was used commercially! Thats really unexpected.)
Thanks for explaining.
laurent
02-21-2014, 09:02 AM
Because while the source code of Android is free and released by Google as AOSP project, someone has to create the Dalvik JIT and ultimately convert the DEX "bytecodes" to native/near native x86 instructions.
Google does that all the required ones for ARM ISAs. For x86 it is done by Android x86 project. The Dalvik JIT was contributed by Intel to the Ax86 project.
Are you sure Intel contributes anything to Android-x86 project? They contribute to Google directly for sure (see Dalvik x86 trace JIT support (https://android.googlesource.com/platform/dalvik/+/0c2dc522d0e120f346cf0a40c8cf0c93346131c2) for instance).
They also have their own open source Android distro: https://01.org/android-ia/
And last but not least: they provide copyrighted closed source tools and utilities (for instance ARM simulator Houdini).
Third Eye
02-22-2014, 10:33 AM
Are you sure Intel contributes anything to Android-x86 project? They contribute to Google directly for sure (see Dalvik x86 trace JIT support (https://android.googlesource.com/platform/dalvik/+/0c2dc522d0e120f346cf0a40c8cf0c93346131c2) for instance).
You are right. My bad. Intel did contribute x86 Dalvik JIT but to AOSP only not to android-x86 project. Thanks for correcting.
They also have their own open source Android distro: https://01.org/android-ia/
And last but not least: they provide copyrighted closed source tools and utilities (for instance ARM simulator Houdini).
See how Intel calls it "Android on Intel Architecture" not "Android on x86". So Intel wants to support Android on its processors alone rather than any x86 (meaning AMD). It goes against the spirit of the Android x86 initiative (which is Android on any x86).
So I am not surprised they have their own Android distro.
Apparently Intel only is interested in Android for its "Atom" series.
Stuckey
02-22-2014, 01:07 PM
Which probably explains why AMD went with Bluestacks instead ? As the chances of a 'Android on x86' version for HSA are slim to none ?
jamie965
02-22-2014, 07:38 PM
right now the only x86 android image, that comes with the android sdk for emulation, is for atom procesors and doesnt run on non intel systems
NTMBK
02-22-2014, 09:00 PM
So long as Intel is contributing to AOSP, I don't see how they can lock out AMD. It's open source, so they should be able to remove any 'GenuineIntel' checks.
laurent
02-23-2014, 03:49 AM
So long as Intel is contributing to AOSP, I don't see how they can lock out AMD. It's open source, so they should be able to remove any 'GenuineIntel' checks.
Yes, but I bet all the existing Android phones and tablets are using the Intel specific distro I mentioned. The ARM simulation lib is Intel specific and is still mandatory to run some apps.
sdlvx
02-23-2014, 12:26 PM
So long as Intel is contributing to AOSP, I don't see how they can lock out AMD. It's open source, so they should be able to remove any 'GenuineIntel' checks.
AMD has Bluestacks. They don't need to have their own version of Android when they can run Android applications on top of Windows. I've used it and it works rather well. In fact I would take this over some sort of dual boot situation, and I am sure I am not alone in thinking that.
Intel could easily completely cripple AMD in mobile Android. Think about Intel doing things like making power saving features only work on Intel. You can do that just by configuring the Linux Kernel differently.
nissangtr786
03-27-2014, 01:58 PM
http://www.techoftomorrow.com/2014/pc/intel-reinvents-the-desktop-for-2014-and-introduces-4-new-overclocking-products/
Haswell 8 core the dream and new TIM materials for overclocking and first 14nm broadwell cpus and a pentium cpu that can overclock.
Joe212
04-02-2014, 09:30 PM
http://www.techoftomorrow.com/2014/pc/intel-reinvents-the-desktop-for-2014-and-introduces-4-new-overclocking-products/
Haswell 8 core the dream and new TIM materials for overclocking and first 14nm broadwell cpus and a pentium cpu that can overclock.
From the slide:
"For the first time - bringing Intel IrisPro Graphics to desktop socketed unlocked processors"
I thought Intel was killing off the desktop: Intel kills off the desktop, PCs go with it (http://semiaccurate.com/2012/11/26/intel-kills-off-the-desktop-pcs-go-with-it/)
NTMBK
04-03-2014, 03:27 AM
From the slide:
"For the first time - bringing Intel IrisPro Graphics to desktop socketed unlocked processors"
I thought Intel was killing off the desktop: Intel kills off the desktop, PCs go with it (http://semiaccurate.com/2012/11/26/intel-kills-off-the-desktop-pcs-go-with-it/)
Hey, let's not complain about the shift in strategy. Should be a great gaming-HTPC part, if the price is right... I hope they release a dual-core + Iris Pro SKU, because using an entire i7 to drive that level of graphics is a bit silly.
juanrga
04-03-2014, 07:33 AM
Hey, let's not complain about the shift in strategy. Should be a great gaming-HTPC part, if the price is right... I hope they release a dual-core + Iris Pro SKU, because using an entire i7 to drive that level of graphics is a bit silly.
I believe it will be available only in i5 and i7 models aka "K-models".
Iris Pro is really only interesting if the price gets reduced. Can't wait to see how the new TIM performs though.
nissangtr786
04-03-2014, 02:51 PM
Intel really need to lower temps in there laptop cpus as it is on limit.
pmoses
04-04-2014, 12:26 AM
Hey, let's not complain about the shift in strategy. Should be a great gaming-HTPC part, if the price is right... I hope they release a dual-core + Iris Pro SKU, because using an entire i7 to drive that level of graphics is a bit silly.
Intel tries to fight against A4-5000? ;)
NTMBK
04-04-2014, 03:08 AM
Intel tries to fight against A4-5000? ;)
Given that an i3 beats Kaveri in most CPU benchmarks and games (even without turbo core), I would expect them to aim a little higher than Kabini...
sdlvx
04-04-2014, 04:48 AM
Given that an i3 beats Kaveri in most CPU benchmarks and games (even without turbo core), I would expect them to aim a little higher than Kabini...
The only thing Intel has against Kabini is i3 Y and U series. Which, might I add, cost significantly more than Kabini.
Bay trail comes kind of close, but the GPU isn't close at all.
Yes, Intel can get Kabini performance no problem, but it's in a $1000+ ultrabook.
The top of the line Kabini laptop I know of is Thinkpad x140e for around $550 equiped with A4-5000.
So Intel is either fighting Kabini with Bay Trail, which is very close in CPU performance and way behind in GPU, or with i3, which is better but costs significantly more.
https://en.wikipedia.org/wiki/List_of_Intel_Core_i3_microprocessors#.22Haswell-MB.22_.2822_nm.29
Take a look at i3 4005U. Yes, it's 15 "intel watts" while A4-5000 is real watts, but even then you can practically purchase a complete A4-5000 laptop for the price of i3 4005U alone.
Maybe with a die shrink Intel can get their die sizes to be more competitive with Kabini, but even if they could, do you really think they'd go and try and compete with AMD in that price range? The $300 to $550 range doesn't seem like an area where Intel wants to be competing with complete laptops.
NTMBK
04-04-2014, 05:34 AM
The only thing Intel has against Kabini is i3 Y and U series. Which, might I add, cost significantly more than Kabini.
Bay trail comes kind of close, but the GPU isn't close at all.
Yes, Intel can get Kabini performance no problem, but it's in a $1000+ ultrabook.
The top of the line Kabini laptop I know of is Thinkpad x140e for around $550 equiped with A4-5000.
So Intel is either fighting Kabini with Bay Trail, which is very close in CPU performance and way behind in GPU, or with i3, which is better but costs significantly more.
https://en.wikipedia.org/wiki/List_of_Intel_Core_i3_microprocessors#.22Haswell-MB.22_.2822_nm.29
Take a look at i3 4005U. Yes, it's 15 "intel watts" while A4-5000 is real watts, but even then you can practically purchase a complete A4-5000 laptop for the price of i3 4005U alone.
Maybe with a die shrink Intel can get their die sizes to be more competitive with Kabini, but even if they could, do you really think they'd go and try and compete with AMD in that price range? The $300 to $550 range doesn't seem like an area where Intel wants to be competing with complete laptops.
Intel fights Kabini with Celerons and Pentiums, not the i3. Even a crippled Ivy Bridge Pentium completely wipes the floor with Kabini in CPU performance, and pretty much matches it in GPU performance: http://www.anandtech.com/show/6974/amd-kabini-review/5 And of course Haswell brings significant improvements to the GPU, as it goes from 6EUs to 10EUs on the lowest end models. Yes, it has a higher TDP, but the majority of Kabini parts are going into 15" craptops which can easily handle a slightly higher TDP just fine. And if you need the lower TDPs, there are plenty of 15W Celeron-U SKUs which drop some CPU speed, but still beat Kabini in CPU performance.
I do agree that Bay Trail's GPU is extraordinarily disappointing. It's a good CPU part, but the GPU lets it down. Thankfully Cherry Trail significantly improves the GPU, so hopefully they will have a worthy Puma competitor on the GPU side. But Intel certainly has plenty of options to fight Kabini with in the craptop market.
EDIT: The low end desktop market doesn't look that promising for Kabini either. AMD like to trumpet the fact that you can get Kabini in a socket, which means you'll be able to upgrade. Well for the same price you can get a H81 motherboard and a Celeron with the same performance as that Kabini, but with the option down the road of upgrading to anything up to and including a Core i7.
Don't get me wrong, I want AMD to do well. I seriously hope that the Puma update will finally give them a tablet processor worth using- Temash was a big disappointment. But chasing the low end isn't going to sustain them forever- Intel's process lead will give them cheaper transistors, as well as faster ones.
Hey, let's not complain about the shift in strategy. Should be a great gaming-HTPC part, if the price is right... I hope they release a dual-core + Iris Pro SKU, because using an entire i7 to drive that level of graphics is a bit silly.
Intel IrisPro Graphics to desktop. Socketed. Unlocked processors. Let that ridiculousness sink in for a bit.
You should complain about that strategy as it means the death of the desktop and likely dGPU's.
...
I do agree that Bay Trail's GPU is extraordinarily disappointing. It's a good CPU part, but the GPU lets it down. Thankfully Cherry Trail significantly improves the GPU, so hopefully they will have a worthy Puma competitor on the GPU side. But Intel certainly has plenty of options to fight Kabini with in the craptop market.
EDIT: The low end desktop market doesn't look that promising for Kabini either. AMD like to trumpet the fact that you can get Kabini in a socket, which means you'll be able to upgrade. Well for the same price you can get a H81 motherboard and a Celeron with the same performance as that Kabini, but with the option down the road of upgrading to anything up to and including a Core i7.
Don't get me wrong, I want AMD to do well. I seriously hope that the Puma update will finally give them a tablet processor worth using- Temash was a big disappointment. But chasing the low end isn't going to sustain them forever- Intel's process lead will give them cheaper transistors, as well as faster ones.
Why is intel perpetually always one chip/process away from having a great mobile product? :D
People choosing the low end chose it because it's cheap and good enough. No one buying a celeron is going to upgrade to an i7 later. They just want the next generation of low end i.e. socketed baytrail upgraded to socketed cherry trail. Intel was forced to enable Quick Sync all the way down to small cores so even a transcoding HTPC doesn't need anything beyond the cheapest option.
I don't think AMD is chasing the low end but creating a new market for socketed ARM. No one is there and Intel can never compete against that. It gets even worse for Intel if the socket can do x86 and ARM. Doubly worse for Intel if AMD pushes it everywhere and invites other ARM players to release their compatible chips.
juanrga
04-04-2014, 12:40 PM
Intel IrisPro Graphics to desktop. Socketed. Unlocked processors. Let that ridiculousness sink in for a bit.
You should complain about that strategy as it means the death of the desktop and likely dGPU's.
Yes Intel plans to kill dGPUs, but that is fantastic thing because both AMD and Nvidia plan the same. The laws of physics are the same for the three players. :rolleyes:
rich wargo
04-04-2014, 01:00 PM
Yes Intel plans to kill dGPUs, but that is fantastic thing because both AMD and Nvidia plan the same. The laws of physics are the same for the three players. :rolleyes:
You sure that's true for nVidia? They don't seem to think so... :D
testbug00
04-04-2014, 01:19 PM
Yes Intel plans to kill dGPUs, but that is fantastic thing because both AMD and Nvidia plan the same. The laws of physics are the same for the three players. :rolleyes:
How does Nvidia plan to chase the majority of the market for people who buy laptops than?
Nvidia leaving dGPU would be suicide until, well, I would give a date, but, that would probably result in a new roadmap coming a day later...
-Q
Joe212
04-04-2014, 05:54 PM
Hey, let's not complain about the shift in strategy. Should be a great gaming-HTPC part, if the price is right... I hope they release a dual-core + Iris Pro SKU, because using an entire i7 to drive that level of graphics is a bit silly.
It would be interesting enough just for the 128 MB L4 cache. I have seen 4770R beat 4770K on several benchmarks with 300 MHz nominal clock speed deficit and with ~ 1/2 thermal budget for the CPU.
NTMBK
04-05-2014, 08:14 AM
Why is intel perpetually always one chip/process away from having a great mobile product? :D
I like how to get to that conclusion you had to snip away the big chunk of my post where I talked about Haswell Celerons being competitive. :rolleyes:
People choosing the low end chose it because it's cheap and good enough. No one buying a celeron is going to upgrade to an i7 later. They just want the next generation of low end i.e. socketed baytrail upgraded to socketed cherry trail. Intel was forced to enable Quick Sync all the way down to small cores so even a transcoding HTPC doesn't need anything beyond the cheapest option.
Why would you bother upgrading to the next-gen of low end? Gains will be pretty minimal, compared to what you could get from e.g. stepping up to a Core i3. Double the GPU, hyperthreading (which adds a lot on Haswell), higher clocks, AVX2 support.
I don't think AMD is chasing the low end but creating a new market for socketed ARM. No one is there and Intel can never compete against that. It gets even worse for Intel if the socket can do x86 and ARM. Doubly worse for Intel if AMD pushes it everywhere and invites other ARM players to release their compatible chips.
1. You can't produce a market just with hardware. You need a good software stack- Android isn't built to be a desktop OS, and Chrome OS is just a glorified web browser. Not to mention they both run just as well on x86. And Windows RT is a crippled joke compared to its x86 sibling.
2. Why would anyone care about a socket which can do both ARM and x86? "Yay, I upgraded my CPU and now I need to install a new OS and none of my software works." It might help with stock management, but that's about it...
3. The ARM SoC market isn't some utopia where everyone is helping each other out. It's an insanely cut throat, thin margin market. No way in hell that any of them would give a rival a foothold like a common socket. It just makes no business sense- do all the hard work and R&D of designing a motherboard and socket spec, and then let Qualcomm swoop in and steal all of the SoC sales. There's a good reason why Intel made their sockets incompatible with their rivals, and that was because their rivals kept undercutting them.
NTMBK
04-05-2014, 08:20 AM
How does Nvidia plan to chase the majority of the market for people who buy laptops than?
Nvidia leaving dGPU would be suicide until, well, I would give a date, but, that would probably result in a new roadmap coming a day later...
-Q
NVidia needs to do something to push integration, or they're dead. The potential performance and efficiency improvements from a tightly integrated GPU and CPU are massive. On the one hand they're pushing NVLink, and on the other they're pouring money into their ARM SoCs. They really wanted to make an x86 core, but Intel locked them out so they're stuck trying to make a market on ARM or POWER.
juanrga
04-05-2014, 08:58 AM
You sure that's true for nVidia? They don't seem to think so... :D
Not only they think so but talk about it ;)
How does Nvidia plan to chase the majority of the market for people who buy laptops than?
Simple. They plan to use APUs... just as AMD.
testbug00
04-05-2014, 11:20 AM
Simple. They plan to use APUs... just as AMD.
Yes, for all those people who buy laptops running Windows for ARM and Android :rolleyes:
Nvidia has no market strategy to exit dGPU in the mobile space. The whole "we can run x86 code" was the only way they had to get in, and, that did not work.
I mean, Nvidia might have products that are aimed at the segment, but, it does not matter if no one buys them.
EDIT: Intel Broadwell is far more competitive than anything Nvidia can throw out currently. Now is this post considered on topic? *Trollface*
-Q
Copper
04-05-2014, 11:22 AM
Just a small hint guys: topic above is INTEL BROADWELL. Feel free to start new threads of appropriate subject matter.
juanrga
04-05-2014, 12:20 PM
Yes, for all those people who buy laptops running Windows for ARM and Android :rolleyes:
Of course, no. :rolleyes:
Nvidia, AMD, and Intel are doing the same technology transition. Why do you believe that the most important upgrade in Intel Broadwell will be on the integrated GPU? The most conservative news claim 40% faster GPU for Broadwell K-series. Others increase it up to 80%.
testbug00
04-05-2014, 12:47 PM
Of course, no. :rolleyes:
Nvidia, AMD, and Intel are doing the same technology transition. Why do you believe that the most important upgrade in Intel Broadwell will be on the integrated GPU? The most conservative news claim 40% faster GPU for Broadwell K-series. Others increase it up to 80%.
Broadwell can sell on x86 laptops, desktops and maybe some silly tablets. People will be fine with it because it works with all their programs.
AMD's x86 APU will do the same, and their ARM might be able to get some sales in Android tablets/desktops/laptops/phones and perhaps some ARM Windows tablets/desktops.
Nvidia's ARM/Transmeta-thingy APU will not do the same. It cannot run x86 code. If any OEMS are willing to use NVidia the might sell a few Android/Windows ARM desktops/tablets/laptops.
I am not saying that integration is not where we are heading.
I am saying NVidia integrates stuff that no one will use. They burned the large mobile OEMS, they don't have stuff that consumers want in the laptop/desktop space. Their largest recent design win probably involved giving Microsoft free chips for the Surface 2.
saying "look we have integrated parts like them, except, none of your old programs will run" is not a good sales pitch.
-Q
juanrga
04-06-2014, 07:50 AM
Broadwell can sell on x86 laptops, desktops and maybe some silly tablets. People will be fine with it because it works with all their programs.
AMD's x86 APU will do the same, and their ARM might be able to get some sales in Android tablets/desktops/laptops/phones and perhaps some ARM Windows tablets/desktops.
Nvidia's ARM/Transmeta-thingy APU will not do the same. It cannot run x86 code. If any OEMS are willing to use NVidia the might sell a few Android/Windows ARM desktops/tablets/laptops.
No. Moreover, I note you didn't really answer my question about Broadwell GPU. As I said before the laws of physics are the same for the three players: Intel, AMD, and Nvidia.
Adam Kudelski
04-07-2014, 03:53 AM
http://fudzilla.com/home/item/34418-broadwell-will-miss-back-to-school-spree
In essence, we should see the first Broadwell products in time for the holiday season, but mass availability is probably coming a bit later, in early 2015.
If mobile Broadwell will be XII'14/I'15 product, than maybe... desktop Broadwell will be on the market later than Carrizo?
testbug00
04-07-2014, 09:06 AM
No. Moreover, I note you didn't really answer my question about Broadwell GPU. As I said before the laws of physics are the same for the three players: Intel, AMD, and Nvidia.
1. Okay, Intel is putting more effort into shrinking die size than raising GPU performance I guess? I really don't know why.
2. That does not mean anything for the companies "doing the same technology transition" as, AMD/Intel are transistioning to all in one APUs for consumers to use. Nvidia is transitioning into Tegra, except, it seems like they are moving all their bets to automotive, and, leaving the consumer market.
-Q
elemein
04-07-2014, 11:30 AM
2. That does not mean anything for the companies "doing the same technology transition" as, AMD/Intel are transistioning to all in one APUs for consumers to use. Nvidia is transitioning into Tegra, except, it seems like they are moving all their bets to automotive, and, leaving the consumer market.
-Q
Dont forget that Intel is also in automotives (even moreso than nVidia IIRC)
Tegra, as seen from years ago; is just further displacing itself out of consumer view, expecting to drag consumers with it, and getting seriously frantic when the consumers DONT follow them.
Who wants a tablet with 1 hour battery life and the heat of the nucleus of the sun anyways?
Thrillseeker
04-07-2014, 11:57 AM
Who wants a tablet with 1 hour battery life and the heat of the nucleus of the sun anyways?
It's pretty cold here in April, so I DO!
elemein
04-08-2014, 10:21 AM
It's pretty cold here in April, so I DO!
It's still cold up here in Canada in April. We still see negative temps.
But if we got one of those K1's without a heatsink and just placed it on the ground, we could have the climate of a very large, airy Jamaica.
Okay now we're just getting silly ;) Back OT.
They'll come in handy for the next Ice Age. Assuming we have enough fusion/fission plants available to power them. ;)
juanrga
04-08-2014, 12:46 PM
1. Okay, Intel is putting more effort into shrinking die size than raising GPU performance I guess?
No. Intel has modified their tic-toc strategy to account for their recent emphasis on the graphics part.
Broadwell main improvement will be on the graphic side. According to some leaks Broadwell GPU will be 40% faster than Haswell; it will be 80% faster according to other sources.
Joe212
04-08-2014, 07:12 PM
No. Intel has modified their tic-toc strategy to account for their recent emphasis on the graphics part.
Broadwell main improvement will be on the graphic side. According to some leaks Broadwell GPU will be 40% faster than Haswell; it will be 80% faster according to other sources.
Intel seems to alternate CPU and GPU refresh. Since Haswell got CPU core refresh, Broadwell would be due to GPU refresh.
As far as GPU performance, gains can come from 2 sources: refreshed GPU and shrink, allowing more units on the same real estate.
juanrga
04-08-2014, 08:36 PM
Intel seems to alternate CPU and GPU refresh. Since Haswell got CPU core refresh, Broadwell would be due to GPU refresh.
As far as GPU performance, gains can come from 2 sources: refreshed GPU and shrink, allowing more units on the same real estate.
Yes. In the new tick-tock cadence, the tick is a die shrink and the introduction of a new GPU, and the tock focuses on the CPU
http://www.extremetech.com/computing/170291-intels-14nm-broadwell-gpu-takes-shape-indicates-major-improvements-over-haswell
Some Broadwell GPUs will receive their performance gain from a third source: L4 cache.
chipshot64
09-04-2014, 07:35 PM
Intel discontinues three Core M Broadwell CPUs
"Intel has announced that it will discontinue Core M 5Y70, 5Y10 and 5Y10A Broadwell processors before they are even launched. The company states:
Market demand for the products listed in the “Products Affected/Intel Ordering Codes” table below have shifted to other Intel products. The products identified in this notification will be discontinued and unavailable for additional orders after the “Last Product Discontinuance Order Date”
If you look at the bold text it states that demand has been shifted to other Intel products. Well this does not indicate that Intel will skip Broadwell for mobile CPUs to make way for Skylake instead it has a different story. Core M Broadwell CPUs went in to mass production earlier this year before Intel was able to fix the TSX issue causing unpredictable system behavior so the company moved on to a new stepping."
http://www.chiploco.com/intel-discontinues-core-m-broadwell-cpus-35950/
After launching Bay Trails last year with many issues like power managment and graphics driver problems, Intel shows that with Broadwell...worse is yet to come?
bladerash
09-05-2014, 08:26 PM
It has a die size of 82mm^2 so there are over 700 dies on a wafer.
Those are going for $281 each if working.
I was expecting it to be somewhat more expensive compared to high end ARM SOC or atom bay trail. 2x maybe.. or 4x, but 10x as much ehhh..
It has a die size of 82mm^2 so there are over 700 dies on a wafer.
Those are going for $281 each if working.
I was expecting it to be somewhat more expensive compared to high end ARM SOC or atom bay trail. 2x maybe.. or 4x, but 10x as much ehhh..
The Haswell i5 Y parts are up to $300. This is a bargain. ;)
http://www.cpu-world.com/CPUs/Core_i5/Intel-Core%20i5-4300Y%20Mobile%20processor.html
vBulletin® v3.8.7, Copyright ©2000-2014, vBulletin Solutions, Inc.