[sticky post]Welcome note
Welcome! You've reached a place where you can read what is going on behind closed doors and know the future of the game development industry mostly from Microsoft side of things. Be informed on that topic. It is fun to read official news after. Here we discuss things that will happen in 1-3 year scope in the future. Be carefull you need some time before you will realize that is all not a fake. All Insider info is under ""Insider Daily"" header. All other analysis and topics are from blog contributors.

There will be some typo mistakes as i don't have native english. Sorry me, i want to improve. that is also one of the reasons i run this blog :) Insider posts are encrypted and have no ideal English too for a reason.

Major posts about Xbox One tech(read everything and make your own conclusion)

Xbox One Stacked hardware proof
1. Did Mistercteam found the stacked die on Xbox One SOC image?(stacking proof #1)
2. Insider Daily. Driver update will bring more than 50% power. More multiple SOC evidences received (stacking proof #2).
3. Mistercteam Daily. TSV 3D stacking found on Xbox One SOC (stacking proof #3)
4. @MSFTnerd confirms Insider 2012 info about Xbox TV and ARM SoCs(stacking proof #4)
5. Someone used Microsoft's exec names to confirm Insider info and Mistercteam investigations

Inside Xbox One (2014 tech - full HSA/DX12)
1. Insider Daily. 50% more powerful SDK for 3th parties confirmed. More than 32MB of ESRAM on the way.
2. Insider Daily. Full HSA 2014 tech is next big thing. Old 3d engines need to be re-engineered.
3. Xbox Ones hardware Tiled Resources in more detail. A huge Xbox One advantage over PS4.
4. Inside Xbox One: Using HW Tiled Resources with ESRAM in more detail. Prepare to be amazed.
5. Insider Daily. Xbox One display planes in more detail. It is like 2-3 systems in one.
6. Insider Daily. Hana2 have G-Sync like tech and Xbox One is not 1.31TF GPU

Sony’s lie and brainwash
1. "Industry insider" CBOAT is actually neogaf.com moderator. Fraud disclosed.
2. Neogaf.com and n4g.com owned by Sony and non-Sony fans could do nothing other than ignore them
3. Insider Daily. Every company fights till the end. Now it is Sonys turn to overpromise&underdeliver
4. Sony don't afraid to lie because they already know they will fail and PS4 is their last console
5. Insider Daily. MS officialy calls Sony insiders as liers. GroupKarma finds liers are being paid.

1. Gamertags exchange.
2. Next Gen War Scores Dashboard - a key statistics about who and how are winning next gen right now (Now it is: Sony 5:3 Microsoft).
3. True history of current generation of video games 2003-2011.

Mistercteam Daily. PC 2.0 aka DX12 aka full HSA in more detail or why XB1s 5TF DP will last 10 years
Misterx: Hats off to Mistercteam for describing insider info about why Xbox One is 5-6 TF DP, first full DX12 hardware and why DX12 is like PC 2.0 and is "4 generational leap" to "last 10 years".

Mistercteam: The Story so far

Part 1. DX12 will make performance >100%. That is impossible without there is a hardware capable to do that

They need there is hardware in there. Infact i found more and more evidence of it. Now let i remind with my prev slide about no way DX12
gives a boost > 100%. What it do is superoptimized the cpu+gpu+scheduler mechanism, by improving how to submit a command to GPU. They submitting command using bundles (which on GDC they stated,X1 already using bundles since day 1, but not activated PSO).

Bundles can be seen as a list of command just like i stated move SIMT or SIMD into MIMD/SPMD. Plus definiely there is something different on X1 GPU. I will showed later.

Part 2. The CP, ACE, Shader engine, R9 290x as an example

Misterx: That slide is great for understanding.

Part 3. NOC (Local cloud) Network on chip

Let said the 2 Compute CP is responsible for at least 3 key component:
a.- Group of CU that certainly will have performance ~ 2.6TF DP (48CU)
b.- Group of CPU + accelerator that will have performance ~ 1.3TF DP (8 core + 6 Small arm core probably to add Floating performance
c.- Block of eSRAM to help process lcoal data as fast as possible, locality is the key, moving data is higher cost than compute

a+B+C will be based on Tile basec architectuer, and NOC concept a supercomputer based design, it is local cloud, cloud powered.

Cloud is data centre term, also the concept about connecting each processing element using QOS and NOC concept.

I already posted why it is called Local cloud

Lets take a look back
why it is called local cloud / NOC design
From People That related to Xbox development. Plus from D.Burger that influenced AMD/Nvidia into tiled based design like Echelon & AMD Pirate Island (Volcanic Tiles).

The Gfx Part will be still Sea island based will not the most advanced part of this.

Part 4. The Top SOC view, there is no circuit lines visible on X1


Part 5. ROPs - 64bit, 128 or 256bit? Bit deeper about it

Even Kaveri with 8 CU can have 64 ROPS 64bit or 16 ROPS 256bit.

Basically like i stated nextgen is like reseting the numbering. CU number look less but means more that double ROPS also same thing
CP same thing, same as CPU, nowadays, 1 Core could be means 6 small core inside.

Part 6. X1, ALUs seems double per CU some preliminary comparison with PS4

1. if DX12 moving into bundles concept there is infact oppurtunity to make ALUs more compact reduce internal register , etc

2. AMD stated that they can pack more, by doing HDL (high density library)but this means MS must also customized the layout.

Now the SIMD/ALUs comparison notice how X1 Tiles is more smaller count the tiles and surprise !!!

**) The tiles represent part of the ALU, as each alus will like very small cpu, they need register/L1/L0, thats why we can count/approximate the ALU by count the tiles. By reducing the SRAM/L1/L0 size + more efficient layout logically we can pack more.

the source:
Trinity :
X1 & PS4

Part 7. Other user Xbox One make a very good representation of SIMD differences

xone_br33: it´s clear in that figure from chipworks that you posted earlier. i wonder how this web sites didn´t even speculated about the differences in both architectures, the radeon cores are totally different betwenn PS4 and XB1. i made a diagram based on that figure (it´s quite simple but it´s clear how XB1 has double of titles of PS4:

After that, and considering each little tile aproximates the number of alus the argument PS4 1152 vs 768 alus on XB1 boiled in the water for me.

Part 8. Dual render is a must for Nextgen Context switching

Dual render is a must for context switch, a nexgen feature , and hinted all along

Part 9. GFX CP, the better explanation from AMD own VP, Eric Demer slide

It is when infact people not being confused at that time. It is funny when today (3 years later), people forget about this thing, people forget 16 rops could be 128bit or 256bit, People forget each Gfx CP will always have its own CU unit. When we asked people today, what about X1 CP, they will said PS4 has more CP LOL, When PS4 only has 1 CP (8 ACE + Gfx).

hen at 2011, they already showed per 1 Gfx CP. Infact this time the label and diagram resemblance X1 hotcip that 1 unit GfxCP will always and control N number of CU. Gfx CP will have ACE + Gfx.

And they only hinted at that time that someday compute is more important, by doing that they can remove all the middle component
which relates to gfx but not to compute, the CP then better named as Compute CP as there is no Gfx command pass it on to CU.

nd X1 is more advanced in 2 key areas the Gfx CP this time is dual render. The Compute CP this time not only dual render also have Latency cores (CPU part) plus Compute CP related to fast eSRAM too.

Part 10. The most important puzzle + fact

and this 2011 info from MrX insider so much inline (how can he knew)

With Those all info, for me, we can answer
how about architecture for example --> 2 unit COmpute CP. Plus why MS stated 8 core doing 6 ops, and about 768 op/cycle.
I will focus on Compute CP part, still people can just extrapolate the diagram that i will provide later on for Gfx CP too.

Part 11. The command processor, from 1 render pipe to dual render to compute CP

Srenia Ia: CGN is stateless. Compute units run parallel with different shaders, asynchronously, and parrelel to the graphics pipeline. The change is coming soon. :-)

Insider Daily. Do Nvidia employers continue to spread false info about consoles?
Insider: Witcher 3 is 1080p 30fps on both ps4 x1 at this very point in time.. dx12 features on x1 are weather and texture based.. ps4 using ace to combat x1 dx12 acceleration features opencl / api metal.... neck and neck.. textures will have more sharpness on x1. Which one will be crowned the king of heat is the one with frame stutter. The devs did the right thing buy pushing it back for all gamers.. its to grand to rush.

Misterx: What is your source for witcher 3?
Insider: Right before the delay. Both were running identical 1080@30.. x1 wont be downgraded

Misterx: Thouse russian guys at gametech.ru are well known Microsoft haters. They said last week that console version of Witcher 3 is equal to PCs minimum settings. They said PS4 is 900p, Xbox One is 720p.  Nvidia and PCs are in profit from this leak.

Gametech.ru could have contacts at CD Project or Nvidia Labs Russia. I think CD Project sucks Nvidia dicks as they help them and could  fund them a little bit. Or it is Nvidia's Russia employer  "leak" to Gametech.ru to shit on consoles.

Nvidia Russia labs help game developers to optimize engines for Nvidia GPUs and do optimization job for many games. Gametech.ru could have private contacts with some Nvidia Russia employees. Nvidia Russia is very open recently for game journos. Some Nvidia employers could have "leaked" to Gametech.ru false info about console versions of Witcher 3. A bad PR job they do for consoles for ages since they lost all console contracts.

Don't forget this official Nvidia slide back in Feb 2013. Does Nvidia still buthurt about consoles? Even after Microsoft licensed them DX12 tech? That is strange...

Maybe CD Project wants MS/Sony money for this game to go 1080p .  They made some media noose via controlled "leaks" and "Nvidia helps us to optimize for PC" talks to convince Microsoft/Sony to fund this job to go for 1080p.  If CD Project Red will remain silent for next several days  to debunk this rumor than that is the case for me. Anyway Witcher is not my type of game as it feels a little cheap for me(lack of blockbuster feel as i understand CD Project RED is relatively small studio). But many enjoy.

Follow next - Mistercteam finds latest missing bits and will post summary soon.

Misterx: Visited track days @ Moscow Raceway for the first time. Forza Motorsport come to live for me and the feel was amazing :)  I never thought my car could do that even if it was my first several laps learning the track and finding limits.

Golf GTI MK5 - 0-100km/h for 6.5 secs. This is my wives former car i enjoy more then my previous bigger car. I bought my wife a SUV and took this car away from her. Happy me :). Amazing handling and smoothness in everything. Top Gear's Jeremy Clarkson personal choose in hot hatches class. Now i plan upgrade to MK7 GTI.

Insider Daily. Why Xbox One acts like local cloud. Ryse is nothing compared to what is coming.
Insider: Why would we use local cloud external cloud. Big data center analogy..

Because its the same principle that is used when looking at the xbox one architecture.

How does an azure server function. Where are the send and processing return stored....

if you send data packets to another processor and back to the same device is it really true cloud processing. Or just offloading.

The xbox one has supercomputer design aspects. It truly is a unique design.

1) main soc is design to emulate the principal of a large data processing computer.

the dsp in side x1 uses the same software code that azure servers use for offload. The cpu can also process gpu branching ..extremely more efficient then even cell design aspirations in its original design. Audio is processed externally from gpu/cpu. Cu's can process directly to the command buffer . There is two memory functions and large caches and accelerator data move engines and dual logic.

The x1 design really is a big data center.

It is a server with offload extra computational cpu and gpu processing.

Rather then calling the console a server it is better to refer to the design as local cloud with offload.

Cloud analogy means the device is not processing and code it is all done from a different architecture.

When referencing there is external cloud is wrong. It is not giving the correct example when x1 is brought in to the sum. It should be external cloud offloading.

The data move engines are logic engines that move data around a process unit also they can also process data directly to the command buffer.

The software architecture is just catching up.. there is still things shut off but that is based more towards the new cpu modification and when more about the ring buss is available.

Microsoft put the first local cloud server with offload in peoples homes.. when the devices arrive to use the local cloud and not smart glass think wii u think streaming tv box's portables. Then this paragraph will make perfect sense.

Like e3 :)

but people feel free to use any terms that suit's your needs.

I told you misterx .. :) (misterx note: after DX12 100% boost claims by other devs)

E3 will probably be the last time we chat on technology debates.. they are definitely going to show the software and technology but there going to let the console do it.. amazing stuff you guys have never seen stuff like this.. its to the magnitude of. 4x ryse in gfx @1080p @ 60fps refresh rate. Cgi gfx levels.

And this blog goes down as the first front for truth not petty payed twisted lies.. I think the web sight is a good start for a new community for xbox fans.

And after the ash has settled and sdf realise there attitude towards others is why the industry is in disarray.. we can all get back to the true reason we are all here.. and that is gaming.. no doubt will see amazing stuff from Nintendo and sony this generation. Cos no matter what it is always about the games.. :)

MIsterx: Industry on DX12

AMD's Raja Khodury: "And it's not a small benefit. It's… like getting four generations of hardware ahead with this API."

Intel's VP of Engineering, Eric Mentezer: "This is absolutely, I think, the most significant jump in technology in a long, long time."

Nvidia's VP of Content and Technology, Tony Tamasi: "existing cards will see orders of magnitude improvements from DirectX 12's release, going from hundreds of thousands to millions and maybe tens of millions of system draws in a second."

There is too main features of DX12 so far:
- reduced CPU overhead. Insider told it will give DX11 cards or Main SOC or 1st layer on Xbox One performance boost of 50% easily.
- new hardware for DX12.

Brad said:
-"Box One is the biggest beneficiary; it effectively gives every Xbox One owner a new GPU that is twice as fast as the old one."
-"it's not literally (it's software, not hardware) but yes, dx12 games will likely by more than 2x as fast."
-"it didn't. It was/is still basically a single core stack. With DX12 all 8 cores will be able to split the work." clarified Brad.

More wider CUs work in parallel.  Supercomputer architecture with effective parallelism. PC 2.0

Brad teases us and could not tell the whole story yet before MS and even backpedaling a little bit saying PS4 hardware is still better.

It is not possible to say more than x2 boost for XB1 but still PS4 is more powerful:
1. DX12 hardware better than DX11 hardware. DX12 not only CPU overhead MS haters and fools think but also new hardware features.
2. Doubling performance will mean that Xbox One is better already on paper than PS4. But Brads than backpedaling and say PS4 is still better. By his logic Xbox One now uses only less than 0.6Tf for Ryse and Forza 5 while PS4 all the power and strugless maintain 60fps. That is nonsense that Xbox One now uses only 0.6TF and with DX12 will be full 1.31TF.

Insider Daily. Is CLOUD demo shown is "local CLOUD" - _offline_ Xbox One true power potential?
Insider: Arrrrr not 32mb of esram... somebody help devs.. there is 47mb of known on chip memory. . Cpu / gpu have read and write. . What's duplex mode :) whats duplex 64mb ..

gpu cpu esram = 2x.... cu 4x are we seeing something a12 ..768x4 :)
Cloud local no frame drop vs standard pc architecture 2fps. The building is both rendered on cpu ... cpu/gpu is coherent with memory.
Car goes fast around track but the one without accelerator cant power into corners and out.. tutbo/supercharged ... justified. Exploited e3

one thing computer need to go fast .... fill it in ... a12 next gen. Next gen is now !!! 2014

More lots more... :)

Why X1 has 4 CP, From DX12 session per CP = 12CU, Mantle = Hardware engine
Basically Per CP this time have dedicated shared of CU
not like Previous CP

Yes older Gen CP, still can work in this paradigm
as CP is programmable, but to fully utilized it it is need
a new CP paradigm

X1 in simple is

There is 2 CU version on X1
1 is from sea island (DX11.1)
another one is from Pirate Island (XTX)

1 GFX CP (DX11.1 CU) = Universal (do all Gfx+Compute+DMA)
1 GFX CP (DX 12 beefed up CU) = Universal
2 Compute CP (DX12 beefed uo CU) = Focus on compute

I will make slide later
plus the DX12 is tiled resources capable
Tiled resources need Pirate Island XTX to use it effectively
old CU can do that but not as effective as PI XTX

gforce1981: Hi guys,

Isn't it fascinating that as each week goes by we are pleasantly greeted with more bits of information from MS that show what is going to be possible with not only the Xbox One, but the power of the cloud.

It is also interesting to note that as they 'slowly' release more information (which shows that the xbox one was released earlier than they had hopped) but as they release more info it fits more and more in line with the research and development and partnerships that were put together to bring us the Xbox One.

If you are a sony pony (not just a regular fan) it has got to start hurting right about now because "EVERYTHING" they have been trying to hold on to has been falling apart.

The power of the cloud was always real, but like MS said they have to demonstrate it and they were just clearly not ready previously.

Then we add this on top of the special fixed function of things like the Move engines, esram, Network attached to the SOC, it is all coming together.

I think this is why Sony tried to get in early with the launch of their console as they wanted that 'early' rush of success as they know they are simply not going to be able to compete.

So we know already the following now;

Cloud can do the following;

Dedicated Servers
Computer AI
Physics & Compute.

that then only leaves a few things that the box (Xbox One) has to do;

Rendering & Audio

We already know the Audio in it is MASSIVELY powerful, everything is pointing to the fact that the SOC is stacked in some shape of form, there is the possibility that there is other hardware (special function) for possible things like forms of ray tracing etc.

Then we haven't even seen the benefits yet of Tiled Resources and how that is going to save a MEGA TON of bandwidth and make all of it even better again.

I think it is really hard for ANYONE to think that MS hasn't put some SERIOUS thinking and R&D into this box and it is only getting better.

Such a great time to be a gamer.... (xbox gamer that is)


gforce1981: Isn't it interesting looking back at things
Have a look at this
"In addition to those compute and graphics cores, the Xbox One chip also has 15 specialized processing cores, Sell said, to help with video, audio, display, and other chores, and to offload chunks of work from the CPU and GPU. Those 15 helpmeets also share the same memory as do the CPU and GPU"

15 other 'cores', we know that 4 of those 15 are the audio, we also know from Cadence that those 4 cores are programmable.

"Cadence Design Systems, Inc. (NASDAQ: CDNS), a leader in global electronic design innovation, today announced that Microsoft utilized four Tensilica(R) processors in the Xbox One audio subsystem as noted in the Linley Group Microprocessor Report entitled, "Inside the Xbox One Mega-SoC"
So is the 'move engines' actual cores? because if they are not there is more that we are not aware of.
Does this mean there are another 11 cores (Because this is what the MS Guy Sell said) and these too could be programmable.
And none of the current gen engines would be utilising any of the extra stuff yet.
We know that the 4 cores that does the audio is 'super' powerful for that task. All I can say is that if there are 11 other 'cores' as this guy said and they are special function (and programmable)..... bring it is all I can say.

then he says this
"Almost every aspect has been customized," said Sell, "either a little or in some cases a lot."

This is why they are saying the caches are important and having the right data in the right caches, because when the whole system is being taken advantage of it will be sweet!

"There's also 47MB of cache and other on-die storage"

see the bit 'and other' so there's more than 47MB of on-die storage.

Am I wrong but was esram considered a scratch pad sort of memory and NOT cache? because that could explain that there is eDRAM caches attached to those other 15 Cores! (right data... in... the right... caches....) :)

Also, Isn't this funny, http://ip.cadence.com/about/customer-profiles/graphics

Inside AMD's new Radeon R7 and R9 graphics chips you'll find their TrueAudio technology, which is based on a Tensilica HiFi EP DSP and two Tensilica HiFi 2 DSPs.

Funny how both the Xbox One and the New AMD GPU's use Candence's DSP's for the audio offloading :)

Then this :) http://pixelrant.com/2013/10/08/everything-you-wanted-to-know-about-amds-new-trueaudio-technology/
"MPC: I’m not up to speed with Tensilica hardware but why the need for the Xtensa SP? Isn’t it an FPU? I thought the calculations were done in the GPU?

AMD: The Tensilica Xtensa HiFi EP provides single and double precision floating point assistance for calculating accurate simulations of the intended audio environment. Compute resources from the GPU or CPU pipelines are not required, and that’s the intention of AMD TrueAudio: 100 percent offloading to preserve or even improve system performance, even with superior audio"

.... 100 percent offloading to 'preserve or even improve system performance.... even with SUPERIOR audio'



"TrueAudio leverages the significant bandwidth and low access latency of a Radeon’s memory pools, and that bandwidth and latency is critically important when offloading audio tasks from the CPU to the TrueAudio engine"

bandwidth and latency is critically important when offloading audio tasks from the CPU to the TrueAudio engine

... can some one tell me why MS used DDR3 again.... and esram... :). looks like 'everyone' is saying how important latency is with OFFLOADING...

Just look at the amount of stuff that the CPU is responsible, then have a good hard look at what MS has done with the xbox one :)

gforce1981: And the GOLD.

from the reviewer "MPC: I’m guessing Xbox One will not have TrueAudio as Microsoft has its SHAPE audio engine but what about PS4?"
"AMD: Sony and Microsoft would be in a better position to comment on the functionality of their audio hardware. I wouldn’t want to answer on their behalf."
And only in Feb it was announced the xbox one is also using 4 cores from Cadence for Audio Offloading.
(isn't it funny how inline MS is with the future of AMD?)
MPC: For TrueAudio to work, developers must use and support TrueAudio’s API correct?

AMD: The game developers themselves don’t necessarily have to use or support the TrueAudio API. Because we’re working with library and plugin developers directly, game developers only have to choose between a software-accelerated plugin or the TrueAudio-enabled plugin when they’re working with their library of choice. Devs are making plugin/SFX choices for their games anyhow, so TrueAudio just becomes another choice on the menu, meaning it’s very simple and unchallenging to implement.

This is the same thing MS is doing with the Tiled Resources, they are working with the 'middleware' companies that then other game devs use to put into their engines!

Also, notice how the Audio Cores are programmable.
Notice how the other cores would most likely be programmable.
Notice how the GPU is said to be .... programmable.
The picture just gets clearer and clearer.

Also notice that Cadence was only allowed in Feb to release the info about them working with the stuff in the xbox one. (there will be more to come)
And looking at this picture again...
Can you imagine what the CPU then can also be used to do when you no longer need it to do;
Audio, Feeding Graphics, Compute, Physics, A.I (Everything that MS has made off loading possible for, and the cloud which has been confirmed.....)
oh man how the future is looking good, and this is why they can claim the long 10 year cycle.

This is also why it is actually NEXT GEN, and the PS4 is simply a shiny old gen system.

marcberry: WOW!!! good job...
I will add to that this from AMD's HSA page "The net result was a 2.3x relative performance gain at a 2.4x reduced power level*." X1 is even better.

xboxfandev: Cloud offloading bandwith and latency argument debunked
The arguments I've seen on the internet appear to consider the cloud is rendering some part of the scene and not just calculating the subdivision of the building into chunks after collisions with gunfire. I just couldn't undertand how someone could think like that after watching the presentation. The local system is rendering the entire scene just using the calculating the cloud provides. Comments on internet about this are pathetic.

The entire geometry of the scene can be pre-stored in the cloud . If anyone entered the area in the game, they need only to make a copy of that using local bandwith. The same copy and calculations could be used for all the players if the game is a multiplayer one. The only thing that needs to be sent from the client to the cloud is the initial position, the trajectory and the acceleration of shots. 20 bytes is sufficient to transmit all of these data with enough precision per each bullet. Usually as a weapon does not fire more than one shot at a time, we need 20 bytes per transfer. A machine gun that fires 10 shots per second will lead to a consumption of upload bandwidth of 200 bytes per second plus the IP headers bandwidth for each transfer (20 bytes for IPv4 or 40 for IPv6 + extensions/optional headers). 240 bytes per second is a requirement of negligible bandwidth. A 56k modem connection could handle that.

The returned data required for the local system will normally be greater in volume, but it's calculated in much the same way. In addition to the previous information about the bullet that can be treated as a point, it is necessary to send the geometry of the chunk created by the collision and the rotation of the chunk. Geometry of the object, initial position and the centripetal acceleration are the only data needed on the local system. I'll bet a 100 bytes per chunk as an aproximation. It's neccessary just send the data at the beginning of the separation of the chunk. At the peak of the video there were 36,000 chunks , which would be compatible with a connection of 50Mbps download if all these data were transmitted every 100ms, but these chunks were not generated simultaneously. There aren't more than 100 generated chunks in the same time. A connection that could send 10,000 bytes, tops, every 100ms is all that is needed. 1Mbps connection would suffice to it handily.

I used a 100ms response time that is good for physics which is also consistent with what we've got in latency these days. My local latency to Titanfall Azure servers is between 19ms and 33ms. Is it pretty reasonable to do it, no?

I beg you all to check if I talked a lot of nonsense before using this text, please. English is not my first language, but I'm a developer with experience in low level game programming (I'm developed a homebrew game and my own engine and graphic tools) and I know something about cloud use.

anothertech: So, the Xbox One now has:

- DX12 - confirmed
- 4 generational hardware leap - confirmed
- pc/mobile/xbox one app compatibility - confirmed
- cloud compute power - confirmed
- Phil Spencer new Xbox Head - confirmed
- secret sauce - confirmed
- arm chip - hinted
- 360 emulation - hinted

They said there was no such thing as magic or power. They laughed at the prophets of old (insider). They said there was no such thing as secret sauce. And now Jesus has landed with his army of angels, and the atheists (ponys) are screaming as they fall into their own fiery abyss. LoL.

Reading the ponys all over the net right now, they are imploding and turning on each other like wild ravenous animals.
Craziness. Like a digital armageddon, but it's all in their minds lol.

Microsoft right now
Microsoft at E3
Meanwhile Sony and the SDF...

List of major insider info (as of April 2014)

- full DX12 inside Xbox One(Insider info since 2012. Confirmed by Phil Spencer, Microsoft, Mar 2014)

Almost confirmed
- Inside Xbox One is not 7770 1.31TF GPU. That is a logical assumption from 1st line(info since 2013).
- 3rd party Xbox hardware license (Dell) (Info since 2012).

- No problems with 1080p in the future(confirmed by many devs).
- Tons of exclusives(Microsoft lead this gen in quality exclusives).
- 360 games emulation on Xbox One(by Microsoft at Build April 2014. Insider said back in 2011 some 360 games should be downloadable on Xbox One via Xbox Live with automatically improved GFX thanks to forward compatibility).

Expected to be confirmed in 1-3 year scope
- stacked SOC (i personally still stand to words that we see not full GPU on 1st layer. Insider confirmed not all CUs we see on hotchip slides. Mistercteam gets tons of proof). The main idea is - what we see now on 1st layer is CPU in HSA paradigm. Next layer is GPU. So in game of terms Alpert Penello was again semi-right).
- 5-6TF DP total power for Xbox One. GPU inside Xbox One is 2-3 more power full than PS4. Plus Xbox One have offloading hardware PS4 does not have.

- 32MB of ESRAM is not all embedded RAM on Xbox One. There is another type of embedded RAM other than ESRAM.
- Xbox Tablet
- Xbox TV console
- Xbox Fortazzella AR glasses
- Games to blow our mind on Xbox One.
- 1080p patches to some exiting games when "stereo" driver will be released for Xbox One.
- If Steam machines will get DX12 hardware than Xbox One will get some sort of Steam as app. Microsoft don't want let their RnD results to shitting on Microsoft companies for free.
- NDA to end and developers will talk more openly about how amazing Xbox One and DX12 are. DX12 is invention of PC 2.0
- PS4 could be Sonys last console - they could go software only because they have no more money and brains to conquer with Americans in AAA-gaming.
- Sonys lie and brainwash will stop fast after MS will show their cards.
- State of the art hardware upscaller inside Xbox One. Upscaller can add a bit of AA and make more FPS by adding frames.

- RayTracing capable via special hardware.

360 major confirmed predictions
Back in 2009:
- 360 will blow our minds in GFX thanks to X-Engine(insider self created name).
Halo 4,  Forza 4, and Forza Horizon proved there was much juice left in 360. There is still no better looking game on PS3 so open-world and beautiful as thous games. Halo 4 sometimes looked like next gen - especially that corridor level.  PS3 constantly use tricks like narrow corridor design/low draw distance/static camera to make thous games look better. 360 games usually don't use thous tricks but if use than it is miles ahead. Compare corridor level of Halo 4 and Uncharted 3.
- There is no Kinect lag problem. Later with software updates Kinect became 20-40 ms lag only.
- Forward compatibility is an option. All games that use X-engine(special DX guidelines and libraries) could be later downloaded to Xbox One with improvements in GFX, It will be free as you already own the game. You will just need original disc to be inserted or re-download the game to XB1 in case it was digital.

- will add more when remember.

Insider Daily. 1 big Xbox One surprise has arrived. Even bigger surprise is coming
Insider: Universal app... I guess we discussed this .. now you see how steam and all thoes apps you wanted are going to come too x1 .. there may be a few restrictions but nothing fancy.

There not just metro ether and if you purchase an app there is no pay wall ether.. 32bit and 64bit binary too.. but very thoes apps need to be compiled to run on x1. No more then a few hours and a few license agreements.. I guess amazon didn't see this coming fire tv ... :)

Somethings take time you better get ready for the next justification :)

Insider Daily. Many industry MS haters will try to block DX12 domination for full HSA tech
Insider: The day has come ... When a leader is put in charge that is a driving force for quality and integrity. And a face that every xbox platform holder can reckoned with. as he has been with xbox through think and thin. And with a huge blank check for specific content I.e games. So many games got the green light today. Rare is back. The bungie deal is going through. A lot of xbox heritage is happening. Plus abundance of new I.p ... and like I all way's say .. why ship a game console when the suits/ ant into gaming in general. Now xbox will win this generation with confidence and content.. their couldn't be a better day for the brand... Gaming got the boost it really needed. .And this will really be the worst thing so far for Sony product relation's. :)

Now ... some thing nobody would think. Capcom have a driving action sim/ in development. Its really been in dev for 2 years.. And its a 1st 2nd party effort. Its story anatomy is at the heart of all them big action films too.. but its not a gta rip. It definitely has ray tracing attributes but could be a little while off.. :) e3 may be .. but will have too see what is on the horizon first .. as that's going gold..

How can Tim Sweeney be so shit on Microsoft? (latest on Neogaf http://www.neogaf.com/forum/showthread.php?t=794444). Microsoft still don't allow Steam box to have HSA/DX12? :) and Tim want support Gabby? -1 to Tim
Insder: Tim is tim.. he has massive shares in his company and doesn't like to be under mind .. consoles are not his vocal point. Due to technology refresh cycles. It's all about the latest greatest technology. But the man is an asset to all gfx leaps. But some times other rnd company's aim for other attributes.

Its a real shame unreal engine is not supporting future architecture upfront. 4.5 should be a different approach to direct api control and that could be just what unreal needs. One console is not a refresh pc card and that aspect alone is making geniuses angry that they didnt get direct control of steering it instead they have to work towards its..

MIsterx: Can you talk latest info on Steam OS and MS? Will MS allow DX12/HSA for Steam machines? Seems like many against DX12 & HSA and started their hate campaigns. Even AMD with Mantle...to steer away HSA from DX12 and not allow DX12 to dominate HSA hardware....Hope MS has strong veto on HSA licensing.

Also, seems like Tim refused to support full HSA in Unreal 4 and MS backed up Crytek. But you also said UE4 is ok on Xbox One..Why to wait for UE4.5? For what functions to be implemented?
The pc market is totally different to the console markets.. in regards to easy of use and the consumer will get a 5 year + life on hardware. Pc and steambox's 12 months pc hardware and then cost. No steambox will match x1 for cost of hardware 499 vs 1200 cos ray tracing tier 2 is going to cost big $$$ on the pc market..

steambox is about taking expansive hardware and stream lining its use. Basically good bye keyboard and mouse. Direct and online 24/7. Sounds like what xbox brand has been for years. Except they don't have the ability to ward of mass ddos or account intrusions. And if 80 million uses get all there cc or account hacked good bye steam. Because the pc guys ant that forgiving when it comes to incompetence.

But the big echo through out the industry. Windows 8 ports to x1 are mere day to cross port. But x1 to pc is not so .. in till windows or pc devs have the resources to test code on hundreds and thousands of different hardware configurations. Something that even valve can not achieve ether. Not just that you have different approach's with api now and engine's and constructor values. Its just the problem that can not unified between hardware manufacturers. There never going to agree 100% and there lies the true weakness of pc evolution. Consoles have then beat.

Misterx: All banned are unbanned for a second chance here to be polite. If  you don't agree with something please provide backup FACTS and LINKS like we do here when speculate on something pro Xbox. Not your empty opinions we should to believe as defenitive truth.

Insider Daily. If I was a indy or mobile developer I would want my games on this new x-hardware
Insider: Theoretical performance is all ways going to be higher .. due to ideal test environment, And machine code vs real "Lab" does not equal real world no matter how any body spins it .. 5tf theoretical is more 4.1 real world plus the closed nature and offload accelerator cpu all factor in to the 5tf(misterx - 5tf dp) figure with out external cloud...

The Ryse article sums up the nature of the development environment perfectly for mono driver no dx12 and no accelerator. One pass was gfx on pass 1080p recourses and gui and sound No offload and certainly no Dx12 or hardware acceleration. all tho they did use esram for deferred rendering. For real clues id dig this part of the article

"Finally, an enormous static shadow map is generated only once when each level loads or when transitioning to a different area, taking advantage of the increased memory of the Xbox One. It includes all the static objects in the level and avoids re-rendering distant objects with every frame. The shadow map is 8192×8192 16 bit, weighs 128 mb and covers an 1 square kilometer area of the game’s world, providing sufficient resolution. This saves between 40 and 60% draw calls in shadow map passes"

There not using ddr3 for shadow maps ether or were they id dig this part of the article... as this is a dx11.1 game with some custom gfx code ..... but they did do most there target render including shadow map to the esram cach.

Tiled recourses were not used in the game constructor hint :)
So ?

The word with dell is there making tv box soon... and its a joint venture .. ever wondered why the x1 has 4 of them new bluetooth and wifi protocols hhhmm x-tv 1 . But there is a multi hardware deal here and its all about on the go I guess Nintendo should be worried on this one cos its going to force them to mobile ... look at the changes with developers even sony has a boat with captain's jumping to mobile indy. If I was a indy or mobile developer I would want my games on this ... something new in town @e3 not sure its going to drop in public but I am sure its going to leak.

Mistercteam: Then look at MSnerd rumor that as of today keep getting so inline

REMEMBER THIS rumour of 2012!!!

Build 2014
Win9 RTM (IE12)
Xbox PU preview

E3 2014
Kinect HP2 announce
Xbox PU announce

Nov 2014
Win9, Win9M, Kinect HP2 launch

My interepretation
Xbox PU = Xbox Portable Unit = X-Surface
Kinect 2HP = Kinect 2 High precision

Misterx: I think Kinect 2HP is AR Fortazella glasses.


Flashback Post
1 year ago old but interesting post for non-sayers :)

"Seems like Microsoft wants everyone to be happy and will allow to use their DX12 tech also for Nvidia.

Xbox 720 will use DX12 tech first with as i beleive AMD GPU.
Later Nvidia and AMD will create PC DX12 hardware for thouse who still enjoy PCs.
What is DX12 will be about? It should have hardware support of RayTracing via HSA(i.e efficienty and full system load with easy scalability)and gigantic memory bandwith(1Tb/s levels) via stacked or wide ram.

Also there is interesting link about Microsoft have veto on deals with more than 30% if Nvidia stocks. Seems like Nvidia agreed for that in order to have this DX12 tech and not loosee to AMD too much in 3-5 years from now."


List of majour confirmed info from Insider
- full DX12 inside Xbox One(Info since 2012. Confirmed by Phil Spencer, Microsoft, Mar 2014)

Allmost confirmed
- Inside Xbox One is not 7770 1.31TF GPU. That is a logical assumption from 1st line(info since 2013).
- 3rd party Xbox hardware license (Dell) (
Info since 2012).
- No problems with 1080p in the future(confirmed by many devs).
- Tons of exclusives(Microsoft lead this gen in quality exclusives).

Expected to be confirmed in 1-3 year scope
- stacked SOC (i personaly still stand to words that we see not full GPU on 1st layer. Insider confirmed not all CUs we see on hotchip slides. Mistercteam gets tons of proof). The main idea is - what we see now on 1st layer is CPU in HSA paradigm. Next layer is GPU. So in game of terms Alpert Penello was again semi-right).
- 5-6TF DP total power for Xbox One. GPU inside Xbox One is 2-3 more power full than PS4. Plus Xbox One have offloading hardware PS4 does  not.
- Xbox Tablet
- Xbox TV console
- Xbox Fortazzella AR glasses
- Games to blow our mind on Xbox One.
- 1080p patches to some exiting games when "stereo" driver will be released for Xbox One.
- If Steam machines will get DX12 hardware than Xbox One will get some sort of Steam as app. Microsoft don't want let their RnD results to shitting on Microsoft companies for free.
- NDA to end and developers will talk more openly about how amazing Xbox One and DX12 are. DX12 is invention of PC 2.0
- PS4 could be Sonys last console - they could go software only because they have no more money and brains to conquer with Americans in AAA-gaming.
- Sonys lie and brainwash will stop fast after MS will show their cards.
- State of the art hardware upscaller inside Xbox One. Upscaller can add a bit of AA and make more FPS by adding frames.
- RayTracing capable via speacial hardware.

Insider Daily. Xbox One is 2.6TF DP GPU@853Mhz+special hardware make it 4-5TF DP for whole system

Insider: No milk is not correct second layer stuff is inaccurate. ..

I need to go over the information and brake down the way the Cu's work . The Ace and render out put tmu and shader cores.. cos due to the command and compute customisation the numbers are very traversed.

Not 7770 closer to 8870 or r280 gpu core but not the same as its so custom.. you can clearly see the x1 xpu core is larger then the 7780 from ps4 diagram.

I know 3072 is the cu raw performance but that wont translate in to shaders ether... this is extremely hard to number because the engine can choose ether value.. rop's are also able to out put large values also.no need for 32...

there is no kinect native processing hardware on soc .. it is time sliced into cpu gpu cycles 10% allocated which does not need to be there any more. It will be engine specific in the next update. As kinect will process straight to command buffer. Kinect is 95% offload. 10% only because of beta kinect drivers....

more on the 2nd layer soon..

We still confusing by your dp meaning.

Is it a)2.6 dp or b) dp is to get 1.31x2 = 2.6?

If a) then 2.6 could be multiplied even more by x2-x4 depends of engine?

If b) that would mean your dp meaning is x2 stereo.
Insider: It is hard to give the answer all I know is.

Mono one gpu thread 1.3 no offload sp

Stereo two gpu thread 1.3 + 1.3 = 2.6tf dp @ 853

1ghz would be 1.52 I believe could be a little more per thread. Stereo mode is with offload. With out off load second thread becomes recourse management and audio processing and dot matrix. And other. This link is old tech but will give some basic idea principles. In regards to a few questions.


Accelerator is different and cpu .. accelerator is start and finish.. new technology require's new software and new way's of thinking.. like cell but more of a whole system approach with games and gfx computation's in mind .. supercomputer on chip... soc :)

Misterx: Can you resend your last mail...i've deleted it.

Again 2.6dp...seems like only in stereo mode 2.6 could be further multipled by 2 or 4(depenfs on engine). Seems like there is:
- 1.3tf sp+ 1.31dp
- 1.31sp + 1.31SP x 2 or 3 in stereo mode.

On mono it is only 1.31TF SP.

Misterx: Liked latest info from go_lantern_op. A little different wording to mictercteam but the same result.

go_lantern_op: 1.3tf DP now but can be upclocked/unlocked to 2.75tf DP (10-12x power of 360 as MS have stated)

1.3tf * 4= 5.2tf but split into 4 APU threads with 3 threads acting as gpu focused and 1 thread acting as CPU focused but can change based devs needs

2.75tf * 4= Max power

It is why insider says top layer is 1.3tf @ 853mhz with mono and stereo. DP splits the dual logic/thread gpu into 2 more XPU thread.

They choose top layer as CPU layer because DXcores are older CU but still capable of dual logic, underlying CU tech (nested paralism)-(samsung cpu + XCUs) use new core tech.

X1 is perfect fusion of GPU and CPU----> XPU, H-CU changes to XCU

I guess what im trying to say.

2.75tf dual logic 280x GPU underclock to 1.3tf. Forza/RYSE use dual logic= 2.6tf (2 XPU threads). QB/SO use stereo/DP + dual logic= 5.2tf (4 XPU threads) plus 1.3tf headroom

Misterx: What do you mean when type 2.6dp? Dp... it is already  applied to gain 2.6sp in stereo....or could be further applied to 2.6 to make it 2.6dp=2.6 x4 or x2 sp?



double - precision is with stereo .. it works extremely effective on x1 architecture due to the dual memory read write nature of esram caching.

The fundamental that lies beneath are also able to do 128 bit double double floating point. But I its technical impossible to explain or to understand the gpummu intill dx 12 and c++amp are implemented in hardware. Hsa is the turning point.

Smid can do 4x pass / 32 4x / 64 x2 /1 x 128 all tho 128 would be slower and take more memory caching. Memory caching is key to feeding the gpu and gpummu. Having your code at the right place at the right time

So the theoretical number for the cu / su is 12 / each cu has 4 smid . each 4 smid have 4 ex processors. So that is 16ex per cu/su
so each cu can do 4 x passes at 32bit 12 x 4 = 48 . 48 x 16 = 768 . 768 x 4 32bit pass = 3072 .. although the sweet spot for x1 architecture is 64bit due to the cache and data move engines. 1536 sc/ and cu do not have a problem doing parallel processing separate from the gpu because the cache can read and write both ways. The gpu can also off load to the cpu cu's also ... hsa coherent memory caching.

Dig this mrc ... I have left a few things out but you people can fill in the details :)

It is more clear about 2.6dp now..thanks...so when you told no more than 4-5tf for the whole system it was DP!
Insider: The 5 tf Dp figure is theoretical... performance will vary.. ps4 is also able to hit higher theoretical performance .. because of ace ques ... but nothing like x1 architecture. But devs will have to work hard due to memory functions and balancing the accelerator but will see huge gains.

E3 will change the minds....

Misterx: That is pretty big news...I
nsider have just confirmed 5TF DP theoretical power for whole system. 4TF for real life use. 2.6DP GPU@853Mhz+special HW + upclock.

Misterx: 4TF DP sure can do some ray tracing and last 10-12 years. Xbox Infinity was in plan for Xbox One name. I hope you know how much SP is in DP. I afraid to post this number now in 2014 as 95% still will not understand. AMD said last week at GDC: DX12 is like 4 generational leap.

Misterx: And last but not least. AMD have just announced their new video card...2.6 DP perfomance = 5.2 SP. Strange they talk about DP while nobody use this term for ages expect us to meassure perfomance. All mistercteam dig and insider info about 2.6 dp almost confirmed.

2.6TF DP on new CU = 5.2TF SP !!!!!

AMD, confirm like i already said 10000 times, AMD W9100 2.6TF DP = 5.2TF SP. Basically what AMD said, is to get 2.6TF DP you dont need 10TF (misterx: as in GCN 1.0).

MIsterx: Congratulations MisterC! Well done, well done. 2 days before AMDs today announcment Insider confirmed to me 1DP: 2SP ratio is more balanced on x1 architecture. My bad i post this only now.

MIsterx: Why at E3 MS will talk about hardware? Games, games, games...Also MS team member said they planed to talk about true Xbox One power only in 2015. Do not think they will talk about hardware this E3 too much...another event we will be disappointment. MS opted to hide it for a dumb ducking reasons we could only imagine. They choose to loose prorit now and get more later.

Maybe SDF corrected their plans :) that is why i say the more SDF lie the better for Xbox fans :).
Insider: Link to Microsoft employer discussing hardware specific detail coming in 2015 .. Please

also E3 will be games games games games and more games.. I can say this there is no way 1.3tf console is doing these games mini 2.6 mini ...
I think they will open more discussion at build and pax.. but e3 will see hardware discussions just not a deep dive.

Misterx: here is a link

Former Xbox One design team member: You guys should be detectives. I was part of the initial design of the Xbox One and have since left to form a startup. The goal was to have a 10-year lifecycle for the console and the only way to do so is to stack silicon and slowly unlock power over the course of its lifetime.

I feel we made a bit of a mistake in the first rev in not releasing enough power. It's a simple firmware tweak and from what I understand speaking to my ex-coworkers a tweak is planned from the first major firmware update to bring it in parity with the PS4 in terms of power. I understand that games like Assassins's Creed 4 and Battlefield 4 will get updates to unlock 1080p 60fps but I don't have much insider knowledge outside of Microsoft on that matter.

It may seem like a crazy play but just think of five years from now when the PS4 is completely tapped out and Sony is forced to release a new console all we do is turn the final knobs and have a console that is 3 - 4x more powerful at a $200 price point. Management has said it before and I will back it up, this is not a sprint -- it's a marathon. By E3 the true power of the Xbox One will be realized, make my words.

Former Xbox One design team member: Just a little trivia. We called this effort "Project 22" internally based on passage 22 from The Art of War by Sun Tzu:

"22. If your opponent is of choleric temper, seek to irritate him. Pretend to be weak, that he may grow arrogant."

I pinged one of my previous co-workers this morning and he was really reluctant to share any new information. It seems they are a little up in arms at the moment with all of the leaks. This information was supposed to be a big surprise that was not going to be revealed until E3 2015 at the earliest. I have to admit Sony did fool us into thinking they had weaker hardware but as long as we unlock just 20% of our reserved capacity we will not only be in parity with the PS4, we will exceed its performance in most cases. Keep in mind, that's with only 20% of the hidden power unlocked. We still have 80% to unleash over they next 10 years.

I am convinced that books will be written about this strategy one day. Never before has a consumer electronics device pulled off such an amazing feat of secrecy and long-term customer value.

Great info.

Hope 2015 will not be too late and you guys have a plan how to conquer all negativuty and lie about Xbox. Social media and FUD could kill any brand nowadays(Misterx dissclamer: As i beleive Sony 1st party employeers use lie and brainwash in full force on internet since 2009 as they know MS will dominate them no matter what and will create better system and infrustructure. Sony is almost bankrupt and prepair to go software only).

insidr1234: I've signed up with an account so I can keep posting and not be anonymous. Like I said, they are looking at releasing an update that will put them at performance parity with the PS4 with the next major firmware release. This will be before E3 for sure.

...the only reason I visited..XboxNeo @ Union Video Game forums.

In response to someone asking for an easier explanation about DX12 and XB1...

"That would be a bit like trying to explain the higgs-boson to you. You would have to know what a g-buffer is, and the role it plays in deferred rendering, and why with Esram and DX12, g-buffers are a hack that is no longer needed on the x1."

"Which is a really amazing development, but not really easy to explain without all the fundamental discussion about global illumination strategies."

"Think elegant solution rather than brute force..."

"The PS4 is a Lamborghini, yes. But the X1 is a Cesna 172. It only does 130mph, but it can fly. The Lambo is just a car... The Cesna will still always get there first."

...About PS4/MLB...

"Already happening. Apparently waxy face but at 1080p is some cutting edge graphics development. Where as Subsurface scatter for natural looking skin (Ryse) but only at 900p is some crushing defeat. lol."

"Well I was hoping for good graphics and animation... But with the latest Sony offering not being as good as an X1 launch title in either the good graphics or animation department..."

"But hey that 1080p using last gen game engines.... woooooo.... Yeah...... Last gen games in new tv selling 1080p from the failing 1080p tv company... go gamer.... Characters who look like they are molded out of wax in all its 1080p glory..."

"DDR3 is plenty fast as a 1080p render target. Rendering a 1080p screen at 60fps only takes like 1/2 gig second of bandwidth. No need to put it in ESRam..."

"Because up until now z-buffer rasterizers were the way to go. However now that there is a paradigm shift under way expect to see more APU's instead of discrete components. For example, DX12 preserves the scene geometry for the entire duration of the render pipeline, eliminating the need for a g-buffer. That's a huge shift right there, and one that is basically incompatible with the existing infrastructure."

"Is there a single PS4 game yet that has hundreds of characters on the screen at once?"

And no one is arguing against the idea that the PS4 is clearly the superior console at rendering the waxy and stiff looking figures that you see in games like Infamous: Second Son.

"I am arguing that the PS4 can't match the visual fidelity of games like Ryse (or it would have by now...) or even get the character count of games like Dead Rising 3, or match the online experience of Titan Fall..."

"In that regard it has taken last gen concepts to their logical conclusion, while the rest of the world is moving onto other things..."

The PS4 is the very best last gen console you can buy. ( smash--I'll add the BOOM here...lol

Smash...after a laughable amount of trolling by the ponies Neo gets hot and lets loose a truth canon...

"I can't wrap my head around people who will go back and forth saying "no it isn't" every time Microsoft has the temerity to make a statement about the x1."

"Look, if you don't like Microsoft, fine. I stay out of the PS side of this site for exactly that reason. The PS4 is obsolete tech junk. All the next gen stuff is happening on the x1. Nothing next gen is happening on the PS4. The PS4 is literally exactly like the old consoles, with nothing new what so ever added (Morpheus is DOA, just like the last virtual reality head set that Sony released...) The PS4 is designed to do nothing but create a reason for people to get 1080p televisions if they haven't already. That is why they are sacrificing good graphics for 1080p on every single game."

"Again man... The PS4 has done nothing to further the most recent development in z-buffer technology. Physically biased deferred shading... Which is as you know is a heavily latency sensitive operation."

...ponies started jumping off of buildings and calling for back up after this backhander...lol

"It took a Titan running on a $5000 PC with DX12 to get the same level of fidelity on Forza, a launch game... Just Saying... I think pretty much defines "obsolete" now as opposed to this summer..."

Best gif posted there in response to this...


ms_xbotx1: Hidden Power

Prove the PS4 is more powerful then the XB1.

MS has the only 1080p/60fps game that actually has textures so far this gen

The Order needs 2 black bars on screen forcing the player to have to play within an 800p gaming area just for Sony to say the game is 1080p. Linear gameplay and no multi player coop.

Driveclub....went back to the drawing board as the devs were having a hard time to keep a solid 30fps.

Infamous is an empty city with no life and full of fog to blur the textures in order for Sony to hit 1080p and it's still only 30fps

Sony LIED about KZ being 1080p/60fps. Not only did KZ turn out to struggle at 30fps, Titanfall's BETA had a higher resolution then KZ.

So I ask you, what proof do you have that the PS4 is more powerful then the XB1?

Multiplats? LMFAO since when do multiplats dictate the strengths of a system? Especially last gen ports that are using last gen dev tools.

I hate ignorant people.

Insider Daily. Everyone think Xbox One GPU is 1.31TF SP. In reallity it is 2.6 TF DP
Misterx: Also you told about hight frequancy for xbox one. Official numbers see much lower frequency...
Is turbo clock tech(future clock upgrade) is possible and planned? that means that 1.3TFx2 DP is not final number for real GPU?

Insider: Its is not easy to explain the architecture with so little information available directly from ms to developers.. only because Its is a new design and software was still in alpha tools were old or still being designed. The soc gpu is power gated and is dark silicon. . Which means different clock frequency can be adjusted over time to suit development needs.. special purpose hardware.

Gpu or like mrc xpu would be a better fit for the gpu terminology. As its not a standard part..
But xpu is dual logic each thread has the same clock and same gfx bandwidth.

High priority is 1.3tf @ 853 mono driver
low priority is also 1.3tf @ 853 stereo driver
I have been told 1ghz clock with power gating ..Cpu how ever is totally different and does not have to match the gpu "fsb" old term!!!

Having two threads in dual logic . stereo driver is the same as dp.. as one render pipe mono would be sp .

This xpu also can have comput processing sent straight to the gpummu
and also use both esram/edram or ddr but this all comes under hsa architecture design aspects. And direct command buffer which is a major part of the oban accelerator.

This part has been hard to understand from my computer science or evolution of compute processing aspect.. but this should give a clue to the cu and accelerator ... 12 cu x 16ex = 192. Then cu are different they can do 4x max op/s
12 cu x 16ex = 192 x 4 = 768 op/s due to the esram not being direct cluster but very small hundreds of caches the cu can constantly feed the max bandwidth read and write .. this aspect is still hard to understand with out enough information on the oban aspect other then it is an accelerated unit that everything is made up from. But it may go under nested parallel accelerated computing

These may help ?


The 14 cu that are available two are unavailable at this point.. not sure if available but two were there for redundant purposes.

Cpu has dxcore and small cach between
the cpu quad cores .. each quad core has an older type cu 2 per 4 core quad. That is why people are saying cpu is superior to ps4 cpu in every way. And also why there is 20gig read write buss.
Still it is complicated to understand it all but oban design is key to this aspect so I have been told... but I would dig this information its all in your hands and in front of your eyes. There is still things unknown by a lot of people even some working in control of hw sw management

Cpu read and write 30gb on x1 cpu if I remember correctly .. I will do a cpu segment soon

smash_mouth: ...DX12 is a game changer tho he tries really hard to show otherwise...too hard. Someone in the comment section hits it straight on in direct contrast to the article...

"AMD's Raja Khodury said on stage. "And it's not a small benefit. It's... like getting four generations of hardware ahead with this API."

"Intel’s Vice President of Platform Engineering Eric Mentzer shared a similar sentiment, with, “This is absolutely, I think, the most significant jump in technology in a long, long time.”

"Nvidia's Tomasi echoed a similar sentiment, "existing cards will see orders of magnitude improvements from DirectX 12's release, he said, "going from 100s of thousands to millions and maybe tens of millions of system draws a second."

marcberry: For the newbies that visit this blog.

This next generation of GPU is… It is kind of a break from the last generation of GPU which were very microcode optimization sensitive. You know, things that order of codes and the shaders made a huge difference in how fast they ran. This time, the chip architecture is based on supercomputer like technologies and if much more about data flow. Having the right data in the right caches the right places at the right time is what will make all the difference in the world for taking advantage of these chips."
Microsoft architectural panel with major Nelson

1."Next generation of GPU" (If it was a HD7790 it's not a new GEN) +1

2."It is kind of a break from the last generation of GPU" ( That would be a HD7790)+1

3."the chip architecture is based on supercomputer like technologies" (Looking at the hotchips GPU layout I cannot fine 1 GPU on the market anywhere on the planet that looks like this from AMD or Nvidia.) +1

ME 3 - Ps4 0 "for taking advantage of these chips."?

mistercteam: http://wccftech.com/microsoft-teased-directx-12-features-reserved-generation-gpus-unveil/

"Microsoft “Only Teased” DirectX 12 – More Features Reserved For Next Generation GPUs, Will Unveil Later"

Certainly not 7790

anothertech: Guys, we need to look at the picture as a whole, and keep our expectations inline.

I think a lot of people here get that 'megaton hype mode' way too early and then go ballistic when their X1 hopes and dreams don't come to fruition at any upcoming convention or conference.

For instance. We just had GDC come and go. We were told countless times that DX12 and Xbox One confirmation was immanent. And they did indeed touch on that at the conference.

Along with that, we were rewarded with countless ways DX12 will revolutionize the next generation of games coming in a year or so, as well as confirmation that recently released games will benefit as well with only a small amount of work by the devs to bring game engines up to par. We also heard PS4 will NOT be getting this kind of update in the slightest. This was all CONFIRMED at that conference. Great great info.

But even after all that, it was all dampened down to oblivion by a simple tweet of "no stacked GPU"

OMG OMG OMG! All hell broke loose over here, all around the web as well, GAF, IGN, Gamespot all wrote articles about that silly tweet like it was some kind of megaton.

The only truth about that was AP re confirming what he said before, as he feels a stacked/discreet GPU was already confirmed a moot point.

I agree, It was a bit of a letdown that new GPU hardware wasn't confirmed for X1 at the GDC conference, but like many of the level headed were saying here, we shouldn't really have expect that. Realistically they don't do that at GDC. Also, not really any other game reveals, but they are doing that at E3 which is coming quickly.

Now, I personally am telling everyone WE NEED TO KEEP EXPECTATIONS within the limits of reality. I'm saying this as a buffer for everyone waiting for a 'MEGATON' at E3. And there's a reason for this.

Through the grapevine, I and many others are already hearing this 'resolutiongate' bullshi* is still in effect. There are games, a few big multiplats, that are coming to both Xbox One and PS4 in the not too distant future. Many of these games coming throughout the end of the year ARE at this point lower resolution on X1.

This shouldn't be a big surprise. We know the tools haven't been ready for the developers for some time, games have been suffering for this since launch to now, and it's been confirmed some features of the Xbox One need the DX12 drivers to be updated before they can be used.

So please, please KEEP YOUR EXPECTATIONS realistic. Don't go ballistic in the next few months when these cross gen games are still doing better on the PS4.

I see some X1 fans posting things like "the X1 is already at 2.6tf! DX12 tools will make it 4 or 5tf output!" and I kind of cringe.

I know with cloud and DX12 tools, our games will definitely get up there maybe even 5tf x2 and beyond, but it's definitely NOT running 2.6 or more tf RIGHT NOW. E3, we will probably see some vindication with game demos running on better tools, but those games are not likely to come out this year.

As the tools are unlocked and more developers are using DX12 tech, ESram tiled resources and more innovative ways to use our huge memory piplines, we WILL see the benefits.

But we need to have patience. Something none of us wants to hear, I know.

2015 will be year of the X1. It will be like a tidal wave of good news on multiplats and exclusives that PS4 tech won't be able to touch. But for the near future, even some past E3, we are going to get shat upon by media and Pony's. (this is nothing new lol)

Remember, we are already winning the games war. That is what matters. PS4 has 3 exclusives to speak of, and their 'savior' Infamous isn't everything they thought it would be. It's a shiny toy, but nothing revolutionary. Not like Titanfall. Believe me when I say, they hate us for that lol.

But in the next few months they will be playing dirty, especially when the next few multiplats hit. Be ready for that.

My personal belief, is that the Samsung accelerator is in our box, and they will be unlocking the tools to use it sometime in 2015 when the games are going to be able to make use of it.

Misterx: With latest Insider info and 10 year MS plan i think MS will unock 1Tf every year to be on pair with PCs until year 8 or 9. Game engines can't go directly from 1TF engines to 9 TF engines..they will smooth evolve to amaze us step by step and take our money everyear for the same but a little bit better looking stuff.

AMD: "It is like 4 generations ahead"...also Raytracing enabled engines need a LOT of power to render pictures like this:

Toscana, Italy :)


You are viewing misterxmedia