Xbox Series X versus Playstation 5 Performance

I have always been an Xbox fan, ever since the original Xbox. I very much enjoyed having the superior technology when it came to OG Xbox versus the PS2, Xbox 360 versus the PS3.

But unfortunately, with Xbox One, Microsoft made some miscalculation: it was an underpowered console compared to the PS4 – until the Xbox One X came along.

The Xbox One X is truly a marvel of engineering: Microsoft ran through extensive computer modelling to understand how to improve the behaviour of both the CPU and GPU – especially when it came to its registers and caching – to achieve a 6TFLOP (TF) system that acted like one with more than 6TFs. Even Mark Cerny famously said that he believed 6TFs would NOT be enough to run games at full 4K. He was both right and wrong. He was right because if Microsoft simply scaled the GCN-based GPU from the original Xbox One to create a 6TF GPU, those 6 TFs would not have been enough. But Microsoft did very extensive modelling to understand why a simply scaled 6TF CPU and GPU could stall and how to avoid those stalls. As part of this work, they tweaked and scaled registers and caches to remove the bottlenecks and allow the CPU and GPU to operate closer to their theoretical maximums. The Xbox One X was one hell of an engineering feat!

So it is interesting then, that Microsoft might not have implemented the same lessons learnt from the Xbox One X engineering team when it came to Xbox Series X while Sony seems to have taken note?

After all, the engineering team that worked on Xbox One X might not have joined with the Xbox Series X development team until they were done with the Xbox One X which could have been as far as halfway through its development.

So how come the PS5 with its 10TF GPU is able to compete with Xbox Series X’s 12TF GPU? Let’s examine some of the possible issues. It is likely not one or the other, however, but a mixture of all these issues and challenges.


Firstly, let’s have a look at the hardware starting with the CPU. We know from released documents that the Xbox Series X CPU has two clusters of 4 cores – 8 cores in total – with each cluster utilising a separate block of L3 cache.

In the case of the PS5, there are strong indications that the L3 cache is unified on the CPU. Why is this important? Well, this is one of the main boosts to achieving a higher IPC (Instructions Per Clock) when moving from AMD’s Zen2 to Zen3 architecture due to a lower level of latency when cores need to share data.

Additionally, with a unified cache, you do not need to duplicate data between the cache blocks if two cores need to work on the same data, therefore achieving better cache utilisation. While the PS5 can utilise all of its cache for unique data, the Xbox Series X has less overall cache as there will be some duplication.

This can lead to the CPU stalling less and therefore achieving a higher execution efficiency at the same clock speeds – that is -a higher IPC.

So in practical terms, what advantage could this give to the PS5?

  1. The CPU would be able to feed the GPU faster when it comes to traditional rasterisation – this might be less important when mesh shading (whether through RDNA2 mesh shaders or PS5’s Geometry Engine) become more commonplace but until then it may result in higher frame rates
  2. The CPU would be able to feed the GPU more consistently as it is less likely to stall or it would have more consistent spikes in wait times. With a split cache, some waits will be longer than others, therefore resulting in less consistency in maintaining the frame rate or simulation rate.
  3. The same would apply to other types of execution such as world simulation, AI, etc. They would be able to be done with more consistency and less delay, therefore again resulting in more consistent or higher frame rates, especially if the world simulation is closely tied to frame rate – some engines will be more sensitive to this than others.

Can Xbox do anything to combat this?

Well, developers are able to organise the work on the different CPU cores – and threads – in a way that there is as little data sharing required between the CPU clusters as possible. However, there is a bit of an issue with expecting developers to do this: it requires a lot more foresight and more optimisation. When the dominant platform – which Playstation is due to its install size – is not requiring developers to do this, multiplatform games can suffer in performance when porting code. In fact, these performance issues can also happen when porting PC code.

To resolve this fully, Microsoft may need to either:

  1. Allow developers to see where this data sharing occurs and allow them to very easily move execution of code between CPU clusters – even dynamically using certain conditions. If they make this tedious, it will require more work from developer or will be less successful.
  2. Get the GDK to do this automatically when it recognises that major and continuous data sharing is occurring between threads from the two clusters – dynamically shifting the load between CPU clusters to prevent a stall. This would be a difficult thing to do but it would allow code to be shifted between the two platforms much easier.

I think Microsoft will probably optimise this and that they are able to enable multi-threading on the CPU is going to aid them in closing the gap. But this certainly will need work.


Cache Coherency

So how come a 10TF GPU seems to be performing like a 12TF GPU? Well, just like Microsoft when engineering the Xbox One X, Sony has put a LOT of focus on reducing latency of data access within the GPU and went beyond cache management of RDNA2 by introducing the cache coherency engines (cache scrubbers) – and possibly other tweaks we will never know.

Whether it is a 10TF or a 12TF GPU it will not achieve its full TF compute performance if there are stalls in the CUs (Compute Units) and it is waiting for information to be fetched from main memory or the SSD. Make no mistake, there are stalls even in the most well-engineered piece of silicon. The question is this:

  1. How can you ensure you have as few stalls as possible
  2. How can you ensure that when stalls happen, they will be as short as possible

These two things contribute greatly to the IPC performance of the CUs and literally all strategies to improve IPC chip designers employ revolve around this: cache sizes, cache coherency, execution pipeline prediction and so on.

I do wonder if Microsoft forgot this core lesson from their Xbox One X engineering team and simply trusted AMD to design something amazing here. I also do wonder if Sony was paying attention and outsmarted Microsoft engineers at their own game when it came to cache coherency by implementing cache scrubbers that allow better cache utilisation. It was a gamble which seems to be paying off majorly.

As it stands, it very much looks like Sony’s GPU is stalling less and performs much better than its 10TFs of performance would imply. It also looks like the Xbox Series X performs like 2x Xbox One Xs which of course checks out – going from 6TFs to 12TFs. So is it possible that the Xbox One X learnings with regards to registers and caches WERE implemented, but not appropriately scaled or those ideas not taken further?

I think there’s no better reminder than the Xbox One’s eDRAM that certain strategies provide massive competitive advantage in one generation while become a liability in the next. After all, the Xbox 360’s high speed on GPU eDRAM provided great competitive advantage when paired with a unified high-speed memory interface, while Xbox One’s eDRAM did exactly the opposite when paired with main RAM speeds that were not state of the art compared to the competition. We can conclude then that context – both within the HW itself and of the competition – is critical.

Cache Coherency – What can Xbox do to resolve this?

Well, luckily, cache acts differently in a GPU as it does in the CPU in the sense that both developers and Microsoft have more control over how and what it gets filled with. A CPU doesn’t really allow such control – apart from changing the microcode I guess – if that?

Again there are two ways:

  1. Allow developers lots of control of caches through the GDK. However, again this results in having to spend lots of time on optimisation and trying to get the best performance out of the GPU, not to mention it could pose technical complexity with forward compatibility. Remember that Microsoft tried to do this with the eDRAM in the Xbox One as a way to fix the horrible performance of the console and achieve resolution parity with the PS4 – among other things. It did allow developers – who were willing to spend the time required optimising – better performance and resolution parity with the PS4. Those developers who couldn’t be bothered or didn’t get the hang of the eDRAM were less likely to achieve that.
  2. Do the hard work the team did during Xbox One X and dust off those computer models. See how different types of code execution requires different caching strategies (cache policies) for best performance then implement those in the GDK. Allow the developers to switch the GPU into the different cache policies during code execution or detect the code change automatically and switch on the fly. Yes, this is hard work to develop, but may very well allow much easier porting of code and achieve excellent performance.

Clock Speed and CUs

The clock speed of the PS5 GPU is higher than that of Xbox Series X by roughly 20%, however, the Xbox Series X has 45% more Compute Units.

PS5: 36 CUs running at 2.23Ghz (max speed)

Xbox Series X: 52 CUs running at 1.852Ghz

This presents two challenges: all the other functions of the chip run at a higher frequency on the PS5 so some steps in the pipeline will be faster due to the higher frequency. However, a stall will affect the PS5 worse because main memory – or the SSD as it were – is more cycles away. This is why Sony has focused so much on cache coherency and ensuring the GPU doesn’t need to wait around.

The difference in CU numbers also has an effect: it is more difficult to keep 52 CUs occupied at all times than do the same with 36CUs. What is more, code that is heavily optimised for 36 CUs might well run pretty badly on a 52 CU system:

36 * 2 / 52 = 1.38

In very simplistic terms, just to demonstrate the issue: if you were to send batches of work to 36 CUs and then run that on 52 CUs, the 52 CU GPU – without any optimisation lower in the stack such as the drivers or the code itself – would run at roughly 70% of its full speed. It would run a full cycle with all CUs utilised, then run a cycle with only 40% of the CUs running.

Now of course in practice, this may not happen due to optimisation lower in the software stack or the code itself but the question is, was code ported from PS5 to Xbox Series X with such batching / CU utilisation issues? If so, that can seriously hurt performance.

Clock Speed and CUs – What can Xbox do to resolve this?

I am certain that Microsoft already catch some of the batching issues. However, it is very possible that those code paths are not fully performant.

Another option is that Microsoft leaves it to developers fully to resolve such batching / CU utilisation issues in which case the question is:

  1. Do the tools support the developer to easily but fully untangle such issues?
  2. Is it easy to resolve such issues in each case?
  3. Can Microsoft do more lower in the stack to optimise for this?

Such issues may well disappear as developers get the hang of developing for the Xbox Series X and code isn’t ported between the two platforms with such little time allowed for performance optimisation.

GDK – The Tools

Microsoft re-wrote the XDK (Xbox Development Kit) for the Xbox One and it took a few years to improve its tools and performance to a point where the Xbox One wasn’t in a major disadvantage to the PS4. Reading this article – or simply looking at the XDK change history – is quite enlightening I think.

Now Microsoft decided to unify PC and Xbox development under the same umbrella and called the SDK and related toolkit the GDK, the Game Development Kit.

However, to understand how performance is impacted on the Xbox platform, we need to look at the full software stack which is – in a simplified way – below:

  1. Firmware and microcode (we don’t strictly call this software but it is code that does get updated and impacts performance)
  2. Hypervisor (as both the game and the dashboard are virtualised)
  3. Operating System or OS (the dashboard which is constant and the game which loads its own instance of a cut-down OS) – we COULD argue that the game OS is part of the GDK.
  4. Graphics Drivers (this is bundled per game on console for stability but is system-wide for PC) – we COULD argue this is part of the GDK on console but not on PC
  5. Various APIs (GDK)
  6. Development tools (GDK)

Considering Microsoft did not have the chip taped out in its final form until sometime in 2020 and the full stack needed to be re-written for the new platform, we can very easily conclude that the stack is not yet fully mature and optimised.

  1. Firstly the original Xbox One needed years to fully mature with less new complex technologies within the chips.
  2. AMD has just released their RDNA2 based drivers for their own cards. Graphics drivers usually improve greatly in both stability and performance during a generation of cards – for both Nvidia and AMD.
  3. Microsoft didn’t only need to adapt AMD’s graphics drivers to the Xbox Platform, they needed to re-write / adapt the whole software stack above. Ouch!

So what kind of issues can this throw up?

  1. Inefficiencies and errors in the code causing stalls, performance issues, crashes, etc. We saw this early on and while I think major issues have been fixed, performance optimisation of a new software stack takes time: not a few months, but years, as there are so many new technologies to integrate and optimise.
  2. Developers have to move their code to a new SDK and re-tooling takes time and is error-prone. To add insult to injury, the GDK is not compatible with earlier version of Visual Studio, so if a developer has not yet made the move to the new tools, it adds extra complexity and possible compile issues. However, the performance discrepancy cannot be explained by this simple point. This is why I wanted to write this article as it is a gross over-simplification of the situation to put it mildly.
  3. A fixed console is NOT a PC. The main advantage of a console is that it is fixed hardware and it will behave exactly the same across all its instances and across multiple runs of code. It is extremely consistent. When a game is running on PC, it needs to expect multiple performance profiles, environments, etc. What this means is that developer tools for a fixed platform can implement very specific codepaths and optimisations in a way that maximise platform performance. Since Microsoft has only just completed the GDK in June 2020 in a stable form – I wonder if they have had time to implement any specific optimisation for Xbox or will they treat it like a PC.
The GDK was only stable in June 2020

The last point is a crucial one, as I think this could be why developers are saying that the new GDK is not as easy to develop with. Microsoft is a software company and has a tendency to enable flexibility with its tools. Unfortunately, flexibility often times means they leave it up to the developer to figure out the performance profile of the platform and let them optimise to their hearts’ contents.

However, GENERALISING the GDK is a dangerous path to take when Sony has doubled down on SPECIALISING their SDK for the PS5’s performance profile. It can mean that getting max performance from the PS5 requires less work, rather than more, partly because of the hardware cache coherency and partly because of SDK specialisation.

I seriously think – as I said earlier – that Microsoft needs to get back to computer modelling and start developing specialisation of the Xbox Series X|S performance profiles such as semi-automatic thread management and updated caching policies on both the CPU and GPU to improve their IPC performance.

I fear that focusing on cross-platform development may have hurt the Xbox in the short term, although I have no doubt it will become an advantage in the long-term, should Microsoft develop the Xbox Series X|S code paths for maximum performance and do that with as little intervention and laborious optimisation necessary on the developer’s behalf as possible.

GPU – RDNA2 Advantage?

Now let’s look at the GPU features of each console one by one and see what advantage next-generation games could see in each.

Mesh Shaders versus Geometry Engine

We don’t have a lot of information about relative performance between the two implementations. However, we can speculate.

At worst the Geometry Engine in PS5 is simply another name for mesh shading and it is simply a rebranding. At best, Sony tweaked the mesh shaders in RDNA2 just like they did caching and they will achieve the same or higher throughput than Xbox Series X.

We can be sure of one thing however, the cache coherency they implemented will also aid mesh shading / the Geometry Engine so we will likely not see a massive performance differential for this feature.

However, if Sony tweaked the Geometry Engine ON TOP OF improving caching performance, then Microsoft will have to work extra hard to maintain the performance lead when mesh shading is fully utilised.

I think you can all see where this is going…

Variable Rate Shading (VRS)

As far as we know, PS5 does not have VRS level 2 feature. However, VRS can be implemented without dedicated hardware that aids it. I would refer you to this interview with Ori developer, Moon Studios, where they describe how they implemented VRS on all platforms without using the specific VRS API calls. Granted, their use case was relatively easy due to how they sliced the image up, but similar techniques can be implemented in software in other game engines.

VRS Level 2 on Xbox Series X can improve performance between 5% and 30% dependent on the use case. Some scenes in a game will benefit more than others. However, in MOST instances it will result in SOME image quality degradation. The aim is to restrict this degradation to parts of the image where they are not visible – dark, fast-moving, blurry parts. It is like using lossy compression on an image however.

Now, for example on Dirt 5, the developer has applied it to parts of the image that are darker AND that are moving so fast that detail is difficult to see. However, even with VRS enabled, and on title update 2.0, the PS5 is neck and neck in performance and has slightly better image clarity (while the Xbox is a few frames ahead hence I said it’s pretty much a wash).

So it does look like VRS was used on this title to offset the performance disadvantage of the game code running on the Xbox Series X, because of the possible compounding issues detailed previously.

If we assume for a second that Xbox could get the CPU and GPU closer to its performance targets without serious optimisation work on behalf of the developer, and that the developer is not going to bother implementing VRS in software on PS5, then yes, VRS has the possibility to provide a 5-30% performance boost over PS5. At the moment, it seems like it is used to offset the performance disparity – at least in Dirt 5.

Sample Feedback Streaming (SFS)

Well, Sample Feedback Streaming is an interesting one because we don’t know if Sony has an equivalent OR if they are simply relying on brute force between the SSD and memory to try and stream textures in on the fly.

A few things are for certain:

  1. SFS “in the lab demos” can reduce memory usage by upto 10x (so think of it as a 10x memory multiplier compared to a solution cobbled together in software.) Sony most certainly does not have an SSD interface that is 10x as fast as Microsoft’s. Even if both were quoting sustained speed – which they are not – Sony’s interface is only 2x faster. However, in practice it looks to be only 1.5x the raw speed from load times on equivalent software. Now the question is, can a skilled developer reduce the memory footprint advantage to around 2x, as opposed to 10x? (see UE5 below for a possible answer)
  2. Keep in mind both companies implemented hardware compression and the algorithms – no matter what fancy name they give them – are pretty close. Even a 50% improvement on state of the art compression maths takes 5+ years to achieve and both use state of the art. Anyone saying one has a multiplier of 2x over the other with regards to data compression does not understand data compression. Keep in mind, the hardware may well allow the developers a lot of flexibility in updating the compression algorithms they use over the generation as learnings from previous generations show that it is advantageous to allow this flexibility.
  3. Not having to wait for textures to stream in allows the GPU to continue working and it can be the difference between a stall or no stall. However, all modern game engines account for this and have a lower res texture available ready in memory. The only difference here is that the transition is smoothed over using a new algorithm and the texture pop-in is minimised to a frame as opposed to multiple frames. Useful? Sure! Revolutionary? We shall see.
  4. UE5 was shown to be running on the PS5 and it was using ultra-high-resolution textures streaming in at 60fps without a hitch! So whether Sony has an SFS solution or not, it is obvious that Epic has figured out a way to do this consistently using the PS5 hardware. Now does this mean that UE5 cannot benefit from SFS even more and would it reduce memory footprint therefore allowing for higher detail / resolution of textures, etc? Well, it is all possible, but rather theoretical at this point and maybe even Microsoft doesn’t know the answer until they see the games running. After all, would we be hitting other limits within the GPU that would bottleneck this at reasonable frame rates?

So I think SFS is likely the odd one out. It is not a proven technology and we don’t know how performant the best software implementation is versus hardware assist. Also, it’s new technology so will take a while to figure out.

Additionally, it looks like Sony has implemented a completely new chip for the SSD controller, which might well house similar technology – or uses brute force to achieve the same.

You can read more about SFS here from Microsoft if you are interested.

Ray Tracing

Ray Tracing hardware seems to be matched between the two – as in one Ray Accelerator per CU. However, since the Xbox Series X has more CUs, it also has more Ray Accelerators, albeit running at a lower clock speed. From tests it would seem that – although cache coherency seems to have some effect on them – it is not as large as with the rest of the pipeline and will likely be more performant on Xbox Series X.

Additionally, AI/ML can be utilised for faster and better de-noising required for ray-tracing so if Microsoft is able to implement a technique that runs on INT8 or INT4, then they would have an additional speed advantage here. See more on this under AI/ML below.


So now we come to the Xbox Series X’s trump card and in fact the only major customisation beyond RDNA2 Microsoft did to the GPU that we know of.

A customer GPU executes shader instructions in Single Precision Floating Point format called FP32. AI/ML operations however normally use FP16 (half precision) or more often integer operations (INT8 and INT4).

The PS5 is able to do FP32 and FP16, however Microsoft customised the Xbox Series X silicon to be able to perform INT8 and INT4 operations with the following speeds:

  1. FP32: 12TFLOPS
  2. FP16: 12 x 2 = 24 TFLOPS
  3. INT8: 12*4 = 48 TOPS
  4. INT4: 12*8= 96 TOPS

So the Xbox is able to perform AI/ML operations upto 8x as fast as FP32 and upto 4x as fast as FP16 (and therefore the PS5). The reason this is significant is that it gives way to technologies like Nvidia’s Deep Learning Super Sampling (a type of Super Resolution algorithm) which means the Xbox can render a scene at a lower resolution and then upscale it to a higher resolution (e.g. 1080p to 4K). This could be used for backwards compatibility but also for games to run with full raytracing at 60 FPS without cutting back on ray tracing quality.

This is also significant because it could allow Microsoft not to have to release a mid-generation refresh at the same time as Sony and still stay competitive. Even if Sony was to introduce this feature in their PS5 Pro for example, they would need at least a year to ship games utilising such a feature – by which time Microsoft can make its own move. If Microsoft invests money here, and times its cards right, this can be an ace up their sleeve.

Of course, the whole mid-generation refresh is very much up in the air for this generation of consoles, while also not nearly as necessary, but that is for another article.

AI/ML can be used to enable a whole host of other applications from better enemy AI, reduction of texture data both on disk and in memory, better de-noising algorithms for ray tracing to increase its quality and many other applications we haven’t even thought of. The sky is really the limit here. The question is how much Microsoft is willing to invest into Research & Development for AI/ML.


So is the Xbox Series X the World’s most powerful console? In theory yes, in practice not right now. Sony’s clever hardware design means that the PS5 is ever so slightly more performant for current titles with less developer optimisation necessary than Xbox Series X. However, no two consoles have ever been so close in performance terms.

While Xbox Series X has the potential to become more powerful due to its untapped potential – and less relative IPC utilisation – it is going to depend on how much investment Microsoft is willing to make into not only optimising the full software stack but also go beyond in enabling developers to easily and fully utilise the hardware under the hood.

As I said, context is important. In a World where the Xbox Series X is the more performant console, the GDK with its cross PC / Xbox development made a lot of sense. It made a lot of sense because the Xbox Series X had a lot of performance margin to play with: upto 20% more if we look at it theoretically.

However, in a World where the Xbox Series X is pretty much on par with the PS5 and in fact has work to do to achieve on par performance consistently, a GDK that is more generalised as opposed to specialised can become a liability if not handled correctly.

For the GDK not to become a liability, Microsoft might need to go beyond traditional optimisations, and also needs to ask the question whether contributing first-party optimisations to game engines like UE5 is worthwhile to allow third-party developers to reach their performance targets easier.

It also raises another question: now that Microsoft has so many first party studios, is it worth having to re-develop the same code and optimisations again and again by each studio for each game engine and title, or should Microsoft actually start their own unified game engine development, just like EA did with Frostbite. This would allow their first-party studios to focus on creating content as opposed to re-inventing the wheel again and again, especially as the wheel is becoming so crazily complex. After all, they now own some of the best game engines in the World and could easily compete with UE5 and Frostbite: IDTech, Forza Tech, Halo’s Slipstream Engine developed by some of the same engineers who wrote the DirectX graphics API (the Xbox name comes from this API).

One strategy could be to have the Coalition continue to work on Unreal Engine, but heavily contributing code back to the code base for third party developers to use, while the other studios start contributing code to a common game engine runtime. This could give Microsoft a leg up on Sony and could allow content to take centre-stage, as opposed to the (continuous re-invention of) technology.

Although, Sony’s hardware is not anywhere near fully tapped out, especially with regards to next-gen features, its IPC is definitely a lot closer to its ceiling due to a more performant software stack and very clever hardware optimisations. There is only so much Sony can do from here with regards to IPC but I very much doubt Sony is going to stop here either.

Whatever the case may be, this will be one hell of a generation to watch and both consoles will surely impress as the generation heats up.

Dead or Alive 5 on Xbox One X is a Pleasant Surprise

Team Ninja has a history of great games under its belt but they are most famous for the Dead or Alive and Ninja Gaiden series. With Itagaki in the driver’s seat, they were pushing console hardware to its very limits achieving graphics that few other companies could.

Unfortunately, since Itagaki left the company, the team has been making one sub-par game after another. We sure all want to forget the disgrace Ninja Gaiden 3 was.

Unfortunately, Dead or Alive 5 in my mind was another example of a team having lost its way. The team made some questionable choices when it came to graphical choices for the game to the point that it sometimes looked worse than the xbox 360 launch title: Dead or Alive 4. There were a few reasons for this:

  • Lack of anti-aliasing
  • Lack of texture filtering – DOA 3 and 4 had better!
  • What looked like atrocious texture work all round
  • The buttery-smooth 60fps from previous games was replaced by glitchy framerates and freezes.

Well, imagine my surprise when I started Dead or Alive 5 Last Round on Xbox One X. The console’s 4K upscaling, forced 16x texture filtering and higher compute power allows the game to run smoothly, the texture work to shine through finally (although it still leaves something to be desired at times) and the game runs at 1080p continuously without the dynamic scaler kicking in with what looks like proper AA in most of the levels. It does look like the game applies AA based on GPU load, and only when there is GPU overhead left. The transformation is massive in my eyes: the game now looks to be worthy of the Dead or Alive name. It’s worth checking it out on the Xbox One X. The previous low quality presentation gives way to a sharp smooth look. Awesome!

Now if only Team Ninja could be fixed so easily. Their recent showings don’t give us much hope and it’s unfortunate, because unless Tecmo takes some radical action, they will waste and possibly ruin two of the best gaming IPs: Dead or Alive and Ninja Gaiden. My advice to Tecmo:

  1. Allow Microsoft to make all previous games available on Xbox One X in 4K – and get some more life out of those games.
  2. Time to sack whoever was responsible for Ninja Gaiden 3 and DOA 5!
  3. Consider building a new more capable team around these two IPs.

I would like to see Team Ninja pushing console hardware to its limits once again with the crazy attention to detail they were once renowned for. It’s not enough to do well enough to kind of meet what other developers do! We want to be amazed once again with what you can do other developers only dream of… 

Redout Lightspeed Edition – Xbox One X 4K Patch Review

The 4K patch for Redout Lightspeed Edition, a futuristic anti-gravity racer by an Italian studio, just dropped a few days ago. The patch notes promised native 4K rendering, 300+ bug fixes, major performance improvement and a 60fps lock due to the dynamic scaler. Has the Italian studio 34BigThings, achieved their goal?

Upon playing just the first level of the game, it became obvious that something is not quite right with this patch. While the visuals are absolutely gorgeous and look to be rendering at 4K natively, something seems to be broken with the dynamic scaler: it doesn’t actually look like it’s kicking in at all! There are major major performance drops in certain sections to what looks like below 30fps refresh! It’s pretty clear something isn’t quite right. The rest of the first 4 levels all seem to have similar slowdowns in places. 

We have had patch issues with games on the Xbox One X, such as with Titanfall 2, that the developers miscalculated and then fixed quickly. Let’s hope they can do the same with this one. 

Interestingly, the slowdowns don’t really affect the gameplay as you can still steer the ship easily and response is good. But of course screen refresh isn’t great in places. 

However, only around 10% of a level is affected, the rest of the level plays out at an almost locked 60fps. It’s interesting that even slowing down and rotating the camera around the problem area, the slowdown still happens and it doesn’t look like there is any dynamic scaler kicking in or the lower bounds of the scaler are way too high. It also mostly happens when lots of the track is in view all at once so clearly they are overloading the GPU by trying to draw too many things at once without either properly scaling back Level of Detail settings or dropping the resolution.

Anyway, I have contacted 34BigThings to have a look at the issue. Let’s hope for a fix soon. 

Fake 4K – When 4K is not Really 4K

UHD Blu Ray is oftentimes made from a source only 6% higher in resolution than a normal Blu Ray. Be aware of this when buying…

How Movies are Prepared for the Cinema and the Home

Movies are normally shot on film or digital cameras. In either case, the usable resolution is between 4K and 10K depending on the medium. However, even movies shot on optical need to be converted to something called the Digital Intermediate (DI) to do digital effects, colour grading and other digital manipulation to the image before released to cinemas including being printed back onto optical print for older cinemas as well as release onto home video. Therefore the Digital Intermediate limits the final resolution and picture quality attainable from the source.

Unfortunately, most of the movies even now are finished using a 2K DI. Did you know that 2K has a resolution of 2048×1080? Compare that to Full HD (1080p) with a resolution of 1920×1080 and you are only getting an increase of 128 vertical lines which amounts to 6% of a resolution boost over 1080p. It is in fact so small that you are unlikely to notice it.

Enter UHD Blu Ray

For any movie finished with a 2K DI, what the movie studios do is upscale the 2K DI to a 4K image, which is almost 4x the resolution of the original creating extra pixels in the process. They cannot add any more detail, they simply use algorithms to guess the extra information. This can work to some degree just like the upscaling of your TV can work well with 1080p material. But then why not save the money, buy the Blu Ray and let your TV upscale – or something super-capable like Panasonic’s UHD players.

To add insult to injury, they then put it on a disk that’s almost the same size as your standard Blu Ray: 66GB versus 50GB disks as the 100GB disks are not really used, only for longer movies. They then compress using an algorithm that actually produces a slightly softer picture if not given enough bandwidth so really you arrive at a picture that’s not really better than 1080p for a price much higher than the standard Blu Ray. This is why reviewers are sometimes hard pressed to find much difference in detail between the Blu Ray and UHD versions. This is atrocious in my opinion.

Now granted, there are UHD disks that have been produced from a higher quality DI such as 3.2K, 4K and very rarely 8K. Those are the disks worthy of your attention. Always check before buying. 4K is not always 4K. Unfortunately, the format isn’t going to shine until they start using the 100GB disks as standard to give the bitrate room to breathe and start using 6K and 8K DIs. Then we will see what the format is really capable of. The current state of the format is sub-standard compared to what it could be.

To understand if a movie is real 4K or fake 4K, check out, which keeps tabs on the resolution of the DIs for each release, and even whether the release was put on a 100GB disk or not.

Now of course, there is more to UHD than just resolution, there is (the messy world of) HDR and also Wide Color Gamut (WCG). But that’s for another article.

Ninja Gaiden Black on Xbox One X is Tremendously Handsome

The First Wave of Original Xbox Titles

I consider Ninja Gaiden on the original Xbox as still one of the best action games of all times. So when Microsoft announced that they would be bringing Original Xbox games to the Xbox One and that Ninja Gaiden would be in the first wave of titles, I was excited!

Upscaling – Not Just Any Upscaling

Emulating the original Xbox is no small engineering achievement, but little did we know that Microsoft wasn’t done there. Microsoft’s backward compatibility team figured out a way to upscale these titles to 1080p on Xbox One and 4K on Xbox One X at the emulator level. It isn’t just a simple video output upscaling, but the actual game on the emulator is rendering out 4x (on Xbox One) or 16x (on Xbox One X) the resolution of the original game with antialiasing thrown in for a super smooth image that is an awe to behold on the Xbox One X especially.

Xbox Graphics

People like to throw figures out: these days it’s teraflops, back in the day we were talking about vertices or triangles a second. However, I think those numbers are pretty meaningless with regards to the kind of image quality a machine is able to produce.

I was never drawn into the Playstation world of games because I found their graphical quality lacking in major areas: objects were too angular, textures were blurry and lacked detail and jaggies were all over the image. This is because Sony make some rather poor design choices when it came to their machines, with the exception of the PS4. With PS1 and PS2, they didn’t have much of a choice due to the available technology to them, however, with PS3 they had only themselves to blame. The machine may have had enough power to complete with the Xbox 360, but the truth is that the machine is not even capable to surpass the original Xbox with regards to graphics and here is why:

From the original Xbox, Microsoft built features into the GPUs to allow for shortcuts in rendering. One of these techniques in the original Xbox was called bump mapping (or DOT3 mapping). Bump mapping allowed the machine to simulate more detail in its textures and more details in the character models than would have been possible otherwise. The Xbox GPU also had pretty sophisticated pixel and vertex shading for the time.

Ninja Gaiden on Xbox One X

So how does Ninja Gaiden hold up after so many years upscaled to 4K with the original assets? Quite simply stunning! I have the PS3 copy of the game as well and running through the same areas in the game, my PS3 copy looks flat and outright horrible in some areas. The Xbox One X image is jaggy-free, has better lighting (except for the first chapter which was re-mastered on the PS3) and beautiful running at 60fps. The original assets hold up remarkably well and the original art really shines through. It is clear the original Xbox was capable of some awesome graphics, just the low resolution was holding the image quality back.

Tremendous job! The Wizards at Microsoft have done it. Now I can’t wait to play some of my favourite games on the platform such as:

  • Outrun 2 – although there may be licensing issues with this game because it had to be pulled off retail years ago – Microsoft, sort out a way to replace music in games so this isn’t an issue!!
  • House of the Dead III – the original Xbox was an arcade powerhouse! This was so much fun.
  • Dead or Alive 3 and Dear or Alive Ultimate (DOA 1 and DOA 2) – come on! Make Xbox One the first platform to have all Dear or Alive games, please. Also bring DOA 4 from Xbox 360! While you are at it, bring DOA Beach Volleyball 1, 2 and 3 so the spin-offs are also on one console. The only thing holding this back might be Tecmo, however, now that they are finished milking DOA 5, they might allow it. Come on, guys!
  • Rallisport Challenge 1 and 2 – they are classics!
  • Top Spin – I still love the original. Awesome!
  • MotoGP 1, 2 and 3 – the original games were some of the best motorcycle racing games on any platform. Still haven’t found any that compares – MotoGP 07 on Xbox 360 comes close.
  • Dino Crisis 3 – this game got some beating from the reviewers, but it is one of the best Sci Fi puzzle shooters on the platform with beautiful graphics and cut-scenes to die for. It is a personal favourite of mine!
  • Brute Force – why not!
  • Unreal Championship 2 – oh yes!

The Future of Xbox Backwards Compatibility

Now that Microsoft has figured out how to upscale original and Xbox 360 games to 1080p and 4K, I can see a future where they enable this for all backward compatible games (400+ and counting). Enabling it and testing it game by game might be tedious, so it would be great if the feature could be switched on and off for titles that have not been tested with the feature yet so the full library can benefit straight away- and we can turn it off per game if the game has glitches because of it until MS gets around to manually updating it. Maybe add some options for upscaling such as 1440p / 1800p / 4K so we can configure it and see what works for now. I know it’s a lot to ask for but dare to dream!

What’s more, this opens the possibility of Xbox One games that were not patched with 4K support for the Xbox One X to actually be upscaled to 4K using an emulation layer / swapping out render targets at run-time. Some of those games currently run at 900p with lack of AA. Even if 4K isn’t possible, 1800p (note I didn’t write 1080p!!) should be easily possible for older Xbox One titles.

One more thing: if they could enable this for all OG Xbox and Xbox 360 games WITH at least 2x AA (preferably 4x where possible), then Xbox One X would be THE PLATFORM to play 3 generations of Xbox games at the best image quality possible. Please make it happen!



Building Your Own Home Cinema Business: Interview With Ben Hobbs

Ben Hobbs, Managing Director of H3 Digital kindly offered to do an interview for SimpleHomeCinema. I welcomed the idea as I believe in supporting each other in this line of business and our readers who may be passionate about starting their own business might well take encouragement to do the same. The interview has been printed below. Should you have any other questions for Ben, please use the comments area below the post and I’ll see if we can get Ben to help us answer them.

Ben Hobbs, Managing Director of H3 Digital

Roland: Ben, please tell us a bit about yourself so our readers can get to know you. Where are you from originally? Where did you grow up? 

Ben: I’m from the UK originally, I was born in Brighton and then moved to Milton Keynes when I was young.  I moved to Thailand when I was 26 years old. 

H3 Digital Logo
H3 Digital Company Logo

Roland: What made you move to Thailand at 26? Is there much of an English community where you are?

I’m into technology and after the dotcom crash it just felt like there wasn’t going to be much growth in that sector in the UK for a while. Yes there is quite a big expat community of people from all over the world here,  Thailand is a great place to live or holiday.

Roland: It does sound like an great place to live and work. Was audio-visual science and home cinema a passion for you from a young age or did you fall into it – so to speak – later on?

Ben: Yes, very much so.  I keenly remember applying for a Student Loan when I was at University and spending more time, and being more excited by planning what HiFfi gear I was going to buy, than on my college work.  (Sherwood CD Player, Sony Amp and Mission 732 speakers if anyone is curious).  It was then that I knew I had the bug.

Roland: How and when did the idea of making a business out of it come to you?

BenIt was always a hobby of mine, I had always had a very special interest in Home Cinema and Music – It wasn’t so much that I listened to music a lot or even saw a lot of movies, instead it was piecing it together that I enjoyed, planning it and hearing and seeing what amazing setups I could build.  It never occurred to me that I could do this as a living.

Roland: That’s really awesome, Ben. It sounds like you share the same passion as me and some of our readers. How did you find your first paying client?

Ben: After the dot com bust in the UK I had a choice to make, either stay in the IT industry – I was in recruitment, pays well but not particularly fun – Or come to Asia, Thailand in particular.  My Father lived in HK and some of his friends were building holiday homes in Phuket, the problem was there wasn’t any Technology expertise.  My Brother and I came over and helped design intelligent cabling and systems into those holiday homes.


RolandSo it sounds like you kind of fell into it through connections that you had?

Ben: Going into business for yourself is a big life decision. We saw an opportunity and grabbed it with both hands, the safe thing would have been to stay back in the UK.  Moving abroad and starting a company at the same time is fairly risky for anyone – it was useful that we had some insight into prospective work.

Star Wars Home Cinema Built by H3 Digital

Roland: Were you successful straight away or did the business grow slowly?

Ben: We were always busy, that doesn’t necessarily mean success. We have had our ups and downs.  So we grew slowly, then quickly, then shrank after the Global Financial Crisis and then grew again in a slower more measured manner.

Iron Man Cinema by H3 Digital

Roland: What were some of the challenges you faced as you were growing the business? 

Ben: At first when you start a company there is so much to do to get the ball rolling, then once you come up with the processes, products and have staff it’s all too easy to become complacent. You have to make sure you keep busy. Constantly find new ways to make yourself useful and relevant in such a fast moving industry.

Nuvo In-Ceiling Speakers
In-Ceiling Speaker Installation for Nuvo Music System by H3 Digital

Roland: Did you have to have much of a capital investment initially?

Ben: There was some outlay to get the company up and running, mainly involving getting the company started, work permits, vehicle, etc. We’ve always tried to keep it as organic as possible though.  My advice here: be as sparing as possible with startup money – if you start with a big lump of cash it’s more than likely going to get wasted.

Home Cinema Installation by H3 Digital

Roland: What were the critical success factors in getting the business where it is today?

Ben: Not giving up.  Running your own business can be really tough at times, sometimes you just want to roll over and give up – You can’t, so you make sure to fight through the hard times and you learn constantly through the whole process.  We’ve tried to keep it as much fun as we can, I didn’t get into this industry to get rich, I do this job because I love it.  That helps.

Roland: That is a great attitude to have, Ben. What is your business model? Do you charge clients for the man hours / consultancy or can you also make money on the equipment by getting wholesale prices? Did the business model change over the years?

Ben: We make some money on equipment and some on the installation, it’s probably around 50/50.  Initially we started out by billing labour as a percentage of the equipment cost but later we moved over to a per unit install cost, where we charge a set integration price on equipment install, That way our customers are fairly charged according to our time rather on how expensive their equipment is.

Roland: How do you make sure the business can be sustained? How do you get new clients coming on board?

Ben: Easier to keep your current and past clients coming back than advertise and market for new ones constantly.  We’ve done two or more properties for more than half of our clients now, sometimes the same property twice! When we first started iPhones weren’t even around so many customers have used us many times to keep their properties up to date.

To do that you must give good service, never cheat people and do your best to make your customers happy.  As a company there has been quite a few times that we’ve ended up losing money on jobs, through no fault of our own – perhaps a supplier let us down or raised their prices.  The customer though is our client, he is dealing with us and we have always been fair.

Roland: What are your plans for the business in the near and mid-term? How do you intend to grow it?

Ben: We are currently building a brand new office which will feature better demo facilities, a coffee shop, better staff facilities and more room for us to stock products.  In addition to designing and installing home cinema, audio and lighting systems ourselves we also distribute some audio and cinema products to other companies.  We hope to include great training facilities and warehousing for those products.

Going forward we are making sure we keep it fun, make sure our clients are happy and look forward to all the great new technology that will be coming out in the future.

Roland: Finally, is there any advice you’d like to give to our readers who would like to get into the business?

Ben: Do it!  If you like home cinema and audio as a hobby, you will more than likely enjoy setting up big systems for others.  Look into if there is a CEDIA in your country and try to get some certified training. Even if you don’t have experience, I think if you know your stuff and have a CEDIA qualification you could probably walk into a junior position in the industry.

I’d like to thank Ben for his time to answer my questions and I’d like to wish him – on behalf of all our readers – good luck with his business.

If you’d like your Home Cinema business to be featured on our site, please contact us.

Why I love the Xbox – and Looking Forward to the Xbox One X


This isn’t going to be an unbiased view or review – as if there was such a thing minus conducting a double blind study on it and even that is questionable. This is my story of why I love the Xbox brand and I have never been swayed by the competition.

Let me start by saying I am certainly not a Microsoft “fan-boy”. I feel, in the area of personal computing, they have not innovated since Windows 95 and actually are holding the industry back. Don’t even get me started on Google: we’ll sell you out so you can get it for free and Apple: let’s put lipstick on a pig! Well, I guess if I have to use a pig, it might as well be pretty, lipstick and all, so I am writing this from a Mac.

So with that out of the way, why the Xbox?


I remember getting my hands on an Xbox back in the early 2000s in London and hooking it up to a flat widescreen CRT and 5.1 surround sound system at home. I bought two games initially: Dead or Alive 3 and Halo, and I was floored by the graphics. This was the first piece of hardware that could simulate high-resolution textures using bump mapping, pixel shading and the like and it looked super good.

Remember, the console was designed by the guys who designed DirectX (hence Xbox) and they knew just how to do it. The PC would be getting these features approx a year after the Xbox launched.

This was a far cry from the likes of Sony’s first 2 consoles: PS1 and PS2 which could only push out low-resolution textures and lacked hardware for pixel shading or bump mapping. In fact, even the PS3 was weak in those areas.

I must admin, Microsoft lost the plot with the Xbox One X and was overtaken by Sony in the graphics department, but not by much. Most cross-platform games initially ran at lower resolution or at a lower framerate on the Xbox One. Microsoft however did correct course – freeing up system resources by removing Kinect and helping developers utilise the EDRAM better – and brought games upto better performance on the system.

The Xbox One X is a message from Microsoft to say: we are sorry, we messed up. We have engineered a box that is kick ass and will push the graphical boundaries in this generation. In fact, due to heavy customisation of the silicon inside of the Xbox One X, 6Tflops of power will go a long way to make it competitive with not just the current Playstation line-up but also well into the next generation.


The Xbox over 3 generations has built up a massive library of amazing games. It would take a lot more time to play through them than I have now as an adult so not going to run out of games to play anytime soon. Whoever is complaining that Xbox doesn’t have enough exclusives or good games should look on their shelves: I bet you haven’t played through those games you bought last month or even last year.

I think Microsoft’s new approach to backwards compatibility: bringing 3 generations of Xbox games playable on the Xbox One family of consoles – and beyond – is a massive value proposition for both developers and gamers. There are over 1300 Xbox One games that have been specifically developed for the Xbox One, while 400+ Xbox 360 games are also playable with more and more becoming compatible every week. The Xbox 360 had a library of over 1200 games over its lifetime. Even if Microsoft only made half of those available to play, we will have more games playable on Xbox One than any other console in the history of gaming.

Then there is backwards compatibility for the original Xbox. There were over 1000 titles developed for that system. Even if Microsoft only brought forward a quarter of those, it would show that they honour the investment gamers have made into their platform and that is massive!

The value proposition and brand loyalty for me therefore makes a lot more sense than buying a Playstation. Even IF Sony were to bring forward PS1 and PS2 games, I lose interest because of a lack of good graphics or audio. I even detest most PS3 games because a lack of texture quality stands out for me almost immediately. Granted, PS4 games look and sound great. But that isn’t the whole story…


I don’t really know where Sony got their idea for the Playstation controllers, but I have always found their analogue sticks way too flimsy without enough resistance to allow them to be accurate – rather frustrating! The PS4 controller is slightly better, but it’s neither ergonomic or smart design. Maybe it’s designed for Japanese hands but I need something a bit meatier to grab onto.

The Xbox controllers from the first generation Xbox were very well designed, for accurate aiming and great button / trigger positioning. The controller sits in the hand very comfortably even after hours of play.

The dual rumble motors of the Xbox controller also just feel much much better. It literally feels like they can simulate different textures: as you’re riding on the asphalt, as your motorbike speeds up, as you fly through the air, etc. It’s a sensational piece of kit. Yes, the Playstation controller used to have rumble motors, but since it didn’t sit in your palm properly, all you felt was this light piece of plastic vibrating in your hand. Let’s just say I was not impressed.

The final blow Microsoft managed to land on Sony’s flimsy controllers was the Xbox Elite Controller and the superb customisability of even the stock controller button layouts – per game no less!

Xbox Live and Services

It is not secret that Microsoft was the first company that successfully launched online gaming for consoles. Sure, they weren’t the pioneers (Sega were), but they were the first to get it right. Xbox Live is a very well-designed network and how it integrates with games. Microsoft’s push to enable cross-platform play between consoles should also be praised. I think it’s the right thing to do for gamers and Nintendo agrees. However, if I was playing on a Sony Playstation with a PS controller, I would sure have my behind kicked sooner than I can say “hey!”. Maybe I can see why they are scared… 😉

I would like to see Microsoft re-enable Xbox Live for original Xbox games – and keep it alive for Xbox 360 games. In fact, some of the most popular titles could get a rotation on Xbox Live with certain weekends dedicated to oldies where the service is enabled for that game for everyone to join in. That way, the smaller but faithful community for those older games could get together and have a blast.

Why the Xbox One X

The Xbox One X makes a lot of sense to me. Finally, Microsoft is back on the horse. They have engineered an Xbox that will be a tough act to follow for Sony. They have the best controllers, the best online services, an amazing catalogue of games over 3 console generations that will be playable on the machine over time – at great performance and improved graphical fidelity.

On that note, I would actually like to see Microsoft applying higher anti-aliasing to Xbox 360 and original Xbox games than they were originally released with. I know it’s not as simple as applying texture filtering which will be done, but I think it would improve visual fidelity greatly if jaggies were gone from older generation games. Microsoft, please make it happen!

As you might have guessed, jaggies are the last bug-bear I have with video games of today and the Xbox One X will be the first console that promises to eliminate them by either a 4K resolution or downsampling for 1080p screens. I have to say that even the Xbox One S made grate strides to reduce them though. Have you tried the 4K output of the Xbox One S? It reduces jaggies considerably so for me it was worth of price of admission – especially with a UHD Blu Ray drive.

As I said earlier, I think the Xbox One X could also become a machine that is kept alive beyond the Xbox One S and play next-generation games – albeit at a lower resolution or graphical fidelity. Nonetheless, if Microsoft are smart enough, they will ensure games will be forwards compatible (play on older consoles), and actually the plumbing they have been doing under the hood of the Xbox One family of consoles will enable them to do just that – and much easier than the challenge of porting games back and forth through the generations. This is something Sony has not even begun to work on and enable properly, and I think over time, it could cost them their console business or at least a considerable amount of business. However, we want Microsoft on their toes and not get complacent so long live Sony!

In any case, that is only my view of why I love the Xbox brand. Whichever console you play, enjoy! Let me know in the comments below why you love your Playstation, Xbox, Nintendo or whatever else you’re playing. Happy Gaming!

How To Create Seating Platforms for Your Home Cinema

I will show you how to elevate your sofa to create proper cinema seating in your living area.


When I moved into my new place, I had the opportunity to create a separate Home Cinema and seating area in my living room. I decided that I wanted to have every seat in the house a good seat. What this meant was that I didn’t want to have my recliner sofas in an L-shape like in the previous place, but have them one behind the other, proper cinema style. However, to do this, I needed to elevate the back sofa enough so people in the back could see the screen comfortably, but also in a way that fits the room, looks stylish and is sturdy enough.

You can see the result of this work below:



What you will need to get the job done:

  1. Strong-enough MDF boards that can hold the weight of the seating. I simply went to our local hardware store (Bunnings), pulled MDF boards off the shelves and tested how they bent and asked the hardware guys what their weight holding capacity was. I decided that I was going to have two MDF boards – one next to the other – to hold one sofa up. (Approx. AU$40)
  2. Sturdy enough legs that can hold the weight and won’t buckle. Also, they need to elevate the platforms enough that people sitting in the back will be able to see 100% of the screen from the back so pick legs that are large enough I picked 23cm legs that could be extended by another 5cms. I decided to attach 5 legs per board initially, but settled on 6 legs in the end. (2x6x AU$10.50 = AU$126)
  3. A cheap carpet by the meter that will look good when stapled to the MDF. Again, you should be able to find this in your local hardware store. (I bought this for AU$16.90 a metre and bought 3 metres = )
  4. A pair of scissors.
    1. A stapler gun (AU$16.90 with 1800 staplers)

Altogether, the bill of materials was AU$233.60.

You will need to measure the space out so that the platforms can sit against a wall or corner preferably so they don’t move. However, as an option, you could put rubber or  carpet underneath the feet of the platforms to make sure they don’t slide. Get your local hardware store to cut the MDF to the correct size.

Putting it Together

First, fix the legs onto the MDF platforms: one on each corner, one in the centre. After the  initial testing, I decided to put 3 legs in front so the platforms should hold larger people as they step up on the platform.

Next I put the carpet onto the platforms and cut it to size so it hangs off the bottom of the platforms.

Lastly, I stapled the carpet to the platform from the bottom. When stapling, you need to pay attention to the corners so the sides fold properly onto each other. To achieve this, you will need to cut them diagonally, making sure you don’t cut into the corner hanging onto the top side of the MDF.

Please see below for a picture of the result from the bottom and the top side.


Putting Them into Place

The sofas had to come forward then then platforms put into place next to each other. The back sofa was then put onto the platforms and rests solidly on them. The platforms are sturdy enough to bear the weight of two people and the sofa comfortably without any instability.

I also saved materials for a third platform that goes in between these two, should I decide to elevate my bigger sofa instead of the smaller one.

If you do embark on this journey, good luck and enjoy!

Loudness Correction and Yamaha YPAO Volume

Here you were thinking you had it all sorted once you applied EQ to your system, hey? Not so fast… a flat EQ is not going to give you correct tonal balance after all. Meet Loudness Correction.

What is Loudness Correction?

Human hearing is not as sensitive in the higher and lower frequency ranges as it is in the mid-range. What this means in practice is that when not listening to a recorded program – let it be music, movies or TV shows – at the volume the program was recorded at or intended to be played at, the treble and base seem to drop off quicker than the midrange as volume is decreased. This actually upsets the tonal characteristic of the program material – even though you may have a completely flat frequency response for your speakers.

Here you were thinking you had it all sorted once you applied EQ to your system, hey? Not so fast… Now you actually know why a flat frequency response for a system is only actually perceived flat if it is playing at the right volume… unless of course we apply loudness correction.

I won’t go into the biological reasons for this or the estimations of how human hearing responds to changes in sound pressure. If you want to know more about the technical ins and outs of Loudness Correction, do a search for any of the following terms on the Internet:

  • Fletcher-Munson equal loudness contours
  • Robinson-Dadson curves
  • Normal Equal-Loudness Level Contours, ISO 226:1987

Loudness correction is not something new, it has been around in stereo systems for the last 30+ years. In fact, I have a loudness control on my 12-year-old stereo system in my car. However, auto-loudness correction is relatively new in the field of Home Cinema – first introduced by Denon and Marantz and other receiver manufacturers about 8 years ago through the inclusion of Audyssey’s Dynamic Equaliser, which is still the most sophisticated to date. Since then, THX, Dolby and more recently Yamaha have come up with their own version of it.

When it comes to movies, reference level for all channels – except the subwoofer channel – is calibrated by adjusting the playback system such that a pink noise signal recorded at -20dB relative to full scale (0dB) creates 85dB sound pressure level as measured with a C weighted SPL meter at the seating locations. Volume levels are adjusted for each channel individually until they reach 85dB.


The premise is valid: when listening to movies below the reference level (85dB), the tonal characteristic of the movie is changed. However, since loudness correction is not an exact science – human hearing is difficult to measure precisely – the different systems implement it slightly differently.

Yamaha YPAO Volume

I took some measurements with regards to Yamaha’s YPAO Volume to see how much loudness correction is applied to the high (above 6.5Kz) and low end (between 20Hz and 400Hz). Results are below:

  • volume at -40dB : 4dB added both high and low end
  • volume at -35dB: 3dB
  • volume at -30dB: 2dB
  • volume at -25dB: 1dB
  • volume at -22dB: 0dB – originally measured this at -22dB, but actually this should be -20dB as I am looking at the graphs now.

Screen Shot 2017-08-05 at 11.40.05 AM.png

Since I don’t have a unit that has YPAO volume – Yamaha sent me a unit (RX- A3060) to test for this – I wanted to know if I could re-create the effect using simply the base and treble controls on my unit. Below are the results:

Screen Shot 2017-08-05 at 11.43.58 AM

As you can see from the graph, applying 4dB to both treble and bass at -40dB, the results very closely match the low end, but the high end is not as closely aligned between 3KHz and 8Khz. Let’s have a look at higher volume levels:

Screen Shot 2017-08-05 at 11.47.39 AM

At -30dB, adding 2dB to both the high and low end, the effect is decreased, but also the errors. Since I actually listen between -30dB and -25dB normally, I have added -1.5dB to both bass and treble permanently on my older a3020 equivalent receiver.

How does it sound in practice? I have to be honest, Yamaha’s new RX-A3060 sounds noticeably clearer than my RX-A3020, even when matching the loudness curve using bass and treble controls as closely as possible and setting them up the same. (Please note that for the graphs above, the subwoofer EQ was turned off and the Yamaha was left to its own devices when it came to EQ.)

Now was it due to YPAO Volume, which works really well when listening to both movies and music at lower volumes, or was it the difference in DACs, the new 64bit YPAO or other component changes between the units? It is hard to know, but likely a combination of all the above. In the same room, with the same speakers, the difference in clarity was noticeable.



Yamaha YPAO Configuration – The Right Way


I have been around to a friend recently who just bought the latest and greatest Avantage Receiver from Yamaha and AU$5000 worth of speakers but didn’t bother to read the manual in how to set up the receiver correctly.

Good sound is dependent just as much on doing the setup correctly, than it is on buying great gear. I was shocked and horrified to learn that after spending almost AU$10K on his setup, he didn’t bother to go through the expert setup procedure, he simply placed the mic on the couch (a big no no), measured one microphone position and off he went. Then he complained to me that the system sounded like $h1t, pardon my French! Of course it did…

However, after fixing his setup, I realised many people do the same, so it’s time for showing you how to do Yamaha YPAO – and actually most other home cinema receiver – setup correctly. Some of the advice below can be applied generally to all receivers. I will highlight Yamaha YPAO only advice here.

General Setup

Before anything else, you should go into the Manual Setup section under Speaker Setup and set the amp assignment correctly, dependent on which speaker terminals you used for which speakers and whether you used external amplification. You will need to refer to your receiver’s user manual as it does differ dependent on the the make and model of your receiver, even if we only look at Yamaha Receivers.

Microphone Placement

Firstly, auto setup should be configured as follows:

  • Multi-position: this is true even if you are only using one chair in one position to listen to the system. Although manufacturers include a single placement setup procedure, it is nearly impossible to get great sound using only one sample. They will either over-correct or under-correct for variations in frequency response. ALWAYS select multi-position measurement.
  • Angle Measurement (YPAO only): this will ensure that CinemaDSP will be configured correctly and if you have an ATMOS or DTS X enabled receiver, the height information will be used to ensure sounds are steered correctly to the different speakers. I would recommend to always select this option as well.

Ok, now with regards to Microphone placement:

  • DO NOT place the microphone on a hard surface like a table or even your couch (either on the seat or head rest) by itself. This will allow it to absorb reflections from the couch or hard surface in a way that is not natural. Unless you plan on listening to your system by putting your ear where your @$$ is or by placing your head on the coffee table and resting your foot on the couch (sounds rather uncomfortable if you ask me!), do not do this.
  • DO place the microphone on a tripod. The microphone has a grove that goes into your standard camera tripod. If you don’t have one of those, elevate the microphone by taping it to an upside down glass (the kind you drink out of). I usually use a plastic cup that has a slim bottom so it doesn’t act as a hard surface for reflections. Make sure you only tape the bottom of the mic. When I do this, I elevate the glass or cup using soft pillows to ear height, and not some other hard surface like a box or – as my friend did – a metal hollow side table.
  • DO make sure that the microphone has a “line of sight” view of all the speakers in the room if possible – at all microphone positions. The only exception to this is the subwoofer which does not need to have a line of sight view from the mic. I sometimes make some exceptions to this when it is totally unavoidable. For example, I have the back surrounds on a bookshelf which the back seats don’t have a line of sight view of. When I switch to the calibration which includes all seating positions in my home cinema, there were some mic positions at the back that did not have a line of sight view. However, all the front seats did. Not a huge deal but I don’t listen with that calibration when we only sit in the front row as it’s not ideal.

Have a look at the diagram below which you can find variations of in Yamaha documentation. The below is the correct version with number 1 position being in the centre of the couch (or chair). Some versions of this diagram were incorrectly putting number 1 on the side of the couch.


However, this is where it ends for me. I usually do the following, which seems to work with YPAO and other receivers as well:

  • Position 1: in the centre of the couch with microphone at ear height, away from the  headrest of the couch. Basically, this is right in the centre of where your bum will be but at ear height. The reason this needs to be in the centre is that the delay / distance measurements are taken from the first position. If you were to place this first measurement off centre, some seats would get better distance measurements while others a lot worse. We want to even this out and have every seat experience the most ideal sound the room allows.
  • Position 2: Same position as number 1 but to the seat to the right of the centre seat. On a 3-seater couch, this will be the right-most seat.
  • Position 3: same as position 1 and 2 but to the left. In case you’re wondering, it doesn’t matter if you go left or right.
  • Positions 4-6: Now I repeat the same measurements I did in Positions 1 to 3 but with the microphone placed much closer to the headrest where your ears would be, but at just enough height that it clears the back of the couch by a cm but also that it is getting some of the reflections off the back of the couch. Why? Because this is the position your ears will be AND there will be reflections that will muffle the sound that originate from the headrest of the couch.
  • Positions 7 and 8: I normally do these positions just between positions 1-2 and 1-3 (just a little off-centre) but in the same elevation and distance from the back of the couch as positions 1 – 3. This is to ensure we give the receiver enough variation for the room acoustics and don’t overwhelm it with having to compensate for that particular couch.

If you were configuring one seating position, a chair or arm chair, do the above exactly but divide up the chair equally or do some positions to the left and right of the chair at ear height as well as just a bit in front and above ear height.

If you have multiple couches, I recommend, doing more positions on the couch that will be your primary listening position and doing the rest on the other couch(es) following the above principles.

Angle Measurement (YPAO only)

YPAO in higher end Yamaha receivers will allow you to do something called an angle measurement. You will have to do this in Position 1 above, but use the little boomerang that came with your receiver like the one below.

YPAO mic

The positions are marked. Position 1 on the boomerang must face towards your front speakers, while 2 and 3 towards the back. The boomerang can be fixed onto a tripod as well. I really recommend this. Otherwise the taping method works here as well.

Measure each position by following the on-screen guidance. Please note that the height position is only available on Atmos and DTS X receivers.

Manual Configuration

Cabling and Crossover

Once YPAO (or other Receiver config) has finished, check the following:

  1. Speaker cabling is normal in all positions.
  2. Check what the crossover frequency was set to on your speakers.

Even though your speakers may be able to do frequencies below 80hz, it is not recommended to go below this cross over for movies as those frequencies are hard to control even from the subwoofer alone. However, your milage may vary. If you don’t have a separate subwoofer EQ, but only using the Yamaha, you may want to leave crossovers below 80Hz or leave speakers as “large” so no crossover is applied to them. As a rule of thumb, if Yamaha configured your speakers with a high crossover, you should not lower it (e.g. from 100Hz to 80hz), you can however up the crossover frequency (e.g. from 60hz to 80hz).

The reason you should not lower crossover frequencies is because the setup routine determined that anything lower than the crossover frequency doesn’t reach the seating positions without major dips (-3dB which is 1/2 of the percieved volume) in frequency response. If you lower the crossover, you may still get an uneven response or may not hear certain frequencies at all as now the subwoofer is not playing them back either.

If the crossover set is much higher than you know the speakers can reproduce, you may want to repeat the setup routine paying attention to mic placement. If they still come up short, think about aiming the speakers with the woofers pointing at the main listening positions.

Equalisation Curve

The equaliser on Yamaha receivers can be set to the following:

  • FLAT: this is the default. You should set this if you listen to a lot of TV, not just movies or you listen below reference level (around -17 volume on Yamaha receivers, 0 volume on THX and Audyssey enabled receivers) and your room has lots of soft furnishings.
  • NATURAL: this is the old CinemaEQ curve. It tames the high end to make sure that movie soundtracks don’t sound too bright if you have a normal living room with lots of hard surfaces like floors and walls without carpet or other soft coverings. Also use this if you are listening to movies LOUD as otherwise even without hard surfaces, just being much closer to the speakers than in the cinema will elevate the high frequencies beyond what was intended. Please note that some DVDs and Blu Rays had EQ applied during mastering to lower the high frequencies for playback in the home. If a movie sounds too muffled with this, you have two options: up the treble 2dB which may work or switch to the flat curve.
  • FRONT: this leaves your front left and right speakers alone and will timbre match all the other speakers to them. Unless you like how your front speakers sound because they are some ultra-expensive supersonic beasts, do not set this. Leave this alone. This is there to please a special few who spent $50K+ on their front two speakers.
  • THROUGH: no EQ is applied. Why would you do this unless you sit inside an anaphoric chamber? Seriously! Don’t!

Other settings

  1. Adaptive DSP should be switched on. It variates the DSP strength based on volume.
  2. Adaptive DRC (Dynamic Range Control) should be switched on if you’re listening at lower volume levels (below -25). Anything above -25 I would recommend switching it off as it can introduce a harshness to certain sounds on any receiver without YPAO Volume.
  3. YPAO Volume should be switched on. I am going to do a review of it in June 2017, at which point I’ll make further recommendations and check if it needs to be switched off for reference listening (anything above -25 really).

Surround and Surround Back Speakers Volume Levels

The only major issue I see with Yamaha receivers – or most receivers that don’t have Audyssey – is that they don’t variate surround speaker volume levels dependent on volume. When listening to surround programs at lower volume levels, the surround and surround back speaker volumes might need to be increased in certain setups to maintain the same surround envelopment. I have heard different views on this but it seems to me like those that have their surround speakers at ear height seem to have less trouble with this than those who have them somewhat higher than ear height or further away than the front speakers.

I believe both Audyssey and Dolby researched this and found it to be true, however. I find the same when listening in my home cinema where the surround speakers are somewhat higher than my front speakers while my presence speakers are on a third plane altogether near the ceilings. I normally increase surround and surround back channels by 1-2 dB dependent on how loud someone is listening to the system in general. You could even design a reference setting and a low volume setting turning this and other features on and off. On Yamaha you can do this using something called “Scenes”.

Give me feedback in the comments below and let me know how you go! Happy Listening!