Wii U Uses a Memory-Heavy Hardware Design, Nintendo Developers Overcoming Adjustment Hurdles

nintendo-chip-cpu

When the CPU clock speed for Wii U was revealed, there was initially some anxiety about whether it was actually more powerful than PlayStation 3 or Xbox 360. Of course, we know enough to be able to say that not all CPUs can be compared directly – and in the case of Wii U, we now officially know why: Wii U uses a more RAM-intensive processing structure, tapping into cache memory instead of relying on a speedy, power-hungry CPU. This has been suspected for some time, but it’s nice to see it spelled out once and for all by one of the hardware leads, Mr. Genyo Takeda.

The RAM-heavy approach also seems to be the route that both of the other next-gen platforms have selected, with PS4 rumored to pack 4GB of RAM – over twice what’s included in Wii U – so it looks like this is going to be the standard for the next several years at least.

Mr. Miyamoto and Mr. Iwata also talked about overcoming the hurdles of developing for next-gen, and report that Nintendo really started making progress in this area at the end of last year, and are approaching other significant leaps in development at the current stage. Read on for their full comments.

On overcoming development hurdles

Q: Is the current development structure suitable for the new architecture?

Miyamoto: We have not specifically changed it. We have just put the right development staff members in the right place to raise the level of each development phase. The other point is that many of our third-party software developers have been dedicated to technologies like shaders. As Wii U is designed to bring out their real strengths, there have recently been more cases where we develop something with their help. It has been more convenient for us to work together with them because they have been able to more smoothly utilize their know-how for development for Wii U.

Iwata: I may add that each game console has its own unique qualities, and developers must go through a trial and error phase to acquire the knack of taking full advantage of them. This time does not come until a final version of the hardware and development tools for the version have been made available and then a base for software development has been established. For Wii U, such a time finally came in the latter half of last year. In this sense, we could not avoid the trial and error stage to create games which take full advantage of the hardware. I think that this is true for third-party software developers as well as Nintendo’s.

The home consoles of other companies are six or seven years old and software developers have sufficiently studied them and know how to take full advantage of them well. As Wii U is new to them, some developers have already acquired the knack and made good use of its features and others have not. You might see this gap among the games that are currently available. However, we are not much concerned about this problem because time will eventually solve it. Actually, we believe that our in-house development teams have almost reached the next stage. It is not true that we are deadlocked with a lot of trouble in our development. Otherwise, we could not aim for 100 billion yen or more in operating profit for the next fiscal year.

On Wii U hardware power

Genyo Takeda (Senior Managing Director, General Manager of Integrated Research and Development Division): I don’t want to talk about anything too technical, but in my view, Wii U is a console with low power consumption and has fairly high performance. Regarding your comment that we focus on the GPU and that the CPU is a little poor, we have a different view. It depends on how to evaluate a processing unit.

In terms of die size (area a chip occupies), the GPU certainly occupies a much larger space than the CPU. As you can see CPUs used for the latest PCs and servers, however, it is usual for current CPUs that the logic part for actual calculations is really small and that the cache memory called SRAM around it covers a large area. From this angle, we don’t think that the performance of the Wii U’s CPU is worse than that of the GPU. In other words, we have taken a so-called “memory-intensified” design approach for the Wii U hardware. It is no use saying much about hardware which should remain in the background in our entertainment offerings, but at least we think that Wii U performs pretty well.

In regard to GPUs, they are so advanced that other companies in the video game market seem to be on the same path. Developers have also been accustomed to programmable shaders to create games. In this sense, we think that the entire industry, including Nintendo, has had less trouble in this field than in the time when shaders were emerging.

Source: Nintendo Third Quarter Financial Results Briefing for the 73rd Fiscal Term Ending March 2013

  • d

    the thing is, something new will always come out. So now they’ve got shaders, cool, and next sony/xbox will have luminous engine

    • Skyward Schlong

      Nintendo banks on development for affordable, established technologies. It’s a strategy that doesn’t appeal to everyone, but their business has survived this long. I play Nintendo. I hardly know what shaders are, and a “luminous engine” sounds like something out of science-fiction.

      • RestlessPoon

        Your name is beautiful

    • Erimgard

      I think Luminous Engine will be pretty smalltime compared to Unreal 4 and FOX Engine.

  • Unimportant Bystander

    What’s right isn’t always popular.

  • tipoo2

    It’s just one excuse after another with these guys. We know the main memory has a maximum bandwidth of 12.8GB/s, and the eDRAM is now thought to be 32GB/s from the leaks on neogaf from someone in the know, or if that’s fake a maximum of 70GB/s. You can’t really call that “memory intensified”.