Ubisoft caught in a little Assassin’s Creed scandal

Forum now read-only, closed for posting. For support please look elsewhere.
Locked
User avatar
Blín D'ñero
Site Admin
Posts: 9969
Joined: 17 Feb 2008, 02:05
Location: Netherlands
Contact:

Ubisoft caught in a little Assassin’s Creed scandal

Post by Blín D'ñero »

Assassin's Creed: DX10.1 support will be removed.
Hi guys,

I’m glad to see some familiar faces around here and it’s cool to be back! I used to work in Ubisoft’s UK office but I’m now working in the Montreal studio, where they developed Assassin’s Creed, and for the foreseeable future I’ll be here as your community developer.

Having me here basically means that I’ll be giving you updates on what’s going on with Assassin’s Creed and the first bit of news I have is that we’re planning to release a patch for the PC version of Assassin’s Creed that addresses the majority of issues reported by fans. In addition to addressing reported glitches, the patch will remove support for DX10.1, since we need to rework its implementation. The performance gains seen by players who are currently playing Assassin’s Creed with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly.

We don’t have an exact date for the release of this patch but we’re hard at work and hope to have further updates soon.

Thanks guys

UR
Source
Main PC: Asus TUF Gaming 570-Pro (wi-fi) * AMD Ryzen 7 5800X * Noctua NH-D15 * Corsair Vengeance LPX 32GB * Asus TUF Radeon 6800XT * Creative AE-9PE * 2 x Samsung 980 Pro * 7 x WD Gold HDD * Corsair HX 1000 * 1 x Asus DRW-24D5MT * Dell U3010 * Windows 10 x64 *

Office PC: Asus ROG Strix X570-E * AMD Ryzen 7 3800X * Noctua NH-D15 * Corsair Vengeance LPX 32GB * MSI Radeon 5700XT * Creative Soundblaster ZxR * 2 x Corsair Force MP600 * 7 x WD Gold HDD * Corsair AX 1200W * 1 x Asus DRW-24D5MT * Dell P4317Q * Windows 10 x64 *

Old workhorse PC: * Intel i7 4790K * Noctua NH-D15S * Asus Maximus VII Hero * Corsair Force MP510 480GB M.2 SSD * 32 GB Corsair Dominator Platinum CMD32GX3M4A2133C9 * Sapphire Radeon R9 290 * 3 x Dell U2410 @ Eyefinity 5760 x 1200 * Corsair HX 1000i * 7 x WD Black / Gold HDDs * Creative Soundblaster ZxR * Asus DRW F1ST * Corsair K95 RGB * Corsair M65 PRO RGB * Steelseries 9HD * Coolermaster STC T01 * Edifier S530 * Sennheiser HD598 * Windows 10 x64 *
User avatar
Blín D'ñero
Site Admin
Posts: 9969
Joined: 17 Feb 2008, 02:05
Location: Netherlands
Contact:

Ubisoft comments on Assassin's Creed DX10.1 controversy

Post by Blín D'ñero »

by Scott Wasson — 5:01 PM on May 8, 2008

We have been following a brewing controversy over the PC version of Assassin's Creed and its support for AMD Radeon graphics cards with DirectX 10.1 for some time now. The folks at Rage3D first broke this story by noting some major performance gains in the game on a Radeon HD 3870 X2 with antialiasing enabled after Vista Service Pack 1 is installed—gains of up to 20%. Vista SP1, of course, adds support for DirectX version 10.1, among other things. Rage3D's Alex Voicu also demonstrated some instances of higher quality antialiasing—some edges were touched that otherwise would not be—with DX10.1. Currently, only Radeon HD 3000-series GPUs are DX10.1-capable, and given AMD's struggles of late, the positive news about DX10.1 support in a major game seemed like a much-needed ray of hope for the company and for Radeon owners.


After that article, things began to snowball, first with confirmation that Assassin's Creed did indeed ship with DX10.1 support, and then with Ubisoft's announcement about a forthcoming patch for the game. The announcement included a rather cryptic explanation of why the DX10.1 code improved performance, but strangely, it also said Ubisoft would be stripping out DX10.1 in the upcoming patch.

In addition to addressing reported glitches, the patch will remove support for DX10.1, since we need to rework its implementation. The performance gains seen by players who are currently playing Assassin’s Creed with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly.

This statement raised a whole new set of questions: What exactly is the "costly" render pass that's being removed in DX10.1? Does it impact image quality or just improve performance? And what are Ubisoft's motives for removing the DX10.1 code path?

Rage3D posted a follow-up article noting some very slight image quality anomalies with DX10.1, but nothing major. Other sites, including PC Games Hardware in Germany and the HardOCP, reproduced Rage3D's findings about performance increases and minor image quality changes in DX10.1.

Perhaps the DirectX 10.1 code path in Assassin's Creed needed some work, as Ubisoft claimed, but why remove DX10.1 support rather than fix it? The rumor mill creaked to life, with folks insinuating Ubisoft decided to nix DX10.1 support in response to pressure from Nvidia after the GPU maker sponsored Assassin's Creed via its The Way It's Meant To Be Played program. Our conversations with multiple credible sources in the industry gave some credence to this scenario, suggesting the marketing partnership with Nvidia may have been a disincentive for Ubisoft to complete its DirectX 10.1 development efforts.

Our next step was to ask Ubisoft some specific questions about DX10.1 support in Assassin's Creed, in order to better understand what's happening. Fortunately, Charles Beauchemin, the tech lead for the Assassin's Creed development team, was kind enough to answer our questions. Those questions, and his answers, follow.

TR: First, what is the nature of the "costly" "post-effect" removed in Assassin's Creed's DX10.1 implementation? Is it related to antialiasing? Tone mapping?

Beauchemin: The post-effects are used to generate a special look to the game. This means some color correction, glow, and other visual effects that give the unique graphical ambiance to the game. They are also used for game play, like character selection, eagle-eye vision coloring, etc.

TR: Does the removal of this "render pass during post-effect" in the DX10.1 have an impact on image quality in the game?

Beauchemin: With DirectX 10.1, we are able to re-use an existing buffer to render the post-effects instead of having to render it again with different attributes. However, with the implementation of the retail version, we found a problem that caused the post-effects to fail to render properly.

TR: Is this "render pass during post-effect" somehow made unnecessary by DirectX 10.1?

Beauchemin: The DirectX 10.1 API enables us to re-use one of our depth buffers without having to render it twice, once with AA and once without.

TR: What other image quality and/or performance enchancements does the DX10.1 code path in the game offer?

Beauchemin: There is no visual difference for the gamer. Only the performance is affected.

TR: What specific factors led to DX10.1 support's removal in patch 1?

Beauchemin: Our DX10.1 implementation was not properly done and we didn't want the users with Vista SP1 and DX10.1-enabled cards to have a bad gaming experience.

TR: Finally, what is the future of DX10.1 support in Assassin's Creed? Will it be restored in a future patch for the game?

Beauchemin: We are currently investigating this situation.


So we have confirmation that the performance gains on Radeons in DirectX 10.1 are indeed legitimate. The removal of the rendering pass is made possible by DX10.1's antialiasing improvements and should not affect image quality. Ubisoft claims it's pulling DX10.1 support in the patch because of a bug, but is non-commital on whether DX10.1 capability will be restored in a future patch for the game.

The big question now is what happens next. It's not hard to surmise that AMD's developer relations team stands ready to assist Ubisoft with fixing Assassin's Creed's DX10.1 code path as quickly as possible, and that doing so ought to be relatively straightforward, since the game's developers have said DX10.1 simply allows them to reuse a depth buffer without re-rendering it.

One would hope that all parties involved, including Ubisoft and Nvidia, would encourage the Assassin's Creed development team to complete its DX10.1 development work in a timely fashion—not to abandon it or to delay its completion until Nvidia also has a DX10.1-capable GPU on the market.

After all, Nvidia recently signed on to the PC Gaming Alliance, whose charter involves pushing common standards like DX10.1 and increasing "the number of PCs that can run games really well." Assassin's Creed is nothing if not a perfect candidate for assistance on this front: a high-profile console port that's gaining a reputation for steep hardware requirements and iffy performance on the PC. How can such an alliance succeed if one of its members is working at cross-purposes with it in a case like this one? And what would the owner of an nForce-based system with a Radeon graphics card think upon learning that Nvidia's marketing dollars had served to weaken his gaming experience?

We'll be watching to see what happens next. For our part, the outcome will affect whether and how we use Assassin's Creed and other Ubisoft and Nvidia "TWIMTBP" titles in our future GPU evaluations.
Source
Main PC: Asus TUF Gaming 570-Pro (wi-fi) * AMD Ryzen 7 5800X * Noctua NH-D15 * Corsair Vengeance LPX 32GB * Asus TUF Radeon 6800XT * Creative AE-9PE * 2 x Samsung 980 Pro * 7 x WD Gold HDD * Corsair HX 1000 * 1 x Asus DRW-24D5MT * Dell U3010 * Windows 10 x64 *

Office PC: Asus ROG Strix X570-E * AMD Ryzen 7 3800X * Noctua NH-D15 * Corsair Vengeance LPX 32GB * MSI Radeon 5700XT * Creative Soundblaster ZxR * 2 x Corsair Force MP600 * 7 x WD Gold HDD * Corsair AX 1200W * 1 x Asus DRW-24D5MT * Dell P4317Q * Windows 10 x64 *

Old workhorse PC: * Intel i7 4790K * Noctua NH-D15S * Asus Maximus VII Hero * Corsair Force MP510 480GB M.2 SSD * 32 GB Corsair Dominator Platinum CMD32GX3M4A2133C9 * Sapphire Radeon R9 290 * 3 x Dell U2410 @ Eyefinity 5760 x 1200 * Corsair HX 1000i * 7 x WD Black / Gold HDDs * Creative Soundblaster ZxR * Asus DRW F1ST * Corsair K95 RGB * Corsair M65 PRO RGB * Steelseries 9HD * Coolermaster STC T01 * Edifier S530 * Sennheiser HD598 * Windows 10 x64 *
User avatar
Blín D'ñero
Site Admin
Posts: 9969
Joined: 17 Feb 2008, 02:05
Location: Netherlands
Contact:

Ubisoft caught in a little Assassin’s Creed scandal

Post by Blín D'ñero »

By Theo Valich
Thursday, May 08, 2008 00:01


TG Daily investigates – Amidst talk that the PC’s days as a major gaming platform could be counted blockbuster titles such as Assassin’s Creed are welcome signs that show just the opposite. Sadly, it is very likely that this game will be remembered for a controversy that dances around a strange decision to remove support for DirectX 10.1 and handed transferred an initial performance advantage for ATI’s Radeon cards over to Nvidia. Did Nvidia have its hands in this one? We looked a bit closer to find out.

In the beginning, everything looked perfect. The DX10.1 API included in Assassin’s Creed enabled Anti-Aliasing in a single pass, which allowed ATI Radeon HD 3000 hardware (which supports DX10.1) to flaunt a competitive advantage over Nvidia (which support only DX10.0). But Assassin's Creed had problems. We noticed various reports citing stability issues such as widescreen scaling, camera loops and crashes - mostly on Nvidia hardware.

Ubisoft became aware of these complaints, which ultimately led to the announcement of a patch. According to Ubisoft Montreal, this patch will remove support for DX10.1 and exactly this result caused Internet forums to catch fire.

So, what is it that convinced Ubisoft to drop the DirectX 10.1 code path? Here is the official explanation:

“We’re planning to release a patch for the PC version of Assassin’s Creed that addresses the majority of issues reported by fans. In addition to addressing reported glitches, the patch will remove support for DX10.1, since we need to rework its implementation. The performance gains seen by players who are currently playing Assassin’s Creed with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly.”


We certainly could eat this statement with a grain of salt, but the game developer did not address a statement made by ATI's Developer Relations team, released when the game was introduced:

“Ubisoft [is] at the forefront of technology adoption, as showcased with the fantastic Assassin’s Creed title. In this instance our developer relations team worked directly with the developer and found an area of code that could be executed more optimally under DX10.1 operation, thus benefiting the ATI Radeon HD 3000 Series.”

Let’s get this straight: On the one hand, the game was released, worked better on ATI hardware, supported an API that Nvidia or Intel didn't and still do not support and then game developer is releasing a patch that will kill that advantage. This brought back memories of Tomb Raider: The Angel of Darkness, a TWIMTBP-supported game that worked better on ATI hardware at time of release, because Nvidia was plagued with performance issues of GeForce FX series.

Assassin's Creed is an Nvidia-branded “The Way It's Meant To Be Played” title, and it didn't take long until rumors about a possible foul-play by Nvidia surfaced. Some voices on Internet forums allege that Nvidia threatened Ubisoft and requested the developer to remove DirectX 10.1. We kept our eyes on this development, but when we started to receive e-mails from graphics card manufacturers (both ATI and Nvidia customers), adding to the already heated discussion of what may have happened in the background, we decided to shift our investigation into a higher gear and talk to all parties involved.


DirectX 10.0 vs. DirectX 10.1 in Assassin' Creed effects

The difference that developers failed to explain is the way how AntiAliasing is controlled in DirectX 10.0 and 10.1. In DX10.0, it was impossible to access information for each sample from a depth buffer. This actually led to a costly slowdown in AntiAliasing operations. 10.1 allows shader units to access all AntiAliasing buffers. All of this was brought to limelight by article an over at Rage3D (https://www.rage3d.com/articles/assassinscreed/).

Following three quotes from software developers, this effect was experienced with all DirectX 10 titles, and there is a good chance that you've already played their games. We talked to a (DX10.0) game developer close to Ubisoft, who requested to remain anonymous, told us that Ubisoft’s explanation walks on thin ice. Here is what he responded to our inquiry and his take on Ubisoft’s statement:

“Felt you might want to hear this out. Read the explanation and laughed hard … the way how DX10.1 works is to remove excessive passes and kill overhead that happened there. That overhead wasn't supposed to happen - we all know that DX10.0 screwed AA in the process, and that 10.1 would solve that [issue]. Yet, even with DX10.0, our stuff runs faster on GeForce than on Radeon, but SP1 resolves scaling issues on [Radeon HD 3800] X2.”

We received a second reply from another game developer, who is currently a DirectX 10.1 title that fully compliant with DX10.0 hardware:

“Of course it removes the render pass! That's what 10.1 does! Why is no one pointing this out, that's the correct way to implement it and is why we will implement 10.1. The same effects in 10.1 take 1 pass whereas in 10 it takes 2 passes.”

A third email reply reached us from a developer a multiplatform development studio:

“Our port to DX10.1 code does not differ from DX10.0, but if you own DX10.1-class hardware from either Nvidia or ATI, FSAA equals performance jump. Remember "Free FSAA"?”


Michael Beadle, senior PR representative at Ubisoft and Jade Raymond, producer of the game, told us in a phone call that the decision to remove DX10.1 support was made by game developers. Both told us that there was no external influence which would mean that Nvidia did not participate in this decision. It was explained to us that features were implemented and tested on a platform with DirectX 10.1 graphics during the development process, which led to an implementation in the final code. However, that code was untested on a large number of DX10 systems and that ultimately led to crashes or system instability.

Ubisoft's explanation indicates a classic case of QA failure that already happened to EA's Crysis as well. Unfinished code was released as the final version. The changes developer made caused instabilities with GeForce hardware, but owners of older ATI products should also be affected (Radeon 2900, for instance) and expect crashes or camera lockdowns.


Money? What Money? Oh, that money.

There is no information whether DX10.1 will be re-implemented and that fact makes the story look fishy. There are rumors that Nvidia may have threatened Ubisoft to pull from co-advertising deals, which are said to have a value of less than $2 million. As a sane businessman, you don't jeopardize a cooperation because of one title - and those $2 million are just one component in the cooperation between these two companies. Of course, we asked both companies for comment, which delivered two different answers.

Derek Perez, director of public relations at Nvidia said that “Nvidia never paid for and will not pay for anything with Ubi. That is a completely false claim.” Michael Beadle from Ubisoft stated that “there was a [co-marketing] money amount, but that [transaction] was already done. That had nothing to do with development team or with Assassin's Creed.”

So, Nvidia appears to have some sort of financial relationship with Ubisoft, just like EA, Activision and other top publishers. Yes, AMD has a similar cooperation in place, but it is not as extensive as Nvidia’s program.


Conclusion

We leave it up to you to draw your own conclusion. Our take is that the Ubisoft team could have done a better to bringing the game to the PC platform. The proprietary Scimitar engine showed a lot of flexibility when it comes to advanced graphics, but the team developed the DirectX 10.1 path without checking the stability of DirectX 10.0 parts, causing numerous issues on the most popular DX10 hardware out here - the GeForce 8 series. The new patch will kill DX10.1 support and it is unclear when DX10.1 will see another implementation. The first "victim" of this battle was Nvidia (on a title they support), the second victim was AMD. Who is really at a loss here, however, are gamers, who are expected to pony up $50 or $75 (depending on where you live) for a title that was not finished.

Sadly, this is the way PC gaming is: There is a great game, but it is burdened with technical issues and ends up getting caught in marketing wars. We have no doubt that the development team made a mistake in QA process and integrated a feature that caused instabilities with Nvidia cards. Given Nvidia’s market share, Ubisoft had to patch it. What we do not understand is the reason for an explanation that left more questions than answers.

But then again, only the developers at Ubisoft know what the Scimitar engine can and cannot do.

Solution is simple: if you run ATI hardware, just don't patch the game (once that the patch comes out) and enjoy all the advantages of DirectX 10.1 - e.g. one pass less to render.

Source
Main PC: Asus TUF Gaming 570-Pro (wi-fi) * AMD Ryzen 7 5800X * Noctua NH-D15 * Corsair Vengeance LPX 32GB * Asus TUF Radeon 6800XT * Creative AE-9PE * 2 x Samsung 980 Pro * 7 x WD Gold HDD * Corsair HX 1000 * 1 x Asus DRW-24D5MT * Dell U3010 * Windows 10 x64 *

Office PC: Asus ROG Strix X570-E * AMD Ryzen 7 3800X * Noctua NH-D15 * Corsair Vengeance LPX 32GB * MSI Radeon 5700XT * Creative Soundblaster ZxR * 2 x Corsair Force MP600 * 7 x WD Gold HDD * Corsair AX 1200W * 1 x Asus DRW-24D5MT * Dell P4317Q * Windows 10 x64 *

Old workhorse PC: * Intel i7 4790K * Noctua NH-D15S * Asus Maximus VII Hero * Corsair Force MP510 480GB M.2 SSD * 32 GB Corsair Dominator Platinum CMD32GX3M4A2133C9 * Sapphire Radeon R9 290 * 3 x Dell U2410 @ Eyefinity 5760 x 1200 * Corsair HX 1000i * 7 x WD Black / Gold HDDs * Creative Soundblaster ZxR * Asus DRW F1ST * Corsair K95 RGB * Corsair M65 PRO RGB * Steelseries 9HD * Coolermaster STC T01 * Edifier S530 * Sennheiser HD598 * Windows 10 x64 *
User avatar
Blín D'ñero
Site Admin
Posts: 9969
Joined: 17 Feb 2008, 02:05
Location: Netherlands
Contact:

Re: Ubisoft caught in a little Assassin’s Creed scandal

Post by Blín D'ñero »

Ubisoft caught in Assassin's Creed marketing war
Author: Tim Smalley
Published: 12th May 2008


Ever since the release of Assassin's Creed on the PC, there has been a controversy brewing over the game's support for DirectX 10.1 and it looks as if things aren't going to calm down.

You see, owners of ATI Radeon HD 3000-series graphics cards benefitted from the inclusion of DirectX 10.1, as it enabled them to run anti-aliasing in a single pass that resulted in it delivering higher performance than the GeForce 9600 GT. The improvement was to the tune of 20 percent as a result of DirectX 10.1.

However, Assassin's Creed has had several reports of stability problems—mostly from users with Nvidia hardware, according to a report on TG Daily—and this led to the announcement of a patch that would remove DirectX 10.1 support from the game.

Assassin's Creed is a part of Nvidia's The Way It's Meant To Be Played program and so it didn't take long for the conspiracy theorists to suggest possible foul play by Nvidia because the company doesn't have any DirectX 10.1 supporting hardware.

DirectX 10.1 gives the shader units access to all anti-aliasing buffers in a single pass – something that developers have been unable to do with DirectX 10.0. "DX10.0 screwed AA [performance]. . . . 10.1 would solve that [issue]," said one developer reportedly close to Ubisoft.

"Of course it removes the render pass! That's what 10.1 does! Why is no one pointing this out, that's the correct way to implement it and is why we will implement 10.1. The same effects in 10.1 take 1 pass whereas in 10 it takes 2 passes," added another anonymous developer, said to be working on a title that implements DirectX 10.1 support – in addition to DirectX 10.0.


Ubisoft confirmed that the decision to remove DirectX 10.1 support was made by the game developers and expressly denied any external influence. Michael Beadle, a senior PR manager at Ubisoft, admitted that there was some co-marketing between Nvidia and Ubisoft, but he said that "had nothing to do with the development team or with Assassin's Creed."

Nvidia, on the other hand, denied any financial agreement "Nvidia never paid for and will not pay for anything with Ubi. That is a completely false claim," said Derek Perez, Nvidia's director of public relations. In the past, during our talks with Nvidia's Developer Relations Team, the company has pointed out that Nvidia spends a lot of money sending its own engineers to development studios to help support them – whether or not that is what Beadle is referring to is unclear.

I spoke to Richard Huddy, AMD's head of developer relations, on Friday in an attempt to find out when we can expect to see the path implemented again, because I'm sure that owners of Radeon HD 3000-series graphics cards aren't too happy that they're missing out on a 20 percent performance increase (when AA is enabled). Huddy said that he is working hard with his team to get DirectX 10.1 support back into the title for Radeon HD 3000 graphics card owners.

I pressed this point further on Saturday during a call with Nvidia spokesperson Ken Brown, and asked him if Nvidia had requested for DirectX 10.1 content to be removed from the game. "We aren't in the business of stifling innovation - it's ludicrous to assume otherwise. Remember that we were the first to bring DirectX 10 hardware to the market and we invested hundreds of millions of dollars on tools, engineers and support for developers in order to get DirectX 10 games out as quickly as possible," said Brown.

That response was to the point, but I felt it was worth pushing from another angle. I asked him if Nvidia ever signs exclusive deals with developers. "Every developer we've worked with on TWIMTBP has not been part of an exclusive arrangement - we do not prevent any developer from working with other hardware vendors," responded Brown. "Assassin's Creed is a great example of this because both Nvidia and ATI developer relations teams worked with Ubisoft to help during the development phase."

The remaining question of course is whether or not DirectX 10.1 support would be re-implemented into the game in a future patch. Brown said he didn't know the answer to that question - he explained that Nvidia has no influence in that decision and it would be up to the developer and publisher to decide whether it would return to the game.

When questions were put to Ubisoft, nobody there seemed to know either and that seems to be the problem – I'm sure somebody knows the state of play when it comes to DirectX 10.1 support in Assassin's Creed; it's just that nobody wants to say anything at the moment. You'd think one of the lead developers would know, but when Charles Beauchemin, the tech lead for the title, was asked about the future of DirectX 10.1 support in AC, he responded by saying "We are currently investigating this situation."

I hope that DirectX 10.1 support comes back to Assassin's Creed, but I'm honestly not very confident that it will at the moment. Whatever happens as we go forwards, stifling technology progression is not good for consumers and this is probably the first of many arguments revolving around DirectX 10.1. I hope it stops, because an industry that doesn't move forwards is not a healthy one – and I think many consumers would agree that right now is a time when PC gaming really needs to move forwards. Let's hope the PC Gaming Alliance does its thing and puts the consumers' interests first in instances like this.
Source

I am not interested in the patch 1.02. The game ran fine without it.
Main PC: Asus TUF Gaming 570-Pro (wi-fi) * AMD Ryzen 7 5800X * Noctua NH-D15 * Corsair Vengeance LPX 32GB * Asus TUF Radeon 6800XT * Creative AE-9PE * 2 x Samsung 980 Pro * 7 x WD Gold HDD * Corsair HX 1000 * 1 x Asus DRW-24D5MT * Dell U3010 * Windows 10 x64 *

Office PC: Asus ROG Strix X570-E * AMD Ryzen 7 3800X * Noctua NH-D15 * Corsair Vengeance LPX 32GB * MSI Radeon 5700XT * Creative Soundblaster ZxR * 2 x Corsair Force MP600 * 7 x WD Gold HDD * Corsair AX 1200W * 1 x Asus DRW-24D5MT * Dell P4317Q * Windows 10 x64 *

Old workhorse PC: * Intel i7 4790K * Noctua NH-D15S * Asus Maximus VII Hero * Corsair Force MP510 480GB M.2 SSD * 32 GB Corsair Dominator Platinum CMD32GX3M4A2133C9 * Sapphire Radeon R9 290 * 3 x Dell U2410 @ Eyefinity 5760 x 1200 * Corsair HX 1000i * 7 x WD Black / Gold HDDs * Creative Soundblaster ZxR * Asus DRW F1ST * Corsair K95 RGB * Corsair M65 PRO RGB * Steelseries 9HD * Coolermaster STC T01 * Edifier S530 * Sennheiser HD598 * Windows 10 x64 *
User avatar
Blín D'ñero
Site Admin
Posts: 9969
Joined: 17 Feb 2008, 02:05
Location: Netherlands
Contact:

Re: Ubisoft caught in a little Assassin’s Creed scandal

Post by Blín D'ñero »

Ubisoft: No Direct 3D 10.1 support for Assassin's Creed planned
PCGH interview with Ubisoft

28.05.2008 02:45


- Ubisoft told PCGH that there is no plan to reintegrate Direct X 10.1 with a future patch.

You might remember: The enormously successful Assassin's Creed was the first game to have DX 10.1 support. But the patch 1.02 removed this feature.

PCGH was able to get some more information about this business. Below you will find an email interview with Ubisoft.

PCGH: D3D 10.1 support in Assassin's Creed was a hidden feature. Why do you choose not to announce this groundbreaking technology?

Ubisoft: The support for DX10.1 was minimal. When investigating the DX10 performance, we found that we could optimize a pass by reusing an existing buffer, which was only possible with DX10.1 API.


PCGH: What features from Direct 3D 10.1 do you use with the release version? Why do they make Assassin's Creed faster? And why do FSAA works better on D3D 10.1 hardware?

Ubisoft: The re-usage of the depth buffer makes the game faster. However, the performance gains that were observed in the retail version are inaccurate since the implementation was wrong and a part of the rendering pipeline was broken.
This optimization pass is only visible when selecting anti-aliasing. Otherwise, both DX10 and DX10.1 use the same rendering pipeline.


PCGH: Why do you plan to remove the D3D 10.1 support?

Ubisoft: Unfortunately, our original implementation on DX10.1 cards was buggy and we had to remove it.


PCGH: Are there plans to implement D3D 10.1 again?

Ubisoft: There is currently no plan to re-implement support for DX10.1.
Source
Main PC: Asus TUF Gaming 570-Pro (wi-fi) * AMD Ryzen 7 5800X * Noctua NH-D15 * Corsair Vengeance LPX 32GB * Asus TUF Radeon 6800XT * Creative AE-9PE * 2 x Samsung 980 Pro * 7 x WD Gold HDD * Corsair HX 1000 * 1 x Asus DRW-24D5MT * Dell U3010 * Windows 10 x64 *

Office PC: Asus ROG Strix X570-E * AMD Ryzen 7 3800X * Noctua NH-D15 * Corsair Vengeance LPX 32GB * MSI Radeon 5700XT * Creative Soundblaster ZxR * 2 x Corsair Force MP600 * 7 x WD Gold HDD * Corsair AX 1200W * 1 x Asus DRW-24D5MT * Dell P4317Q * Windows 10 x64 *

Old workhorse PC: * Intel i7 4790K * Noctua NH-D15S * Asus Maximus VII Hero * Corsair Force MP510 480GB M.2 SSD * 32 GB Corsair Dominator Platinum CMD32GX3M4A2133C9 * Sapphire Radeon R9 290 * 3 x Dell U2410 @ Eyefinity 5760 x 1200 * Corsair HX 1000i * 7 x WD Black / Gold HDDs * Creative Soundblaster ZxR * Asus DRW F1ST * Corsair K95 RGB * Corsair M65 PRO RGB * Steelseries 9HD * Coolermaster STC T01 * Edifier S530 * Sennheiser HD598 * Windows 10 x64 *
Locked