Due to limitations intrinsic to video broadcasting—from capture to encoding to transmission—the final signal may not always fall neatly into the specified range. For a variety of technical reasons, data intended for grayscale level 2, for example, may slip to level 0 or be clipped altogether by the time it reaches your display.
This is much less of a concern in the all-digital distribution pipelines of today, but engineers decided at the time on a range, building headroom and toeroom around those upper and lower values so signals slipping outside the nominal range could still be reproduced by the end user if desired.
In the land of video, 16 and are reference levels, not hard limits. Any data below level 16 is referred to as blacker-than-black BTB , and any data above level is referred to as whiter-than-white WTW. Both ranges simply possess a different number of tonal gradations between reference black and reference white. As long as the display is set to receive the range the source device is sending, reference black and reference white for both ranges are the same intensity to the end user i.
We can now complete the circle we began in the section on color space above. Section 6. Thus, YCbCr is limited range by design. Because of this, it is unaffected by the Reference Level setting on the Xbox You can test this for yourself. It would be less confusing for consumers if the option was simply grayed out when in YCbCr mode.
As with video-based material above, we can now complete the circle. Using end-to-end can also help reduce banding in-game due to the additional levels of gradation available. If instead you set your X to Standard, this will require the GPU to render at which will then be requantized to upon output. Configuring the console for Expanded will avoid this unnecessary step.
The ultimate objective here is to ensure that source and sink, and everything in between, are in agreement. This will ensure reference black and white output from the source are in sync with the reference tones the display is expecting.
Which Reference Level to use becomes an easy answer if the display you are using can only reproduce one or the other. For example, most PC monitors have only one display mode , so using Expanded here is the obvious choice. Keep in mind that most HDTVs and other home theater equipment are configured for video levels by default.
You may need to dig into your menus and toggle it over if you want to use Expanded Reference Level for games. Remember, while full range RGB is the best way to output games from the Xbox , it only works if the display expects full range. Make sure they match. If you send Standard range to a display expecting Expanded range , the image will appear washed out and your contrast ratio will suffer: see Quadrant 3 below.
Low-end detail gets the shaft because your is placing black level at 16, while your display is placing black level at 0. Quadrants 1 and 4 are examples of Expanded-Expanded and Standard-Standard, respectively. The image below is used here courtesy of NeoGaf forums user Raist in this post.
Note 1: Many games offer an in-game brightness tool with instructions for how to pass the pattern. This should be the last thing you touch, if at all. Then bring up the brightness pattern included with the game.
This should be done on a case-by-case basis. You should never adjust TV controls to pass an in-game brightness pattern because those settings may be wrong for every other game, movie, etc. Instead, the preferred approach is to tweak the in-game brightness control per game, if necessary, to bring the game into alignment with your display calibration, not the other way around.
Some models are known to truncate whatever signal they receive such that they only pass Note 3: While most video mastering engineers strictly adhere to the range, a few may intentionally place detail above For example, white detail in clouds or sky, or white clothing may exceed reference white If your display is calibrated to show a bit of WTW, you will see this extra detail. Voting will remain open for 18 days, 4 hours, 35 minutes, 15 seconds , until Feb 1, at AM.
Let everyone know about your favorite games this year! Forums Discussion Gaming Forum. JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding. You are using an out of date browser. It may not display this or other websites correctly. You should upgrade or use an alternative browser.
First Prev 5 of 6 Go to page. Trunchisholm Member. Oct 31, Not yet BC. Jun 20, 2, I really hope it's not just fps boost, because, on my 4k tv, i prefer a resolution boost than a fps boost.
Steverulez said:. If it was unlocked maybe it was fine on those consoles, but 4k would've been nice. I dont think the PC version of UE was that great it ran pretty poorly for me personally so having an upgraded version on Xbox would've been nice. Click to expand Dead Man Typing Member. Oct 27, 2, Trunchisholm said:. It was 60fps only in multiplayer on XB1. The campaign was locked at 30, which is far from ideal. The thing is, they already showed the game running at 4K on Series X a year and a half ago.
I've no idea why they're waiting so long to release a patch, hopefully with an FPS boost option. Hey, why not both? Yeah for both! But lately we only got fps boost :. Last edited: Nov 14, Tailzo Member.
Oct 27, 4, Gatsbits said:. This list could be fpsboost related since all these games runs at 30fps. Dead Space 2, Oct 27, 7, Australia. I have a lot of BC games installed, I just went to my Xbox a little while ago to see if any updated that haven't been mentioned here I'll go back and check. Gestault Member. Oct 26, 10, SteviaRelievingYa Member. Apr 30, 1, Blacklist at 60 fps and Double Agent at fps would be amazing.
DarknessTear Member. Oct 25, 6, Blue Dragon and Lost Odyssey need to be enhanced. Galactic Specter Member. Oct 28, 4, Oct 25, 8, Arizona. Watch these just be a new emulator build with general stability tweaks being pushed out. MCD Member. Oct 27, 6, Mylandro Member. Oct 30, Gestault said:.
WhiteRabbitEXE said:. MCD said:. Kapryov said:. Where are they pulling these titles from? I think Eternal Sonata was a game that was being tested back before the BC program was stopped, so it could be a possible future addition. Not feeling confident about JSRF though, due to the music licensing. It's just them and me wishing they get BC. Oct 25, 15, Shane M Avenger. Jan 1, Thorold, Ontario, Canada. That's what my understanding was.
Aug 22, 2, Oct 30, 2, TheDutchSlayer Did you find it? Cuez I didn't! Timings of a millisecond 12 FPS frame with the old render target cache — many long, badly parallelized operations as render target copy and compute operations are preceded and followed by barriers can be seen Halo 3, by Bungie. Before we start, compare the previous picture with the timing graph from the update that we have just released:.
Timings of a frame with the new render target cache : 24 milliseconds 41 FPS — 3. So, copying ownership transfer is done only when a range is actually used with a different format, width or MSAA sample count. There still are issues with draws performed clipping to a viewport — again, the range touched by those draws has to be hugely overestimated — but once CPU vertex shader execution is added later to the emulator, it will be possible to determine the exact eDRAM extents for common cases of such draws, completely eliminating spurious overlaps and ownership transfers with no effect.
Other improvements come from the way copying between render targets itself is performed now. Previously, copying was done via the following path:. Now, copying is done directly from one render target to another by drawing quads with a pixel shader reading directly from the previous render target. Except for the For bits-per-component and float32 formats, the actual bit representation is copied through an integer format view, so NaNs, denormal flushing and the interpretation of both and as For the bit floating-point depth format, whose encoding can store values up to 1.
Nonetheless, this still pales in comparison with all the issues with a CopyTextureRegion-based approach. Here are some frame time comparisons between the old render target cache, the new one, and the fully custom ROV-based output-merger tested on Nvidia GeForce GTX , driver version If we turn on two light bulbs, the room will not look twice as bright to us. Monitors and TV screens take that into account, and also display the image in a nonlinear way.
This can be illustrated by the following test. Perceptually vs. The left and right parts consist of alternating black 0, 0, 0 and white , , lines — half of the pixels physically being black, half being white.
This means that, on a CRT monitor, half the amount of photons being a linear quantity will be emitted compared to a solid white , , rectangle. The solid rectangle on the top is colored , , — half the , , white. However, the bottom rectangle is visually much closer to the dithered pattern — but it has the , , color instead. But why would a monitor, to display — or half of — not actually emit half the light it emits to display ?
A gradient of colors in perceptually vs. See how the top gradient looks much more uniform than the bottom one. The top gradient is in gamma color space — where brightness values we perceive are distributed uniformly, rather than the lightness values actually emitted , unlike in the bottom gradient. Framebuffers after tone mapping go directly or almost directly to the screen.
Color textures — such as diffuse or albedo maps — are authored on monitors that display gamma-spaced images, and in normal in-game lighting conditions, also end up on the screen with brightness close than that in the original texture. However, in shading of a 3D scene, lighting values inside the game world are calculated — using formulas based on real-life behavior of light.
Thus, lighting has to be calculated using linear values — just like how the sum of light emitted by equal amounts of black and white pixels is halfway between the black level and white level in linear space in real life in the test above, lighting amounts are measured in linear space in the game world too.
This means that before the result is written to a fixed-point framebuffer, it needs to be converted from the linear space. Values from color textures participating in lighting need the reverse conversion — from gamma to linear. Gamma-space vs. Here you can see bright blue being mixed with bright red. But in the upper gradient, you can see a relatively dark purple area in between, while the lower gradient stays much more uniformly bright along its range.
This is because the upper gradient is interpolated in gamma space. This is in part why blending has to be done in linear space. In a 3D scene, blending also corresponds to actual light calculations. Alpha blending is also used for transparent effects such as smoke — where the amount of light reflected by smoke particles is mixed with lighting behind the smoke with the factor of concentration of smoke particles in the volume in the game world that corresponds to the pixel.
Screen-space ambient occlusion SSAO scales the amount of ambient light by the factor of how much of it is estimated to be able to reach a region around the point on the surface. But blending with the current value in the framebuffer is done by the output-merger functionality of the GPU.
This means that the fixed-function GPU hardware — not the game code — has to be responsible for conversion from linear to gamma, because before this conversion is done, another step needs to be done. And then blending of linear values is done by the output-merger , and only after that the resulting value is converted to gamma. On PC GPUs , the color space used to represent gamma values is sRGB , which is the color space also directly used on CRT monitors, and is the standard color space on Windows and assumed the default for images on the Internet unless a different one is specified in the image file.
This is a pretty difficult formula to implement in hardware, and according to the OpenGL sRGB extension specification , it is expected to be implemented via lookup tables. So, large hardwired lookup tables would likely take a lot of die space.
Instead, the Xbox uses a piecewise linear PWL approximation of the gamma curve, consisting of four linear parts, that is easy to implement in hardware using simple integer math. This difference is one of the difficulties of emulation — as this curve, as we said previously, is used in fixed-function parts of the GPU outside our control: output-merger and texture sampling functionality.
The possible alternatives are doing all the texture filtering in shaders manually massive pixel shading performance overhead, especially for anisotropic filtering , pre-converting to linear while significantly increasing bit depth big memory overhead, especially for compressed textures , or actually sampling as sRGB and converting from sRGB to PWL in the shader not exact, but much closer to true PWL filtering.
With render targets , the effects of inaccurate gamma emulation are much more pronounced. So, the PWL gamma curve needs to be actually taken into account in render target logic as well. But we still need a solution for the conventional host render target output path. Previously, Xenia was simply converting the color value returned by the shader to PWL directly in shader code. However, this resulted in the issue that we began with — blending happens in gamma space in this case.
This actually worked fine in many cases — generally gamma-space blending still looks believable enough, just different. And actually, early GPUs supporting sRGB conversion — the ones targeting Direct3D 9 — were also not doing blending in linear space, and were getting away with that mostly fine!
Compare the translucent parts in the two screenshots of a Half-Life 2: Episode Two scene drawn with gamma-space and proper linear-space blending:. However, there are cases where blending in the incorrect color space produces noticeably wrong results. One example is impact and footsteps decals in Halo 3.
This blending setup can be used to let the decal both lighten and darken the underlying surface — when the source is smaller than 0. Source value 0. But if the conversion is performed in the shader, the neutral 0. This results in transparent areas being brightened instead — the equation becomes 1.
This was one of the issues that were considered inherently unfixable in the old render target implementation. But the new update adds a workaround that works in certain games such as Halo 3 : emulation of piecewise linear gamma with real sRGB , but only for render targets and render-to-texture — regular textures are still fetched as piecewise linear.
For regular world textures, conversion is done from the usual PWL gamma — but render-to-texture data is converted from sRGB instead. Emulation of piecewise linear gamma as sRGB in framebuffer writing and render-to-texture Halo 3, by Bungie.
This new approach , however, only works in cases when the game only uses hardware gamma framebuffer writing and texture fetching. And they may also be mixing hardware and software conversion — for example, draw the world with fixed-function gamma writing, but do post-processing with manual conversion. Since the game knows nothing about us using sRGB instead of the expected piecewise linear gamma, sometimes using the new option may produce incorrect results.
Correct piecewise linear gamma throughout the entire frame rendering pipeline of the game without the new option Portal, part of The Orange Box, by Valve Corporation. Because of this, the new method of using sRGB for piecewise linear gamma is switchable with a configuration option and is disabled by default , since the impact of mixed gamma spaces is much more pronounced — the entire image may have incorrect colors — than that of incorrect blending, which is usually limited to translucent surfaces.
One of the possibilities provided by emulators is making games playable not just in their original form on a different system, but also with an improved experience. Resolution scaling — the ability to render the game at a graphics resolution larger than the original — is a common feature provided by emulators. Xenia is among the emulators supporting increased rendering resolution.
0コメント