|
So for the last couple of years I've been a little bit dissatisfied with GSL's internet streams. They look washed-out. By that I mean that the blacks look kind of gray and the colors look dull.
Today, as I was watching the GSL finals stream on Twitch.tv, which also had this problem, I realized what they're doing wrong.
To explain it requires some knowledge about how analog video encoded the tonal ranges of broadcast images, and how that legacy is still poking its head into modern video production.
The original analog video broadcast standard transmitted each line of video as a continuous, varying signal. You could think of the brightness of each point on that line as a voltage. So, while there were 525 discrete lines of video in the frame, within a line there was just a continuously varying voltage representing brightness.
Because this analog signal was designed to be transmitted via radio, over the air, and picked up by TV owner's antennas, noise was a problem. This noise came from many places, including signal leakage by other electronic equipment, weather effects, and astronomical effects like interaction between the sun and the ionosphere.
So, the designers of analog video defined "black" to be a signal level somewhere above the 0 point, so that noise would have less effect on the darkest darks in the scene. "White" was defined to be a particular intensity of signal above that, but of course there were no limitations to how strong the signal could be. The TV's automatic gain control was expected to scale the highest signals to be white, but there was no assurance that a stronger signal wouldn't get through.
When a color signal was added to video in the late 50s, it was added in a backward-compatible way, so that it could be broadcast alongside the luminance signal that had already made up black-and-white TV. This color signal decomposed into two components -- a blue/yellow channel and a green/red channel. The color space for this was referred to as YUV, with Y representing luminance, U representing blue/yellow, and V representing red/green.
In the early days of digital video in broadcasting, those lines of analog information in each broadcast frame were divided up into pixels, and each pixel represented by a triplet of values, representing Y, U, and V. At first, these channels were (at best) 8 bits each. However, because analog video allowed "sub-black" values and "super-white" values, the black to white range was encoded between 16 and 235, leaving 0-15 for sub-blacks and 236-255 for super-whites.
So, plenty of systems were developed that mapped black to 16 and white to 235, with the intention that they'd be correctly mapped into analog broadcast levels when converted back into viewable (and at that point, analog) video.
This standard has carried forward to today. Most broadcast video systems use a 16-235 range for their black-white range, and retain capability for sub-blacks and super-whites. Even newer systems with a lot more than 8 bits per channel retain proportional chunks of their ranges for sub-blacks and super-whites.
The problem comes when converting video encoded in this way for viewing on a computer. The sRGB color space, which is currently the standard color space for almost all display monitors, modern Macs, and Windows PCs, encodes black at 0 and white at 255 with no allowance for sub-black or super-white.
Thus, it's necessary for video engineers converting signals from broadcast systems to view on computer systems to stretch that 16-235 range to 0-255 at some point in the process. Many video editing packages know about this conversion and offer some control over it, but when preparing images for a stream that go from broadcast equipment to, say, a Flash encoder, someone has to manually intervene to set it up correctly.
As DJWheat pointed out on the most recent Live on Three, GOMTV's production for the GSL at IPL 5 has been excellent. However, because of their emphasis on TV broadcasting, it's quite likely that they have overlooked this issue, particularly because the impact on the look of their images is disappointing, but not dramatic. It's possible that nobody with a broadcast engineering background has ever looked at their converted Internet stream.
I'm contemplating how to communicate this to the GOMTV folks, but I'm concerned that the language barrier and the subtlety of the technical discussion involved will lead to nothing. As a professional with many years of experience in digital imaging who has had to deal with these issues for work, I can say that these matters are not well-understood in many cases even by the people who need to know about them to produce the best-quality output.
So, any thoughts are welcome.
Edit: I have examined a range of GOMTV VODs as well as the Twitch.tv recording of tonight's broadcast and none of them exhibit this problem. However, I have a screenshot from the live broadcast of the award ceremony tonight that exhibits it. Note that I have seen this consistently in their live streams, but looking at their VODs, they all look good.
And here's this same image corrected assuming that 16 is a saturated black and 235 is a pure white:
Here, by the way, is a similar image off the Twitch.tv replay. It's so far in the other direction that I kind of think someone manually applied an adjustment to fix it:
In the histogram of the image, the darkest values start around 17 and the brightest appear at about 250, with most of the whites occurring at 229 and below but some geting a little brighter. This tells me that the problem is probably NOT doubled-up, but is likely present.
The fact that the VODs do not illustrate the problem suggests that there may be some difference between their live-streaming conversion workflow and their VOD recording workflow.
Edit 2: I did mail the GOMTV account on TL asking them to take a look at this blog and to pass it along to their technicians. Maybe it may help!!?
|
I'll throw out the additional observation that it's precisely GOMTV's emphasis on broadcast television and their use of professional equipment that makes them susceptible to this problem!
Edit: Wow, thanks for featuring this blog post within minutes! Hopefully this helps get these thoughts to the GOMTV folks.
I want to point out one additional fact: I took a few screenshots of the GSL stream today and opened them up in Photoshop to look at the histogram, which shows how values in the image were distributed between 0 and 255. It was very clear that the darkest blacks were elevated and the highest whites were lowered a bit. I only had a few minutes to look at this, so I couldn't check the exact values, but it suggests that something like what I describe is going on. Most typical Twitch.tv streams broadcast using Xsplit or whatever do not have this problem.
Edit 2: I have edited the OP to include a screen shot along with its corrected version.
|
Ha. As someone who knows about this stuff, you might be spot on here. I, too, have the "something's odd" feeling on GOM's stream. However, it never occurred to me that it might be the colour range. As a Free Quality viewer, I simply assumed it gave the impression because of the poor resolution.
Great job finding out, and I think you've done the explanation quite well. As to the problem of communicating with GOM, no idea.
|
I came to this blog expecting jokes about how tasteless has lost his passion. I am disappointed.
|
I'm working on including some pictures to show what I mean. IBringUFire: Thanks for the confirmation. I think the thing to do is link this blog to Mr. Chae in a PM and hope he can pass it along to someone who might find it useful. Kerotan: Tasteless may be susceptible to jet lag, but I don't think he's lost his passion! :D
|
After some thought, I think your best bet would be to go through Blizzard. It's in their best interest to make SC2 better so they would get the issue solved the quickest.
Edit: nVidia says it's called Dynamic Range:
|
iTzSnypah: I have some back-channel options to pass this info along through Blizzard, but I guarantee that they're inundated with a lot more customer feedback by far than GOMTV is, so starting with GOMTV might be a better bet.
|
On December 02 2012 20:39 iTzSnypah wrote: Edit: nVidia says it's called Dynamic Range:
That's one term for it, but I've seen it called various things. I've seen "YUV" used for the narrower range and "YUV-full" used for the 0-255 range. I'm not sure there's consistent terminology.
However, I am certain that it's an issue on their end. In fact, looking at the values in the file I'm examining, it looks like this problem might be doubled-up, that is, that the image values may be compressed into a narrower range like 32-223 or something like that. I'm not sure about this -- I'm installing Photoshop on a different computer now to have a look.
|
Annoyingly, I seem to have accidentally deleted or otherwise lost my screenshot from the award ceremony that illustrated this. Also, I notice that the Twitch.tv VODs do not have this problem, while the live broadcast on Twitch.tv did! How strange!
Edit: Located it, it's now in the OP
|
This YUV range is not the cause of problem you are talking about. Every single video you have seen on your TV was in the 16-235 range, and didn't look "washed-out". You might have a better monitor than usual, but for most monitors, it will not change anything. The washed-out aspect is more likely to originate in the encoding.
|
Note: I originally said that blacks were adjusted by varying the "gain" on channels in a TV set. The actual parameter is "bias." "Gain" affects the white levels. Sorry, it's been a while since I've had to deal with these issues myself.
On December 02 2012 21:49 Denar wrote: This YUV range is not the cause of problem you are talking about. Every single video you have seen on your TV was in the 16-235 range, and didn't look "washed-out". You might have a better monitor than usual, but for most monitors, it will not change anything. The washed-out aspect is more likely to originate in the encoding.
Thanks for the comment, but it's unfortunately not completely correct. You are right that it's an encoding issue, but encoding is all we're discussing here, specifically the encoding specification for the video.
Televisions are specifically set such that the black level (which is 16 in the usual broadcast encoding) is the darkest value that can be displayed. Broadcast color bars include both sub-black (0) and black (16) bars, and the usual procedure is to adjust the TV's channel bias so that in each channel the sub-black and black are visually indistinguishable. There's a white bar that's handled somewhat differently, in that its luminosity is measured by the person adjusting the monitor.
Here's an example of what the SMPTE color bars look like. In the lower right you can see the sub-black, black, and slightly above black bars. On a properly adjusted TV broadcast reference monitor, the black and sub-black are not visually distinguishable, but because you're looking at this image on a computer monitor with software that assumes a 0-255 black to white range, you can see the difference.
Images that are displayed on computers that use the sRGB standard peg black at 0, so if you take an image that's encoded in the broadcast (16-235) range and display it using software that assumes an input source that's sRGB, you're going to get washed-out blacks and muted whites. You're also going to get minor color shifts that are a result of sRGB not matching the HD standard (called Rec. 709) but unless you're using a color chart to compare you're unlikely to notice those differences.
In my work, we use high-end broadcast equipment to generate our image output, and the movies that are generated for looking at our work in progress are in the 16-235 range because they're intended for display on video equipment that assumes that. When we send out to a client, we have to manually convert these images to 0-255 so that they appear similar on our client's computers to what they look like on our video monitors. The reason is because the client's computers assume a 0-255 encoding for black-white while our video equipment assumes 16-235.
This effect is a real thing. On occasions when we've forgotten this step, clients have complained that the blacks do not look completely black.
|
On December 02 2012 22:00 Lysenko wrote:Show nested quote +On December 02 2012 21:49 Denar wrote: This YUV range is not the cause of problem you are talking about. Every single video you have seen on your TV was in the 16-235 range, and didn't look "washed-out". You might have a better monitor than usual, but for most monitors, it will not change anything. The washed-out aspect is more likely to originate in the encoding. Thanks for the comment, but it's unfortunately not completely correct. You are right that it's an encoding issue, but encoding is all we're discussing here, specifically the encoding specification for the video. Televisions are specifically set such that the black level (which is 16 in the usual broadcast encoding) is the darkest value that can be displayed. Broadcast color bars include both sub-black (0) and black (16) bars, and the usual procedure is to adjust the TV's channel gain so that in each channel the sub-black and black are visually indistinguishable. There's a white bar that's handled somewhat differently, in that its luminosity is measured by the person adjusting the monitor. Here's an example of what the SMPTE color bars look like. In the lower right you can see the sub-black, black, and slightly above black bars. On a properly adjusted TV broadcast reference monitor, the black and sub-black are not visually distinguishable, but because you're looking at this image on a computer monitor with software that assumes a 0-255 black to white range, you can see the difference. Images that are displayed on computers that use the sRGB standard peg black at 0, so if you take an image that's encoded in the broadcast (16-235) range and display it using software that assumes an input source that's sRGB, you're going to get washed-out blacks and muted whites. You're also going to get minor color shifts that are a result of sRGB not matching the HD standard (called Rec. 709) but unless you're using a color chart to compare you're unlikely to notice those differences. In my work, we use high-end broadcast equipment to generate our image output, and the movies that are generated for looking at our work in progress are in the 16-235 range because they're intended for display on video equipment that assumes that. When we send out to a client, we have to manually convert these images to 0-255 so that they appear similar on our client's computers to what they look like on our video monitors. The reason is because the client's computers assume a 0-255 encoding for black-white while our video equipment assumes 16-235.
Ok, I see. Nice to learn something today
|
Yeah, this stuff can be very hard to get one's head around.
Here, by the way, is what those color bars should look like (or close to it) on a properly adjusted broadcast monitor:
Edit: If you ever wondered what those color bars are for, hey, that's one major thing!
Another thing they've been used for in the past is that the color channel for analog TV required manual balancing of the U and V channels, and that would be done by hitting a special button on the display monitor to only show the blue channel, and adjusting the balance between U and V until alternate bars all looked the same brightness. I would expect that this wouldn't be necessary for digital TV, because Y, U, and V are all maintained as separate channels in transmission.
Edit 2: While I was viewing the GSL stream on a TV set, it was being sent to the TV via HDMI from a Mac laptop's Mini DisplayPort output and run through a converter. In that connection, it's pretty clear that all these issues are worked out by the various standards that define how these systems work. Images or movies that looked good on the laptop's display looked fine on the TV, but something that looked washed-out on the display looked washed-out on the TV as well. It may well be that the relevant standards are taking the 0-255 on the Mac monitor and converting them to 16-235 for transmission over the HDMI cable, but if so it's transparent and the TV handles it appropriately.
Edit 3: WOW look at how the midtone gray on the left matches the value of the blue in the TL forum theme!
|
|
|
Amazing how you came up with this, hope GOM checks it out and fixes their broadcasts.
|
I love learning new things. Thanks so much for this really interesting post!
|
Wow, i'm blown away with your knowledge bombs I'll look in the AMD Graphic Driver later this evening and see if there is a setting which can somehow workaround until you reached out on GOM! Edit: AMD Driver setting: + Show Spoiler +
|
I've never really noticed anything (always assumed any quality issues were due to the free "sq" stream).
On December 02 2012 23:26 AmericanUmlaut wrote: I love learning new things. Thanks so much for this really interesting post!
|
On December 03 2012 00:25 y0su wrote:I've never really noticed anything (always assumed any quality issues were due to the free "sq" stream). Show nested quote +On December 02 2012 23:26 AmericanUmlaut wrote: I love learning new things. Thanks so much for this really interesting post!
I might have assumed the same thing if I had been watching the free streams, but I've been a paying subscriber to GOMTV and while the resolution is great for the better streams, this issue is still there.
GOMTV is improving their streams in a number of ways for 2013, so I hope they can perhaps address this as well, particularly if it's just a software configuration fix!
|
Great Post. Just learned a lot about TV
|
*head meets desk*
I have always wondered where that problem could come from from a photographers point of view and assumed if the issue arises with videocameras it should be around with photocameras as well but the only places I noticed it was when stuff was badly edited.
Thanks for teaching all of us a little bit about old standards wrecking us in todays world. =D
|
The color level issues also applies to MLG and DH and a lot of other organizations that are beyond simple screen / capture card setups. As soon as you start using analogue stuff in your video chain, problems like this appear, and even all digital stuff can cause this for "compatibility" with TV. Interlacing is also a big problem too as shown during IPL 5 and any EG events (IPL applied de-interlacing, but the resolution loss is still there). Finally you have sync issues where you can sometimes see frame tearing due to different sync timings on capture cards / outputs.
|
thanks for this post. i don't even really watch SC2 streams that much nowadays, but i learned a lot of cool stuff here.
|
This was not what I was expecting from the blog name at all. Pleasantly surprised although I can't say I've noticed it ever. Maybe it's more noticable in hd.
|
Something about this is incredibly entertaining to me :D thanks for figuring this out/sharing.
|
Hey R1ch, I've watched a lot of the MLG streams and don't remember seeing much if any of this problem, but I'll sure be looking for it now.
What's funny is that it's probably all coming from digital systems operating in compatibility modes to support outdated signal standards. I doubt that too much analog equipment remains in GOMTV's image stream but this stuff is very hard to get away from.
Like you mention, interlacing is another great example. Interlacing was invented to reduce bandwidth when broadcasting using analog systems for display on CRTs, whose phosphors glow for a bit after being illuminated. Nobody uses CRT displays anymore except in certain very narrow applications (for viewing HD, of course there are millions of SD sets still in service), but most countries' HD broadcast standards require that the 1920x1080 mode be broadcast interlaced DESPITE that the digital broadcasting standard permits adjusting the amount of video compression anyway, independently of that. Then, on the other end, the TVs, which are generally not devices that support actual interlaced display modes, have to do all kinds of tricky signal processing to produce a nice looking image.
I think TV viewers would see a lot more objectionable artifacts, except for the fact that almost all pre-recorded content on mainstream TV is shot at 24 fps 1080p and the round-trip conversion to 1080i/60 fields per second for broadcast and back again is something the TVs have gotten pretty good at doing automatically.
Of course that 24 FPS standard too is a throwback to the early part of the century when motion picture film stock didn't have the tensile strength to be run much faster through a camera than that. Even current film stock can handle higher frame rates without damage, and of course all modern digital cinematic cameras support frame rates to 48 and above.
In a world with no analog broadcasting and no CRTs, we should not have 16-235 video color spaces or interlacing at all. 24 fps I am more ambivalent about because it has a characteristic look that I have learned to love as a filmmaker, and using a higher frame rate would have a direct negative impact on the cost and feasibility of visual effects work. Basically, 24 fps makes digital visual effects a lot easier and cheaper, since a lot of work is still manual frame-by-frame labor.
|
According to the Meta Info found in GOMs VODs, gom uses this capture card: Blackmagic Decklink, which operates only with digital signals. The 1080i Stream of GOMs GSL shown on Korean TV "AniBox HD" is also wrong its zoomed out and has a small letterbox around
The People at Gom are just incompetent
@Lysenko 24fps is cheaper... 48/60fps is still the way to go, imo 24fps will slowly die out in the next 10years
|
On December 03 2012 12:55 KMM wrote: @Lysenko 24fps is cheaper... 48/60fps is still the way to go, imo 24fps will slowly die out in the next 10years
I think the jury is totally still out. A lot of people who have seen footage from The Hobbit in 48 fps are unhappy with the look, including some non-experts.
That said, I think the extra costs in post production will be the thing that prevents a wholesale switch to higher frame rates.
Storage costs are an issue, and while storage costs are always dropping, visual effects people have been pretty good at expanding to fill all available space with things like massive caches for scene data and higher image and geometry resolutions. Furthermore, techniques like painstaking manual frame-by-frame painting and rotoscoping are essential tools that (almost) directly double in cost going to 48 fps. Render CPU resources also double, and that's another area where artists and software developers fill all available time no matter how fast the systems get even at 24 fps.
Production is an issue too. With most productions shooting at 4k these days, on-set data storage requirements (which because of the need to be ruggedized, fast, and portable can be extremely expensive) would double as well.
I think it's a very real possibility that a studio looking at 48 fps anytime in the next 20 years will find that visual effects costs go up 30% by making that decision, and that right there will kill it for a visual-effects-heavy film that doesn't have a James Cameron or Peter Jackson to push the issue. We'll see though, there are obviously people who love the look and want to make it happen.
Then there's the question of how to convert a 48 fps film to 1080i/60, lol...
Edit: Worth noting that the studios are already incurring a similar hit today for 3D, though there are some economies in 3D that might not apply in a higher frame rate. However, 3D can provide a much more different experience than a frame rate bump can, and the theaters can charge more as a result, so there's a better business case for it. Going 24-48 is comparatively subtle.
Edit 2: I expect The Hobbit in the vast majority of theaters and on Blu-Ray to be downsampled to 24 for both bandwidth and compatibility reasons. In fact, the reason they're choosing 48 rather than 60 is specifically for that.
|
why would you still use 1080i in the future. As you already stated interlace is just a cheap method to save bandwidth, but as 48p gets "standard" tv broadcasts will be in 1080p
btw 48p -> 60i is done the same way as 24p ->60i by repeating frames and making the video choppy all you say against 48p are costs and data storage, but both factors get smaller in the future...
edit: 3d is imo still overrated and in afaik applying visual effects on a (realfootage) 3d movie, costs more than 2d 48p...
why not just shoot in 60fps, thats way smoother than 48p and most devices that can handle 1080p24 can also handle 1080p60 60fps can be made into 30fps without problems, bluray specifications (avchd) allows 60fps
as for interlacing and tvs: All Tv-shows are 60fps Ads are mostly 60fps (or 30fps) movies/series are 24fps
i live in germany so its a whole lot different: all tv-shows are 50fps ads are 50fps (or 25fps) movies/series are 25fps (pal-speedup)
you are saying tvs got good at tripple conversion deinterlacing, thats wrong motion adaptive deinterlacing is what TVs or PC GPUs got good at upscaling the interlaced content to progressive, while maintaining framerate most TVs will not even recognize that its playing back 24fps, it will just deinterlace the 60i to 60p even if there are dupe frames
|
Edit: 720p in the U.S. is broadcast at 60 frames per second, not 30. My mistake. I've corrected this in this post.
On December 03 2012 14:24 KMM wrote: why would you still use 1080i in the future. As you already stated interlace is just a cheap method to save bandwidth, but as 48p gets "standard" tv broadcasts will be in 1080p
btw 48p -> 60i is done the same way as 24p ->60i by repeating frames and making the video choppy all you say against 48p are costs and data storage, but both factors get smaller in the future...
edit: 3d is imo still overrated and in afaik applying visual effects on a (realfootage) 3d movie, costs more than 2d 48p...
why not just shoot in 60fps, thats way smoother than 48p and most devices that can handle 1080p24 can also handle 1080p60 60fps can be made into 30fps without problems, bluray specifications (avchd) allows 60fps
as for interlacing and tvs: All Tv-shows are 60fps Ads are mostly 60fps (or 30fps) movies/series are 24fps
i live in germany so its a whole lot different: all tv-shows are 50fps ads are 50fps (or 25fps) movies/series are 25fps (pal-speedup)
you are saying tvs got good at tripple conversion deinterlacing, thats wrong motion adaptive deinterlacing is what TVs or PC GPUs got good at upscaling the interlaced content to progressive, while maintaining framerate most TVs will not even recognize that its playing back 24fps, it will just deinterlace the 60i to 60p even if there are dupe frames
The reason 1080i isn't going away is that 720p/60 and 1080i/60 are the two licensed HD broadcast formats in the US and certain other countries, mostly in Asia. They're a lowest-common-denominator that everyone has to support (in terms of signal format. You can legally sell a 720p TV that downconverts 1080i).
There are many TVs out there incapable of handling a 1080p signal, so the standards will require supporting 1080i for the foreseeable future.
The dynamic of storage and CPU is a lot more complicated than you suggest. The upward pressures on each without regard to frame rate in vfx keep pace with the technology. In fact, I see render times getting LONGER even as computers get faster to achieve more accurate realism. Disk space is the same -- there are powerful techniques that require massive amounts of cached data and as storage gets cheaper those techniques become more common and fill it up.
Waiting 20 years will not help this because we're going to be disk and CPU bound for that entire time even if formats stay the same.
24p to 60i conversion is actually a pretty smooth process. Each 24p frame is displayed for either 1/30 or 1/20 second, occupying 2 or 3 fields respectively, and all the lines in the frame always wind up being displayed. (A field, in interlaced video, consists of either the even or the odd lines in a frame.) Each frame alternates between those lengths so there's no sensation of choppiness. 48p-60i is something else again. You have to throw away half the image information for four fields and then show a full frame on the 5th. This will look choppy, so it's more likely that the procedure with The Hobbit will be to master to video by using a 24 fps source.
Yes, TVs are decent at motion-adaptive deinterlacing, but speaking as a former television engineer for Mitsubishi Electric, which pioneered a lot of this digital HD technology, I can tell you that many TVs do specifically go looking to identify 24fps source material and show full frames at 1080p/24 or 60 rather than just deinterlacing the 1080i input. I know this because I was in the room in some instances where it was put in. However, whether that feature is there is irrelevant to this discussion.
Finally, 3d is probably somewhat more costly in visual effects than doubling the frame rate, but in many ways they're similar. Additional costs in 3d pretty much come down to extra labor in layout and animation departments plus possibly some up-front software development costs that can be amortized over multiple shows.
Anyway, I've been working in these and related fields for pushing 18 years now, so I'm pretty sure of myself on the way the technology and costs scale over time. As for what fads or bandwagons the studios will jump on, that's harder to predict. I just think on frame rates the judgment will be that it's too costly for too small a benefit.
|
On December 03 2012 16:18 Lysenko wrote:Show nested quote +On December 03 2012 14:24 KMM wrote: why would you still use 1080i in the future. As you already stated interlace is just a cheap method to save bandwidth, but as 48p gets "standard" tv broadcasts will be in 1080p
btw 48p -> 60i is done the same way as 24p ->60i by repeating frames and making the video choppy all you say against 48p are costs and data storage, but both factors get smaller in the future...
edit: 3d is imo still overrated and in afaik applying visual effects on a (realfootage) 3d movie, costs more than 2d 48p...
why not just shoot in 60fps, thats way smoother than 48p and most devices that can handle 1080p24 can also handle 1080p60 60fps can be made into 30fps without problems, bluray specifications (avchd) allows 60fps
as for interlacing and tvs: All Tv-shows are 60fps Ads are mostly 60fps (or 30fps) movies/series are 24fps
i live in germany so its a whole lot different: all tv-shows are 50fps ads are 50fps (or 25fps) movies/series are 25fps (pal-speedup)
you are saying tvs got good at tripple conversion deinterlacing, thats wrong motion adaptive deinterlacing is what TVs or PC GPUs got good at upscaling the interlaced content to progressive, while maintaining framerate most TVs will not even recognize that its playing back 24fps, it will just deinterlace the 60i to 60p even if there are dupe frames The reason 1080i isn't going away is that 720p/30 and 1080i/60 are the two licensed HD broadcast formats in the US and certain other countries, mostly in Asia. They're a lowest-common-denominator that everyone has to support (in terms of signal format. You can legally sell a 720p TV that downconverts 1080i). There are many TVs out there incapable of handling a 1080p signal, so the standards will require supporting 1080i for the foreseeable future. The dynamic of storage and CPU is a lot more complicated than you suggest. The upward pressures on each without regard to frame rate in vfx keep pace with the technology. In fact, I see render times getting LONGER even as computers get faster to achieve more accurate realism. Disk space is the same -- there are powerful techniques that require massive amounts of cached data and as storage gets cheaper those techniques become more common and fill it up. Waiting 20 years will not help this because we're going to be disk and CPU bound for that entire time even if formats stay the same. 24p to 60i conversion is actually a pretty smooth process. Each 24p frame is displayed for either 1/30 or 1/20 second, occupying 2 or 3 fields respectively, and all the lines in the frame always wind up being displayed. (A field, in interlaced video, consists of either the even or the odd lines in a frame.) Each frame alternates between those lengths so there's no sensation of choppiness. 48p-60i is something else again. You have to throw away half the image information for four fields and then show a full frame on the 5th. This will look choppy, so it's more likely that the procedure with The Hobbit will be to master to video by using a 24 fps source. Yes, TVs are decent at motion-adaptive deinterlacing, but speaking as a former television engineer for Mitsubishi Electric, which pioneered a lot of this digital HD technology, I can tell you that many TVs do specifically go looking to identify 24fps source material and show full frames at 1080p/24 or 60 rather than just deinterlacing the 1080i input. I know this because I was in the room in some instances where it was put in. However, whether that feature is there is irrelevant to this discussion. Finally, 3d is probably somewhat more costly in visual effects than doubling the frame rate, but in many ways they're similar. Additional costs in 3d pretty much come down to extra labor in layout and animation departments plus possibly some up-front software development costs that can be amortized over multiple shows. Anyway, I've been working in these and related fields for pushing 18 years now, so I'm pretty sure of myself on the way the technology and costs scale over time. As for what fads or bandwagons the studios will jump on, that's harder to predict. I just think on frame rates the judgment will be that it's too costly for too small a benefit. 18 years? Dang how old are you o.o
|
On December 03 2012 17:12 EtherealDeath wrote: 18 years? Dang how old are you o.o
I'm 41, turning 42 in June. I graduated college young, so I started my first real job at age 21, but it was a couple years later that I joined Mitsubishi, and I went to Disney Animation when I was 24, in April of 1996. (So "pushing 18 years" means a little less than 17 and a half. My 18th anniversary of joining Mitsubishi will be around June of 2013.)
Lots more detail here, if you are interested in the whole story:
http://www.teamliquid.net/blogs/viewblog.php?topic_id=230005
|
He's 41, I believe. Like a boss :D
Great read and replies... I don't understand a lot of it (not even close!), but I thought it was really interesting to go through
|
On December 03 2012 17:23 Aerisky wrote:Great read and replies... I don't understand a lot of it (not even close!), but I thought it was really interesting to go through
Thanks man, I love this technology. Keeps me going at work when the specifics are getting me down.
Edit: I was around 26 when all my friends were telling me OH MAN YOU HAVE TO TRY STARCRAFT! Too bad I never did, or I'd probably have been a lot better at the game and have a 10+ year TL account. :D
|
Yeah fuck rotoscoping 48 fps is all i have to say on that issue. Interesting blog this. Made me learn a thing or two for work actually.
|
This is a pretty nice knowledge bomb, thanks for this
|
On December 03 2012 18:30 unkkz wrote: Yeah fuck rotoscoping 48 fps is all i have to say on that issue. Interesting blog this. Made me learn a thing or two for work actually.
All you do looking at a huge long roto is sigh and start clicking. :D
|
wow, they actually broadcast 720p30 in US?!? in germany every hd channel is either 720p50 or 1080i50
i think high fps is still too much underrated, native 60fps hd content looks soooooo good in my eyes 24fps is just wrong, a mediocre frame rate dating back from the time when you had to physically cut movies frame per frame...
you're right about 48fps conversion, but lets be honest what converts better to 60i than 60p does haha
|
KMM: I agree that there are advantages to a higher frame rate. And, producers might accept increases in cost to get there. However, there will be increases in cost.
Also, I was in error. The U.S. broadcast standard specifies 720p to be broadcast at 60 fps, not 30. Sorry!!
|
24 fps give you more of a "movie feeling" where as the HD formats look "too real" that is the issue with them. I feel that watching a movie at HD fps values just feels wrong and looks wrong. A bit hard to explain really.
|
of course will there be increases in costs, but as these formats standardize costs decrease, till the next new format comes to increase it again xD
|
On December 03 2012 23:46 KMM wrote: of course will there be increases in costs, but as these formats standardize costs decrease, till the next new format comes to increase it again xD
My comments about costs are specific to the production costs to the studios making the content. Unlike in manufacturing, there are not economies of scale that make it substantially cheaper to do the 100th time than the first. Also, as I pointed out, there are specific technological dynamics in visual effects that tend to undo the decreases in storage and CPU cost over time*. There may be a point where the work doesn't expand to fill available resources, but we're a long way from it.
If we're talking about the consumer electronics world, that's a very different question and the barriers are more about standardization and regulators not wanting to render people's existing products obsolete.
* For example, in 1999 I worked briefly on a major summer effects-driven movie project I won't name (because for reasons beyond our team's control our company at the time walked away from the project and it went elsewhere.) That film had 271 visual effects shots in the entire movie. Today, a similar movie would contain over 1000. The rendering technologies we use today (which include lots of raytracing and using things like high dynamic range environment maps to light the scene) were unthinkable then, because the computers were 100x slower. Also, clients demand progressively increased complexity and doing more and more of the work in post to save production costs.
My colleagues and I sometimes stop to wonder if, in our careers, we'll hit a point where rendering technology is no longer CPU limited. This would be characterized by fully photorealistic lighting in real-time. I do not know, but I suspect not -- some images today take 48 hours or more to render, so even with judicious application of preview modes and so on we need another factor of maybe 1000 in performance, and that assumes that someone doesn't come up with the much better alternative technique that's 1000x slower.
|
Or someone finds a way/mehtod/algorythm to make it less cpu-demanding (ie something like cuda)... You can cluster the whole network together to do the processing, that would also speed it up.
Real-Time Rendering... hmm... i guess we'll see in the future, some computer games do a really good job with hdr for example but for now its like you say, massive rendering time involved
|
Sorry for asking here, but... does anyone know/suspect the technology behind the stream for BlizzCon 2011 (the main SC2 event and the NA and EU regionals)? To this day I don't think I've seen similar level of stream quality. In retrospection, I believe it might have had higher fps than 24/25, maybe 29/30 fps. It had 1080p and it was damn smooth and sharp, no squares, real black. And it had review/scrollback option too. And it was free.
|
I wondered that too. Adjusting for broadcast levels shouldn't give you this washed out look. If it's not from the camera or the crew, I'd say this looks like typical h264 over-compression ( for streaming, not VOD ). It literraly destroys contrast by compressing the luminance channel if you're not very carefull with the settings... and SC2 is a nightmare for encoders : details in motion everywhere, fast pan from observers, zooms...
Since it's impossible to specifically fine-tune the encoder for certain scenes ( you wouldn't encode with the same settings a baneling bust and a TvT position war ) I believe they chose the least worst settings they could find that could garantee that a 200 vs 200 battle doesn't look like 4 rectangles moving around.
This also explains why the problem is less visible in their VODs.
I personally don't mind having a less contrasty image in the stream ( which will be the most visible with non-sc2 footage ) if it can make the in-game visual more pleasant.
Random ideas : I don't know what the workflow is in gom tv but since they are working with different kind of footage ( in game + standart camera ) they should treat it differently. For exemple, capturing at 60fps motion for in-game sc2 footage and regular the 25fps for non-in game footage. That way, the game looks much more alive, and you don't get the sitcom look with the interviews. Also, since they are able to control the what the game looks like before capturing, they could maybe tweak the graphic driver to optimize the footage to get the maximum of details so thhey have more freedom in color correction later on.
|
On December 05 2012 03:52 renkin wrote: Adjusting for broadcast levels shouldn't give you this washed out look.
I agree, it is a worse problem than you would get if that were the only issue.
Failing to adjust for broadcast levels will give you a washed-out look, however that doesn't mean that there isn't more going on. Like I mentioned, the histogram for the image did start at 17, which is a pretty good smoking gun pointing at broadcast levels as at least some of the problem, but most of the darks in the image were well above that level, so there might be multiple stages of mis-encoding going on.
|
On December 05 2012 05:26 Lysenko wrote:
Failing to adjust for broadcast levels will give you a washed-out look, however that doesn't mean that there isn't more going on. .
I agree there is definitly something else. It might look fine on a korean TV network but if we, internet users, get this version and not a optimized one for PC monitors then there is a workflow problem somewhere.
Stop making me think about work on TL !
|
|
Thank you for shedding light on the issue. I too noted the odd look of live streams. I'm glad someone with knowledge on the situation was able to figure it out.
|
wow, cant wait to see this.
but it sounds like my probably shitty cinema might not be equipped to display it in full frame rate..
and you say we're already watching a lot of things in high frame rate anyway?
i wonder what Planet Earth (dvd) and the Final Fantasy Advent Children movie (various) were produced in (some of the scenes in that were ridiculous)
|
On December 05 2012 13:43 FFGenerations wrote: and you say we're already watching a lot of things in high frame rate anyway?
We are not. Unless it was live television, just about all content you see out there today was shot at 24 fps and is being played back either at 24 fps or 25 fps depending on country.
Live television is often shot at 60 frames per second progressive or 60 fields per second, interlaced. IMAX movies can be presented at 24, 30, or 60 fps.
|
720p channels use 60 progressive frames,... all live shows and sports events are High Frame Rate
only problem with HFR people are so much used to stuttery shitty 24fps that they consider something other bad...
|
On December 05 2012 15:53 KMM wrote: 720p channels use 60 progressive frames,... all live shows and sports events are High Frame Rate
only problem with HFR people are so much used to stuttery shitty 24fps that they consider something other bad...
I don't know why I keep screwing that 720p frame rate up.
As for whether 24 is better, I love it, but hey, I'm a motion picture person.
|
Lets be honest, whatever you may think about it, HFR is better (objective). It allows for more detail, fluid motions, etc And if the end user is not satisfied, he could just activate the "cripple back to 24fps" feature haha @gom im trying to get hold of one of those GSL finals in 1080i again to add some screenshots
|
Damn reading your comments in this blog, Lysenko, made me remember reading another of your blogs quite some time ago where you talked about your history Good times.
I would just like to point out that GomTV is internet TV. They don't broadcast in a form that you would get on your TV, unless of course you went through your PC. So it's not like gom needs to worry about broadcasting in a specific way to keep it good on TVs, because no matter what it's gonna have a PC/Mac that it's running through first.
|
On December 05 2012 19:09 KMM wrote: Lets be honest, whatever you may think about it, HFR is better (objective).
Any judgment on this is subjective. That's why that article I just posted is full of people who dislike it.
Anyway, we've gone way off-topic on this thread. Maybe it would be best if you start a new blog of your own for arguing about high frame rates and we can move the conversation there?
|
Yeah, I'm glad someone finally diagnosed the issue. I was fairly sure it was either and encoder or a color space issue, turns out it's a little bit of both.
|
Amazingly, this is still a problem despite their introduction of much higher quality streams!
|
|
I would not be surprised if Youtube had put in place an encoding solution that could automatically detect and correct this.
|
Heart of the swarm Launch Event are both streamed from GOM Studio Korea, and all streams are on twitch english stream has the bad image quality (to bright etc.), the korean stream is perfect
|
Watching this GSL season, it appears that wherever this issue came from, it's been resolved, although the gamma looks a little off.
|
|
|
|