All,
I’ve been deliberating over this ever since I got usenet and got access to easy downloads of 70-80gb 4k remuxes. At first I believed I was getting fake uploads or something, but it seems to be a pattern. The remuxes have extreme noise that’s really distracting and actually discards some of the depth information of the scene.
on the other hand, the smaller 265/264 files appear to be smoothing out the noise, but also some texture detail. You can see from the comparison above.
Note that the noisy version is hdr10 and the non noisy version doesn’t have hdr.
My question is, is that grain supposed to be so prominent? or is the scene releasing these remuxes pulling some shortcuts just to release a remux?
Is this noisy version how it’s supposed to look? or did they fuck up the encoding? Don’t flame me for not knowing. I’m hoping this can be a touchpoint for constructive conversation, and to help other confused mateys on the journey to visual bliss. Thank you
edit::
This resource helped me understand. Long story short, this is the intent of the director - potentially enhance the period piece aesthetics. I suppose if it looks as crisp and sharp as a movie filmed on the state of the art digital camera, some of the cold war aesthetic would be lost. I get it and I will watch it and enjoy the graininess as the director intended 😂
A couple things that confused me:
-
I know sometimes scene groups take shortcuts and this can really affect quality. For example transcoding x264 to x265. Yes, some groups do this and it’s of no use to anyone. My concern was that I may be downloading one of these releases.
-
There are movies from 40-50 years ago with very little to no film grain noticeable. Many remuxes don’t have that grain. I’m not big in the filmmaker scene, so I guess it was somewhat a surprise to learn that some filmmakers actually want that grain in there…
Thank you all for your input and I hope this thread can be of use to folks that have this same question. Cheers mateys, happy sharing :)
I think you are mixing two things up. Some 4k remasters have incredibly distracting film grain. Blade Runner comes to mind. It’s a matter of personal taste and how accurate you want the film to reflect the film source.
The more compressed the file is, the les accurate it is, and film grain is the first thing to get crushed by compression algorithms
I guess I’m just trying to figure out if the grain is supposed to be there or if there’s a possibility it’s due to a poor scene release. The grainy version is a release from
4k4U
. The non grainy is fromLAMA
And I want it to reflect the film source as closely as possible, or at least the intention of the artist. I don’t want to enhance colors, or smooth things out if that’s now how it was intended to be viewed. Just want to make sure the releases I’m grabbing are actually good quality and not somehow transcoding from lower source quality or some other weird trick that introduces this layer of noise
Fuck grain. Why dont you go generate 100% noise files if it’s so important
lol, controversial! but the noise is so warm and analog bro
Fuck analog it’s 2023!
Film grain isn’t inherently good, but if the source has imperfections (a poor transfer, a blurry scene, etc), the grain creates a random discontinuity pattern for each frame that helps the brain unconsciously imagine fine details.
This could just come down to the remux group using differently sourced BluRays. Not all BluRays are created equally, sometimes a US release is the best, other times it might be an ITA release, really depends. A good remux group looks at all the sources and chooses the best, sometimes they’ll combine different parts from multiple sources as well.
In my opinion, the release with the noise in the comparison shots is the better release from the details retained.
Thanks for the informative input. I could even be dealing with same comparison with official blurays. Wild.
Remux all the way. If the movie is grainy that’s just how it is. I understand some people like smoothing but I’d rather the full quality.
ok, that seems to be the general consensus. It’s weird though - this particularly movie, incredibly grainy. movies that came out 40 years before it, remuxes have very little grain.
It’s just confusing to me because I do want the original full quality and if it’s grainy so be it. I just want to make sure it’s just not some scene group pumping shit out with poor encoding.
Yeah that a bigger discussion around 4k masters. Usually I’ve found if the Blu-ray has a lot of grain it’s likely a newer scan more true to the film print. Some older ones were just upscaled from some older scan and some they do cleanup - some is fine like removing defects not in the print. Definitely a balancing act. If it’s real film grain I’m fine with it even if it’s excessive as long as it’s not added in for the sake of making an upscaled copy look more legit.
There used to be a site that tracked which copies were real but seems to have been abandoned.
Yeah it’s interesting. Some grain kind feels unnatural like it was added after the fact. Idk, maybe my understanding of physics isn’t good enough. But it sometimes feels that the grain, in motion, obscures what I’m looking at. And I imagine grain isn’t a thing in digital
If there is grain in the source original, then a “cleaned” version is not an archival copy in my opinion. Grain is difficult to compress as it is random noise, so the bitrates required to maintain it are high.
If you see grain in a movie then it is meant to be there.
Film CAN look crystal clear and modern, but it also has flaws that show depending on how it is used and what film is used in the first place. if you start with high ISO film and push it in development it will have serious grain, but low ISO film fully exposed and well developed can be as smooth as glass, even on a 4K scan, problem is you have to have a huge amount of light to get film to look that clear and sometimes that isn’t possible. you can get around some of this with day-for-night methods (filming fully lit then darkening to look like night) but that is a choice the director/DOP make in a production.
Digital cameras have grain too, and it can be much uglier than film grain but our modern image sensors can handle low light better than film (in some ways, it’s a debate still) but even clear, well lit, well exposed, low ISO digital video productions will have grain added to tie scenes together visually if lighting changes slightly, to mask flaws, and to tie composites together (mixing cg and video for example). Small amounts of grain can actually enhance apparent detail as your brain fills in what it perceives as detail, Sony TVs can do this as part of their upscaling if you enable it.
That sounds about right. I was dealing with trading off high ISO for slow aperture for a theater show I was shooting just tonight. Was difficult to dial in where I got enough exposure but the noise didn’t obliterate any details in the faces. Definitely requires good light to get crystal clear results