#### Encoder(s): Jensen
---
Release notes: *This time the CloverWorks studio used the same type of upscale as in the last films of the A1 studio - lanczos 878. But this film a minimal number of taps were used, and sharpening was also done after upscaling, which caused a strong halo and moiré effect. But the frequency filtering that I used to clean up these problems made it possible to increase the compressibility of the material, which was already quite high, which is why the video bitrate is so low. Otherwise everything is standard: slight de-noise with de-band and detail preservation mask to prevent destruction on dark scenes. That was encoded in HEVC 10-bit with safe compression settings to produce transparent quality video. Also due to the small video bitrate, the DTS HD-MA audio bit depth was reduced to 20 bits (16 for the LFE channel) using Dolby algorithms and then the tracks were encoded back to DTS with the core bitrate reduced to accommodate the video size.*
#### Subtitles:
| Author| Language| Format |
| ------- |:------------------:| -----:| -----:|
| JPN BD | JPN | pgs |
#### Audio:
| Author| Language| Format |
| ------- |:------------------:| -----:| -----:|
| BD_Audio 2.0 | JPN | DTS-HD Master Audio |
| BD_Audio 5.1 | JPN | DTS-HD Master Audio |
| BD_Audio 2.0 (Audio Guide) | JPN | DTS |
| BD_Audio 2.0 (Audio Commentary) | JPN | DTS |
####
Telegram: https://t.me/BeatriceRaws | Discord: https://discord.gg/Hry7EkU
> This time the CloverWorks studio used the same type of upscale as in the last films of the A1 studio - lanczos 878. But this film a minimal number of taps were used, which caused a strong halo and moiré effect.
Descaling 878p (nor any of the other spiked heights: 877.78, 877.9, 878.2, 878.3) at Lanczos does *not* clean up any of the introduced upscale artifacts e.g. ringing.
https://slow.pics/c/hwr6gRqq
Your descale is wrong, as usual
No, everything is correct. I am more than sure that the core here is 878 after analyzing fft and various tests. In addition, the descaled does not remove ringing, etc., only the main artifacts. Because we work with compressed video after DCT and it is mathematically impossible to reverse all errors with 100% accuracy, in addition, post-processing in the studio usually interferes with this. Everything else artifacts is additionally cleaned up by other filters, mask and etc.
Yapping hard. You can see none of the artifacts are removed because it was post-sharpened after the upscale to 1080p. You are incorrect and I posted evidence.
Look up and read what I wrote about post-processing after upscaling in the studio and, in general, the accuracy of undoing such things.But yes, I agree with you that some artifacts are caused by sharpening in post production.
I did. Mathematically, you cannot undo sharpening. The original data is permanently lost and it is not a reversible operation.
Quick edit:
> the descaled does not remove ringing, etc., only the main artifacts
The ringing suggests it was upscaled with Lanczos 4 taps. If the descale is correct, then the ringing should be removed *entirely*. I can provide examples if you really care.
>The original data is permanently lost and it is not a reversible operation.
Yes. Just like upscale artifacts from any kernel. If you look closely at your comparison, you will notice that in some places the artifacts are not completely smoothed out or only one side is smoothed - only vertical lines or only horizontal lines. Therefore, the problems of this BD are a mixture of upscale artifacts and post-processing with sharpening.
>If the descale is correct, then the ringing should be removed entirely.
This is mathematically impossible, I repeat. Only immediately after upscaling in the studio with lossless video. Compression will not allow you to do everything with sufficient accuracy. Do you think it’s just that all rippers use filters to remove halo or ringing? After all, according to your words, you can simply descale it and everything will be fine.
But you're right, the description is not accurate enough, I'll fix that, thanks.
As far as I know, Jensen is a russian nazi who supported the execution of ukrainians. If that's true, I'm surprised Beatrice keep him on this group. Also smol is right, all Jensens rips are shit. The only normal rips from this group are DJATOM and DeadNews
> Do you think it’s just that all rippers use filters to remove halo or ringing? After all, according to your words, you can simply descale it and everything will be fine.
If you need to dehalo, then your descale is incorrect in 99% of the cases. The 1% being sharpening done on the source prior to the descale.
Here's an example from WATAMOTE: https://slow.pics/c/djTtTyox
The descale does not have any of the artifacts from the source. Upscaling back to 1080p with the same kernel results in a visually identical picture; except for the dithering of course.
The third node is the descale going through Waifu2x, then downscaled with Hermite in linear light to 1080p. No ringing to be found, free sharpness with no detail loss, and the line art is corrected as well. I haven't even used an edge mask.
@Kirakishou
Lol, and this is being told to me by a person who was kicked out of our chat for spam calling for the destruction of the Russian people? Learn the terms at least to understand the difference between Nazism and fascism, and don't embarrass yourself,
Each of us, and any reasonable person, has long had our own idea of the causes and consequences of this war, and we are smart enough to share this and our relationships with each other without quarrels and the like. Besides this, our relationships in the group are ours alone, and it’s not for you to decide.
In the future, I intend to ignore such attacks; there is no need to drag politics everywhere.
>If you need to dehalo, then your descale is incorrect in 99% of the cases. The 1% being sharpening done on the source prior to the descale.
You have the right to think so, but you are wrong. Both DJATOM and I have already explained a thousand times how it works and why we do it this way and not differently.
In your case you are losing a lot of fine details and distorting the original Line Art.
> In your case you are losing a lot of fine details and distorting the original Line Art.
Do show me where any "fine detail" is lost in the comparisons I sent. The original line art is in the descale. The upscale used there added ringing and aliasing. Both were solved without compromising detail surrounding edges with unnecessarily destructive dehalo/antialiasing calls.
i still dont know why people look for encodes. if you are going to look at a piece of media from a safe distance away, you will NOT notice any difference. if you are complaining about artifacts BACK OFF FROM YOUR COMPUTER.
> Do show me where any “fine detail” is lost in the comparisons I sent. The original line art is in the descale. The upscale used there added ringing and aliasing. Both were solved without compromising detail surrounding edges with unnecessarily destructive dehalo/antialiasing calls.
You have a 720 source with a simple upscale and line art in 2013, everything is relatively normal there. But we are talking about resolutions and drawings that are higher and more complex. LightArrowsEXE had already encountered similar problems from this approach to descaling and rewrote its code to minimize them.
https://slow.pics/c/gNGv1E20 thickening the lines is also a problem https://slow.pics/c/upwGnVil
if you apply this method of descaling to sources containing post-processing and an original resolution of 800+. In addition, problems arise in the case of complex and dotted Line Art like Horimiya or any Wit studio anime. And also in the case of such source. https://slow.pics/c/jGS5KpUn
In any case, a combination of methods is used that are suitable for each specific source material and suit the ripper personally. Therefore, you shouldn’t force your approach on me, I understand your reasons, but I don’t agree with everything, let’s put it this way.
Descaling is one of the only operations in filtering that has to be objectively and mathematically correct. However, if there is post-processing like you claim with sharpening, then no attempt to descale should be made as it is just WRONG. Usually in these cases you will see some form of visible error affirming this, whether it be additional/strengthened haloing or ringing in the rescaled clip. Cleaning this up afterwards does NOT make the rescale ok. While compression can indeed affect descalability, usually it doesn't impact error rates to the point of a show not being descaleable or adding visible error. There might be a few problematic scenes, but you can scene those out.
Smol is actually correct that in most cases, assuming you have the correct descale, it will remove the majority of ringing and haloing. That being said, it is true that you can't 100% reverse an upscale unless dealing with a lossless source, but using proper tooling should be close enough.
Another topic that has to be explored further is the downscaler you use back to 1080p during the rescale process. The kernel you use and whether to use it in gamma, sigmoid, or linear light, can have a massive effect on the result. In addition, it's honestly baffling to me that InsaneAA is still being used to this day. It's a clusterfuck of merging with a spline36 downscale and doubling with eeid3+nnedi3 with aggressive settings. Just a horrible idea all around that results in blurred output and many encoders have agreed that it's essentially trash.
Based on your logic shouldn't use descaling 99% of the time lol. Various post-processing and other processes along with lossy compression ALWAYS add errors. Therefore, back in 2012, our method of removing scale from any source was invented and implemented using masks and mixing with a neutral kernel to minimize errors. It is the fact that lossy compression does not have that much of an impact and allows descaling to be used almost 100% of the time. You just shouldn't rely on it alone to provide results. It is always a combination of different methods.
Rescaling tooling has vastly improved over the past couple of years alone. It is basically acknowledged that many older rescales are incorrect, one example being fractional sources (quite common) which simply means that they were scaled past 1080p and cropped, and when not accounted for, also introduces error that wouldn't be presented when rescaled correctly.
> Based on your logic shouldn’t use descaling 99% of the time lol.
Also no, I said in my post that in most cases the result is fine and compression alone, while adding some mathematical error, usually doesn't add noticeable visible error unless the scene is heavily compressed.
I don't think this is the right comments section to "educate" anyone on descaling... save your breath for when there is a show that is actually descalable and you can provide objective comps showing less dark ringing compared to Beatrice-Raws or whatever the case may be.
@oZanderr
In general, studying keyframe scan settings from the network explained the fractional resolutions as a digitization process tied to scan resolution and cropping to the active area of the frame. https://slow.pics/Gd065hvN/ https://slow.pics/DQuoKv54/ https://slow.pics/vVHyEFoG/ (A4 paper scanning, with 1754x1240 (150dpi) and scan frame 1708x964 (grey line) active area 1552x872))
And people agreed that using resampling without subpixel offsets or values close to a fractional value (871 instead of 871.7) could produce a significant error. But not always, since often the coefficient is as close as possible to 1.24x and the error estimate does not exceed 0.002. And they also agreed that pruning is usually not necessary.
@motbob
It has literally been around for a very long time. This is Kara no Kyoukai Movie 1 - 2. It uses bicubic interpolation without post-processing + huge video bitrate and the descaling error is 0.00000000000001. I still can't finish this series. I did 3 out of 6 and I'm tired of fixing stuff like this https://slow.pics/c/zq2mapq6 That's the case where descaling was the easiest thing, lol.
Seems nowhere near an error of 0.0000000000001 to me.
![](https://i.imgur.com/vj5RPBd.png)
And even getfscaler doesn't agree that it's bicubic (assuming 855 like most of these ufotable productions, and based off the width results).
Results for frame 0 (resolution: 1520/855, AR: 1.778, field-based: Progressive):
Scaler Error% Abs. Error
Bilinear 100.0% 0.0000421115840
Catrom (Bicubic b=0.00, c=0.50) 102.5% 0.0000431793291
Mitchell (Bicubic b=0.33, c=0.33) 105.9% 0.0000445916754
FFmpegBicubic (Bicubic b=0.00, c=0.60) 108.3% 0.0000456033227
AdobeBicubic (Bicubic b=0.00, c=0.75) 121.6% 0.0000512225375
Hermite (Bicubic b=0.00, c=0.00) 140.6% 0.0000591964485
-snip-
Smallest error achieved by "Bilinear" (0.0000421116)
We can also tell this is very, *very* unlikely to be Bicubic because of the dirty borders. There appears to only be 1px dirty borders (at least on the width, height may be shifted during resampling?), and Bicubic will not cause that (unless you're upsampling with Hermite, which this very clearly is not). It's much more likely to be a Bilinear upscale.
Checking for frac resolution (though because of the dirty edges, we can relatively safely assume it's an integer resolution already):
![](https://i.imgur.com/T5mG3F3.png)
.99 is well within compression error... error, so it's safe to assume the width is 1520 (and the height is likely to be 855 because of that, with poorer results for height likely stemming from a vertical shift during resampling).
I'm not interested in arguing how good/bad insaneAA or the descale for this movie is. Just want to show my quick 1-minute research. I'm always up for explaining new descaling methods in PMs if you'd prefer that.
[@motbob](https://nyaa.iss.one/view/1752443#com-24)
> save your breath for when there is a show that is actually descalable and you can provide objective comps showing less dark ringing compared to Beatrice-Raws or whatever the case may be.
If you wish. I just recently worked on Fragtime, which Jensen also encoded for Beatrice-Raws. Of course, it was 3 years ago. But seems like they've made zero advancements in methods used, so might as well use that as a recent example.
https://slow.pics/c/AuJhS9kO
Jensen's encode adds halos that have not existed in first place; considering the movie was upscaled with bilinear rather than some sort of bIcubic (I'm going to assume Jensen went with Mitchell, if the scripts on the Beatrice-Raws GitHub repository are anything to go by); and bilinear does not ring. The same artifact is visible across that entire movie.
I have plenty of other hands-on examples from other encoders in Beatrice-Raws but they weren't done by Jensen, so I'd rather not include them. Although it seems like the standard for Beatrice-Raws: Incorrectly descale and add artifacts in the process, attempt to fix said artifacts with extremely aggressive dehalo, then destroy the line art with aggressive AA. Rinse and repeat, for years.
>Smallest error achieved by “Bilinear” (0.0000421116)
Yes, I was wrong because I forgot, because it was 3 years ago, there really was bilinear interpolation. But the point is not in the core, but in how precisely in the case of this film it can be removed with quite high accuracy.
855 Bicubic - 0.0000001241 - 936.16
855 bilinear - 0.0000000328 - 4106.73
As I said, the percentage of error tends to zero. Just in case you didn't know, you shouldn't select dark frames for resolution analysis and frames with blur effects. I used this frame: https://slow.pics/eJRfUyPM/
@smol
I don't see any halo in my encoding, that's one and two - how do you know what I used if this encoding script is not in our repo, lol? Even I myself don’t know what I used, because this script is lost. In the case of Fragtime, if my memory serves me correctly, halo suppression was not used at all, because it was not there in the source in a form sufficient for removal. A barely noticeable halo of 1 pixel does not count, this is a consequence of upscale and you can only notice this when looking at screenshots with a magnifying glass, but we are watching a movie, not screenshots?
Regarding the rest: you have your truth, we have ours. I prefer not to react to this. If you can do better, then do it and beat us in terms of quality. All people make mistakes and that's normal, but wherein we don't go around by comments and prove to everyone how to do their encoding.
> you shouldn’t select dark frames for resolution analysis and frames with blur effects
I know, but that's the only screenshot you provided and I didn't want to download the BDMV just to check lol. I asked for a sample on your server, but you didn't respond.
No popcorn needed. Jensen is not interested in using factually better methods and they clearly told me to not push it onto them. Might as well move on.
I just love that Jensen's first defense against Kirakishou's accusation is to imply he is _merely_ a fascist, not a nazi.That's the kind of subtle gem that brings me back to this site
Don't you understand the subtle hint about the state's disregard for human lives and the militarization of society? Oh, well, at least with my level of English... This man doesn't even know the correct term for what he's accusing me of lol.
All, making personal attacks is not cool.
Jensen and the other encoders in the comments section are doing great job. Personally I find this type of technical discussions informative :D
Comments - 38
Krappy_Monster
NodMan
KnSnaru
smol
Jensen
smol
Jensen
smol
Jensen
Kirakishou
smol
Jensen
Jensen
smol
golgoth-13
wildewilde
HowUnfortunate
matheousse
HowUnfortunate
Jensen
oZanderr
Jensen
oZanderr
motbob
Jensen
LightArrowsEXE
smol
Jensen
LightArrowsEXE
Jensen
NodMan
smol
Jensen
Yurasuka
Gnome
Jensen
toshii
Reza27