Most common aspect ratio besides 16:9? (4:3 doesn't count) - Off-topic

TL;DR in the end
I was wondering what the ideal aspect ratio is for a screen. It would, of course, be reflected by what the majority of media is using. Pondering about the benefits of ultra-wide (21:9), I discovered about the 2.39:1 aspect ratio. It's the true aspect ratio cinemas use, and it's the one Ultra-Wide was designed off, which is a gimmick in comparison. Ultra-Wide 21:9 is not actually 21:9 but 64:27, which equals 2.370̅3̅7̅:1, whilst the cinematic ratio doesn't actually have a correct way to represent it: since it can't simplify to whole numbers below 21, the two closest ways to write it would be: either by being huge but whole, 1024:429; or small but a little broken, 25.6:10.725, both equaling 2.386946...:1 (so not exactly 2.39, but still closer to it than 2.38). Now, problem is, what I've been referring to up until now as "2.39:1" has had many interpretations, including people rounding it to 2.4:1 (or even grosser, 2.40, with that unnecessary zero), which is incorrect rounding, or referring to it as 2.35:1 even thought that's an aspect ratio from the 1970s. The worst part, is that this correct 2.39:1 value might be the DCP (digital cinema package, universal standard for cinema movie making and distributing) standard, but that hasn't stopped people like RED and other camera manufacturers to call 2.39:1 something different. To put it into perspective, 4K DCP 2.39:1 is 4096 × 1716, yet many 4K 2.39:1 videos end up in a 4096 × 1714 resolution, which would be a different aspect ratio. To make things further complicated, the standard for non-16:9 aspect ratios in Blu-Ray format is exactly 2.4:1 (a.k.a. 12:5).
By doing a quick search and looking up at aspect ratio cheat-sheets, the most likely contenders would be:
2.4:1 (12:5, Blu-Ray standard), there are already screen with this aspect ratio
25.6:10.725 (1024:429, true DCP 2.39:1 standard)
64:13.390625 (2048:857, non-standard with 3 known resolutions ["4K" 4096 × 1714 |"8K" 8192 × 3428|"6K" 6144 × 2571])
20:8.3671875 (2560:1071, non-standard with 2 known resolutions ["5K" 5120 × 2142| "8K UHD" 7680 × 3213])
TL;DR So, my question would be, in the end: which aspect ratio, not counting 16:9 and 4:3, would be the most used in all modern media, and therefore the most ideal for a screen?

Related

[Q] Possible to mod the camera to record in 1080p ?

Any devs looking at the possibility to record movies in fullhd, 1080p ? I seem to remember I read somewhere that it should be capable of it.
Well if it is capable of recording in full-HD then why wouldn't Samsung themselves implement it so to make more sales?
leoon said:
Well if it is capable of recording in full-HD then why wouldn't Samsung themselves implement it so to make more sales?
Click to expand...
Click to collapse
Are we talking about the same company that decided to use rfs filesystem and use reserved memory thus limiting available ram... not to mention the weak wi-fi reception / gps issues.
INeedYourHelp said:
Are we talking about the same company that decided to use rfs filesystem and use reserved memory thus limiting available ram... not to mention the weak wi-fi reception / gps issues.
Click to expand...
Click to collapse
Exactly my point, there could be a thousand of different reasons. But maybe our devs inhere are a bit sharper than Samsung themselves...
People have made mods that claim an extra 20 - 30 megabytes of RAM. When these are applied problems are noticed with 720p recording. Imagine the ram usage for 1080p. I don't think its worth the hassle.
1080p used in mobile phones do you think will be much better?
come on!
i dont think so...
Especially since the audio is still bollixed... if they fixed that first.
Sent from my GT-I9000M using Tapatalk
Dont think it need it.
First if hardware permit to record 1080p stream the 5megapixels chip wont manage to provide 1080p frames with a decent framerate.
then if it could the optics wont be able to resolve the resolution gain.compared with n8 nokia or iphone 4 720p output you can see what there s place for improvement in this way(sharpest optic and better sensibility)
but may our dev can work on compression level to keep more fine detail , sensibility management or faster autofocus without resolution change.
think this is the only reasonable improvement we could expect by software mod
Well, I have problems with 1080p playing, let alone recording.
Anyway, the hardware is 100% capable of 1080p recording and it would be really cool if some can mod it.
medimel said:
Dont think it need it.
First if hardware permit to record 1080p stream the 5megapixels chip wont manage to provide 1080p frames with a decent framerate.
then if it could the optics wont be able to resolve the resolution gain.compared with n8 nokia or iphone 4 720p output you can see what there s place for improvement in this way(sharpest optic and better sensibility)
but may our dev can work on compression level to keep more fine detail , sensibility management or faster autofocus without resolution change.
think this is the only reasonable improvement we could expect by software mod
Click to expand...
Click to collapse
Hummingbird is capable of 1080p hardware decoding/encoding. It's equipped with hardware encoders/decoders. Both of them require decent amount of RAM reserved. I think that was the issue.
5mpix sensor is perfectly capable of delivering decent framerate @720p, why wouldn't it be capable of 1080p?
Resolution is enough, there might be bandwidth limiting factors between sensor-CPU.
Optics is perfectly capable of making quite sharp photos @5mpix, why wouldn't it be capable of shooting just 1920x1080?
There will be no software mod enabling 1080p recording, without hacking into hardware codecs/drivers.
Even if the framerate would go down to 15-20 fps, I would personally really like this feature. Some moments are best captured in highest resolution possible. An idea about the memory could be to allocate needed amount on demand, thereafter releasing it again?
Thanks for confirming that our Galaxy S is indeed hardware-wise capable of recording in 1920x1080.
Actually, why 1080p? It doesn't NEED to be 1080p. Why can't we add support for 800p (800lines vertical res) or even 960p.
We keep thinking about making the jump to 1080p, but is there any reason why would couldn't ramp up the resolution higher on the camera? Just because your TV expects 720p, doesn't mean computers do when playing it back...
andrewluecke said:
Actually, why 1080p? It doesn't NEED to be 1080p. Why can't we add support for 800p (800lines vertical res) or even 960p.
We keep thinking about making the jump to 1080p, but is there any reason why would couldn't ramp up the resolution higher on the camera? Just because your TV expects 720p, doesn't mean computers do when playing it back...
Click to expand...
Click to collapse
800p and 960p are not common, so it would make things awkward. Can't play it on a 720p screen and not properly on a 720p screen.
BTW although noticable I don't think the difference between 1080p and 720p is that big. So I don't think anyone would really notice the difference between 720p and 960p and if so probably more as a placebo than a real difference.
Mycorrhiza said:
800p and 960p are not common, so it would make things awkward. Can't play it on a 720p screen and not properly on a 720p screen.
BTW although noticable I don't think the difference between 1080p and 720p is that big. So I don't think anyone would really notice the difference between 720p and 960p and if so probably more as a placebo than a real difference.
Click to expand...
Click to collapse
I agree on the odd formats. However, going from 720p to 1080p is a significant improvement, especially if you have a large ( 46" + ) flat panal to view things on.
I would be very interested in this. And for everyone saying its not needed, this is a development forum. Many many many things that are done are "not needed" but still pretty cool. He asked if it could be done, lets stick to if it can, not if it should.
xan said:
5mpix sensor is perfectly capable of delivering decent framerate @720p, why wouldn't it be capable of 1080p?
Click to expand...
Click to collapse
720 from 5 meg camera is already seriously pushing it, almost hack wise. Normally only 8 meg cameras should support it. And im not speaking about 1080...
The sensors usually can't deliver 30 fps at 1080p even if the hardware can encode it (which ive seen no tech specs of,just various "web claims" aka moot stuff)
It's not because its a 5MP sensor etc, its about how much data can go through the sensor after it's captured (that's before the CPU/DSP!!) You have very good 5MP 1080p cameras, because the sensors can handle it. They also cost more. I highly doubt the one in the SGS can handle much more than 720p at 30fps.
i'd rather have the image processing improved than 1080p, since 1080p (if it could be done that is) will be approx the same quality as 720p, use twice the space and need twice the power to decode on other systems.
in fact even the encoder can maybe be optimized. i'm not familiar with the hummingbird, but the OMAP's have TI's own such hardware codecs and while its proprietary you can implement your own codec accelerated by the DSP.
HummingBird's codec produce "very average" 720p H264 mainline (i believe?) at 10-12mbits (!)
Compare with x264 4mbit 720p H264 high profile quality for the same source, it blasts it away quality wise and is 2/2.5x smaller in file size. besides it has a zillion options depending if you want quality, latency etc.
bottom line, if a genius would accelerate x264 via the DSP it would be awesome.
I know the x264 team worked on the OMAP DSP with little success, mostly due to rather cryptic documentation
There are plenty of PC displays which AREN'T 1080P (only cheap ones). 1080p and 720p is optimal for TV's, but not computer displays. There are plenty of computer displays which are 1200 lines vertical resolution.
And I've found a difference between 720p and 1080p, but it's more obvious on larger displays which supports higher resolutions
I'd rather have slow-motion and a proper app that enables video editing/cutting/sound mixing just with Iphone 4.
Sent from my GT-I9000 using XDA App
I'm inclined to agree, theres room for improvement at 720p, its like the same logic as low end cameras and camera phones alike ramping up the pixel count doesn't directly mean better quality..
Plus the phone although it should be able to currently doesn't like playing back 1080p videos...
I'm not saying everyones going to want to watch 1080p on an 800 x 400 panel, just saying you might want to play back what you've just recorded to see how its come out..

[Q] Why is text not crisp on the NC?

I read somewhere that although the LCD screen of the nookColor is 1024x600, the graphics chip is actually outputting at a much lower resolution and it is being scaled/interpolated to fit the 1024x600 screen. Is this why small text is hard to read and not as crisp as like on my EVO? This is especially noticable on widgets and icons like SwitchPro battery indicator. It's near impossible to read the battery percentage.
If this is something I can disable (font smoothing or something), I'd definitely do it.
I've never seen this problem on any of the nooks I've used?
Mine is crisp and clear.
really? it looks like microsoft ClearType is cranked way up. All the letters are fuzzy instead of crisp and clean edges like on a PC or an EVO. I've noticed it on every nook I've picked up.
This is the first complaint I've seen of fuzzy text. Did they have some kind of matte screen protector or something over the display?
I can't imagine a dedicated bookstore making a (supposedly) dedicated ereader without ensuring it had crisp text.
are you sure the app isnt upscaling, and designed for a small screen?
if they arent using vector images then they would blur in upscaling.
otherwise, i havent experienced anything at all like what you explain
Found where I read about the video output:
GPU Processor: PowerVR SGX530 Graphics Rendering: Open GLES1.1/2.0 Hardware Scaling: 854x480 scaled to 1024x600 Video Formats: .3GP, .MP4, .3G2 ** Video Codecs: H.263, H.264, MPEG-4, ON2 VP7 ** Image Formats: JPEG, GIF, PNG, BMP ** (same GPU as Droid 2 and Droid X)
Click to expand...
Click to collapse
from: http://www.androidtablets.net/forum/nook-color-technical/3483-nookcolor-full-specifications.html
I agree about the text, depending on what you are reading I do see a fuzz around the letters.
Sometimes its poor pdf quality.
I also think the video quality is kind of washed out and not as sharp as it seems it should be.
Glad I'm not the only one who is bothered by this. I certainly never noticed it on my wife's iPad and the nook should be crisper considering the dpi. UNLESS we are actually seeing an 854x480 output interpolated to 1024x600 instead of native like other devices.
wy1d said:
Found where I read about the video output:
from: http://www.androidtablets.net/forum/nook-color-technical/3483-nookcolor-full-specifications.html
Click to expand...
Click to collapse
From what I've read this applies purely to video decoding. Anything the OS renders from apps to text does not have this problem. That being said mine is incredibly crisp.
For those who say theirs is crisp, what are you comparing to? For example, the text on the xda app on my EVO is much much easier to read than the xda app on the nook.
wy1d said:
For those who say theirs is crisp, what are you comparing to? For example, the text on the xda app on my EVO is much much easier to read than the xda app on the nook.
Click to expand...
Click to collapse
AMOLED HTC Incredible with CM7.
I just made a close-up side-by-side comparison of it with the NC. The NC's text is actually smoother around the edges of the letters than the Inc's, while the interior of the letters looks more "solid" on the Inc, probably due to the physically larger pixel-grid on the NC's display. Note that this was from a viewing distance of about two inches.
To me, it's a wash. At a normal viewing distance, they appear about equal and both look great.
This isn't in any particular app, though. I have some of the same widgets and apps on my home screens, so I was comparing the widget text and icon labels.
wy1d said:
For those who say theirs is crisp, what are you comparing to? For example, the text on the xda app on my EVO is much much easier to read than the xda app on the nook.
Click to expand...
Click to collapse
Not really comparing it to anything. My iPhone4 is more crisp but it has a much higher ppi. It's just good overall, I mean your evo probably has more ppi(idk the evo specs) so I doubt the text would appear as crisp to that. I mean the text isn't blurry at all so I guess I'd say it's just as good as a book and better than a newspaper
wy1d said:
For those who say theirs is crisp, what are you comparing to? For example, the text on the xda app on my EVO is much much easier to read than the xda app on the nook.
Click to expand...
Click to collapse
That is not the nook's fault. What you are referring to is pixel density.
If you have a phone with a small screen at 840x480 versus a screen more than double the size only scaling a 1024x600, the pixel density ill be lower on that device.
Our pixel density on the nook is about 169
ipad is about 133
iphone 4 is over 300 which is why that screen looks so sharp
Anybody who says that the display is being generated at 854x480 and upscaled to 1024x600 is, well, wrong. First of all, 854x480 is not the same aspect ratio as 1024x600(the equivalent would be 820x480, which nothing renders in), so those claims are completely made up.
More conclusively, even a smidgen of playing about with any pixel-related app(Multitouch Visualizer shows distance between touches) will plainly show that the screen is, in fact, 1024x600. You can also look up the LCD panel type(see the teardown thread), or ask ANYBODY that is doing hardware dev on the thing.
"Blurriness" can result from poorly-coded apps doing a bad upscale on their graphics, or from you needing to buy glasses. But the device itself is 1024x600, and looks just fine to me.
What you have posted there are video upscaling stats. The nook hardware cannot process video above 852 pixels wide so upscales to 1024. With that said the nook color has been reviewed to have a higher pixel density than the iPad and I have never seen anything less than sharp text.
MattJ951 said:
From what I've read this applies purely to video decoding. Anything the OS renders from apps to text does not have this problem. That being said mine is incredibly crisp.
Click to expand...
Click to collapse
I think this is correct, I looked at a list of 16x9 resolutions and this is what I see:
WVGA 854 × 480 ~16:9 1.783 410,880 total Pixels.
and
Used in many netbooks 1024 × 600 128:75 1.707 614,400 Total Pixels
I think the 848 (close to 854) by 480 is their attempt to render 16x9 or close to it for video. But that is just my guess.
Posted from http://en.wikipedia.org/wiki/List_of_common_resolutions
Also, if you are not aware of it, there is a way to change font size when reading... I have never noticed any fuzzy text on the NC screen. Try a few different things and find your happy place.
migrax
I edited my build.prop and changed lcd density to 150. Everything looks much crisper. I snapped some macro photos of it before. Will post tomorrow.

Pixel Aspect Ratio

MP4's on the Galaxy S encoded using Handbrake and MP4 aren't respecting the Pixel Aspect Ratio.
Meanwhile MKV's on the Galaxy Tab 10.1 don't respect the Pixel Aspect Ratio.
In otherwords, they only see the size of the actual video frame and not the intended frame.
That seems a little naff.
Thoughts?
Im using Strict anamorphic mode in Handbrake to keep vertical resolution.
Guess I'll have to change to none.

how can redmi note 7 have 48mp camera but snapdragon 660 only can afford until 25mp

I wanna ask how can redmi note 7 have 48mp camera but snapdragon 660 only can afford until 25mp
It's fake 48mp which made by ai algorithm, you need to wait for pro version if you want the real 48mp.
But redmi 7 pro maybe use snapdragon 710 and still snapdragon 710 cant afford for 48mp
Kent Nathanael said:
I wanna ask how can redmi note 7 have 48mp camera but snapdragon 660 only can afford until 25mp
Click to expand...
Click to collapse
Probably it uses a custom ISP from sony (IMX 586) and not using the integrated one in the SoC
Kent Nathanael said:
I wanna ask how can redmi note 7 have 48mp camera but snapdragon 660 only can afford until 25mp
Click to expand...
Click to collapse
What the Samsung's sensor do, it's they stick 4 tiny pixels into one big pixel, for brightness in the images, actually there is a 12mpx camera but the result it's from a 48mpx resolution. You can look into it.
It's NOT a fake 48 mp camera. Let me explain,
The camera has physical 48million pixels , same as Sony imx586(used in redmi note 7 pro).
But let's see what's the catch about 12mp thing.
So as i said earlier, SAMSUNG GM1 SENSOR(used in redmi note 7) actually has physical 48million pixels.
What it does is, it treats every 4 pixels as 1 bigger one. So that means when we have 48(million) pixels. And 4 pixels will be bind together and made 1.
Result is we get 12(million) pixels in resulting pic.
Now this is done to get more bright images, so that each Pixel of image can get more light.
https://www.youtube.com/watch?v=prJnWBNFQnY
48MP Camera on Redmi Note 7 explained by C4ETech
it's the same trick used with Xiaomi's latest 20MP or 24MP front facing sensors. it combines 4 pixels into 1 bigger pixel (this is called pixel binning). for example: if you install a custom ROM on the Poco (which has a 20MP front facing camera), it will register as only 5MP, but in reality, you actually capture 20MP, combined into a 5MP picture, this tech helps with low light and creating brighter images.
so basically: the SoC registers the sensor as 12MP, but its truly 48MP.
Kent Nathanael said:
I wanna ask how can redmi note 7 have 48mp camera but snapdragon 660 only can afford until 25mp
Click to expand...
Click to collapse
1) 25mp is for single camera not for dual
2)It can't, the camera is just an interpolated version of a 12mp sensor thats all. Its just an edit
sssaini007 said:
It's NOT a fake 48 mp camera. Let me explain,
The camera has physical 48million pixels , same as Sony imx586(used in redmi note 7 pro).
But let's see what's the catch about 12mp thing.
So as i said earlier, SAMSUNG GM1 SENSOR(used in redmi note 7) actually has physical 48million pixels.
What it does is, it treats every 4 pixels as 1 bigger one. So that means when we have 48(million) pixels. And 4 pixels will be bind together and made 1.
Result is we get 12(million) pixels in resulting pic.
Now this is done to get more bright images, so that each Pixel of image can get more light.
Click to expand...
Click to collapse
yah i also think like you but some youtuber in my country they explanted that redmi note 7 would has a small cache to process images
As long as i can take great low light photos i am ok with it.
cwr250 said:
It's fake 48mp which made by ai algorithm, you need to wait for pro version if you want the real 48mp.
Click to expand...
Click to collapse
That sounds like ?️
Processor
Redminote 7 PRO wil be released with Snapdragon 675 Soc
majidamiri15300 said:
it's the same trick used with Xiaomi's latest 20MP or 24MP front facing sensors. it combines 4 pixels into 1 bigger pixel (this is called pixel binning). for example: if you install a custom ROM on the Poco (which has a 20MP front facing camera), it will register as only 5MP, but in reality, you actually capture 20MP, combined into a 5MP picture, this tech helps with low light and creating brighter images.
so basically: the SoC registers the sensor as 12MP, but its truly 48MP.
Click to expand...
Click to collapse
I don't think this is fully true
Poco video selfie is just so dark like no native binning at all. This is different to the big pixel like on mi5.
support
harysviewty said:
I don't think this is fully true
Poco video selfie is just so dark like no binning at all. This is different to the big pixel like on mi5.
Click to expand...
Click to collapse
I don't think there is a phone currently that does pixel binning in video, now it's just done in photo.
Binning adds the the brightness of 4 pixels and merge then into a bigger number.
(example: 4 pixels have brightness values of 3,4,5,3 the binned number is 15(3+4+5+3), meaning that pixel is brighter, if it were averaged, it would have had a value of 3.75 ((3+4+5+3)/4) and it's not bigger than the original pixels values, meaning no brightness improvement)
(the real process of binning pixels is actually much more complicated, and simple addition may not be done)
While in video the brightness of individual is averaged(not binned) with neighbouring pixels to make 30 1080p (2.1 Mp) photos per second.
If someone understands the process of real pixel binning, then correct me if i am wrong.
JoraForever said:
I don't think there is a phone currently that does pixel binning in video, now it's just done in photo.
Binning adds the the brightness of 4 pixels and merge then into a bigger number.
(example: 4 pixels have brightness values of 3,4,5,3 the binned number is 15(3+4+5+3), meaning that pixel is brighter, if it were averaged, it would have had a value of 3.75 ((3+4+5+3)/4) and it's not bigger than the original pixels values, meaning no brightness improvement)
(the real process of binning pixels is actually much more complicated, and simple addition may not be done)
While in video the brightness of individual is averaged(not binned) with neighbouring pixels to make 30 1080p (2.1 Mp) photos per second.
If someone understands the process of real pixel binning, then correct me if i am wrong.
Click to expand...
Click to collapse
Lg v30 V35 g7 v40 have super bright mode video, 2x2 4in1 pixel binning in video (4K becoming full HD), + there's adaptive fps for lowlight
HTC has multiframe subsampling processing for lowlight noiseless video
Sony has dual camera sensor fusion (normal & bw) for super high iso lowlight video
No, you're wrong
Binning isn't always about 1+2+3+4= 10, it can also be like 1+2+3+4=10 :4= 2.5 (PureView) or 1+4+4+5=4 or 1+2+3+4=1/2/3/4 (real time hdr)
And there's no averaging in lower resolution normal video processing, it's not even using all the pixels of the full sensor. That's why most flagship use 12mp 4:3 (video is cropped 16:9 8mp 4k), no wasted resolution
harysviewty said:
Lg v30 V35 g7 v40 have super bright mode video, 2x2 4in1 pixel binning in video (4K becoming full HD), + there's adaptive fps for lowlight
HTC has multiframe subsampling processing for lowlight noiseless video
Sony has dual camera sensor fusion (normal & bw) for super high iso lowlight video
No, you're wrong
Binning isn't always about 1+2+3+4= 10, it can also be like 1+2+3+4=10 :4= 2.5 (PureView) or 1+4+4+5=4 or 1+2+3+4=1/2/3/4 (real time hdr)
And there's no averaging in lower resolution normal video processing, it's not even using all the pixels of the full sensor. That's why most flagship use 12mp 4:3 (video is cropped 16:9 8mp 4k), no wasted resolution
Click to expand...
Click to collapse
I did say: "the real process of binning pixels is actually much more complicated, and simple addition may not be done" i was simplifying the technical stuff. Pixel binning is also done by averaging values, though not benefiting brightness but noise reduction.
In video the camera actually sends full frame raw data to the isp, which manipulates the raw sensor data by cropping and subsampling (technically the same as binning by averaging) and then dumping that data on flash memory as a video format.
Most modern phones use subsampling by averaging because it reduces noise.
Many phones have issues with noise while filming in 4k in low light condition because the noise filtering applied to videos must be fast and efficient, where as photo noise filtering is done with much more processing.
The LG super bright video mode is most likely some kind of software trickery that forces 1080p resolution because subsampling reduces noise, and does one of two things either increase iso or increase brightness/contrast in post processing.
JoraForever said:
I did say: "the real process of binning pixels is actually much more complicated, and simple addition may not be done" i was simplifying the technical stuff. Pixel binning is also done by averaging values, though not benefiting brightness but noise reduction.
In video the camera actually sends full frame raw data to the isp, which manipulates the raw sensor data by cropping and subsampling (technically the same as binning by averaging) and then dumping that data on flash memory as a video format.
Most modern phones use subsampling by averaging because it reduces noise.
Many phones have issues with noise while filming in 4k in low light condition because the noise filtering applied to videos must be fast and efficient, where as photo noise filtering is done with much more processing.
The LG super bright video mode is most likely some kind of software trickery that forces 1080p resolution because subsampling reduces noise, and does one of two things either increase iso or increase brightness/contrast in post processing.
Click to expand...
Click to collapse
https://en.ids-imaging.com/techtipps-detail/en_techtip-binning-subsampling-or-scaler.html
I believe subsampling =/= averaging
That's why lowlight video from 40mp Huawei mate 20 pro sucks so bad
Lg bright mode is a real time processing, 15-24fps, ev +1 stop (higher than max iso on auto /manual mode), 1/4 max resolution,
I've been wanting to ask this question since, Xiaomi lies a lot when it comes to phone specs. Well, it's cheap, so we can't complain.

1080 60 FPS too much noisy

I tried to make a video in indoor light in 4K and 1080, 60 FPS, is there a reason why 4K is much less noisy? I did not think that resolution decides how much noise is there.
4K provides you with almost 4 times the resolution of 1080P thus you will most certainly see a difference in clarity.
In running a quick test, at 1080P on the default camera app, it captures the video at a bit rate of 20.0Mbits per second.
In contrast, at 4K @ 30FPS, it captures the video at 41.9Mbits per second.
So the capture size versus the bit rate is certainly going to play a deciding factor.
But there are other ways to achieve an optimal capture rate @ 1080P.
If Xiaomi gave us the ability to utilize the High Efficiency Video Coding (H.265) and dictate our own bitrate, you'd see better capture rates @1080P.
My only other suggestion would be to seek out a third-party camera application (GCAM perhaps?) and test further.
A_H_E said:
4K provides you with almost 4 times the resolution of 1080P thus you will most certainly see a difference in clarity.
In running a quick test, at 1080P on the default camera app, it captures the video at a bit rate of 20.0Mbits per second.
In contrast, at 4K @ 30FPS, it captures the video at 41.9Mbits per second.
So the capture size versus the bit rate is certainly going to play a deciding factor.
But there are other ways to achieve an optimal capture rate @ 1080P.
If Xiaomi gave us the ability to utilize the High Efficiency Video Coding (H.265) and dictate our own bitrate, you'd see better capture rates @1080P.
My only other suggestion would be to seek out a third-party camera application (GCAM perhaps?) and test further.
Click to expand...
Click to collapse
Is there an app that supports [email protected]? There is no point in recording videos [email protected] if there is 4K, but I like the idea of 60FPS.

Categories

Resources