Hello there,
I am just curiuos if there's any way to disable VSync in SGX 540 driver? 3D applications are limited to 56FPS, so Samsung Galaxy S can't win some benchmarks, because of this ridiculous restriction. My SGS always scores about 55.7 FPS or very close to 56, but Droid X has 60 or 62 Hz refresh rate and even though it has weaker GPU (SGX 530) it can still beat SGS in some less demanding benchmarks. The SGS screen refresh rate is 56 Hz... I wish we could unleash the real power of SGX 540, which is now *STRONGLY* limited.
Maybe there are cfg files with options:
VSync=0/1
Or:
MaxRefreshRate 56 Hz etc..
Or something like that... Can somebody look at it?
Thanks in advance for answers!
Best regards,
John
While I'm sure if the framerate was unlocked it would beat every other current Android device in GPU benchmarks, in real-world use this wouldn't mean a thing since the screen itself has a refresh rate of 56Hz...so visually it would be impossible to see any difference other than screen tearing.
It's a limit to save battery, i would like to turn it off/on (for benches) but for real life 56+ fps is just waist of powerbecause of the screen that has 56Hz
(but i prefer 200fps against 60fps on my PC in my 60Hz monitor, somehow i DO notice the difference )
jaapschaap said:
It's a limit to save battery, i would like to turn it off/on (for benches) but for real life 56+ fps is just waist of powerbecause of the screen that has 56Hz
(but i prefer 200fps against 60fps on my PC in my 60Hz monitor, somehow i DO notice the difference )
Click to expand...
Click to collapse
That's called the placebo effect. It's literally impossible for you to see it, you just 'notice' it because you're aware that the framerate is higher, so your mind convinces you that you can see the difference.
AXIS of Reality said:
That's called the placebo effect. It's literally impossible for you to see it, you just 'notice' it because you're aware that the framerate is higher, so your mind convinces you that you can see the difference.
Click to expand...
Click to collapse
Let's try not to prove if human eye can see the difference between 56 or 200 FPS. Let's try to find out if there is a possibility to turn the VSync off to see what SGX 540 is capable of. Because it's an external GPU so it has to have any external panel control or cfg file with some options like NVIDIA and ATI GFX cards have.
Damn I wish I could see 80-100 FPS @ Neocore and GL Benchmark score that kicks ass Droid X score (which is better than SGS, because of damn screen refresh rate)... It's really possible!
About fps perception, that is discussed a lot.... dont bring it here.
The tweak -> would be nice, but just for benchmarks (showing them mine is bigger), or playing with charger connected.
This is mobile device, we got limited power resource.
On pc we got sum competitive games, they provide matchup with players around the world, and it means sth to win/lose, so ppl would do anything to do better.. But in singleplayer...on a 'phone'? No matter to me.
xan said:
About fps perception, that is discussed a lot.... dont bring it here.
The tweak -> would be nice, but just for benchmarks (showing them mine is bigger), or playing with charger connected.
This is mobile device, we got limited power resource.
On pc we got sum competitive games, they provide matchup with players around the world, and it means sth to win/lose, so ppl would do anything to do better.. But in singleplayer...on a 'phone'? No matter to me.
Click to expand...
Click to collapse
Mate! It doesn't matter for the power usage if there are 56 FPS or 200 FPS! GPU is doing the same job with the same power usage but is capped by VSync. It's the software (driver level) limitation, not hardware.
Why don't you just find a more graphic-intensive benchmark instead? It's pointless to turn off Vsync even if we could.
Sent from my GT-I9000 using XDA App
+1 for disable the VSync ....
It`s the reason for the result of GLbenchmark:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
So what will you guy accomplish from getting higher scores in a synthetic benchmark which doesn't actually represent real speed?
I wish people would stop putting such an emphasis on synthetic benchmarks...
ed10000 said:
Why don't you just find a more graphic-intensive benchmark instead? It's pointless to turn off Vsync even if we could.
Sent from my GT-I9000 using XDA App
Click to expand...
Click to collapse
Testing purposes, to see how fast GPU really is and to kick Droid X ass. ;-) We can't say anything about the real performance of SGX 540... But we can see that every 3D test we run on SGS it gets ~56 FPS...
Mate I can ask why do people overclock their CPUs? Why do people overclock their Galaxy S CPU to 1.2 GHz? It's pointless in some way (battery usage, it's hard to see the difference without benchmark scores etc.). But for power users - people just like me - it's something to test the limits of the hardware.
I am going to check the possibility on the PowerVR forum and keep You updated.
The worst thing is that I can cool my PC hardware using liquid nitrogen but I don't really know mobile software to do the trick.
+1 to try and disable vsync! really would like to know what the gpu is really capable of...
amir_rafie said:
+1 for disable the VSync ....
It`s the reason for the result of GLbenchmark:
Click to expand...
Click to collapse
With tables like this one, for non informed people it may look like Galaxy S is the worst of the current high end devices.
In the /system/build.prop file, there is a line called:
windowsmgr.max_events_per_sec=55
Change it to 60 and see if the framerates go up? Quadrant reports the Galaxy S screen refresh rate is 68Hz.
changed. quadrant gfx test is same 56fps max.
The config file should be somewhere in the OpenGL driver folder I think. I can't check it out because I am at work at the moment...
fua said:
changed. quadrant gfx test is same 56fps max.
Click to expand...
Click to collapse
I changed it to 68 and although the quadrant test didnt go about 57fps, I got higher scores. JPC default got around 950 or so, with this I got about 1000.
The UI is definitely smoother though.
hardcore said:
I changed it to 68 and although the quadrant test didnt go about 57fps, I got higher scores. JPC default got around 950 or so, with this I got about 1000.
The UI is definitely smoother though.
Click to expand...
Click to collapse
Interesting!;-) So we can adjust FPS on the UI right now. Looks like it's possible to make scrolling ultra smooth.
nice, please add all of the findings to post 1 , because we could find it easier later..
amir_rafie said:
nice, please add all of the findings to post 1 , because we could find it easier later..
Click to expand...
Click to collapse
Later. But there are still no idea to walk arount the VSync problem.
Related
hey guys
ive recently flashed this kernel http://forum.xda-developers.com/showthread.php?t=975349
and im getting quadrant scores btw 1400 to 1600
is there any way to increase them
i read somewhere it can be increased by using some file named ' build.prop'
I am on Darky's 10.1 using DarkCore, and I'm getting about 1200 with EXT4 lagfixes enabled.
i used to get 1900 - 2100 back on stock froyo with OCLF 2.1
if it is only about increasing the quadrant score, you could download tegrak overclock tool from market and increase the CPU to 1.25 Ghz..
or, before that you can disable lagfix, switch to CF-root, enable all tweaks (including media stage someting), and then apply overclock and it ll hit around 2350 ...
use damians latest kernel.. quadrant scores of more than 3000 are reported(use jvp as ur firmware) hope this helps..
I just can't understand why people are so obsessed with quadrant scores??? Frankly with stock Rom with ext4 the normal score is around 1500 and with custom ROM you can get around 1700-1800 - that's it (and you can gain around 100 more by removing the journaling of the the ext4 system). Quadrant isn't reliable benchmark and people should stop striving to get higher quadrant scores.
Anyway - if you want high quadrant scores just enable stagefright if you don't mind not being able to play certain video and audio formats on your phone - at least you'll be able to brag about you scores.
quardrant scores shouldnt be an obsession..... but just merely a certain reference once in awhile....
ROM: Miui MCGv 6.13.1
Kernel: Neo7
Quadrant's Score: 1737
JunaidZaka said:
hey guys
ive recently flashed this kernel http://forum.xda-developers.com/showthread.php?t=975349
and im getting quadrant scores btw 1400 to 1600
is there any way to increase them
i read somewhere it can be increased by using some file named ' build.prop'
Click to expand...
Click to collapse
Try F1 JVP ROM as base and go experiment with speedmod kernels i got up to a stable 1.5Ghz with my phone (but very short battery life):
http://forum.xda-developers.com/showthread.php?t=1058814
Also make sure all apps and processes are closed using taskmanager. Then start Quadrant score i got around 2900+ using JVP.
ROM: Miui MCGv 6.13.1
Kernel: Neo7
Quadrant's Score: 1957
No OC but UV between -50 to -125
ROM: XXJVH
Kernel: GTO
Benchmark: 3268
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Mods need to start closing some of these Quadrant threads, because, there are way too many.
Quadrant scores are great, if you are simply chasing a placebo (or you want the opportunity to grow your ePenis). If you want a phone which is productive though, throw the benchmark in the bin and run a blind test.
You are only testing if Quadrant is faster, not if other applications are. It is possible that optimising for Quadrant may lead to other applications being dramatically slower (because Quadrant isn't detailed enough). You will notice that the Wikipedia article is almost exclusively about the inaccuracies of benchmarking, and that developers generally don't use standard benchmarks (almost every developer will use a test suite that pops out pages of "user unfriendly garbage").
Quadrant simply doesn't provide enough information to tell you what operations is faster.
An optimisation might yield a 6x speedup of directory creation, at the cost of a 2x file writing slowdown. If quadrant is coded poorly and weighs the tests equally, it might tell you the optimisation was worth it (it will be 4x speedup overall). However, in normal usage, you might write to files 10000x more than create directories, so in the real world, it might be actually slower.
It's even less worth it considering that the time most people waste running benchmarks, and optimising their phone (sometimes they spend an hour or so), is never regained in productivity gains again.
That is just an opinion though
Auzy said:
Mods need to start closing some of these Quadrant threads, because, there are way too many.
Quadrant scores are great, if you are simply chasing a placebo (or you want the opportunity to grow your ePenis). If you want a phone which is productive though, throw the benchmark in the bin and run a blind test.
You are only testing if Quadrant is faster, not if other applications are. It is possible that optimising for Quadrant may lead to other applications being dramatically slower (because Quadrant isn't detailed enough). You will notice that the Wikipedia article is almost exclusively about the inaccuracies of benchmarking, and that developers generally don't use standard benchmarks (almost every developer will use a test suite that pops out pages of "user unfriendly garbage").
Quadrant simply doesn't provide enough information to tell you what operations is faster.
An optimisation might yield a 6x speedup of directory creation, at the cost of a 2x file writing slowdown. If quadrant is coded poorly and weighs the tests equally, it might tell you the optimisation was worth it (it will be 4x speedup overall). However, in normal usage, you might write to files 10000x more than create directories, so in the real world, it might be actually slower.
It's even less worth it considering that the time most people waste running benchmarks, and optimising their phone (sometimes they spend an hour or so), is never regained in productivity gains again.
That is just an opinion though
Click to expand...
Click to collapse
Yes, your right and I know it but anyway is something that cans show something of the power of the phone. (sorry by the bad english).
Now, with Miui MCGv6.15 and the Neo7 kernel I've 1939 on Quadrant
Quadrant does not tell anything from speed and you can easily cheat on that test
galaxysdev said:
Quadrant does not tell anything from speed and you can easily cheat on that test
Click to expand...
Click to collapse
Right. But if the dev's or SGS owners don't cheat the test, it could be credible.
You are all alright , but aniway ..
Quadrant : 2843
Antutu Benchmark : 2543
JVP Deodex , Alpha Damian 1 2.2
alasth said:
You are all alright , but aniway ..
Quadrant : 2843
Antutu Benchmark : 2543
JVP Deodex , Alpha Damian 1 2.2
Click to expand...
Click to collapse
Which is your kernel?
[EDIT] I see it now. Sorry ^^
JunaidZaka said:
hey guys
ive recently flashed this kernel http://forum.xda-developers.com/showthread.php?t=975349
and im getting quadrant scores btw 1400 to 1600
is there any way to increase them
i read somewhere it can be increased by using some file named ' build.prop'
Click to expand...
Click to collapse
Okay wouldnt be a better question be whats the fastest rom?
Im still on GingerReal cause it is a combination of both a stable as a fast rom. Quadrant score of GingerReal isnt that high somewhere around 1500 - 1600. I never have forced closes and loading of files etc. go like crazy.
Tried CyanModGen and MIUI which are both great roms but they arent that stable and still have some minor known issues. I'll probably change my ROM until of one of the above have a stable rom.
What is working the best for u guys (and girls offcourse)?
Sent from my GT-I9000 using XDA App
thanks for all your replies guys
i think the better question wouldve been whats the fastest rom in terms of performance?
becuz i do experience a little bit of lag while playing some games
nrbuitenhius said:
Okay wouldnt be a better question be whats the fastest rom?
Im still on GingerReal cause it is a combination of both a stable as a fast rom. Quadrant score of GingerReal isnt that high somewhere around 1500 - 1600. I never have forced closes and loading of files etc. go like crazy.
Tried CyanModGen and MIUI which are both great roms but they arent that stable and still have some minor known issues. I'll probably change my ROM until of one of the above have a stable rom.
What is working the best for u guys (and girls offcourse)?
Sent from my GT-I9000 using XDA App
Click to expand...
Click to collapse
I will try that ROM
Antutu
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
GL Benchmark
CNET UK said:
To see how it stacks up against the competition, I booted up my benchmark test and hit go. On the Geekbench test, it returned a frankly astonishing score of 1,975, putting it just below the powerhouse Galaxy Note 2 and far above the S3. It did similarly well on the CF-Bench test, where it managed to achieve 13,207, again well clear of the S3.
Click to expand...
Click to collapse
CNET UK said:
Processor and battery: Though the Nexus 4's data speeds might not be blazingly fast, the 1.5GHz Qualcomm Snapdragon S4 Pro quad-core CPU makes its internal speed swift and smooth. Graphics-intense games like Riptide GP and Asphalt 7 played extremely well, launching and running with no stalls or hiccups. The games both displayed high frame rates with high-resolution graphics.
Because of the phone's ultrafast CPU, gameplay was crisp, smooth, and fast.
(Credit: Josh Miller/CNET)
In addition, average start time for the handset was about 23 seconds, and it took about 1.82 seconds to launch the camera. Browsing on Chrome was a lot smoother on this device than on the Optimus G for some reason. For instance, scrolling down Web pages was executed much more swiftly.
Click to expand...
Click to collapse
The GLbenchmarks scores look promising as you would expect. Quallcomm also has a record of drastically improving 3D performance over time with driver updates, let's hope they do the same with the Adreno 320.
Antutu however doesn't seem like it's a good way to measure the Performance of highend smartphones anymore.
The 3D and 2D scores are totally useless as the test are completely Vsync limited meaning that all new Smartphones score the same.
I also doesn't seem right that the Tegra 3 has a higher CPU score at the same clockspeed.
It's about time someone develops a decent benchmark suite for android.
This shouldn't be a surprise since the power of the Adreno 320 is mainly the SGX 543MP2 and SGX543MP4. So it has half the (GPU)power of the iPad 3, though it's sorta close to the iPhone 5(SGX543MP3). I'm actually surprised that Android phones haven't been using the SGX solution--would've been epic if the Nexus 4 had the SGX Rogue.
Ace42 said:
This shouldn't be a surprise since the power of the Adreno 320 is mainly the SGX 543MP2 and SGX543MP4. So it has half the (GPU)power of the iPad 3, though it's sorta close to the iPhone 5(SGX543MP3). I'm actually surprised that Android phones haven't been using the SGX solution--would've been epic if the Nexus 4 had the SGX Rogue.
Click to expand...
Click to collapse
HMM Nexus 4 beat Ipad 3 in most of the tests
but in low level benchmark ipad win not sure why
Any quadrant scores please?
Ace42 said:
This shouldn't be a surprise since the power of the Adreno 320 is mainly the SGX 543MP2 and SGX543MP4. So it has half the (GPU)power of the iPad 3, though it's sorta close to the iPhone 5(SGX543MP3). I'm actually surprised that Android phones haven't been using the SGX solution--would've been epic if the Nexus 4 had the SGX Rogue.
Click to expand...
Click to collapse
Im pretty sure the Gnex and NexusS had SGX's cards
Anyone have the sunspider scores compared to the iphone5? thanks!
yahyoh said:
HMM Nexus 4 beat Ipad 3 in most of the tests
but in low level benchmark ipad win not sure why
Click to expand...
Click to collapse
You have to take into account that iPhone has a higher resolution screen than the Nexus 4...
Pertaining hentrite
rkantos said:
You have to take into account that iPhone has a higher resolution screen than the Nexus 4...
Click to expand...
Click to collapse
No it doesn't?
EDIT: Oh, you probably meant iPad 3
BatteryCro said:
No it doesn't?
Click to expand...
Click to collapse
Im certain he was referring to the iPad 3 that was referenced for comparison.
yahyoh said:
HMM Nexus 4 beat Ipad 3 in most of the tests
but in low level benchmark ipad win not sure why
Click to expand...
Click to collapse
If we could normalize the iPad3's ridiculous resolution then it would win in every tests--Andantech showed that the Nexus 4 is between the iPad 2 & iPad 3 in terms of power(GPU).
BennyJr said:
Im pretty sure the Gnex and NexusS had SGX's cards
Click to expand...
Click to collapse
Yes that's correct(including every Galaxy S1 phone aka Hummingbird chips). Though I meant to say that no Android phone has a *modern* SGX model like the 544/543.
rkantos said:
You have to take into account that iPhone has a higher resolution screen than the Nexus 4...
Click to expand...
Click to collapse
Yeah if the iPad has a logical resolution then the scores would be different.
Damn that total disappoint benchmarks compared to Exynos 4412 in note 2
my old s2 get 1350ms in sunspider with stock cpu speed + cm10
Edit : WTF my s2 get 1500 point in vellamo
I guess I'm slow. Can someone explain to me how it is that the Nexus 4 and Optimus g are getting such different scores? The only thing I can figure is the kernel needs refining on the Nexus.
estallings15 said:
I guess I'm slow. Can someone explain to me how it is that the Nexus 4 and Optimus g are getting such different scores? The only thing I can figure is the kernel needs refining on the Nexus.
Click to expand...
Click to collapse
Only thing that can explain it is the hardware is throttled back.
yahyoh said:
Damn that total disappoint benchmarks compared to Exynos 4412 in note 2
my old s2 get 1350ms in sunspider with stock cpu speed + cm10
Edit : WTF my s2 get 1500 point in vellamo
Click to expand...
Click to collapse
My Galaxy S2 got 1960 in Vellamo
and 1200ms in Sunspider running CM10 and Siyah Kernel,
but I wouldn't be too concerned we already know that the S4 pro is a beast, and what's more important is real world performance,
I guess there will be Android 4.2.1 when the device arrives that will improve Javascript perfomance.
Venekor said:
Only thing that can explain it is the hardware is throttled back.
Click to expand...
Click to collapse
or maybe stock kernel is piece of crap :silly:
yahyoh said:
or maybe stock kernel is piece of crap :silly:
Click to expand...
Click to collapse
This
yahyoh said:
or maybe stock kernel is piece of crap :silly:
Click to expand...
Click to collapse
Yeah, that's actually what I was thinking. Bring on that source code!
i remember something when i first saw gnex benchmarks i said WTF this crap then after i bought it and flashed CM9 then CM10 or AOKP with Franco or trinty kernel and damn , u feel the phone then worked with full power
Most likely overheating (thermal throttling). Google should have stayed with dual core.
http://www.engadget.com/2014/06/18/korean-samsung-galaxy-s5-has-qhd-snapdragon-805
Only for Korean market.
Oh Dear.... soon to be an influx of Samsung haters, saying the usual things like Samsung can go to hell, Samsung betrayed us ect ect ect.
Me i will continue to enjoy my bog standard S5, until i decide next year what to replace it with.
Too bad it's a korean exclusive for now
I still don't see how anyone could be surprised about that.
Or has everyone forgotten about the S4 Advanced LTE-A already?
Honestly people, business as usual...
ShadowLea said:
I still don't see how anyone could be surprised about that.
Or has everyone forgotten about the S4 Advanced LTE-A already?
Honestly people, business as usual...
Click to expand...
Click to collapse
Did S4 advanced have a better screen than S4 ?
Out of interest, will it ACTUALLY be that much/if any faster anyway?
Would be interested if anyone with the tech knowledge could chime in....because it seems to me that the slightly faster snapdragon 805 is probably 'cancelled out' in effectiveness by the fact that the handset is now having to power a quad hd screen instead of standard 1080p.
Thoughts?
I can tell you for a fact that my gf's older macbook pro is snappier in day-to-day ui movements than my newer retina mbp which I'm guessing must be due to the added strain of those extra pixels.
jodvova said:
Did S4 advanced have a better screen than S4 ?
Click to expand...
Click to collapse
So?? I don't get your logic.
paddylaz said:
Out of interest, will it ACTUALLY be that much/if any faster anyway?
Would be interested if anyone with the tech knowledge could chime in....because it seems to me that the slightly faster snapdragon 805 is probably 'cancelled out' in effectiveness by the fact that the handset is now having to power a quad hd screen instead of standard 1080p.
Thoughts?
I can tell you for a fact that my gf's older macbook pro is snappier in day-to-day ui movements than my newer retina mbp which I'm guessing must be due to the added strain of those extra pixels.
Click to expand...
Click to collapse
There's been quite a bit of discussion in the M8 comparison thread about the impact of QHD on the LG G3. Cliff notes:
- Performance on S-801 took a pretty big hit
- Battery life took a hit
- Display contrast, black levels, and reflectivity all took hits
S-805 isn't a minor upgrade. It and Adreno 440 should allow QHD to perform as well and most likely better than S-801/1080P. So battery life and the quality of Samsung's QHD display are questions left to answer. The M8 thread also has discussion on the value of going from 1080P to QHD which is really pretty limited.
paddylaz said:
Out of interest, will it ACTUALLY be that much/if any faster anyway?
Would be interested if anyone with the tech knowledge could chime in....because it seems to me that the slightly faster snapdragon 805 is probably 'cancelled out' in effectiveness by the fact that the handset is now having to power a quad hd screen instead of standard 1080p.
Thoughts?
I can tell you for a fact that my gf's older macbook pro is snappier in day-to-day ui movements than my newer retina mbp which I'm guessing must be due to the added strain of those extra pixels.
Click to expand...
Click to collapse
Performance isn't 'canceled out' since the 805 can handle Ultra HD (4K) screens. The following scteenshot is from Qualcomm's site which summarizes the specs of both the 801 and 805. Snapdragon 805 breakdown
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
However, QHD content is currently limited to whatever bloat comes pre-installed from Samsung and wallpapers you can find online . As a result, people can't yet take full advantage of of that high res screen until app developers update their apps and there are very little YouTube videos above 1080p.
3GB RAM vs 2GB in the S5.
Great Samsung. Great.
That phone is a beast. But so is the regular galaxy S5. I don't think ppl should be upset really, its business. If you enjoy your phone then just enjoy it.
Sent from my SM-G900T using Tapatalk
There will be little or no QHD (2K) native content. Commercial content will be 1080p or UHD (4K). That means it'll be up and down scaled which impacts image quality. QHD is a marketing ploy of questionable value. Lots of reasons supporting this in the M8 comparison thread.
This is indeed a good read regarding Snapdragon 805's cpu and gpu power.
http://www.anandtech.com/show/8035/qualcomm-snapdragon-805-performance-preview
From there I got couple important points:
The 805 can handle a QuadHD resolution device at the same frame rate and with same performance that an 801 can drive a 1080p device.
It is said to use 20% less power and provide 40% more performance compared to an 800 SoC
The 805 has small (comparatively) cpu boost and significant gpu and video engines boost.
The GPU tests were there. It nailed pretty much everything.
The 805 has HEVC HW decoder, but no HEVC HW Acceleration until the 810 SoC comes out in H1 2015.
I personally did not know what HEVC was. It is actually H.265 codec that provides magnificent video output at much lower bitrate than ax264/h.264 encoded video.
I curiously downloaded Big Buck Bunny 1080p encoded with HEVC that sized only 130 MB. The h.264/x264 encoded video was available at the Big Buck Bunny's official website to download which was roughly 700 MB. The 130 MB file indeed provided great output compared to its regular 700MB variant!
The only thing I´d like in my S5 is 3GB ram. I do not want a QHD display which will only drain extra battery without me even being able to tell the difference.
Apoxx said:
The only thing I´d like in my S5 is 3GB ram. I do not want a QHD display which will only drain extra battery without me even being able to tell the difference.
Click to expand...
Click to collapse
Why would you like 3gb of ram? Have you ever run out of it on your S5?
Sent from my SAMSUNG-SM-G900A using XDA Premium 4 mobile app
Yeah alot of times, many apps have to reload when multitasking, it´s clearly not as good at multitasking as the note3 for instance.
WizeGuyDezignz said:
Why would you like 3gb of ram? Have you ever run out of it on your S5?
Click to expand...
Click to collapse
So true. Someone above wants S-805 because it's "faster." Faster at what? 85% of apps don't use more than two cores according to Qualcomm. Screen transitions and app openings certainly don't need S-805. In every upgrade cycle (720p/S-600<>1080p/S-800<>QHD/S-805) the potentially huge gains in performance and battery life ended up minor because of the resources consumed by the display. Let's see some benchmarks from the SGS5 LTE-A before everyone wets themselves over it. Display quality took a big hit on the LG G3. Let's see how Samsung does.
BarryH_GEG said:
So true. Someone above wants S-805 because it's "faster." Faster at what? 85% of apps don't use more than two cores according to Qualcomm. Screen transitions and app openings certainly don't need S-805. In every upgrade cycle (720p/S-6001080p/S-800QHD/S-805) the potentially huge gains in performance and battery life ended up minor because of the resources consumed by the display. Let's see some benchmarks from the SGS5 LTE-A before everyone wets themselves over it. Display quality took a big hit on the LG G3. Let's see how Samsung does.
Click to expand...
Click to collapse
Yeah, that's exactly why I asked. I don't think people understand how ram works, they just want more because it sounds good.
Unused ram is exactly that, unused ram. No matter how many apps I've had open at once, I've never reached near 2gb of usage.
Sent from my SAMSUNG-SM-G900A using XDA Premium 4 mobile app
WizeGuyDezignz said:
Yeah, that's exactly why I asked. I don't think people understand how ram works, they just want more because it sounds good.
Unused ram is exactly that, unused ram. No matter how many apps I've had open at once, I've never reached near 2gb of usage.
Sent from my SAMSUNG-SM-G900A using XDA Premium 4 mobile app
Click to expand...
Click to collapse
Same here. And it's a Korea exclusive, I don't know why people are whining.
Enviado do meu Galaxy S5
WizeGuyDezignz said:
Yeah, that's exactly why I asked. I don't think people understand how ram works, they just want more because it sounds good.
Unused ram is exactly that, unused ram. No matter how many apps I've had open at once, I've never reached near 2gb of usage.
Sent from my SAMSUNG-SM-G900A using XDA Premium 4 mobile app
Click to expand...
Click to collapse
That´s because the phone hibernates apps before it ever reaches full RAM usage. More RAM = more open apps which allows for faster multitasking. You´d think this was basic knowledge by now.
I´m sure you all have switched to an open app only to have it reload like it was first opened.
Coming from an iPhone with 1GB of ram I am overly aware of this issue.
And no I do not want the new S5, I just wish Samsung had put 3gb in the S5 in the first place, like they did in the note3.
I'm going to downscale the device resolution to 1080p for a few days if the 2K screen really affects the battery that bad.. If you want to do this, download Resolution Changer from Google Play, and set the screen res to 1080x1920 480 dpi, apply and reboot (important to reboot, if not, the UI will be messed up)
Going to edit this in 3 days. Cheers!
Unless you only use up exactly 1920x1080 pixels on the screen battery life is not going to change much.
Right now that resolution just causes the screen to upscale so the only battery savings would be from the gpu working less driving a lower resolution. The screen which is the main batt hog will still be drawing power as per normal.
Unless you are 3D gaming at 2K then there will be little savings if any. 2D is pretty easy to render at 2K Resolution. And the GPU is probably optimized for 2K and NOT 1080p. So you might actually see a performance decrease running lower resolutions like the old days of Voodoo2 GPU on PC. I dont know... benchmark it and see.
Sublation said:
Unless you are 3D gaming at 2K then there will be little savings if any. 2D is pretty easy to render at 2K Resolution. And the GPU is probably optimized for 2K and NOT 1080p. So you might actually see a performance decrease running lower resolutions like the old days of Voodoo2 GPU on PC. I dont know... benchmark it and see.
Click to expand...
Click to collapse
There is no such a optimization for a specific resolution.
Lodix said:
There is no such a optimization for a specific resolution.
Click to expand...
Click to collapse
Good job you work for Samsung and are privvy to their device drivers etc and cleared that up for us.
Jonathan-H said:
Good job you work for Samsung and are privvy to their device drivers etc and cleared that up for us.
Click to expand...
Click to collapse
Exactly that.
To the moderator's, just close this thread before it becomes a cesspool.
scoopdreams said:
Unless you only use up exactly 1920x1080 pixels on the screen battery life is not going to change much.
Right now that resolution just causes the screen to upscale so the only battery savings would be from the gpu working less driving a lower resolution. The screen which is the main batt hog will still be drawing power as per normal.
Click to expand...
Click to collapse
unfortunately, this is the correct response. even though the human eye cannot spot the difference between 2K and 1080p if the screen isn't right on your eyes (there have been several articles about it on XDA portal..interesting reading), lowering the resolution on this device is useless as your phone will power the extra pixles(they are still there) but the GPU just won't process them.
2K was and will be a marking gimmik to sell more phones. just like 64 bit.
hopefully we'll see a 5.5' flagship this year with 1080p- same view as 2K on 5.5' screens and with 6+ SOT easily.
scoopdreams said:
Unless you only use up exactly 1920x1080 pixels on the screen battery life is not going to change much.
Right now that resolution just causes the screen to upscale so the only battery savings would be from the gpu working less driving a lower resolution. The screen which is the main batt hog will still be drawing power as per normal.
Click to expand...
Click to collapse
That is correct, but there's one more thing:
the GPU won't work less. Whatever power is saved not having to draw the full size of the screen, will be spent upscalling the 1920x1080 image to 2560x1440. I bet that this upscalling can draw even more power from the battery than simply let it work on its native resolution.
So, if your intention is to save power, reducing the resolution is a bad idea, and it is totally useless.
The only reason to lower the resolution is if you have eyesight problems.
tal123 said:
unfortunately, this is the correct response. even though the human eye cannot spot the difference between 2K and 1080p if the screen isn't right on your eyes (there have been several articles about it on XDA portal..interesting reading), lowering the resolution on this device is useless as your phone will power the extra pixles(they are still there) but the GPU just won't process them.
2K was and will be a marking gimmik to sell more phones. just like 64 bit.
hopefully we'll see a 5.5' flagship this year with 1080p- same view as 2K on 5.5' screens and with 6+ SOT easily.
Click to expand...
Click to collapse
With Snapdragon 810 forget about it xD
The LG G Flex 2 has a poor battery life taking in mind the Full HD+ 3000mAh + the supposed lower power Snapdragon 810.
Lodix said:
With Snapdragon 810 forget about it xD
The LG G Flex 2 has a poor battery life taking in mind the Full HD+ 3000mAh + the supposed lower power Snapdragon 810.
Click to expand...
Click to collapse
''Thankfully, the battery isn't as persnickety as the rest of the phone: It typically managed about 13 hours on a single charge, all while I was futzing around in HipChat and Hangouts, firing off emails from CloudMagic and playing the occasional documentary in the background on YouTube. When it came time for our standard video-rundown test (all together now: looping a 720p video with screen brightness set to 50 percent), the new G Flex held out for ten hours and 13 minutes before needing a top-up. As it turns out, the charger's no slouch either; it takes the phone from 0 to 50 percent in about an hour, and it'll top off the battery completely in less than an hour after that.''
http://www.engadget.com/2015/02/18/lg-g-flex-2-review/
I'm reading mixed reviews about the LG F Flex 2. however all in all I don't think 10 SOT at 50% brightness is less than amazing (they didn't put a chart showing comparisons to other phones, but personally I don't get more then 5.5 SOT on the note 4 no matter what)
tal123 said:
''Thankfully, the battery isn't as persnickety as the rest of the phone: It typically managed about 13 hours on a single charge, all while I was futzing around in HipChat and Hangouts, firing off emails from CloudMagic and playing the occasional documentary in the background on YouTube. When it came time for our standard video-rundown test (all together now: looping a 720p video with screen brightness set to 50 percent), the new G Flex held out for ten hours and 13 minutes before needing a top-up. As it turns out, the charger's no slouch either; it takes the phone from 0 to 50 percent in about an hour, and it'll top off the battery completely in less than an hour after that.''
http://www.engadget.com/2015/02/18/lg-g-flex-2-review/
I'm reading mixed reviews about the LG F Flex 2. however all in all I don't think 10 SOT at 50% brightness is less than amazing (they didn't put a chart showing comparisons to other phones, but personally I don't get more then 5.5 SOT on the note 4 no matter what)
Click to expand...
Click to collapse
Personal usages are subjective and I don't care about them.
It is not 10hours of SOT in normal usage, it was a test.
In GSMarena where they do pretty well battery Test comparisons, they show a mediocre scores from what you should have expected.
At The same standard-test, Engadget says that Note 4 get 13h.
But 50% brightness on Note 4 is two times higher (291nits) than on the GFlex 2 (152nits).
So results are pretty poor on the Gflex 2 considering the fact that both have the same max brightness.
tal123 said:
.......
2K was and will be a marking gimmik to sell more phones. just like 64 bit.
hopefully we'll see a 5.5' flagship this year with 1080p- same view as 2K on 5.5' screens and with 6+ SOT easily.
Click to expand...
Click to collapse
Once you use phone with Gear VR you will know that even 2K is not enough.
Sent from my SM-N910T using XDA Free mobile app
And you also have to take in mind pentile matrix. A RGB panel with the same resolution looks more shaper.
galaxynote2 said:
I'm going to downscale the device resolution to 1080p for a few days if the 2K screen really affects the battery that bad.. If you want to do this, download Resolution Changer from Google Play, and set the screen res to 1080x1920 480 dpi, apply and reboot (important to reboot, if not, the UI will be messed up)
Going to edit this in 3 days. Cheers!
Click to expand...
Click to collapse
Antutu results:
48715 with 2k resolution
52553 with 1080
Interesting.. but the samsung keyboard size is bad. How can i fix it¿?
Thanks
nacholo said:
Antutu results:
48715 with 2k resolution
52553 with 1080
Interesting.. but the samsung keyboard size is bad. How can i fix it¿?
Thanks
Click to expand...
Click to collapse
I get 52526 at 2K:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Amazing score! It,s possible to get more than 56k in your phone at 1080!!
Enviado desde mi SM-N910F mediante Tapatalk
My N910F scored lower in 1080p then it did in 2k.
The highest score is default resolution.
Hi!
So, i found how the 8 cores from the P8 Lite works!
First of all, based on the Cortex-A53 the octa-core 64-bit Kirin 620 processor was made to supports HDR photography, and 1080p HD video encoding and decoding. The chipset also supports ultra-fast LTE Cat4 with downlink speeds up to 150Mbps. The performance can be comparable to Qualcomm’s Snapdragon 410.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
So, how it works in phone?
I installed the app System2 and stay for a will observing the cpus and is like this, the first block of 4 cores are linked and have for objective the first ground apps, the ones that are running "in first plan", the second block of 4 cores are for background apps and functions like for example installing a app, in the image below i was updating from Google Play you can see the first block of 4 cores almost sleeping and the second block of cores almost at 100.
You can confirm this by the graphics color.
So, like this in a benchmark the scores are from the first block of 4 cores, and the phone release them from the not so important apps and put them in the second block of a 4 cores.
Like this, first block of 4 cores are masters, second block of 4 cores are slaves!
The kernel specifies what each core does.
Will this helps in improvement? I think so, but not a hudge one.
This explains why Benchmark apps give the same scores then a 4 cores CPU, they are only using the first block of 4 cores ...
ExtremeTech said:
Applications like Geekbench have become popular as a way to demonstrate the theoretical performance of smartphone SoCs, but real-world application testing shines a different light on things (Moor Insights also tested camera and chat applications to round out their multi-core evaluation). As things stand, there are some benefits to quad-core devices and virtually no gains from octa-core. In a few cases, moving to more cores actually makes things worse. The overall situation will depend on which applications you use, but the relentless push to add cores is doing end-users no favors.
Click to expand...
Click to collapse
.
We don't have big.LITTLE...
dominj97 said:
We don't have big.LITTLE...
Click to expand...
Click to collapse
But works in the same way ...
.
persona78 said:
But works in the same way ...
.
Click to expand...
Click to collapse
I don't get how does it save power or does anything else good for this device processor to work only 4 cores at time, when all 8 cores are similiar to each others. Maybe I read a bit hastly but I only find stuff about how it helps power saving if all cores aren't the same.
keikari said:
I don't get how does it save power or does anything else good for this device processor to work only 4 cores at time, when all 8 cores are similiar to each others. Maybe I read a bit hastly but I only find stuff about how it helps power saving if all cores aren't the same.
Click to expand...
Click to collapse
You're right, what is the plan here? Same cores, but only 4 working for power saving? It's something else, I guess.. It's not big.little... And how we can activate all of them...?
That was stated a of times on this forum. The phones uses them all. Use an app like system panel 2. It will show all of them.
I found this to...
ExtremeTech said:
Applications like Geekbench have become popular as a way to demonstrate the theoretical performance of smartphone SoCs, but real-world application testing shines a different light on things (Moor Insights also tested camera and chat applications to round out their multi-core evaluation). As things stand, there are some benefits to quad-core devices and virtually no gains from octa-core. In a few cases, moving to more cores actually makes things worse. The overall situation will depend on which applications you use, but the relentless push to add cores is doing end-users no favors.
https://www.extremetech.com/extreme...re-smartphone-isnt-as-fast-as-you-think-it-is
Click to expand...
Click to collapse
Vinnipinni said:
That was stated a of times on this forum. The phones uses them all. Use an app like system panel 2. It will show all of them.
Click to expand...
Click to collapse
Do you need to do something to make it show all cores? It's only showing 4 cores online for me.
just my 2p:
Set battery to smart (hope it's the english version), then use asphalt 8 (or other heavy app)
Set battery to performance, then use asphalt 8 (or other heavy app)
You can see a lot of difference.
Set battery to smart (hope it's the english version), check battery percentage, then use as always your phone, after 1 hour check battery percentage again
Set battery to performance, check battery percentage, then use as always your phone, after 1 hour check battery percentage again
Use Antutu and when testing cpu, look at system panel 2 and you will see all the cores working at 100% In this capture is not 100% but I have managed to work at 100%
It has to be a stock rom or it has not changed the kernel, most of the custom rom has a bug in the kernel and the cpu does not work well.
juanche007 said:
Use Antutu and when testing cpu, look at system panel 2 and you will see all the cores working at 100% In this capture is not 100% but I have managed to work at 100%
It has to be a stock rom or it has not changed the kernel, most of the custom rom has a bug in the kernel and the cpu does not work well.
Click to expand...
Click to collapse
I test it with cpu monitor, i run Antutu and nothing happens in the last 4cores...
Check the image below, the high curve is when i was testing with Antutu... The 4 cores bellow are sleeping.
THEY ARE NOT SLEEPING. The most apps are not able to display all cores. Run antutu and try system panel 2. PACperformance is also able to display all cores. There are some others, but most aren't able.
persona78 said:
I test it with cpu monitor, i run Antutu and nothing happens in the last 4cores...
Check the image below, the high curve is when i was testing with Antutu... The 4 cores bellow are sleeping.
Click to expand...
Click to collapse
Plus you can "unlock" the mhz, you can have constant 1200MHz by deleting/renaming something... Kernel Adiutor doesn't really help... I tested this too, and it worked, the only thing is that somehow i'm getting stable 22-25 fps, next runs stable 40 on 3dmark, idk why sometimes I have those low fps-es...
Vinnipinni said:
THEY ARE NOT SLEEPING. The most apps are not able to display all cores. Run antutu and try system panel 2. PACperformance is also able to display all cores. There are some others, but most aren't able.
Click to expand...
Click to collapse
You are right and me to!
I installed the app that you use and stay for a will observing the cpus and is like this, the first 4 cores are linked and have for objective the apps that are running "in first plan", the second block are for background apps and functions like installing a app, in the image below i was updating from Google Play you can see the first 4 cores almost sleeping and the second block of cores almost at 100.
You can confirm this by the graphics color.
So, like this in a benchmark the scores are from the first 4 cores, and the phone release them from the not so important apps and put them in the second block of a 4cores.
Like this, first 4 are masters, second 4 are slaves.
.
persona78 said:
You are right and me to!
I installed the app that you use and stay for a will observing the cpus and is like this, the first 4 cores are linked and have for objective the apps that are running "in first plan", the second block are for background apps and functions like installing a app, in the image below i was updating from Google Play you can see the first 4 cores almost sleeping and the second block of cores almost at 100.
You can confirm this by the graphics color.
So, like this in a benchmark the scores are from the first 4 cores, and the phone release them from the not so important apps and put them in the second block of a 4cores.
Like this, first 4 are masters, second 4 are slaves.
.
Click to expand...
Click to collapse
Are games using all cores, like for ex. minecraft pe, gta sa, mobile legends, rr3 etc.?
Vinnipinni said:
THEY ARE NOT SLEEPING. The most apps are not able to display all cores. Run antutu and try system panel 2. PACperformance is also able to display all cores. There are some others, but most aren't able.
Click to expand...
Click to collapse
PACPerformance shows wrong info. Or at least shows wrong info for offline cores. Can be tested by turning cores 1-3 off.
D1stRU3T0R said:
Are games using all cores, like for ex. minecraft pe, gta sa, mobile legends, rr3 etc.?
Click to expand...
Click to collapse
Im sure it use it, heavy games use many background functions like graphics, and like i read in some Huawei site this cpu is built regarding media functions like filming in 1080p, pictures etc.
.
So, i re-write the main post.
Correct me if im wrong!
.
You should uninstall HWPowerGenie or However it's called. Then all cores will be used the same IIRC
Vinnipinni said:
You should uninstall HWPowerGenie or However it's called. Then all cores will be used the same IIRC
Click to expand...
Click to collapse
I have edited my HwPowerGenieEngine3 app so i know what is inside the app but the app only controls the frequencies on hidle, on perfomance, etc, doens´t control how cpus works.
Code:
<?xml version="1.0" encoding="utf-8"?>
<cpu_load version="1">
<para modid="0">
<upload>80</upload>
<upchecktimes>2</upchecktimes>
<upcheckspace>1000</upcheckspace>
<upoffset>200000</upoffset>
<maxchecktimes>120</maxchecktimes>
</para>
<para modid="1">
<upload>90</upload>
<upchecktimes>2</upchecktimes>
<upcheckspace>2000</upcheckspace>
<upoffset>200000</upoffset>
<maxchecktimes>250</maxchecktimes>
</para>
<para modid="2">
<upload>90</upload>
<upchecktimes>2</upchecktimes>
<upcheckspace>2000</upcheckspace>
<upoffset>200000</upoffset>
<maxchecktimes>60</maxchecktimes>
</para>
<para modid="3">
<upload>95</upload>
<upchecktimes>3</upchecktimes>
<upcheckspace>2000</upcheckspace>
<upoffset>200000</upoffset>
<maxchecktimes>60</maxchecktimes>
</para>
</cpu_load>
And in pictures you can see the files inside the app.
.