possible graphics performance enhancements - Nook Color General

Hi,
I spent the last couple of weekends trying to improve graphics performance on the nook color. I had two approaches and got stuck on both, but I thought I'd write them down here in case anybody else can figure out where I was going wrong.
16BPP mode:
The panel supports native 16bpp as far as I can tell (it seems that u-boot drives it in 16bpp). I tried disabling the CONFIG_FB_OMAP2_32_BPP kernel option, and changing the numbers in panel-boxer.c and board-3621_boxer.c to request 16bpp mode. The kernel compiles and boots fine and does draw output on the LCD (and surfaceflinger on boot reports 565 16bpp mode), however the LCD was being turned off any time there was no animation running with an error in the kernel log about a dispc GFX FIFO Underflow. I spent some time trying to figure how the FIFO high/low watermarks were calculated but wasn't able to get reliable output. There shouldn't be any underflow in 16bpp mode, as it ought to be using less memory bandwidth than 32bpp mode to scan out to the LCD...
USE_COMPOSITION_BYPASS:
Last year it looks like some work was done for the Samsung C110 SoC (as used in the Nexus S, etc) to allow fullscreen OpenGL apps to draw directly to the framebuffer instead of drawing to a surface, and having surfaceflinger then copy that surface to the framebuffer. This could be a huge graphics performance win as it avoids a fullscreen copy every frame. I tried enabling it in a CM7 build (and fixed the couple of compile errors relating to the changed surface locking semantics -- basically just remove unlockClients() and add hw.compositionComplete()) however as soon as SurfaceFlinger determined that a Surface could be given the framebuffer the client of that surface died. I found that the GraphicBuffer allocated in Layer.cpp:381 had an fd of -1, which binder was refusing to send over to the client because -1 isn't a valid fd. This fd comes back out of gralloc.omap3.so, which is closed source.
I'm not sure why the fd is -1, since it seems the buffer was allocated successfully -- I guess there's no reason to have a valid fd for a framebuffer-backed GraphicBuffer normally because they always stay in surfaceflinger. I thought of maybe just opening /dev/graphics/fb0 and using that fd, but didn't have time to try it.
Anyway, that's as far as I got on those two ideas before I ran out of time. Hopefully somebody else can pick up where I left off. The composition bypass is probably the best optimization if it works, because it's essentially free and would cause no decrease in graphics quality (like 16bpp mode would). It would only benefit fullscreen (no system bar) OpenGL apps (like most games).
Also, I would have posted in Android Development, but I haven't posted 10 times yet...

Related

Delay by touchscreen, not only by missing video drivers

Provoking headline, no? Well, first things first - I also agree to the position that there should be suitable video drivers for the Kaiser; I signed the petition and all. Nevertheless, I may have found another culprit:
I found that the frame rate drops noticeably as long as you tap on the screen (when running "Quake2", for instance). This tells me that the touchscreen is fully operational only if a touch is sensed - at least I wouldn't do this much differently. This is because it is relatively straightforward, fast and power saving a process if you want to sense the presence of a touch, as long as you can ignore the position. But it requires a certain sequence of events in order to determine the X & Y positions. Depending on the type of touchscreen and the supporting hardware, there may be certain processing delays.
Now my suspect is that the processor has to handle the touchscreen all by itself, just with the help of an A/D converter and a few port lines. Which means it may be subjected to a certain processing load if the exact position of a touch is to be analyzed.
Awaiting your comments...
Buster
2 theories about that .
first : if i disable TouchFlo software it should be no issue after that , like thousands of other PDA with "Stylus Only" Touchscreen .
second : if the drivers are aviable , the processor have more capacities for calculating the stylus/finger Position and anything might run smoother ...
also i have noticed that the issue is different in different windows , like in the programm screen its more laggy then in the settings screens
joolsthebear said:
2 theories about that .
first : if i disable TouchFlo software it should be no issue after that , like thousands of other PDA with "Stylus Only" Touchscreen .
second : if the drivers are aviable , the processor have more capacities for calculating the stylus/finger Position and anything might run smoother ...
also i have noticed that the issue is different in different windows , like in the programm screen its more laggy then in the settings screens
Click to expand...
Click to collapse
1st: TouchFlo is off in my case. 2nd: Thats right. I should have added that the drivers may be responsible as well (do you have good ones ;-) ??). But even with the best possible drivers there will be a certain maximum detection speed, depending on the touchscreen architecture (well, a resistive one shouldn't represent much of a problem). I only wondered whether the overall detection speed is so low that it can impact video rendering. Last paragraph: thats another factor (window handler, I 'd say), apart from touchscreen processing - but only as long as these handlers are running. Certain games take over the system almost completely, so there won't be much resources available to the normal windows processing.
yes, this issue has been covered many times over, even at htcclassaction.org
i think the slowness of scrolling the programs menu is not due to no graphics drivers, as the settings menu exhibits no such lag. i suspect te programs menu is jerky as the system must reload icons/data as items come into view - i beleive this is a caching (or lack of) issue.
the touchscren takes up a lot of cpu and as far as i can tell it doesnt matter what program has focus. touching the screen and moving my finger around produces around 50% cpu usage with both touchflo turned on and off (via the settings icon).
fusi said:
i think the slowness of scrolling the programs menu is not due to no graphics drivers, as the settings menu exhibits no such lag. i suspect te programs menu is jerky as the system must reload icons/data as items come into view - i beleive this is a caching (or lack of) issue.
the touchscren takes up a lot of cpu and as far as i can tell it doesnt matter what program has focus. touching the screen and moving my finger around produces around 50% cpu usage with both touchflo turned on and off (via the settings icon).
Click to expand...
Click to collapse
That's what I meant.
Buster

A clarification on missing drivers - my thoughts - you won't like them.

Well, i was here from the beginning. I didn't start the petition, but i posted it and asked for a sticky. I was one of the first few that posted about the lack of 3D. I'm very angry about my slow crappy Kaiser. But...
A BIG BUT:
- The "drivers" would NOT solve the slow video we see in TCPMP, since TCPMP decompresses in software. And any release of CorePlayer has the same performance. Maybe one released after the "drivers" are installed would be better.
- The "drivers" would NOT score higher in SPB benchmark.
We don't actually need "drivers". We need a build of WM6 (WM6.1) which is using a better SDK (libraries) from Qualcomm.
This is what i think needs fixing:
- Accurate VSync - so we don't get tearing anymore - this may also solve some slow programs.
- Better implemented sound library - i can't believe nobody complained about the sound - which is the worst, ever, in the whole world, in the universe. It sound like an old radio. A broken old radio. A broken old radio in a Faraday cage tuned to the wrong frequency...
BTW: the 4.3k score on graphics that the Kaiser got, and any other graphic benchmark is VERY VERY FAKE - they say 40+ FPS in some test, but i see 5 FPS on my screen (trust me, i know, i'm a render programmer in the game industry, i have an eye for these). What is actually happening is that the program says: draw this on the screen, and the hardware says "done" about 400 times a second, and it actually didn't render anything - this is what i mean by a good VSync.
I say this again - if we had 100% working drivers on our Kaiser right now, and you try them with TCPMP -> no difference, trust me. If you run Quake for PPC -> no difference, again. You would probably never know you had HW acceleration. The 2D HW part would be noticed in the Windows GUI (maybe) and in some programs (very few - those that use DirectDraw, and use it correctly)
Maybe we should try asking for what i said - no tearing, better sound, and maybe we'd get it.
Please, please don't post if you don't understand what i'm saying. This is for the big boys
agreed many of the ports of old 3d games like duke3d, doom1-2 quake1-3 .....
would most likely never benefit from a real driver as they are old games which
dident even benefit from 3d cards in pcs
so thinking that a driver will make anything faster is likely to cause tears
plus the ati powers of the kaiser and others is really not much of a 3d powerhouse like the geforce counterpart it does add some 2d speed up but thats about it
which is prob why a money aware company like qualcomm would bother to pay ati
the pretty small fee for their license
but at the end of the day htc are cheapskates
I understand your points and believe that everything you said is true. (Specially about the video drivers not solving the slow video problem).
However I don't see any problems with the audio on the Kaiser (at least on mine).
But yes, I also say that we need a better OS implementation for the Qualcomm chip.
Should we start a new (and proven ineffective) petition to HTC?
Cheers!
RayanMX
what borthers me is that htc seem to be happy enough to be the biggest wm device maker rather then really making an efford to compeat with iphone they seem to sleep on the bed of roses that iphone not being that available outside of us and not being 3g or having sd interface
kinda sad imho :S
Oh, and the funny part is that if they would have fixed the tearing and the DirectDraw bug before releasing it, nobody would have cared about the 3D HW not being used. Well, maybe me, omikr0n, and ten other guys would have cared, but it would have been forgotten in a few weeks.
The 2D DirectDraw was noticed only because TCPMP and CorePlayer have the option to use DDraw. And it could have been fixed - CorePlayer devs found a way around it.
In the end this is just a rushed device, using a cheap (and slow) platform - 400Mhz means nothing, it behaves like a 300Mhz Intel or 240Mhz TI OMAP - not a fact, just an analogy.
Just have one comment about the sound...
did you try the sound using any stereo speakers other than the mono speaker on the device itself? it is actually one of the finest and cleanest audio output i've heard for a while, i compared to ipod and tilt's sound was MUCH better
RPG0 said:
This is what i think needs fixing:
- Accurate VSync - so we don't get tearing anymore - this may also solve some slow programs.
- Better implemented sound library - i can't believe nobody complained about the sound - which is the worst, ever, in the whole world, in the universe. It sound like an old radio. A broken old radio. A broken old radio in a Faraday cage tuned to the wrong frequency...
BTW: the 4.3k score on graphics that the Kaiser got, and any other graphic benchmark is VERY VERY FAKE - they say 40+ FPS in some test, but i see 5 FPS on my screen (trust me, i know, i'm a render programmer in the game industry, i have an eye for these). What is actually happening is that the program says: draw this on the screen, and the hardware says "done" about 400 times a second, and it actually didn't render anything - this is what i mean by a good VSync.
Click to expand...
Click to collapse
Being a developer for such a long time, I feel you.
We should release our own benchmark program that we know won't get nop'ed out by some other timing code if we can't get to the vsync register.
The WM default midi soundbank is still crap as well as playback. My SE T610, and all my Palm Pilots playback midi with much more clarity. I know it's not the hardware because the alternative GSPlayer+ MIDI player plays the same samples just fine.
What can be learnt from the Apple II days is to just vote with the wallet. Fighting the hardware maker is wasted time and effort. Only complaint that gets their attention is a walking wallet.
edit: you know, we used to rawbuffer most of our graphics on the same resolution screen (320x240) on a vastly inferior CPU. No reason we can't try to do the same on the Kaiser with a super ARMv6 asm optimized library. We just need to shut up the OS from stealing so many cycles.
Of course I'm only thinking of 2D gaming without video decoding, and only MIDI music since those are the least CPU intensive.
ahussam said:
Just have one comment about the sound...
did you try the sound using any stereo speakers other than the mono speaker on the device itself? it is actually one of the finest and cleanest audio output i've heard for a while, i compared to ipod and tilt's sound was MUCH better
Click to expand...
Click to collapse
+1
I forgot I had an iPod since I got my Tilt. With an 8gb SD who needs one?
How do you come to the conclusion that drivers wouldn't help DDraw applications like CorePlayer and such?
Devices with proper drivers seem to work just fine with DirectDraw and they are able to create a proper HW overlay etc.
Granted it would not solve decoding of video, that's given. But it could/would/should surely speed up the actual rendering.
Writing to the dedicated video RAM instead of creating a framebuffer in normal system RAM should be faster. Hardware overlay should be faster than just using the standard rendering paths etc.
As for games: of course they won't be sped up unless the actual game supports D3D or OGL. Most games don't (even "fancy" 3d ones) but some do, such as COD.
Also, a proper DDI driver can and will speed up 2D rendering in general. Doing simple stuff like rendering just a menu is WAY too slow right now and that's unrelated to vsync. (There's certainly a world of difference between lacking the smootheness and tearing "freeness" of proper vsync and just performance in general, it shouldn't take a full second to draw a simple screen in WM if no other operations are active.)
As for sound I did have those problems but as of the latest 1.65.14.06 radio my audio is pretty much top notch. Sounds just as good as my ipod or my creative zen players and a whole lot better than the integrated soundcard of my laptop (realtek hd audio).
That's both during calls but especially when listening to music.
Of course with the bundled headphones or even with HTCs "high end" headphones it all sounds like crap. With a proper set of of in ear plugs (I've a "cheap" Shure set and a more expensive Sony one and they both sound simply awesome.)
Actually I really like using it as an audio player as it supports WMA Lossless (undocumented feature of Windows Mobile 6), few devices.
For whatever reason though poorly compressed songs sound much worse on this device compared to MP3-players in general.
If it's because the codec is poor or if it's so great that it makes smaller variances noticable is hard to say (I'd go for poor codec) but it's certainly not due to "poor sound".
Thats my $0.2
RPG0 said:
- The "drivers" would NOT solve the slow video we see in TCPMP, since TCPMP decompresses in software. And any release of CorePlayer has the same performance. Maybe one released after the "drivers" are installed would be better.
Click to expand...
Click to collapse
I have to disagree here...
The current routines being used for DirectDraw are slow and inefficient. Probably using very slow CPU calls. It's for sure a software CPU routine drawing the pixels.
Anyone who has done raw hardware video programming can attest if you popped an interrupt to draw one pixel on the screen vs using DMA access the difference is night and day. The interrupt method chews up CPU cycles drawing the pixels ont he screen vs the DMA method which doesn't waste CPU cycles and is far more efficient.
Pointing directdraw to use hooks into the GPU and or DMA access to a hardware frame buffer would improve things significantly (night and day). You can easily see this by playing a video on an old PPC6700 which has proper direct draw routine implemented. The difference is HUGE factor of 10 I'd say.
The slow video rendering has nothing to do with the CPU doing the video decompression. It has to do with the directdraw routines not being implemented efficiently and the CPU wasting it's cycles drawing pixels on the screen. You can see "tearing" because you can SEE the video refreshing the screen out of the vsync due to the slow directdraw routines that exist. The the CPU should be concentrating on decoding video not wasting cycles drawing the video. The directdraw routine should be writing to a framebuffer using DMA or some similar method.
This is what's going on (high level lame explantion). Pretend the below are mappings.
Method #1
DirectDraw -> IntXX (video interrupt)
or
Method #2
DirectDraw -> DMA (direct memory to the video card)
Using method #1 it will cost you CPU cycles just to draw the pixels. So not only does your device need to concentrate on decoding the video it also needs to waste CPU cycles drawing pixels on the screen one by one. And the larger the screen area your painting, the more CPU it costs you. You can even experience this on a device like the Mogul (6800); you can get faster frame rates when you draw less pixels.
Using method #2 drawing video costs your CPU basically no overhead and it can spend it cycles decoding the video. It will use DMA to write to a video frame buffer instead of making CPU calls to do it.
These are just examples, but this how it's broken from my hardware programming experience.
The post above mine is also a good reference.
I stand by my statement, that having a good DDraw implementation will not help. In theory, you're right, but on other devices (Htc Prophet aka Qtek S200 and HTC Touch) there was no difference between Raw framebuffer or GDI and DDraw. So real-life scenarios tend to prove me right.
The only reason things could speed up is the hardware conversion between YUV and RGB, but for a 320x240 frame, that takes very little time to do in software, and for some codecs, the conversion is not necessary. I can explain what YUV means, and why it's used in video/image compression if anyone is interested, but you can google youself.
About the sound part, maybe i have an old radio ROM, maybe i have a defective device or maybe it's just my configuration.
EDIT: Don't forget that even when using HW overlay, you STILL have to fill the surface with pixels (the pixels you just decoded), so you have to write 320x240 somewhere (with DDraw you write in the memory area you get with Lock(), in Raw framebuffer you write directly in an area that is drawn afterwards). If you ignore the YUV->RGB conversion, you gain NOTHING with DDraw.
As i said, i'm a game render programmer, and i did some image/video compression/decompression in my time, so you can get technical, I'll understand.
so !!
hi fpgo,so how are sony/ericsson getting round video prob on X1 xperia ? I guess their not going to market with a half crippled device, unlike htc,from what I can make out x1 is only a tweaked kaiser,so could we not just back engineer their solution ?
greatly disapointed with kaiser all round.if id known se version was due,id have waited to upgrade,still prefer my p910i,just no 3g.
tleaf100 said:
hi fpgo,so how are sony/ericsson getting round video prob on X1 xperia ? I guess their not going to market with a half crippled device, unlike htc,from what I can make out x1 is only a tweaked kaiser,so could we not just back engineer their solution ?
greatly disapointed with kaiser all round.if id known se version was due,id have waited to upgrade,still prefer my p910i,just no 3g.
Click to expand...
Click to collapse
If you watch the CNET video from Bonnie Cha on the X1, you'll see it's actually pretty slow in the rendering as well, so forget that.
Now back to the experts.
X1
sorry,not seen video,will go and have a look.
was only an idea...
will leave to you "experts" ....
RPG0 said:
I stand by my statement, that having a good DDraw implementation will not help. In theory, you're right, but on other devices (Htc Prophet aka Qtek S200 and HTC Touch) there was no difference between Raw framebuffer or GDI and DDraw. So real-life scenarios tend to prove me right.
The only reason things could speed up is the hardware conversion between YUV and RGB, but for a 320x240 frame, that takes very little time to do in software, and for some codecs, the conversion is not necessary. I can explain what YUV means, and why it's used in video/image compression if anyone is interested, but you can google youself.
About the sound part, maybe i have an old radio ROM, maybe i have a defective device or maybe it's just my configuration.
EDIT: Don't forget that even when using HW overlay, you STILL have to fill the surface with pixels (the pixels you just decoded), so you have to write 320x240 somewhere (with DDraw you write in the memory area you get with Lock(), in Raw framebuffer you write directly in an area that is drawn afterwards). If you ignore the YUV->RGB conversion, you gain NOTHING with DDraw.
As i said, i'm a game render programmer, and i did some image/video compression/decompression in my time, so you can get technical, I'll understand.
Click to expand...
Click to collapse
Well, YUV overlay support would be very nice, but I doubt we will see it working on kaiser (does Imageon even supports it?). But.. Working DDraw accel. would still help - for example when doing soft yuv->rgb conversion and double buffering result - you would definitively get better results if Kaiser have had hw accelerated bitblt (or at least less tearing).
*** Massive Brain Overload*** "Apparently these people are speaking a strange dialect I've never heard before" -Harold & Kumar Escape from Montanamo Bay
RPG0 said:
...
- Better implemented sound library - i can't believe nobody complained about the sound - which is the worst, ever, in the whole world, in the universe. It sound like an old radio. A broken old radio. A broken old radio in a Faraday cage tuned to the wrong frequency...
Click to expand...
Click to collapse
LOL! And that will be the funniest thing I hear all day. And I just woke up.
Oh, and since HTC said they are releasing a new ROM which fixes the speed of the device, but does not bring HW accel (drivers), i think my problems will be over.
new rom
ahh,and when is this magical fix for kaiser meant to be released to public ?
my kaiser has become 3g /hsdpa usb modem,and gone back to p910i for everyday use,kaiser too fussy/slow/clunky for busy gardener,keeping fingers crossed about s/e "paris".
Just a shameless bump

CorePlayer lagging? - Solution without disabling manilla

Ok so Manila is a huge ram monster I found that a few registry tweaks dose the trick for the dreaded core player lagging XD
First: ResProxy
HKLM\Software\HTC\ResProxy
"ShareMemSize"
Change this value to zero
http://forum.ppcgeeks.com/showthread.php?t=85716&page=5
“ResProxy is HTC's method of "Pre-Caching" your applications on the phone, therefore loading times are faster when opening, etc. It just allocates a chunk of your memory for this feature. Problem is, after about a day of not soft-resetting your phone, your idling at 90% or so.
Changing it to 0 will set this as DYNAMIC. So instead of a bunch of pre-set apps being precached IMMEDIATELY, it does it dynamically. It will only raise the memory for the apps that you are currently using, etc.
Keep in mind, changing this REG does not affect speed in terms of opening applications,etc. Performance stays identical.
You will not notice IMMEDIATELY that this tweak has worked. You will notice later in the way when you notice your device won't idle higher than 65-70% usage anymore, and won't even get as high as 90% unless your running mad software on it lol.
Enjoy! “
Second: PushInternet
http://forum.xda-developers.com/showthread.php?t=532948
HKLM\Software\HTC\PushInternet\Enable => change to 0
Note: with both of them turned on the core player benchmark was of 48.8%
With just the Resproxy off it was at 58.8%
And with both off it got up to 115%
Scratch that, it didn’t help
It only worked just once and then went back to lagging
I'm a long time user of Core Player on several devices with Manilla & it's never been a problem for video playback for me. If video playback is your issue, rip better video files. I use .avi files sized from 550mb to 800mb.
If I turn off Sense UI, I can play videos fine using QTv display on high quality with zoom and dither off
No ghosting or anything...
With Sense UI on, I have to down to medium quality videos to avoid ghosting. So I usually turn off Sense if I'm going to watch something long...
Good effort. Since it didn't really work out, I'm going to put this thread out of it's young misery.

Wondering if burn-in might be caused by elevated temperature from added case.

https://drive.google.com/file/d/1UO84SCeykrD6Ecb9p3yWfoUe4KN2pJs2Eg/view
Above you can see detectable (under close scrutiny, ideal conditions) burn-in of my Nexus 6 screen after the following usage:
Only 8 weeks since I bought the phone
I'm reasonably sure that this was not present when I got the device because:
1 - it was "new" from Amazon in factory sealed box (although at a bargain price $299 for 64gb which makes me wonder slightly);
2 - inspected it pretty careully when I was reading about burn in shortly after I got my device;​
My screen on time on a typical day is probably 2-5 hours per day over that 8 weeks. (although probably 5-6 hours per day during the very first week!)
Brightness is always in auto with the slider in the middle. And I don't spend a lot of time outside with my screen on.
You can see that's pretty mild usage for a relatively short time. And yet many others are reporting they absolutely can't detect anything whatsoever even after a year of use. . I'm led to believe I'm an outlier.
So what is it that might make me more susceptible than the next guy?
The "easy" answer is device quality. And it might be the right answer, but what if there's something else.
I thought about what it is in my usage (which seems relatively mild) that might possibly cause this.
Then I realized the one thing that struck me odd about this phone. I noticed the screen is warm to my fingers if I use it while charging. I didn't meausre that temperature, but I did download GSAM about a month in and and set a battery temperature alarm at 110F. If I do mild surfing while charging, battery temperature gets up to 110F within a few minutes. I don't let it go above that now, and the screen feels relatively cool compared to how hot I used to let it get. So I'm going to guess that my battery temperature used to routinely get to 115 or 120F while I was using my phone while charging. And although they're not equal, when battery gets hot screen gets hot (again I could feel the heat).
Would heat affect amoled deterioration? We wouldn't think that at first because we normally associate the deterioration with simply being energized brightly over time. But what about being energized causes that deterioration. What if it's the localized heat at the pixel from that brightly energized pixel that causes the damage. In that case, anything else which makes the phone/screen run hotter overall will tend to enhance that local damage mechanism also.
Does my phone run hotter than average? I'm not sure, but since day 1 I've been using this funky case Uniform Supcase Beetle Hybrid Pro.. It's built like an otterbox.... thick and rugged. Up to 1/4" inch thick and surrounds the phone on back, all sides, and even wraps around to cover the front bezel. Imagine that wrapping is an insulator (like a blanket), it helps to keep the heat inside so the phone underneath the blankets is hotter than it otherwise would be.
Is that case why I'm different? I dunno, all I've got is one phone as a data point and a scenario that seems plausible to me.
I'm interested to know if any others have associated elevated temperatures or a thick case with enhanced susceptibility to burn-in . Or if you believe it may be related.
By the way, I have to add my perspective this is a minor and managable thing. I still love this phone. I'm just curious about why...
There are of course many threads on subject of N6 burn-in. The biggest one here:
http://forum.xda-developers.com/nexus-6/general/burn-t2955765
electricpete1 said:
https://drive.google.com/file/d/1UO84SCeykrD6Ecb9p3yWfoUe4KN2pJs2Eg/view
Above you can see detectable (under close scrutiny, ideal conditions) burn-in of my Nexus 6 screen after the following usage:
Only 8 weeks since I bought the phone
I'm reasonably sure that this was not present when I got the device because:
1 - it was "new" from Amazon in factory sealed box (although at a bargain price $299 for 64gb which makes me wonder slightly);
2 - inspected it pretty careully when I was reading about burn in shortly after I got my device;​
My screen on time on a typical day is probably 2-5 hours per day over that 8 weeks. (although probably 5-6 hours per day during the very first week!)
Brightness is always in auto with the slider in the middle. And I don't spend a lot of time outside with my screen on.
You can see that's pretty mild usage for a relatively short time. And yet many others are reporting they absolutely can't detect anything whatsoever even after a year of use. . I'm led to believe I'm an outlier.
So what is it that might make me more susceptible than the next guy?
The "easy" answer is device quality. And it might be the right answer, but what if there's something else.
I thought about what it is in my usage (which seems relatively mild) that might possibly cause this.
Then I realized the one thing that struck me odd about this phone. I noticed the screen is warm to my fingers if I use it while charging. I didn't meausre that temperature, but I did download GSAM about a month in and and set a battery temperature alarm at 110F. If I do mild surfing while charging, battery temperature gets up to 110F within a few minutes. I don't let it go above that now, and the screen feels relatively cool compared to how hot I used to let it get. So I'm going to guess that my battery temperature used to routinely get to 115 or 120F while I was using my phone while charging. And although they're not equal, when battery gets hot screen gets hot (again I could feel the heat).
Would heat affect amoled deterioration? We wouldn't think that at first because we normally associate the deterioration with simply being energized brightly over time. But what about being energized causes that deterioration. What if it's the localized heat at the pixel from that brightly energized pixel that causes the damage. In that case, anything else which makes the phone/screen run hotter overall will tend to enhance that local damage mechanism also.
Does my phone run hotter than average? I'm not sure, but since day 1 I've been using this funky case Uniform Supcase Beetle Hybrid Pro.. It's built like an otterbox.... thick and rugged. Up to 1/4" inch thick and surrounds the phone on back, all sides, and even wraps around to cover the front bezel. Imagine that wrapping is an insulator (like a blanket), it helps to keep the heat inside so the phone underneath the blankets is hotter than it otherwise would be.
Is that case why I'm different? I dunno, all I've got is one phone as a data point and a scenario that seems plausible to me.
I'm interested to know if any others have associated elevated temperatures or a thick case with enhanced susceptibility to burn-in . Or if you believe it may be related.
By the way, I have to add my perspective this is a minor and managable thing. I still love this phone. I'm just curious about why...
There are of course many threads on subject of N6 burn-in. The biggest one here:
http://forum.xda-developers.com/nexus-6/general/burn-t2955765
Click to expand...
Click to collapse
I can notice burn in on mine (I use no case at all) but its very mild and a "burn-in fixer" like the black and white rolling lines from "Display Tester" makes it go away if ever I'm that bored to do so. Also I only ever notice it on a grey background which honestly is rare to appear where the status and button bars are. On the terms of battery temp I really have to get something going to hit >105F for instance downloading videos using multiple streams to make it go faster which requires stitching the video back together at the end which basically pegs the cpu. General browsing reading some news stories etc stays ~85-90F. Differences in temps between phone for this one in particular can swing wildly due to there being upwards of 17 different CPU "bins" (compared to something like 3-4 on the Nexus 5) with each bin being a 10mV shift in the voltage table for the CPU meaning on CPU can have a 300mhz at 810mV (bin0) and another could have it at 650mV (bin16). For reference mine is a Bin 12 or 690mV on the 300mhz and 1110mV on 2.7ghz.
Both of my devices (had the old one for about 2 months, this guy for about 1) both show burn-in similar to yours already. First was caseless, this one I run with a case. I think that burn-in is just a very widespread issue with this panel and the people claiming to not have any simply don't notice it. Maybe I got 2 bad devices fresh from amazon but, I think that's just how it is. Good news is it really doesn't bother me and I can't see it without a gray background.
electricpete1 said:
https://drive.google.com/file/d/1UO84SCeykrD6Ecb9p3yWfoUe4KN2pJs2Eg/view
Above you can see detectable (under close scrutiny, ideal conditions) burn-in of my Nexus 6 screen after the following usage:
Only 8 weeks since I bought the phone
I'm reasonably sure that this was not present when I got the device because:
1 - it was "new" from Amazon in factory sealed box (although at a bargain price $299 for 64gb which makes me wonder slightly);
2 - inspected it pretty careully when I was reading about burn in shortly after I got my device;​
My screen on time on a typical day is probably 2-5 hours per day over that 8 weeks. (although probably 5-6 hours per day during the very first week!)
Brightness is always in auto with the slider in the middle. And I don't spend a lot of time outside with my screen on.
Click to expand...
Click to collapse
That's an interesting theory.
I'm one of those who have absolutely NO detectable burn in, either permanent or latent. I also run a thick case -- Ballistic Maxx.... but... I have (used to, but don't any more) used it very hot. In particular google maps nav used to really make a lot of heat, enough that you could REALLY feel it on the screen, and even through the case. Somewhere along the lines, something changed enough in (firmware? OS? gmaps?) that this no longer gets it to heat up appreciably. Aside from that, I've *never* used it while charging -- kind of difficult to do so with wireless charging. Auto-brightness with the slider at about 1/3. I've owned it for a year now.
---------- Post added at 08:49 PM ---------- Previous post was at 08:39 PM ----------
StykerB said:
I can notice burn in on mine (I use no case at all) but its very mild and a "burn-in fixer" like the black and white rolling lines from "Display Tester" makes it go away if ever I'm that bored to do so. Also I only ever notice it on a grey background which honestly is rare to appear where the status and button bars are. On the terms of battery temp I really have to get something going to hit >105F for instance downloading videos using multiple streams to make it go faster which requires stitching the video back together at the end which basically pegs the cpu. General browsing reading some news stories etc stays ~85-90F. Differences in temps between phone for this one in particular can swing wildly due to there being upwards of 17 different CPU "bins" (compared to something like 3-4 on the Nexus 5) with each bin being a 10mV shift in the voltage table for the CPU meaning on CPU can have a 300mhz at 810mV (bin0) and another could have it at 650mV (bin16). For reference mine is a Bin 12 or 690mV on the 300mhz and 1110mV on 2.7ghz.
Click to expand...
Click to collapse
That operation *should not* be especially hard on the CPU. Its actually a fairly trivial task. Might be that your software is doing something stupid, like keeping all the different pieces separate until the end when it copies them all in sequence into a target file.
You might want to look into better downloader software. The *correct* way to perform this kind of a job, is to allocate the file in advance, and have each stream write to its correct offset in parallel. No final copy.
You might also save a heap of work if you get rid of userdata crypto. Big problem with the crypto on these when doing that kind of glue-job, is that it will be running decrypt on all the different pieces, and simultaneously running encrypt on the target file.
That operation *should not* be especially hard on the CPU. Its actually a fairly trivial task. Might be that your software is doing something stupid, like keeping all the different pieces separate until the end when it copies them all in sequence into a target file.
You might want to look into better downloader software. The *correct* way to perform this kind of a job, is to allocate the file in advance, and have each stream write to its correct offset in parallel. No final copy.
You might also save a heap of work if you get rid of userdata crypto. Big problem with the crypto on these when doing that kind of glue-job, is that it will be running decrypt on all the different pieces, and simultaneously running encrypt on the target file.
Click to expand...
Click to collapse
Well for 360p, 144p, and 720p which are the resolutions you can natively download from youtube as an MP4 do function this way (which if you download these resolutions the software just takes ~5 seconds to finalize the file after downloading). However since youtube is exactly keen on people downloading videos w/o Red (which I do have but I like having 1080p and/or 60fps for gaming style videos which Red does not allow for unfortunately). So those non-standard resolutions have to be downloaded in their little 4096 kb chunks with separate video and audio streams as if it were streaming it from the website itself and stitched together using an actual encoder. A 1080p60 video can put a sizable load ~50-100% i5 5200u laptop processor (depending on the complexity of the scene) while realtime streaming which arguably has more raw power than a N6 (part of the reason youtube still streams AVC codec to android). Now with the phone not having to actually render the video which VP9 codec isn't supported fully either device's GPU so a chunk of the load on the laptop would be rendering. The phone being able to finalize a 1080p60 30 minute video in ~1 minute is why the CPU is 100% while doing so and that just tells me it's using all the power it can to accomplish the task in parallel utilizing all 4 cores.
As for using encryption I've disabled it in the past with varied results (mostly kernel differences) but since I'm sticking with the stock rom and using the monthly ota's I've just left it on to stop any potential accidental encryption which could theoretically lead to data loss and/or the hassle of unecrypting again. In addition to this I see absolutely no gain in speed on the video stitching process which considering it performs the task at a ~15-20 MB/s rate where the limits of the encrypted storage are 200MB/s read and 90 write for sequential reading which is what the encoder is using. Since android 5.1 when the kernel was updated to take advantage of qualcomm's encryption instructions which would otherwise go unused to the OS anyway, encryption doesn't affect this type of workload.
I know this is kinda outside the scope of this thread but I have done some reading on this kinda stuff when I was trying to learn why the app needed an external app when downloading those difference resolutions which most apps wouldn't even do outside of PC software.
StykerB said:
Well for 360p, 144p, and 720p which are the resolutions you can natively download from youtube as an MP4 do function this way (which if you download these resolutions the software just takes ~5 seconds to finalize the file after downloading). However since youtube is exactly keen on people downloading videos w/o Red (which I do have but I like having 1080p and/or 60fps for gaming style videos which Red does not allow for unfortunately). So those non-standard resolutions have to be downloaded in their little 4096 kb chunks with separate video and audio streams as if it were streaming it from the website itself and stitched together using an actual encoder. A 1080p60 video can put a sizable load ~50-100% i5 5200u laptop processor (depending on the complexity of the scene) while realtime streaming which arguably has more raw power than a N6 (part of the reason youtube still streams AVC codec to android). Now with the phone not having to actually render the video which VP9 codec isn't supported fully either device's GPU so a chunk of the load on the laptop would be rendering. The phone being able to finalize a 1080p60 30 minute video in ~1 minute is why the CPU is 100% while doing so and that just tells me it's using all the power it can to accomplish the task in parallel utilizing all 4 cores.
As for using encryption I've disabled it in the past with varied results (mostly kernel differences) but since I'm sticking with the stock rom and using the monthly ota's I've just left it on to stop any potential accidental encryption which could theoretically lead to data loss and/or the hassle of unecrypting again. In addition to this I see absolutely no gain in speed on the video stitching process which considering it performs the task at a ~15-20 MB/s rate where the limits of the encrypted storage are 200MB/s read and 90 write for sequential reading which is what the encoder is using. Since android 5.1 when the kernel was updated to take advantage of qualcomm's encryption instructions which would otherwise go unused to the OS anyway, encryption doesn't affect this type of workload.
I know this is kinda outside the scope of this thread but I have done some reading on this kinda stuff when I was trying to learn why the app needed an external app when downloading those difference resolutions which most apps wouldn't even do outside of PC software.
Click to expand...
Click to collapse
I can't even bother to read that since (a) it is out of scope, (b) is from an end-user point of view, and (c) doesn't actually address anything related to what I've explained to you.

[RESEARCH] External touch screen instead of AA head unit

Seeing videos of people integrating Raspberry Pis into their cars with OpenAuto got me interested in Android Auto. There is one thing I don't like about it, however: you're just adding another device to pass on audio/video/touch to an external display, from the phone. Why not cut out the middle man?
Back in the good old days of my Xperia S, phones had a dedicated micro-HDMI output. Delicous 60fps 720p (and even 1080p, with the UI drawn at 720p, but for instance video playback in full 1080p), and a free charging/OTG port. However, nowadays, we're stuck with MHL, and unless you have a Samsung phone, which has a proprietary connector, you can't use MHL and OTG at the same time, nor does MHL properly support touch functionality (only in theory), so we need the OTG.
DisplayLink to the rescue. We're going to sacrifice a lot of display smoothness/responsiveness here unless you have a modern phone with a USB 3.0 Type-C port (so you can use a newer high performance DisplayLink adapter), but we can use an OTG hub to get both HID touch functionality and HDMI through the DisplayLink adapter at the same time. More on this later. Also, I originally bought a $6 USB to HDMI adapter off eBay thinking it would be DisplayLink (or a compatible off-brand clone), but it turns out to be the really dodgy Fresco Logic FL2000, which is so cheap because it does none of the clever things that DisplayLink adapters do, and instead just spits out full resolution frames as fast as it can, which is completely incompatible with USB 2.0 or low power devices like phones. I found an affordable HP DisplayLink DVI adapter second hand from a Chinese seller that works.
On to the next hurdle: charging while using OTG. This is an interesting one, as it's not something I really gave a lot of thought initially. I mean surely just using a powered hub and giving the phone 5V over its micro-USB port would work, right? Well, it's a bit more complex than that, but Sony used to have an OTG dock that could charge, so I'm confident once I get the right OTG hub, it will work fine on my Z5. The one I got off eBay wouldn't do anything but charge, and when I opened it up, I saw it doesn't even use the 5th OTG pin, which would explain why it didn't work. I soldered a regular Type-A plug onto it and used my Sony OTG adapter to test, and I can get either charging or OTG, so something a bit less hacky is required. I ordered the Acasis H027, so when it arrives, we'll see if that works.
Touch: I haven't tested it yet. I have an HDMI touch screen in storage at my brother's house, so next time I visit I'll see if I can get touches to register on the phone. Based on my Googling most people on the internet seem to have gotten this to work fine, although there is no touch calibration on Android as far as I'm aware, like there is on Windows.
Portrait mode: this is my personal pet project and what I've been struggling with the most so far. It would be by far the easiest to just use a touch screen in landscape mode, but hear me out. Because HDMI (touch) monitors are cheap anyway, and in the 7 inch to 24 inch size range, bigger usually means cheaper, I want Tesla-style portrait mode. In portrait mode we can fit more screen estate in the centre console of the car than in landscape mode, and a bigger screen = bigger text = quicker glancing = safer driving. And it also happens to look extremely cool. If you have a big car you can probably fit a 20 or 24 inch screen, but I think I'm going for 13 inch. However. Using the DisplayLink Presenter app or the DisplayLink Desktop demo app which as far as I can tell does exactly the same things, I cannot for the life of me get native portrait mode to work. Android insists on pillarboxing portrait mode, and no amount of forcing rotation, setting build.props like ro.sf.hwrotation or persist.demo.hdmirotation works to change its mind. More worryingly, it seems that for the HDMI rotation prop to work, you need to also set persist.demo.singledisplay, which prevents apps from accessing secondary displays, which means DisplayLink Presenter, which itself is an app, cannot mirror the screen output any more and you get a sad single stripe of garbage pixels on an otherwise completely black screen. I probably need help here from an XDA developer, to hack the DisplayLink app, or to develop an Xposed module that causes all apps to draw themselves in portrait mode while the system is actually in landscape mode, or something along those lines. I've tried most things in my power that I could think of, and since my phone is unlocked, rooted, magisk'd and xposed, that is actually quite a lot, but nothing helped. So, uh, help?
Lastly there is DPI, which is one of the easier hurdles that I did solve. Android Auto (in phone mode, so not connected to any head unit) has a ridiculously huge UI on my phone's native dpi, and while I understand the reasons for it, with a larger external touch screen attached it just becomes unreasonably huge. With Tasker set to run "wm density 240" the entire Android UI becomes a lot more suited to a large screen, and even though Android Auto is still pretty huge compared to other apps, it's what I would consider reasonable.
More to come!
So, quick update:
While the screen portrait mode issues were "simple" to fix (although root was definitely required), the touch orientation issues as well as charging-during-OTG require kernel modifications to be fixed. I managed to compile a modified kernel with charging-during-OTG support thanks to @nlra 's work on that front, but I couldn't get the new image to boot.
A few things happened in the mean time:
- I discovered scrcpy
- I got an Xperia XZ3 (which I haven't rooted yet)
Scrcpy seems to be basically what Android Auto does, but for the whole Android desktop instead of only one app. I kind of don't like it because it involves adding a computer in between the display and the phone again (probably a Raspberry Pi), but the advantages are so huge it's basically the only realistic option right now.
Scrcpy:
- Basically always runs at 60fps, even on USB 2.0
- Handles portrait/landscape gracefully
- Integrates display, touch, (audio in a future version), and charging in a single connection
- Doesn't require root (although automatically setting the Android resolution to 1920x1080 and keeping the display on at 0% brightness are things that can probably only be accomplished with Tasker, which requires root)
So basically this simplifies and moves the project further immensely, however there are still some blocking issues right now. Touch screens only work on Windows because in addition to generating touch events, Windows also generates fallback mouse click events for touches, something that Linux doesn't do, and because there is no formal touch screen support in scrcpy, multi-touch doesn't work at all. Audio support also seems to be in an experimental state currently, and is not enabled for regular builds.
I hope ROM1V will eventually implement touch screen support (it's been in his GitHub issue tracker since March) as I have enough work to do as it is. I will focus on the hardware part (Raspi, cabling, VESA mount etc.) first and if by that time touch support is still missing I'll take a crack at it myself. Thankfully scrcpy is built on SDL which I'm fairly familiar with, although I've never worked with the touch input API before.
For the charging you need hub with usb-c PowerDelivery passthrough and HDMI. I am testing ones with less power consumption right now. I am not sure why you weren't able to use it.
I think people would be better of using following app for changing resolution, etc. using SecondScreen (I think root is required to run HDMI in portait mode, because it is grayed out for me).
I am doing the same project, but I keep the screen 1600x1200 horizontally and use apps in split view mode. I don't want root. I was thinking of using SamsungDEX for it, but the menus are too small.
The good thing is though, that on Samsung it is possible to create two-apps split screen pair (e.g. google map + music) as a launcher shortcut (using Good Lock (MultiStar plugin) from Galaxy Store). Unfortunately, I don't think it is possible to automate launching two apps in split screen automatically, nor create a split-pair shortcut on other launchers.
It would be good to have some multiwindow manager since I also wanted my BMW-tuning/logging gauges app to run in a floating window on top or minimized to a floating icon. It is possible naturally but it is a lot of manual clicking :/
One more thing that I don't think will be possible, is to completely turn off the phone screen. With the screen on, Note8 doesn't do fast charging. Can the screen be off with scrcpy somehow, I don't think?
So this dead?

Categories

Resources