Mali 400 MP 16 bit only(Reason for high performance)? - LG Optimus 2x

Hello,
http://forum.xda-developers.com/showthread.php?t=1075364&page=21
It seems the Mali 400 mp is rendering colors at only 16 bit (remember the old days). This effects the performance a lot while losing image quality.
My Gf has the S2 and this is very noticeable.
This is affected in browsers and games in general.
Hence the galaxy s2 has huge banding issues.
you can google for it and you will come up with plenty of results.
I am wondering if it is possible to compile a kernel for O2X with forcing 16 bit (It can be enabled or disabled so that we can get better FPS). This way we can make for super fast browsing experience.
http://forums.arm.com/index.php?/topic/15028-bug-in-driver-android-sgs-2-i9100/
Read this from ARM forums
Thanks and Regards,
Gana

I do not believe that LCDs or AMOLEDs are capable reproducing accurate picture for 24+ bit color space... it is just a waste of memory...
So it seems quite reasonable that it works at 16bit only...
And... wrong section...

ganaboy said:
Hello,
http://forum.xda-developers.com/showthread.php?t=1075364&page=21
It seems the Mali 400 mp is rendering colors at only 16 bit (remember the old days). This effects the performance a lot while losing image quality.
My Gf has the S2 and this is very noticeable.
This is affected in browsers and games in general.
Hence the galaxy s2 has huge banding issues.
you can google for it and you will come up with plenty of results.
I am wondering if it is possible to compile a kernel for O2X with forcing 16 bit (It can be enabled or disabled so that we can get better FPS). This way we can make for super fast browsing experience.
http://forums.arm.com/index.php?/topic/15028-bug-in-driver-android-sgs-2-i9100/
Read this from ARM forums
Thanks and Regards,
Gana
Click to expand...
Click to collapse
Not possible, color depth is stored in userspace drivers

Ferrum Master said:
I do not believe that LCDs or AMOLEDs are capable reproducing accurate picture for 24+ bit color space... it is just a waste of memory...
So it seems quite reasonable that it works at 16bit only...
And... wrong section...
Click to expand...
Click to collapse
ips screens have at least 8bit per channel
so 24bit it's really possible...
some professional ones have also more

Well i finally found the reason why Mali 400 is so much faster
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/16
If you notice Geforce ULP has much higher vertex(triangles) through put.
This is because Mali has 1 vertex core(VLIW 2) vs geforce's 4 cores
But in mobile space there is not much geometric complexity like on desktop PC's. The screens are so small.
Even though Mali and Geforce ULP have 4 fragment processors/cores each.
the mali core is a VLIW 4 like in AMD stream processors.
Therefore it can perform 4 MADD per core per clock. Where as geforce can do 1 MADD per core per clock.
MADD - Multiply and Add
Per Clock - One Cycle time.
Geforce - 8 MADD/clock
Mali 400 MP4 - 18 MADD/clock
Nvidia designs to run less MADD per clock but at higher frequencies. This way the theortical FLOPS limit can be reached easily.
The VLIW4 requires u to pack 4 MADDS per instruction like in AMD . The theortical limit is much more difficult to achieve.
Nvidia does it in this fashion on the desktop PC's , but i think they missed a trick here.
1> In desktop PC's the cores can be used for general purpose computing(like CUDA). People are not interested at the moment in mobile space in such things.
2> The though process is that the geometric complexity will increase in games.( I think they assumed that most people will connect their phones to big monitors and then Play )
Which brings me to tegra 3
12 Cores - 12 MADD/Clock
I really hope they do not split the cores as 6/6 vertex/fragment
Maybe 3/9 or 4/8 maximum.
Small Tip:
Tegra 2 works at peak efficiency when the load is split between vertex shader and pixel shader seperately. So having seperate shader files for different hardware is a good idea.
Gana

Related

Xperia X1 CPU vs Omnia i900 CPU

hello
Xperia X1 has a Qualcomm MSM7200 528MHz processor and Omnia i900 has a 624MHz Marvell PXA312 processor , now the question is :
Xperia cpu has the top speed or Omnia Cpu ???
please answer with full detail and full info !!!
thanks
ofcource omnia has a faster cpu
xperia in my opinion is suffering from not optimal firmware..
however i have seen a video online about xperia vs touch pro when switching landscape to port. , xperia wins..
what do u intend to use the phone for most impt.
CPU speed has nothing to do with actual performance, there are many variables there like RAM, video processor and many other things.
so if you want to know which is better between the two by comparing the CPU, then i think you are heading the wrong way.
samy.3660 said:
ofcource omnia has a faster cpu
Click to expand...
Click to collapse
thanks but i need full details !!!
May i know why my Quake 3 only running 1fps on my xperia?
mcbyte_it said:
CPU speed has nothing to do with actual performance, there are many variables there like RAM, video processor and many other things.
so if you want to know which is better between the two by comparing the CPU, then i think you are heading the wrong way.
Click to expand...
Click to collapse
you right !!!
i want to now xperia x1 has top speed and top performance or Omnia i900 !!!
xperia have 256mb ram and omnia have 128mb ram !!!
I also got an Fujitsu Siemens n560 with xscale 624MHz (pxa 27* i think) and its twice as fast as xperia playing vga avi.
If i compare the time they need to open /windows for example, the xperia is faster.
The qualcomm is made to run with extra graphics chip i think and is slow if it has to draw something itself.
Der_Immitanz_konverter said:
I also got an Fujitsu Siemens n560 with xscale 624MHz (pxa 27* i think) and its twice as fast as xperia playing vga avi.
If i compare the time they need to open /windows for example, the xperia is faster.
The qualcomm is made to run with extra graphics chip i think and is slow if it has to draw something itself.
Click to expand...
Click to collapse
thanks ! nice !!
command , more info , you can do it !!!
CPU Topic only...
No brainer! 624MHz > 528MHz.. Omnia Wins!
Thread close...
frankly, you can't compare CPU with MHZ. depends on the internal structure.. for example, u take a pentium 3 CPU 3GHZ vs intel core 2 3GHZ, intel core 2 will be WAY faster
leobox1 said:
frankly, you can't compare CPU with MHZ. depends on the internal structure.. for example, u take a pentium 3 CPU 3GHZ vs intel core 2 3GHZ, intel core 2 will be WAY faster
Click to expand...
Click to collapse
yes , you right !!! we can't compare CPU with MHZ !!!!
xxl2005 said:
yes , you right !!! we can't compare CPU with MHZ !!!!
Click to expand...
Click to collapse
I have to disagree on this one.
Of course there are hundreds of important facts, how the CPU is processing the data, which structure it is using, what processes can be outsourced, how they are transferred and so on.
But these are concerns of the processor type and its environment.
The MHz say, how many operations the CPU can do per second. That is the only indication of its real speed.
Just think about the car industry. If you want to compare the power of an engine to another, everything that counts is the power. There are thousands of reasons why the car with the weaker engine could be faster than the other (maybe the stronger engine is built in a truck or what else).
But you wanted to compare the engines, so the most powerful wins.
I'm sure, the opener of the thread rather wanted to hear about the overall phone speed, but then he asked the wrong question.
i remember he powerpc cpu used to be lower than p4 but used to run a lot faster. i think the ram is the bigger factor between these 2 handsets. i know the x1 has a due core cpu, not sure on the other handset.
correct me if im wrong but these are both based on the same arm cpu right?
damskie said:
CPU Topic only...
No brainer! 624MHz > 528MHz.. Omnia Wins!
Thread close...
Click to expand...
Click to collapse
Just like the old AMD 64 series CPU's beat Intel Pentium 4 chips which ran at a way higher clockspeed?
Get the facts straight before posting ignorant posts like that. Pure clockspeed has nothing to do with overall performance of a CPU.
Non scientific answer here, but ive got both phones.
The omnia on the stock Vodafone UK rom was slow as ya like to the point where the phone was almost useless.
Both phones with cooked roms on them perform about the same.
Personal winner... cant choose, omnia better as a phone, x1 better as a pda, not amazed by either.
lol not amazed by either
Simple test. One is Divx certified and the other isn't. That's a true performance test.
Hi,
i used to have both phone and i have used them and test them a lot.
for processors mhz doesn't mean anything sine a long time now.That why industry has creatd mips instead of frequency . Why ? for instance take a pentium 1 266mhz and a pentium 1 mmx 266mhz. same frequency but p1 mmx was way faster because of optimise instructions inside the cpu itself. Don't forgot we speak for frenquency about cpu cycles. before it was 1 cpu cycle = one simple operation since mmx 1cpu cycle severals operations. So you can have a very high frequency processor which can do worst than a lesser frenquency one. Compare Pentium 2 400mhz with a G3 266 mhz. G3 had the same performance. Why? not the same technology inside. A core 2 duo 1.8gz will outperform an p4 4 ghz even with more than 2ghz frenquency difference!!!
So to come back to the thread. omnia 1 arm cpu. xperia qualcomm 2700a.
2700a = dual core processor with integrated gpu. 1 core for the pda 1 core for the phone and 1 core for the 3d graphic cards( omnia doesn't have an accelerated graphic card.) So with the actual firmware we can say that omnia has reached the maximum of its capabilities. Which is not true with the xperia.
in cpu benchmark the two are about equals. But xperia is faster in all the other domain like memory access ...better multitasking on xperia.
for example on omnia when using their touchscroll music player if you got a lot of music the music will stop during the scrolling because scrolling" iphone like" takes a lot of ram....
the truth is that it's better to have more ram and a less powerfull processor than having a powerfull processor and no ram.
after a lot of testing i have sold my omnia and kept my xperia.
it's only my opinion with the test i made.
cheers.
NuShrike said:
Simple test. One is Divx certified and the other isn't. That's a true performance test.
Click to expand...
Click to collapse
divx playback with the integrated player was shocking. Packets of pixels everywhere...thx coreplayer to play divx so well!! and the resolution is a bit too small 240*400.
only samsung phone are divx certified... maybe because the other construtor didn't ask for the certification.

Custom Llano based HTPC... opinions please!

Hey all,
I am putting a HTPC together that will primarily be used with XBMC, but also be used to browse the internet and download films via lovefilm.com. Here is what I am considering...
AMD Llano A8-3800
http://www.xbitlabs.com/articles/cpu/display/amd-a8-3800.html
Gigabyte Motherboard - AMD A75, Socket FM1, DDR3 (GA-A75M-UD2H)
http://www.overclockers.co.uk/showproduct.php?prodid=MB-358-GI&groupid=701&catid=1903&subcat=2058
Corsair Vengeance 4GB (2x2GB) DDR3 PC3-12800C8 1600MHz Dual Channel
http://www.overclockers.co.uk/showproduct.php?prodid=MY-298-CS&groupid=701&catid=8&subcat=1517
Western Digital Caviar Black 2TB SATA 6Gb/s 64MB
http://www.overclockers.co.uk/showproduct.php?prodid=HD-368-WD&groupid=701&catid=14&subcat=1953
OCZ ModXStream Pro 500w Silent
http://www.overclockers.co.uk/showproduct.php?prodid=CA-037-OC&tool=3
Lian Li Case (PC-C37B)
http://www.kustompcs.co.uk/acatalog/info_1194.html
For these simple tasks I am under the impression Llano will suffice. Should I be worried about the lack of a discrete GPU?
Also this will cost about £500 which is kind of pricey for a HTPC. Has anyone got any suggestions to reduce the price of the build?
Thanks for any feedback?
PSU and RAM is a bit overkill for a HTPC. Also, run LINUX if you wanna keep it low-powered. From what I hear, Llano has a great GPU but sucky CPU. It should suffice as a HTPC processor. I'd go for a lower end PSU and about 1GB RAM if Linux, 2GB if Windows.
Thanks for the good advice about the PSU and RAM.
I have heard that the LLano CPU is a little weak on other sites too. I was considering instead an Athlon II with dedicated graphics. It will cost a similar amount as this system.
I can even get the AsRock vision 3D for the same price...
http://www.asrock.com/microsite/Vision3D/index.asp?c=Main
There are just too many options...
edcoppen said:
Thanks for the good advice about the PSU and RAM.
I have heard that the LLano CPU is a little weak on other sites too. I was considering instead an Athlon II with dedicated graphics. It will cost a similar amount as this system.
I can even get the AsRock vision 3D for the same price...
http://www.asrock.com/microsite/Vision3D/index.asp?c=Main
There are just too many options...
Click to expand...
Click to collapse
3D is overrated. I'm assuming that you:
1. Have a 3D HDTV.
2. Have the 3D glasses
3. Have a desire for headaches.
Also, a lot will depend on usage pattern/behaviour. If you are only using it for some browsing (assuming social networks, youtubes, reading forums like XDA, some degree of flash playing), the Llano should be more than sufficient. It will also serve well in a light gaming mode (we're talking COD:MW2 probably). And if you're running Linux, I'd say that bumping to 2GB will make it a behemoth when it comes to webapps.
That said and done, what I suggested (Linux build and bumping it to 2GB) will be more than sufficient for watching movies and some light browsing with webapps. The Llano is not good as a CPU, but it is a real kicker when it comes to making a no fuss dedicated system (although it sucks when it comes to making a good gaming PC). I believe that many sites actually view it as a high potential processor for HTPCs. Just remember to properly cool your rig (silent cooling FTW) when building your HTPC (my brother's sucked because he used a 9800GT).
So... building your own (if you have the expertise or can seduce/befriend someone with the expertise) will definitely yield savings, benefits and earn an essential geek badge.
Linux is out the question as my Dad (who will be using the HTPC) has used Windows all his life and will not learn another OS.
I get your point about the 3D and I have no intentions of using it for now... but it will be there for the future
I believe that both a LLano based system and the ASRock Vision 3D will fit the needs of a HTPC. As they cost a similar price and I am comfortable building my own system I have both options open to me.
I guess what it comes down to is which system is better... Llano with A75 chipset or i3 with HM55 chipset? Any opinions???
edcoppen said:
Linux is out the question as my Dad (who will be using the HTPC) has used Windows all his life and will not learn another OS.
I get your point about the 3D and I have no intentions of using it for now... but it will be there for the future
I believe that both a LLano based system and the ASRock Vision 3D will fit the needs of a HTPC. As they cost a similar price and I am comfortable building my own system I have both options open to me.
I guess what it comes down to is which system is better... Llano with A75 chipset or i3 with HM55 chipset? Any opinions???
Click to expand...
Click to collapse
Llano.
It has similar processing powers to an i3, but trumps even an i7 when it comes to GPU power. As for 3D, when the glassless 3DTVs come out, the specs will be different. I get most of my home movies off the internet, and from what I understand, a Blu-Ray disc has about 20+GB on average on it, so go figure.
Thank you for the good advice. I am nearly ready to make my purchase. I have decided to go for a custom Llano based system pretty similar to the one outlined in the OP. I will follow the advice though to downgrade the PSU and ram. Just a few more questions pls...
I was hoping to avoid using a dedicated GPU but I just realised i'm not sure if the motherboard supports lossless bitstreaming. I have looked but couldnt find out. Here's the motherboard I have in mind...
http://uk.asus.com/Motherboards/AMD_Socket_FM1/F1A75M/#specifications/#specifications
If this board doesn't support it I will probably get this GPU but I want to avoid it if possible...
http://www.overclockers.co.uk/showproduct.php?prodid=GX-263-SP
Thanks again for the help so far!
This situation just got a whole load more confusing
It turns out that the only way to get lossless bitstreaming with a Llano-based system is to use a dedicated GPU. This kind of defies the whole point of going down the Llano route as its integrated graphics was one of it's key benefits. Seeing as everyone says the CPU performance of Llano system is underwhelming I am seriously reconsidering the whole build.
Instead I could base the build around the H55 chipset as this does support lossless bitstreaming. I could then use the superior CPU performance of an i3, but would still require dedicated graphics to escape crappy Intel HD2000.
Bearing in mind that bitstreaming is an essential part of the build what would you do?
Edit: the H55 path really limits things like SATA 6gb/s and USB 3.0
edcoppen said:
This situation just got a whole load more confusing
It turns out that the only way to get lossless bitstreaming with a Llano-based system is to use a dedicated GPU. This kind of defies the whole point of going down the Llano route as its integrated graphics was one of it's key benefits. Seeing as everyone says the CPU performance of Llano system is underwhelming I am seriously reconsidering the whole build.
Instead I could base the build around the H55 chipset as this does support lossless bitstreaming. I could then use the superior CPU performance of an i3, but would still require dedicated graphics to escape crappy Intel HD2000.
Bearing in mind that bitstreaming is an essential part of the build what would you do?
Edit: the H55 path really limits things like SATA 6gb/s and USB 3.0
Click to expand...
Click to collapse
Hmmm... I'll need to do a little homework first... I'll get back to you regarding the lossless streams
edcoppen said:
This situation just got a whole load more confusing
It turns out that the only way to get lossless bitstreaming with a Llano-based system is to use a dedicated GPU. This kind of defies the whole point of going down the Llano route as its integrated graphics was one of it's key benefits. Seeing as everyone says the CPU performance of Llano system is underwhelming I am seriously reconsidering the whole build.
Instead I could base the build around the H55 chipset as this does support lossless bitstreaming. I could then use the superior CPU performance of an i3, but would still require dedicated graphics to escape crappy Intel HD2000.
Bearing in mind that bitstreaming is an essential part of the build what would you do?
Edit: the H55 path really limits things like SATA 6gb/s and USB 3.0
Click to expand...
Click to collapse
Seems to me that using an AMD Phenom/Athlon with a dedicated GPU will be slightly cheaper., although the whole rig will never fit in that casing...
I have decided to rule out the Llano system due to the complications with lossless audio. This now leaves me with an i3 system or Athlon like you suggested.
For an Athlon system I saw these parts:
AMD Athlon II X2 Dual Core 250 3.00GHz
Asus M4A88TD-M EVO/USB3 AMD 880G (Socket AM3)
These are cheaper than an i3 system for sure... as far as performance goes I am confident both the Athlon and i3 route is enough for a HTPC. I wonder about how their power consumption compares though?
edcoppen said:
I have decided to rule out the Llano system due to the complications with lossless audio. This now leaves me with an i3 system or Athlon like you suggested.
For an Athlon system I saw these parts:
AMD Athlon II X2 Dual Core 250 3.00GHz
Asus M4A88TD-M EVO/USB3 AMD 880G (Socket AM3)
These are cheaper than an i3 system for sure... as far as performance goes I am confident both the Athlon and i3 route is enough for a HTPC. I wonder about how their power consumption compares though?
Click to expand...
Click to collapse
AMD usually has a lower power profile than Intel, although if you underpowered your PC the processor will have to work REALLY hard to keep up... depends a lot.
Currently, an AMD-AMD setup for CPU and GPU combo is more efficient than an Intel-NVidia setup, although for the mid-range PCs, it might be different. A key component of power draw and power efficiency is actually your PSU. Most of the time, the PC will be on idle/low usage. Having an 80+ rated Gold or Platinum goes a loooooooong way towards saving power.
In terms of performance, the i3 does not have much benefit over AMD, because the good techs are limited to the i5s and i7s. AMD only differentiates the core count and superficial unlocks.
DISCLAIMER: A little late on this, but: I AM A HUGE AMD FAN. Not that I blow, but I really like AMD, and have been using AMD rigs for as long as I can remember.
Well I think I have come to a decision... again. Almost every component is different now. Here's my new selection of components:
Intel Core i3-2100T 2.5Ghz
MSI H67MA-E35 Intel H67
OCZ Platinum 4GB (2x2GB) DDR3 PC3-10666
Sapphire ATI Radeon HD 6670 1024MB
Western Digital Caviar Black 1TB
SilverStone Grandia GD04
OCZ StealthXStream2 400w Silent
I can get all of these for a round £500. Any last minute feedback from anyone before I buy it all would be much appreciated.
One thing that I didn't clarify with you. The service is movie streaming or downloading? Coz 1TB is mighty little for heavy downloading (trust me).
Although, from your setup, the parts look mighty fine to me. Just upgrade the CPU and GPU down the road and you'll have a mainstream gaming rig

[Q]GPU comparison questions-GL benchmark/SGS3-HTC oneX-Note-ipad2

hi,
i am kind of addicted about the GPU performances of the mobile devices because i believe the CPU tech is already way ahead of todays needings for mobile worlds.. and i need your opinons and even -if you wouldn't mind spending some time- your tests to decide..
novadays i am planning to buy galaxy note, it catches my eyes with its beautiful (not huge, but georgeus) screen even though it has black rendering issues
anyway the thing is the most beautiful apps and games comes from the apple side and the apple devices has good GPUs with average CPUs.. this is also excatly what a good movie in mp4 format needs.. so this is why i believe GPU has a superior roleplay than a CPU novadays devices..
in the raw power/resolution comparison ipad 2 is first with its PowerVR 543MP2/(1024x768)
i guess SGS3 is the 2nd one with OCed mali-400/(1280x720)
so lets create chart to compare them: take the GL benchmark egypt offscreen results and divide them with their resolutions and multiple the results with 100000 (since the result will be very small) i guess this should tell us about the gameplaying capibilities of a device (correct me if i am wrong)
according to this;
the ipad2 scores; 90x100000/1024x768= 11,444 points
the new ipad scores; 136x100000/2048x1536=4,323
SGS3 scores; 102x100000/1280x720= 11,067
GNote scores; 50x100000/1280x720= 4,882 (50 is what i get from glbenchmark web site, i did not run the test myself)
so my questions are; will we get 74,9 FPS(simply (50 FPS X 400Mhz/267Mhz) points on GL benchmark egypt offscreen by simply overclocking the GPU to 400 Mhz from 267Mhz (i guess this is the stock clock) just like SGS3 and why are we not getting 102 like it? is this the Nm making the difference or the drivers? and can someone with OCed GPU run the test and tell the results?
thanks..
Its all about:
-quality of the SoC (some can handle heavy stress better)
-drivers!
-kernel implementation
...and to a much lesser point, any optimizations/problems affected by high-level data, example the ROM.
Unfortunately, gpu doesn't work the way you think.
Usually benchmark score points show innovation in the gpu performance proceeding in a "Exponential Slope" whereas they're actually increasing in a more "Plateau" slope.
In fact, almost all improvements are software based.
A Exynos 3110 (1st-gen SGX540) can in practice perform as well as the gpu in the SGS3 or EVEN BETTER, depending on its setup.
This is a similar case to the PS3 vs 360. The PS3 has a much much more powerful gpu, however Xbox games look just as well as PS3 games, and even better, because they are encoded in DirectX format compared to SONY's proprietary SDK solution. Then again there is the RAM issue in both consoles
60 frame per second cap
Some things to consider in your graphics comparisons:
http://forum.xda-developers.com/showthread.php?t=1169188
http://answers.unity3d.com/questions/32841/is-it-possible-to-get-above-30-fps-on-an-ios-devic.html
This quote explains it best:
Frankly, your opinion is uneducated. The screen of the Galaxy S II has a refresh rate of 60 Hertz, meaning the screen physically cannot display any material higher than 60 frames per second. If you uncap the software frame rate, then the CPU and GPU of the phone will work harder to render as much material as possible - let's say in this case, we have something that has 80 frames to display in a single second. Yet since the screen cannot display 80 frames per second, 20 of those frames will never be shown, and the resulting movement could even suffer from tearing because of the mismatched refresh rate and frame rate. In order to fix tearing, a technique called vertical sync is employed, which would cut frame rates to 60fps in order to eliminate the extra frames which cause tearing.
So, if we remove the frame rate cap on Samsung's version of Android, then what do we accomplish? We increase the workload on the phone's processors, increasing heat output and decreasing battery life. Rendering above 60fps will generate frames which are never shown, and will introduce visual glitches if vertical sync is not used; vertical sync, in turn, would cap the frame rate to 60fps once again. I hope this post has been helpful.

Intel Chip

Hello everyone
I've been reading for a few days in this forum about the Motorola Razr i
I certainly found interesting articles but strangely I have found that very few refer to the Intel chip
just as I have been looking about the chip of "Motorola Razr i" and found a curious comment
"Medfield z2460 was meant to test the waters and is the reason why it was launched in india and not the US. Just a precursor to the medfield z2580. The z2580 and Clovertrail will be offered in dual core variants (not to mention quad core for clover trail) and will ditch the imagination technologies sgx 540 for an sgx 544 mp2 which runs 34 [email protected] mhz.
The sgx 540 gets 6.4 gfllops @400mhz . The adreno 225 runs at 24.5 [email protected] 400mhz and the tegra 3 (t30l or t33) gpu runs 13 [email protected] mhz. So being that 6.4gflops vs 24.5gflops is relative to 202% what do you think happens with 34 gflops vs 24.5 gflops? Plus the s4 is 103 mflops single threaded while medfield z2460 is 90 mflops single threaded on the cpu side. That's pretty close. Dual core comparison with sgx544 might actually be superior and at a higher process node (32nm vs 28nm), and that's with an in order instruction set vs ARM's out of order. I don't see how you get "x86 Atom has very slim chances when taking on Qualcomm’s ARM processors or any other new generation ARM mobile CPU from Samsung or Nvidia" with that info. Your talking a gpu and a core.
Come spring they go out of order, not to mention ditching 5 year old architecture for silvermont, 22nm process and inclusion of intel hd gpu with 40 to 80 gflops (depending on eu count) and you think there will be no competition? Even the apq8064 adreno 320 only has approx 40-45 gflops but that doesn't include the modem so higher tdp .
Maybe the exynos 5250 with mali [email protected] 68 gflops will be a threat given release schedule but still, nearly matching single threaded performance with the best chip on the market (and with 5 year old architecture), and beating every ARM chip to date in java script for a first try/test the waters offering? Swap a gpu and add a core and its game on. And adding new architecture, 22nm, out of order instruction and hd graphics and ARM might have a problem until 64 bit ARM v8."
Click to expand...
Click to collapse
My question is How true is this?
not publish the link to the site because I do not know if this is right in this forum.
I apologize for possible flaws in my English.
I'm having a super-smooth experience, so yeah, the hyper-threaded single-core chip is doing a very fine job compared to the ARM competitors.
But is it true that Intel will go out-of-order for their next architecture? Because the whole point behind Atom processors was to take advantage of Intel's advanced lithography and well tought architecture, then simplify it to make it consume much less energy (and go from out-of-order to in-order was one of those simplifications).
Well I always thought Intel smartphone chips were more powerful CPU wise but gpu wise its behind.
And christ you can quote all the figures you like but it doesn't mean it'll actually reach that. Its what the individual parts can achieve.
Put them into a phone and reduce the power consumption to an acceptable level = a lot less than quoted figures
Sent from my HTC Desire using xda app-developers app

I think we should have opened a discussion on some important points,

I think we should have opened a discussion on some important points,
1 adaptive storage, which I will draw on some specific model as our most has active others such as moto one, G6, and G7, motorcycle c, motorcycle and, we have to see if there is any limitation of hardware or software only thing, because this speech that Google does not support or that the cards do not have enough quality, not real, because every month we have models released with updated with this support.
2, dp vs dpi = 480 standard and high for this screen size, the base of android and 160, then it means that we have a base x3 of scale, this is legal for a worse point on the other, as this increases a lot more quality, the more the information displayed. You gain large, borderless icons plus loses in detail. Solution likely to set screen resolution lower than 540p, to have equibrio between size and quantity, or @ 1080p to have full amount of pixel there is screen, but if it is in stock has to activate a third party lacucher, because Motorola laucher does not support this pattern .
3, gpu is clock different from the specified, because it was to have maxima clock of 850mhz, more in the last version that I did read in the android with rom stock it was in setada the clock max 700mhz, more main problem was that even taking Max to 700mhz, in a good part of game and app it gets 450 MHz with low FPS content, even if it is not in the FPS limit of the game or application, it did not raise clock to its maximum .. probable solution for who has root, exchange governor gpu using kernel auditor, by order or by performance
4, this is less important and only one more doubt, mixed with indignation,. Because Samsung and some other privileged manufacturers can use 32bit android on their handset, which infinitely lighter and less consumer ram. What the 64bit, which Google has been forcing developer to use since the Android ..
It seems to prank some more Samsung with snapdragon 425 and 2gb of ram and HD + screen with 32bit android has usability, and multitasking better than our device that has 630 and 3gb of ram more with Android 64bit.
Mano has 5 years I see Google saying that there is no more Android 32bit, even so every year there is cell phone with input clip using Android 32bit fighting from equal to equal with Android intermediate with Android 64bit.
And I'm sorry if I said some nonsense, I do not understand much about development, but I had several devices and always made modifications, and always following the tropics and the developers
Ermes.mt/brasil said:
I think we should have opened a discussion on some important points,
And I'm sorry if I said some nonsense, I do not understand much about development, but I had several devices and always made modifications, and always following the tropics and the developers
Click to expand...
Click to collapse
So.... What are you trying to accomplish with this "discussion" that is probably already addressed in another thread??
jbaumert said:
So.... What are you trying to accomplish with this "discussion" that is probably already addressed in another thread??
Click to expand...
Click to collapse
yes and no, yes, because all the tropics in general seek 3 things (support, features or performance), and I'm approaching this three point, and yes! and discussion then everyone is welcome to try on a chair to have a coffee and speak their point of view .. #not much sound on this device and if we do not talk and we work together on it, we will not get a survival of it

Categories

Resources