How many cores (not cpu) does a htc one have? It's has 4 cpus I know that, so what about the gpu? And the HTC one has an image processor, so does that make it 6 cores. Like the Moto x has 8 cores. 2 cpus, 4 gpus, 1 natural language processor and 1 contextual computing processor. So can you tell me from this explanation how many cores a HTC one actually has
Sent from my HTC One
ShaheenXE said:
How many cores (not cpu) does a htc one have? It's has 4 cpus I know that, so what about the gpu? And the HTC one has an image processor, so does that make it 6 cores. Like the Moto x has 8 cores. 2 cpus, 4 gpus, 1 natural language processor and 1 contextual computing processor. So can you tell me from this explanation how many cores a HTC one actually has
Sent from my HTC One
Click to expand...
Click to collapse
The HTC One has 1 processor and 4 cores
How many cpu and gpu? And Image processor?
Sent from my HTC One
ShaheenXE said:
How many cpu and gpu? And Image processor?
Sent from my HTC One
Click to expand...
Click to collapse
One CPU and one GPU, both (afaik) consisting of four cores, making it 8 total. Add the image processor and you've got 9.
Now I get it thanks
Sent from my HTC One
Related
The Droid2/X use the same graphic processor as Droid 1, which is PowerVR SGX 530. According to the datasheet, this core is designed to run at 200Mhz with power of rendering 14M triangles/sec. But our Droid/Milestone runs underclocked at 110Mhz(7M tri/s) while D2/X at 200Mhz. That leads to major UI responsiveness&gaming difference between D2&D1.
I wonder if there's any possibility to overclock the GPU as well?
Thanks in advance.
Sent from my Milestone using XDA App
TeroZ said:
The Droid2/X use the same graphic processor as Droid 1, which is PowerVR SGX 530. According to the datasheet, this core is designed to run at 200Mhz with power of rendering 14M triangles/sec. But our Droid/Milestone runs underclocked at 110Mhz(7M tri/s) while D2/X at 200Mhz. That leads to major UI responsiveness&gaming difference between D2&D1.
I wonder if there's any possibility to overclock the GPU as well?
Thanks in advance.
Sent from my Milestone using XDA App
Click to expand...
Click to collapse
As far as I know this has been tried (overclocking), but with no results (constant reboots)
Imagination Technologies (PowerVR) defines the GPU internals and sells the "plans" for the part, to be included in SOCs like TI's OMAP.
But PowerVR does not, however, define the exact clocks at which the parts should run, nor other things like number of memory channels, memory speed, etc.
Texas Instruments are the ones who defined the GPU clocks. The OMAP 34xx chips (Droid 1, Milestone, XT720, Flipout, etc) are made using 65nm process, and that determines a certain power consumption using certain clocks, hence why they defined a ~100MHz clock for the GPU and ~600-800MHz for the CPU.
The OMAP 36xx (Droid X, Droid 2, Defy, etc) are made using a newer, smaller 45nm process, which allows them to run at higher speeds while spending approx. the same power, which is why Texas Instruments decided to clock the GPU at ~200MHz and the CPU at ~1-1.2GHz.
So it's not like the Milestones and Droids have their GPUs underclocked, those are just their factory clocks.
Of course, overclocking the GPU would be nice and it could be possible. If someone found out how to change the GPU's voltage and clocks, I'm sure it could come in handy in future games.
However, right now, the 1st gen Milestones/Droids are running every high-end HD game from gameloft at full speed, and I bet it'll even do Infinity Blade and other UE3 games when they're out for Android.
Every "HD" Android game has to be compatible with the 1st-gen Snapdragon's GPU, the Adreno 200, which is a lot slower than the SGX530 @ 100MHz, so we're sitting confortably above the base spec for now. And with all the Windows Mobile 7 phones coming with a 1st-gen Snapdragon (mandatory requirement), it'll be like this for a while.
So there's really not a big need for overclocking the GPU right now, except for getting higher scores in mobile benchmarks (some of them terribly unoptimized, like GLBenchmark 1.1 and 2.0).
Furthermore, I it seems the first factor to limit the 1st-gen Droids in games will be the RAM amount.
The first UE3-based game for Android is already out, and it requires 512MB of RAM.
So the game runs on Nexus One and not on a Droid/Milestone, which has far superior graphics performance.
(I'm pretty sure this has something to do with the fact that Android doesn't allow graphics buffering in the main memory, though, which could be resolved in future firmware revisions).
Then again, overclocking the GPU would be cool, and I'm pretty sure getting our SGX530 to work @ ~200MHz would significantly increase the gaming longevity of our phones for quite a while.
Thanks for your useful and important reply.
"The Manhattan Project" on Galaxy S Series just made me curious about Droid's gpu oc, because SGS also use a PowerVR gpu. But things isn't easy due to a fact that one is made by TI while another is made by samsung, the structure inside both SoCs may be completely different.
But I still hope someone capable would try something on this.
That's really cool and significantly lengthen the lifetime of our Droid and Milestone.
Thx again for your reply!
PS: I also felt strange why the UI(not games) on N1 is faster than an OCed droid, could it be the optimization problem?
Sent from my Milestone using XDA App
TeroZ said:
PS: I also felt strange why the UI(not games) on N1 is faster than an OCed droid, could it be the optimization problem?
Sent from my Milestone using XDA App
Click to expand...
Click to collapse
Definitely part of the optimization --a fast ROM with a good theme like the Droid X theme on the GOT 2.2.1 ROM has as fast a GUI as I've encountered on Android, even without overclock.
Also take in consideration that all the current 2.1 and 2.2 roms have a cap of 30fps in 2D, perhaps when the final 2.2 update arrives there will be some perfomance gain
Sent from my Milestone using Tapatalk
So,
My friend (Simon), recently acquired a Dell Precision Series computer/micro server tower.
Inside, it has 2 Intel® Xeon® X5472 quad-core processors running at 3.00ghz
My other friend (Josh) said to Simon, in his apparent jealousy, that even though Simon essentially has 8 cores, they are %&$# because they are VIRTUAL cores and not physical cores.
I scouted the Intel website and found this model processor and clearly states that within each processor, "Number of cores: 4. Number of threads: 4"
I am not familiar with threads but I am aware that hyper-threading is essentially creating virtual cores.
So the question is:
Is he using 8 virtual cores or merely 8 physical cores
If you can answer this, could you please tell me your reasoning if you have one?
Thanks =)
Sent from my GT-S5830 using XDA App
[CORRECTION]
8 physical cores or merely 8 virtual cores*****
This is a thread
assuming he is running windows 7....
click start, right there above where you clicked... there is a search box... type "dxdiag" and hit enter.
this will bring up a window listing all your hardware in detail.
i have an i5, dual core (which is actually a quad core with 4 logical processors) so under my "processor" description, it reads "M540 @ 2.53 Ghz (4 CPUs) ~2.5 Ghz."
running this program will tell you exactly what hardware he has. sounds like a dual quad core to me, though.
hope that helps.
you can also, type "msconfig" in that same window... then click the "boot" tab, then "advanced options" and on the top right you will see a window listing how many processor you have available to you.
hope that helps
His says:
Intel(R ) Xeon( R)
CPU X5472 @ 3. 00GHz (8 CPU's), ~3GHz
8 physical cores or only virtual?
Sorry =\
And thanks for the responses...
Sent from my GT-S5830 using XDA App
jimbo.levy said:
His says:
Intel(R ) Xeon( R)
CPU X5472 @ 3. 00GHz (8 CPU's), ~3GHz
8 physical cores or only virtual?
Sorry =\
And thanks for the responses...
Sent from my GT-S5830 using XDA App
Click to expand...
Click to collapse
I'm pretty sure that's a 4 physical core processor.
sure is. that is a dual processor, quad core processor set up, my friend.
http://ark.intel.com/products/34447/Intel-Xeon-Processor-X5472-(12M-Cache-3_00-GHz-1600-MHz-FSB)
Says here its a quad core
Each Xeon has 4 physical cores inside. A dual processor setup in this case means 8 actual cores. A better question will be whether your friend has applications that can make use of said number of cores.
The most number of cores is 6 , which is the Intel Core i7 Extreme Edition Q----- (something) , in which it has 6 cores 12 threads (as said by the Intel site)
Forever living in my Galaxy Ace using XDA App
http://ark.intel.com/products/34447/Intel-Xeon-Processor-X5472-(12M-Cache-3_00-GHz-1600-MHz-FSB)
Four physical cores per processor. No Hyperthreading.
Two processors..
4x2=8
Thanks everyone =D
Sent from my GT-S5830 using XDA App
Hello everyone
I've been reading for a few days in this forum about the Motorola Razr i
I certainly found interesting articles but strangely I have found that very few refer to the Intel chip
just as I have been looking about the chip of "Motorola Razr i" and found a curious comment
"Medfield z2460 was meant to test the waters and is the reason why it was launched in india and not the US. Just a precursor to the medfield z2580. The z2580 and Clovertrail will be offered in dual core variants (not to mention quad core for clover trail) and will ditch the imagination technologies sgx 540 for an sgx 544 mp2 which runs 34 [email protected] mhz.
The sgx 540 gets 6.4 gfllops @400mhz . The adreno 225 runs at 24.5 [email protected] 400mhz and the tegra 3 (t30l or t33) gpu runs 13 [email protected] mhz. So being that 6.4gflops vs 24.5gflops is relative to 202% what do you think happens with 34 gflops vs 24.5 gflops? Plus the s4 is 103 mflops single threaded while medfield z2460 is 90 mflops single threaded on the cpu side. That's pretty close. Dual core comparison with sgx544 might actually be superior and at a higher process node (32nm vs 28nm), and that's with an in order instruction set vs ARM's out of order. I don't see how you get "x86 Atom has very slim chances when taking on Qualcomm’s ARM processors or any other new generation ARM mobile CPU from Samsung or Nvidia" with that info. Your talking a gpu and a core.
Come spring they go out of order, not to mention ditching 5 year old architecture for silvermont, 22nm process and inclusion of intel hd gpu with 40 to 80 gflops (depending on eu count) and you think there will be no competition? Even the apq8064 adreno 320 only has approx 40-45 gflops but that doesn't include the modem so higher tdp .
Maybe the exynos 5250 with mali [email protected] 68 gflops will be a threat given release schedule but still, nearly matching single threaded performance with the best chip on the market (and with 5 year old architecture), and beating every ARM chip to date in java script for a first try/test the waters offering? Swap a gpu and add a core and its game on. And adding new architecture, 22nm, out of order instruction and hd graphics and ARM might have a problem until 64 bit ARM v8."
Click to expand...
Click to collapse
My question is How true is this?
not publish the link to the site because I do not know if this is right in this forum.
I apologize for possible flaws in my English.
I'm having a super-smooth experience, so yeah, the hyper-threaded single-core chip is doing a very fine job compared to the ARM competitors.
But is it true that Intel will go out-of-order for their next architecture? Because the whole point behind Atom processors was to take advantage of Intel's advanced lithography and well tought architecture, then simplify it to make it consume much less energy (and go from out-of-order to in-order was one of those simplifications).
Well I always thought Intel smartphone chips were more powerful CPU wise but gpu wise its behind.
And christ you can quote all the figures you like but it doesn't mean it'll actually reach that. Its what the individual parts can achieve.
Put them into a phone and reduce the power consumption to an acceptable level = a lot less than quoted figures
Sent from my HTC Desire using xda app-developers app
Is thereany way to have google now always listening to the hotword? like in the way it works on nexus
Search the forum. All discussed. "Voice" isn't a bad search key
Sent from my C6903 using xda app-developers app
It a bed idea.
On Motorola X are 8 cores available.
2 Processor+2 for Sensor (Smart Camera start...) 2 interactive Speak identification
The last 2 I forgot for what.
On our Z1 we have Snapdragon 800 with 4 cores
I will say when the features will port to our Z1, 1 core must already for speaking and this will be suck very much battery.
Sent from awesome Z1
Nemeziz 56th said:
It a bed idea.
On Motorola X are 8 cores available.
2 Processor+2 for Sensor (Smart Camera start...) 2 interactive Speak identification
The last 2 I forgot for what.
On our Z1 we have Snapdragon 800 with 4 cores
I will say when the features will port to our Z1, 1 core must already for speaking and this will be suck very much battery.
Click to expand...
Click to collapse
Moto X how many cores? It's dualcore no matter Motorolas Marketing tries to beautify. How can they sum the Adreno 320 GPU as additional cores? But worse, the TI C55x voice DSP as core? Last but not least? What has this all to do with battery drain? It might sound rude, but you have an important information mismatch...
Motorola have :
Motorola*X8 Mobile Computing*System:
2x 1,7 GHz Prozessor/ Quad Core Gpu Adreno 320/ 2 Cores for Sensors and Speaking.
The last two cores for Sensor and Speaking are so low underclocked and only for Sensors and Google Now (Okay,Google) Command, that they drain very very very low power.
We havent on our Z1 this low power cores, we have 4 Quad Cores with minimum clock of 300 MHz.
When this feature will be port to our Z1 and will active the processor must run on 300 MHz or high to have smooth restrained.
Our Z1 can't than go to deep sleep.
Sent from awesome Z1
Sorry for this question, but I'm very confuse about gpu of I9070,some people said "s advance have single gpu" but on website official novathor u8500 said "multi core gpu process graphics 2d and 3d" So s advance have single or dual gpu ? Thanks http://developer.sonymobile.com/knowledge-base/technologies/novethor-u8500/
S Advance has a single core Mali 400 MP GPU. And as far as I know, the Galaxy S2 also has a Mali 400 MP GPU - but its dual core (instead of just one core). If you read about Mali-400 MP in ARM's website (link), this is what you'll see:
Scalable from 1 to 4 cores the Mali-400 MP enables a wide range of different use cases, from mobile user interfaces up to smartphones, tablets and DTVs, to be addressed with a single IP. One single driver stack for all multi-core configurations simplifies application porting, system integration and maintenance. Multicore scheduling and performance scaling is fully handled within the graphics system, with no special considerations required from the application developer
Click to expand...
Click to collapse
So this shows that different phones can use the same GPU but with different number of cores.
PS: Anyone is free to correct me if I'm wrong.
Sami Kabir said:
S Advance has a single core Mali 400 MP GPU. And as far as I know, the Galaxy S2 also has a Mali 400 MP GPU - but its dual core (instead of just one core). If you read about Mali-400 MP in ARM's website (link), this is what you'll see:
So this shows that different phones can use the same GPU but with different number of cores.
PS: Anyone is free to correct me if I'm wrong.
Click to expand...
Click to collapse
Thanks for answer. It's very stranger because my galaxy S advance runs n.o.v.a 3 smooth and fast (mali400), and I have one tablet based allwinner A13 (mali 400), sometimes I got hard lags ( yes I always optimize my mb ram).