Yesterday I read an XDA archives that will not get full GPU for samsung Galaxy Ace s5830i .. But also Samsun Galaxy J has the same GPU so that you can use the same ROMs, but with bugs. WiFi does not work, gafika is terrible..
also says that the CPU 832 because the Galaxy Ace "and" using too much CPU for graphics....
WHAT YOU THINK ABOUT THIS?
Can u share d source?
Sent from my GT-S5830i using xda premium
I believe this isn't supposed to be in Dev section
"Friends forever." - Ventus
CallMeVentus said:
I believe this isn't supposed to be in Dev section
"Friends forever." - Ventus
Click to expand...
Click to collapse
True, but you forgot a fact: s5830i aint got no section other then the dev section...
zeelie said:
True, but you forgot a fact: s5830i aint got no section other then the dev section...
Click to expand...
Click to collapse
You have a title to state that it's for the S5830i .
And the Galaxy Ace sections refer to almost all variations , it's only the Dev threads that are different .
"Friends forever." - Ventus
acei consist bcm2763 GPU with 128MB memory.This is a videocore IV Gpu from broadcom,According to broadcom it is capable of decoding 1080p video playback
Check Below Link:
http://www.unwiredview.com/2009/12/...rocessor-for-mobiles-20mp-hd-video-recording/
akashshinde said:
acei consist bcm2763 GPU with 128MB memory.This is a videocore IV Gpu from broadcom,According to broadcom it is capable of decoding 1080p video playback
Check Below Link:
http://www.unwiredview.com/2009/12/...rocessor-for-mobiles-20mp-hd-video-recording/
Click to expand...
Click to collapse
I think our hardware is bcm21553 not bcm2763...I saw our hardware info in setcpu app ..in info section..go to hardware
onedirection_1995 said:
I think our hardware is bcm21553 not bcm2763...I saw our hardware info in setcpu app ..in info section..go to hardware
Click to expand...
Click to collapse
We have bcm21553 cpu and GPU is bcm2763 with 128 MB memory
Harley--TSI said:
Yesterday I read an XDA archives that will not get full GPU for samsung Galaxy Ace s5830i .. But also Samsun Galaxy J has the same GPU so that you can use the same ROMs, but with bugs. WiFi does not work, gafika is terrible..
also says that the CPU 832 because the Galaxy Ace "and" using too much CPU for graphics....
WHAT YOU THINK ABOUT THIS?
Click to expand...
Click to collapse
wtf is this samsung galaxy J...as far as i know there is no samsung galaxy J !!!!!!!
onedirection_1995 said:
wtf is this samsung galaxy J...as far as i know there is no samsung galaxy J !!!!!!!
Click to expand...
Click to collapse
I think he meant Galaxy Y.
Yes, Galaxy Y has same hardware with S5830i
Damn, am I the only one who thinks S5830i sucks? I'm glad I bought a non i model lol
Sent from my GT-I9100 using Tapatalk 2
KcLKcL said:
I think he meant Galaxy Y.
Yes, Galaxy Y has same hardware with S5830i
Damn, am I the only one who thinks S5830i sucks? I'm glad I bought a non i model lol
Sent from my GT-I9100 using Tapatalk 2
Click to expand...
Click to collapse
me too agree with u!!..unfortunately brought an ace i...and btw we dont hav same hardware of galaxy Y...there r some similarities...our's is far more better than galaxy Y!!! didn't u see any galaxy Y..its cheap display i dont think we hav exact hardware of galaxy Y!!!!!!
onedirection_1995 said:
me too agree with u!!..unfortunately brought an ace i...and btw we dont hav same hardware of galaxy Y...there r some similarities...our's is far more better than galaxy Y!!! didn't u see any galaxy Y..its cheap display i dont think we hav exact hardware of galaxy Y!!!!!!
Click to expand...
Click to collapse
Well not hardware, I meant chipset lol
Sent from my GT-I9100 using Tapatalk 2
Broadcom® BCM2763 VideoCore® IV Processor Features 1080p Video, 20 Megapixel Photos and 1 Gigapixel Graphics in an Ultra-Low Power 40 Nanometer Design
IRVINE, Calif., Dec 15, 2009 -- Broadcom Corporation (Nasdaq: BRCM), a global leader in semiconductors for wired and wireless communications, today announced its next generation multimedia processor that delivers industry leading performance and lower power in the top multimedia categories for mobile devices. Using 40 nanometer (40nm) CMOS process technology, the new Broadcom® BCM2763 VideoCore® IV multimedia processor provides even higher integration, smaller footprint size and lower power consumption than 65nm designs.
With the higher integration and significant power savings from 40nm CMOS process technology, the BCM2763 multimedia processor features the most advanced mobile high definition (HD) camcorder and video playback, up to 20 megapixel digital camera and photo image processing, and 1 gigapixel 2D/3D graphics rendering for a world-class gaming experience. HD video, 3D games and high resolution 20 megapixel pictures can be displayed at top quality on full-sized HD televisions and monitors using an on-chip industry standard HDMI interface. Additionally, the BCM2763's highly integrated architecture reduces bill-of-materials (BOM) cost to help drive sophisticated multimedia features into more affordable handsets.
Highlights/Key Facts:
-- The breadth and quality of Internet multimedia content is rapidly
improving, with sites such as YouTube now supporting full HD 1080p video
sharing. Consumers are also increasingly using cell phones as their
primary digital camera and camcorder, which is driving demand for higher
resolution and more sophisticated image processing which is currently
only available on advanced standalone camcorders and cameras.
Additionally, newer graphics-oriented user interfaces and mobile games
now require enhanced graphics capabilities.
-- The new Broadcom BCM2763 VideoCore IV multimedia processor enables
best-in-class performance in the following areas:
-- Full HD 1080p camcorder capabilities in a cell phone with
significantly improved quality over current generation handsets
(which generally have VGA or lower resolution camcorders).
-- Up to 20 megapixel digital camera with advanced features such as
multiple shots per second, image stabilization, face and smile
detection and panorama mode.
-- The ability to render mobile games natively at up to 1080p
resolution, which in combination with an on-board HDMI output,
allows a console-quality gaming experience on large screen HDTVs.
-- In addition to providing these capabilities on new handsets, the BCM2763
has improved power savings using a 40nm process without draining the
battery or significantly reducing talk time. Additional ultra-low power
consumption features include:
-- 20% to 50% power reduction in comparison to the prior generation
Videocore III multimedia processor.
-- 4 to 6 hours of 1080p video recording and 8 to 10 hours of mobile
playback, with up to 16 hours of full HD playback over HDMI given
sufficient handset storage.
-- Only 490 mW of chip power is required for 1080p camcorder H.264 High
Profile encoding and only 160 mW for 1080p playback.
-- Only 160 mW of power is required for mobile game graphics
processing, supporting up to 1 gigapixel per second fill rates and
improves graphics performance by a factor of 4x to 6x in comparison
to the prior generation Videocore III multimedia processor.
-- The BCM2763 processor integrates the key functionality and components
needed to drive advanced multimedia capabilities in new handsets. As a
result of this high integration, the BCM2763 enables a lower overall BOM
cost, enabling manufacturers to pass these lower costs on and introduce
advanced features to lower tier phones than previously possible.
-- The BCM2763 integrates the functions of eight chips including GPU
and graphics memory, image signal processing (ISP) and ISP memory,
video processing and video memory, HDMI and USB 2.0. 128MB of LPDDR2
graphics memory is stacked in a single package.
-- The 40nm process enables reduced power, improved performance and
reduced handset board space.
-- Benefiting from an existing VideoCore software code base and legacy
architecture, manufacturers of phones and other consumer electronics
devices can easily add these new VideoCore IV multimedia features to
their products, allowing faster time-to-market.
-- The BCM2763 is currently sampling to early access customers (pricing
available upon request). Handsets utilizing this new 40nm VideoCore IV
multimedia processor technology are expected to reach the market in
2011.
Supporting Quotes:
Mark Casey, Vice President & General Manager, Broadcom's Mobile Multimedia line of business.
"VideoCore IV is setting new benchmarks for performance, power consumption and affordability and is poised to drive advanced multimedia capabilities into new tiers of handsets. Supported by our comprehensive line of complementary cellular and connectivity solutions, our multimedia processor technology is the right choice for next generation mobile designs."
Subscribe to RSS Feed: Broadcom Mobile Platforms Group
About Broadcom
Broadcom Corporation is a major technology innovator and global leader in semiconductors for wired and wireless communications. Broadcom products enable the delivery of voice, video, data and multimedia to and throughout the home, the office and the mobile environment. We provide the industry's broadest portfolio of state-of-the-art system-on-a-chip and software solutions to manufacturers of computing and networking equipment, digital entertainment and broadband access products, and mobile devices. These solutions support our core mission: Connecting everything®.
Broadcom is one of the world's largest fabless semiconductor companies, with 2008 revenue of $4.66 billion, and holds over 3,650 U.S. and over 1,450 foreign patents, more than 7,750 additional pending patent applications, and one of the broadest intellectual property portfolios addressing both wired and wireless transmission of voice, video, data and multimedia.
A FORTUNE 500® company, Broadcom is headquartered in Irvine, Calif., and has offices and research facilities in North America, Asia and Europe. Broadcom may be contacted at +1.949.926.5000 or at www.broadcom.com.
Cautions regarding Forward Looking Statements:
All statements included or incorporated by reference in this release, other than statements or characterizations of historical fact, are forward-looking statements. These forward-looking statements are based on our current expectations, estimates and projections about our industry and business, management's beliefs, and certain assumptions made by us, all of which are subject to change. Forward-looking statements can often be identified by words such as "anticipates," "expects," "intends," "plans," "predicts," "believes," "seeks," "estimates," "may," "will," "should," "would," "could," "potential," "continue," "ongoing," similar expressions, and variations or negatives of these words. Examples of such forward-looking statements include, but are not limited to, the timing that handsets utilizing the Broadcom BCM2763 VideoCore IV multimedia processor are expected to reach the market, and the effect of the VideoCore IV on advanced multimedia capabilities in new handset devices.. These forward-looking statements are not guarantees of future results and are subject to risks, uncertainties and assumptions that could cause our actual results to differ materially and adversely from those expressed in any forward-looking statement.
Important factors that may cause such a difference for Broadcom in connection with the BCM2763 VideoCore IV multimedia processor, but are not limited to:
-- the rate at which our present and future customers and end-users adopt
Broadcom's mobile multimedia technologies for mobile applications;
-- trends in the multimedia processor markets in various geographic
regions, including seasonality in sales of consumer products into which
our products are incorporated;
-- the gain or loss of a key customer, design win or order;
-- the volume of our product sales and pricing concessions on volume sales;
-- our ability to timely and accurately predict market requirements and
evolving industry standards and to identify opportunities in new
markets; and
-- competitive pressures and other factors such as the qualification,
availability and pricing of competing products and technologies and the
resulting effects on sales and pricing of our products.
Additional factors that may cause Broadcom's actual results to differ materially from those expressed in forward-looking statements include, but are not limited to the list that can be found at http://www.broadcom.com/press/additional_risk_factors/Q42009.php.
Our Annual Report on Form 10-K, subsequent Quarterly Reports on Form 10-Q, recent Current Reports on Form 8-K, and other Securities and Exchange Commission filings discuss the foregoing risks as well as other important risk factors that could contribute to such differences or otherwise affect our business, results of operations and financial condition. The forward-looking statements in this release speak only as of this date. We undertake no obligation to revise or update publicly any forward-looking statement, except as required by law.
Broadcom, the pulse logo, Connecting everything, the Connecting everything logo and VideoCore® are among the trademarks of Broadcom Corporation and/or its affiliates in the United States, certain other countries and/or the EU. Any other trademarks or trade names mentioned are the property of their respective owners.
Contacts
Trade Press Investor Relations
Henry Rael T. Peter Andrew
Public Relations Manager Vice President, Corporate Communications
949-926-5734 949-926-5663
[email protected] [email protected]
SOURCE Broadcom Corporation; BRCM Mobile & Wireless; Broadcom Mobile Platforms Group
http://www.broadcom.com
Copyright (C) 2009 PR Newswire. All rights reserved
Click to expand...
Click to collapse
http://www.broadcom.com/press/release.php?id=s430181
Related
Found the datasheet on MSM7200A, would be pleased if any of you guys check it and compare it to iPhone graphics chip
Xperia proccessor MSM7200A, with integrated graphics chip
iPhone graphics chip PowerVR MBX-Lite (Sorry no datasheet)
Since the most popular (because of its app store, games quantity) is iPhone,
And I must say iPhone games are really superb for a graphics aspect, imo best graphics in mobile phones/pda/smartphones field. (like Sims 3, NFS Undercover), I wonder if Xperia could do the same.
In GSM arena, I discovered that iPhone graphics chip is Powervr MBX-Lite (On iPhone & iPhone 3G), featuring OpenGL ES 1.1, OpenVG 1.0, Direct3D and of course full 2D/3D support, which, compared MSM7200A integrated ATi accelaration, well, really Im not a Pro out here, so I've just found a Datasheet on this.
Not to tell about the WM gaming industry, which is ermm "....", but the fact, hope, that we could have this...
Now we have almost everything - itje, who dedicated very much to xda (his roms, other deeds are priceless) ((ofcourse his Touch-IT testing team, who not only answer touch-it related questions but is active like everyhwere)) other Rom and active app's/ xda contributors like gtrab, smaberg, jackleung,Tnyynt, fingerkeyboard makers, HTC encoder makers, other guys, fixing our phone bugs, creating soft for us,moderators... I could go on and on
naming these great people, who raised up our xperia from stock to almost perfection.
So only issue is...(for me ) gaming.
Heard that EA, gameloft are porting iPhone games to newer WM devices, but, I dont think that these sources are...well...realistic.
Tautvydas said:
Been looking in forums, web, to be precise, everywhere, even Xperia X1 White Paper, but Still havent got clarified answer on what Graphics chip does Xperia use. So I made several assumptions, by checking the information a little bit.
As common answer is Xperia uses ATI Imageon 2700, in Wikipedia there's no such chip number as "2700", also, Products with Imageon well, there's no xperia!
Imo, variants Xperia graphics chip is "Imageon 2388/2380" or "Imageon 2300" (If its even Imageon) , just need you guys to clarify this.
Just why I am making this thread - since Xperia is very versatile, like music/internet/messaging/videos with qwerty and etc. the only problem (for me), well, not as a problem, but as a shortage - gaming.
We have ScuMMM, sNes, sega, even PSX emu's, and maybe one game, who shows what Xperia is capable of - Xtrakt.
Since the most popular (because of its app store, games quantity) is iPhone,
And I must say iPhone games are really superb for a graphics aspect, imo best graphics in mobile phones/pda/smartphones field. (like Sims 3, NFS Undercover), I wonder if Xperia could do the same.
In GSM arena, I discovered that iPhone graphics chip is Powervr MBX-Lite (On iPhone & iPhone 3G), featuring OpenGL ES 1.1, OpenVG 1.0, Direct3D and of course full 2D/3D support, which, compared to "Imageon 2388/2380", is "almost" equal (dunno about Direct3D, although I think its not supported by Imageon) , so if any of you would tell whats the Xperia graphics chip, we would clarify the fact that we "could" enjoy the iPhone graphics (^^)
Not to tell about the WM gaming industry, which is ermm "....", but the fact, hope, that we could have this...
Now we have almost everything - itje, who dedicated very much to xda (his roms, other deeds are priceless) ((ofcourse his Touch-IT testing team, who not only answer touch-it related questions but is active like everyhwere)) other Rom and active app's/ xda contributors like gtrab, smaberg, jackleung,Tnyynt, fingerkeyboard makers, HTC encoder makers, other guys, fixing our phone bugs, creating soft for us,moderators... I could go on and on
naming these great people, who raised up our xperia from stock to almost perfection.
So only issue is...(for me ) gaming.
Heard that EA, gameloft are porting iPhone games to newer WM devices, but, I dont think that these sources are...well...realistic.
My target is to clarify Xperia graphics potential, know full information about graphics chip
P.S. Yes I've searched xda/google for this like crazy. Sorry for this thread to go as a poem.
P.S.S. Yes Xperia is business class phone, but hey, why not to dream? It still has got one of the best HW on WM devices around.
Click to expand...
Click to collapse
Could you maybe find out by taking the x1 apart and looking at all of the chips? I looked at some videos online but couldnt see the names on any chips - not sure if it would even say but it seems as if it could work - i looked around a bit too and i keep hearing about the Imageon 2300 but i cant confirm it sorry
the ati part is not a chip it's intergrated into the qualcomm cpu
http://www.google.dk/search?source=ig&hl=da&rlz=&q=MSM7200a+ati&btnG=Google-søgning&aq=f&oq=
more info
Well its not the 2300, here a quote from ati "Imageon 2300 integrates an advanced 2D and 3D graphics engine, MPEG-4 video decoder, JPEG encoding/decoding, and a 2 Mega pixel camera sub-system processing engine. With support for up to 2MB of ultra low-power SDRAM, it "
link:http://ati.amd.com/products/imageon2300/
since we have (being said) 128 mb shared ram and ive also read some where in some sheet that we have an tuned up gpu. (this is all speculative tho)
and why do the touch pro have 288 and we 256 ram? do we have a better gpu needing more ram ?
Updated the post, please check, now only we need guys to compare both phones graphics capability.
Chaosstorm said:
Well its not the 2300, here a quote from ati "Imageon 2300 integrates an advanced 2D and 3D graphics engine, MPEG-4 video decoder, JPEG encoding/decoding, and a 2 Mega pixel camera sub-system processing engine. With support for up to 2MB of ultra low-power SDRAM, it "
link:http://ati.amd.com/products/imageon2300/
since we have (being said) 128 mb shared ram and ive also read some where in some sheet that we have an tuned up gpu. (this is all speculative tho)
and why do the touch pro have 288 and we 256 ram? do we have a better gpu needing more ram ?
Click to expand...
Click to collapse
Seems like SE/HTC made some kind of adjustment with these GPU on xperia's MSM7200A, wondering what, so your guessing might be right about better xperia's gpu, or its just that touch pro's all interface/oem/os modifications are more "hungry" for ram.
Just for curiosity, new released iPhone 3GS has PowerVR SGX535 graphics, also SE iDou has same chip from SGX family (530 one), so we can expect awesome graphics, chip's techinical capabilities are very big:
# next generation fully programmable universal scalable shader architecture
# exceeding requirements of OpenGL 2.0 and up to DirectX 10.1 Shader Model 4.1
Just that 3GS's chip will be better than iDou's (535 - 28MPolys/s , 530 - 14 MPolys/s)
soo..... where can we find an D3D drivers for X1i?
I've been reading a lot of discussion on this and would love to hear some opinions and see some benchmarks.
I currently own a Nexus One & where I live they are priced about $150 dollars more for a Nexus than a Galaxy S (It's my understanding Nexus are regarded as cheaper phones in America?) So basically I can sell my 4 month old Nexus One & buy a brand new 16GB Galaxy S for no extra cost. Here is what I am wondering...
I know the Galaxy S has an amazing GPU, it facerolls the Nexus One & even seems to stomp the Droid X with its improved GPU so that is great.
The CPU however seems to under perform in every benchmark I can find versus the Nexus/Droid2 & many more current high end Androids.
I realise these devices are running Android 2.2 with JIT. I've seen Linpacks of 2.2 running Galaxy S devices and JIT enabled ROMs that still don't compare with these other devices.
Question 1
What I'm wondering is the difference we can see in CPU benchmarks going to be surpassed with the addition of a proper 2.2 JIT rom on our devices or is simply that the Snapdragons & other Qualcomm CPU are actually better than our Hummingbird.
Question 2
My Nexus One is Linkpacking 30 MFlops atm, I think with OC etc I can get it higher too. Does anyone have any evidence of a Galaxy S phone (running 2.2, JIT, lagfix or anything) that competes (or even comes close to competing) with this? I have been unable to find anything.
Question 3
Is the current Quadrant scores that I'm seeing people reporting in the Lag Fix threads (2000+) actually representative of speed or are these (as Cyanogen & others seem to be claiming) distorted?
(I realise a lot of people are reporting lag fixed.. what I'm asking is the number represented there (x2 N1 Froyo's score) actually accurate. I don't understand the mechanics behind the I/O benchmark so I don't understand if the lagfix is distoring the reported results from it.)
1. Hummingbird is apparently faster.
2. We don't have JIT yet.. Compare Nexus One 2.1/Eclair with Galaxy S 2.1, and I remember seeing we are faster.. JIT has a massive impact on mflops (because the benchmark uses bytecode, not compiled code).
3. No benchmark is really representative of speeds (no matter what people tell you). Because different apps have different workloads. You might get 50mflops in a CPU test, but for 3D games, the number of triangles matters more. It has recently been shown the I/O test for quadrant can be tricked too.
Benchmarks aren't really comprehensive enough for anything more than getting an idea of the performance.. But don't rely on them.
The reason why we get crappy benchmarks is due to having ****ty filesystem (rfs) which don't let us have multi writes. That's what lag fixes help. Cpu wise we eat snapdragons for breakfast, lunch and tea.
Sent from my GT-I9000 using Tapatalk
andrewluecke said:
1. Hummingbird is apparently faster.
2. We don't have JIT yet.. Compare Nexus One 2.1/Eclair with Galaxy S 2.1, and I remember seeing we are faster.. JIT has a massive impact on mflops (because the benchmark uses bytecode, not compiled code).
3. No benchmark is really representative of speeds (no matter what people tell you). Because different apps have different workloads. You might get 50mflops in a CPU test, but for 3D games, the number of triangles matters more. It has recently been shown the I/O test for quadrant can be tricked too.
Benchmarks aren't really comprehensive enough for anything more than getting an idea of the performance.. But don't rely on them.
Click to expand...
Click to collapse
what he said ^^^
regards
ickyboo said:
The reason why we get crappy benchmarks is due to having ****ty filesystem (rfs) which don't let us have multi writes.
Sent from my GT-I9000 using Tapatalk
Click to expand...
Click to collapse
Source please.. I never have actually seen anyone prove this here, but I hear it being thrown around increasingly. How was this proven? I'm becoming increasingly concerned that this conclusion was made by playing chinese whispers
andrewluecke said:
Source please.. I never have actually seen anyone prove this here, but I hear it being thrown around increasingly. How was this proven? I'm becoming increasingly concerned that this conclusion was made by playing chinese whispers
Click to expand...
Click to collapse
Well, if you look at pre-Froyo benchmarks of Snapdragon devices, they generally get around 6.1 in Linpack, vs ~8.4 for a Galaxy S. That's a pretty big delta, and carriers through most other synthetic and real world benchmarks, roughly 20% faster at the same clock speed. Same thing can be seen with the TI processors in the Droid line, at 1Ghz, they score in the 8's with 2.1.
Froyo benchmarks are suspect for a number of reasons, mainly because most of the benchmarks were designed with 1.6-2.1 in mind, and partly because Google spent a lot of time optimizing the base Froyo build for a Snapdragon processor. HTC, Sony, Dell, etc can piggyback off this work with their version, whereas Samsung and Motorola have to start much closer to scratch. Which is also why the HTC devices got Froyo sooner.
Believe it or not (and despite the marketing hype) the Snapdragon chipset is a budget solution, with less complex/expensive memory subsystem, and a far less costly integrated graphics solution than what is found on the Galaxy S.
It's cheap to produce, it has almost everything in a nice tidy package that makes it cheaper to engineer handsets (when I say everything, I mean CPU/GPU/Radio/WiFi/GPS/USB).
It's a pretty good package for companies like HTC, who don't do any real hardware engineering, and try to keep costs low. They do software (very very well, I should add), industrial design, and mass manufacturing, but they've NEVER designed a chipset (or display), they always source those from a third party, in this case Qualcom for the chipset, Samsung/Sony for the displays, etc.
However, they were the first to market with 1Ghz speed and it's a solid and stable hardware setup. Just keep in mind that clock speeds don't tell the whole tale.
The Galaxy S, (and to a lesser extent the Droid series) use a better stand-alone CPU solution and a far superior non-integrated (has its own chip) GPU. Samsung does do their own in-house chipset engineering, and they didn't cut corners on the CPU design, and they learned a lot about how to squeeze a lot of performance out of the ARM instruction set from their own products and the work they did for the iPhone processors. In brute-force, they smack the Snapdragon chipset around like a *****, but they get slapped around in turn by HTC's superior software engineering.
HTC has a real advantage in lots and lots of PDA/Smartphone software experience. They know how to make the most of the hardware they purchase, and seem to spend a great deal of time optimizing the software, be it Windows Mobile or Android, and lessons learned from a decade of making PDAs, under their name and for others.
If HTC used a Hummingbird or TI OMAP chipset with PowerVR GPU, I have no doubt they'd be able to more quickly wring more performance and stability out of it than Samsung or Motorola can.
Croak said:
Well, if you look at pre-Froyo benchmarks of Snapdragon devices, they generally get around 6.1 in Linpack, vs ~8.4 for a Galaxy S. That's a pretty big delta, and carriers through most other synthetic and real world benchmarks, roughly 20% faster at the same clock speed. Same thing can be seen with the TI processors in the Droid line, at 1Ghz, they score in the 8's with 2.1.
Froyo benchmarks are suspect for a number of reasons, mainly because most of the benchmarks were designed with 1.6-2.1 in mind, and partly because Google spent a lot of time optimizing the base Froyo build for a Snapdragon processor. HTC, Sony, Dell, etc can piggyback off this work with their version, whereas Samsung and Motorola have to start much closer to scratch. Which is also why the HTC devices got Froyo sooner.
Believe it or not (and despite the marketing hype) the Snapdragon chipset is a budget solution, with less complex/expensive memory subsystem, and a far less costly integrated graphics solution than what is found on the Galaxy S.
It's cheap to produce, it has almost everything in a nice tidy package that makes it cheaper to engineer handsets (when I say everything, I mean CPU/GPU/Radio/WiFi/GPS/USB).
It's a pretty good package for companies like HTC, who don't do any real hardware engineering, and try to keep costs low. They do software (very very well, I should add), industrial design, and mass manufacturing, but they've NEVER designed a chipset (or display), they always source those from a third party, in this case Qualcom for the chipset, Samsung/Sony for the displays, etc.
However, they were the first to market with 1Ghz speed and it's a solid and stable hardware setup. Just keep in mind that clock speeds don't tell the whole tale.
The Galaxy S, (and to a lesser extent the Droid series) use a better stand-alone CPU solution and a far superior non-integrated (has its own chip) GPU. Samsung does do their own in-house chipset engineering, and they didn't cut corners on the CPU design, and they learned a lot about how to squeeze a lot of performance out of the ARM instruction set from their own products and the work they did for the iPhone processors. In brute-force, they smack the Snapdragon chipset around like a *****, but they get slapped around in turn by HTC's superior software engineering.
HTC has a real advantage in lots and lots of PDA/Smartphone software experience. They know how to make the most of the hardware they purchase, and seem to spend a great deal of time optimizing the software, be it Windows Mobile or Android, and lessons learned from a decade of making PDAs, under their name and for others.
If HTC used a Hummingbird or TI OMAP chipset with PowerVR GPU, I have no doubt they'd be able to more quickly wring more performance and stability out of it than Samsung or Motorola can.
Click to expand...
Click to collapse
Thanks, that was a really insightful post.
So basically even though our processor should outperform or ATLEAST match the snapdragons. Due to the mass optimization of 2.2 JIT for Snapdragon devices it's likely we'll never see the same performance. Unless Samsung gets really keen to do some optimization themselves.
I searched all over the internet to see why the CPU scores in Quadrant and other benchmarks are waaaay lower then the Nexus ones, but still I can't find anything.
Does Samsung disable the JIT in their Froyo ROMs? Because both Snapdragon and Hummingbird are still based on the same Cortex A8 cores
"It's clear that FroYo's JIT compiler currently only delivers significant performance gains for Snapdragon CPUs with the Scorpion core. This in turn explains why, so far, only a beta version of Android 2.2 is available for the Cortex-A8-based Samsung Galaxy S — the JIT compiler is the outstanding feature of FroYo. For the widespread Cortex-A8 cores, used in many high-end Android smartphones, the JIT compiler needs to be optimised. A Cortex-A8 core will still be slower than a Scorpion core at the same clock speed, but the Scorpion's advantage may not be as much 260 percent."
Click to expand...
Click to collapse
http://androidforums.com/samsung-ca...ant-scores-why-humming-bird-doing-so-bad.html
There are multiple reasons, not optimised jit, slow memory for caching and more. Most of them are solved in the CM roms (it performs on par with the N1), and i can tell you that when Gingerbread comes it will blow the snapdragons away.
Which custom ROM provides CPU performance close to Snapdragon?
[ignore this post please]
Still the 1Ghz humming bird out performs the 1Ghz snap in real world performance
Even the LG Optimus One ARM11 600MHz Core scores better than Galaxy S. I still believe it's a software problem.
http://lgoptimusonep500.blogspot.com/2011/01/custom-rom-for-lg-optimus-one-p500.html#more
Another benchmark:
http://www.anandtech.com/show/4126/nokia-http://www.anandtech.com/show/4126/nokia-n8-review-/7
...where the Nexus S proves that the Hummingbird can do more than it currrently does in Galaxy S.
I currently have a HTC Desire which I have had since it first come out and am in line for a new handset... I always get mine sim free or pay as you go.
Been looking a lot a lately of the Optimus 2x or Galaxy S2 in regards to gaming. Now, I know there are threads about both GPU's but.... which one is more powerful and will be best "future proofed"?.
Heard a lot of things on both handset forums saying that Tegra 2 is a year old, has 8 cores and the Galazy S2 is newer and only 4 cores.
So, as a potential buyer of either handset... am looking at the best gaming platform based on games.
At the moment, its a hard choice because I want to purchase the best fone I can at the moment...
Any thoughts on which platform will be better?, or.... get the support from developers?. At the moment NVIDIA have got the marketing right imho, but could the Galazy S2 overtake that and make "it" the most optimized platform for games on a Android device?.
A lot of questions, which I am unsure of the answers?.
Any thoughts?.
While i can't tell you which is the future proof, i think its worth remembering that nvidia is very old school in gaming and i am sure they are doing what they can to promote tegra as the ultimate mobile gaming platform and i am sure they know a few in the business.
Can non tegra phone play tegra games ?
Sent from my LG-P990 using XDA App
@iceman92
Yes.
But still i'd say tegra is more furute-proof cause the developers will focus on the more mainstream processor which will be tegra... only a suggestion.
With the future-proof part, I would say Tegra is the best to go with. Nvidia have alot of plans for releasing smartphone CPU's in the future, I mean, they are due to release a quad core CPU this summer. I've had my O2x for about 2 weeks now and I've had no problems gaming with it, smooth as silk. As long as you use Launcher Pro then you're fine
The Mali-400 in the SGS2 SoC is older than the Geforce ULP in the Tegra 2 I believe but the Mali should outperform Tegra 2 on paper. Currently, gaming on "superphones" is still murky. You have different approaches to how you make (e.g. Adreno and PowerVR parts are "tile-based") the chips work. Therefore some games will play better on some chips because they are optimized for a certain kind of graphics design which is good on certain kinds of GPU hardware.
So here's what you do. Focus on good "Families" of GPU. First we have Adreno found in Qualcomm Chips (Adreno 220 in the HTC Sensation slaps Geforce ULP hard). The Adreno 200 is in the Nexus One and several Android phones. It's a well known and widely used GPU in Android.
Next you have PowerVR by Imagination, a very proven family. The PowerVR SGX540 is found in the Nexus S and the Galaxy S i9000 class of phones (Very popular phone). So expect a lot of marketshare in that. PowerVR is also used in iPhones and iPads. So expect some advantages when an iPhone released game reaches an Android platform.
Next you have Geforce ULP in the Tegra 2 by NVidia. Geforce ULP has not had much time to shine HOWEVER Tegra Zone has demonstrated NVidia has been encouraging developers on the platform. NVidia has a good history with developer support on their desktop chips and it is quite evident that they are doing the same with their smartphones. However, Tegra 2 is only in two (three if you count g2x as separate from o2x) smartphones in the market so far.
From what I can see so far, the Adreno, PowerVR, and Geforce ULP are very relevant in the future of mobile gaming and will be for a long time. There's no chance in hell you can futureproof with any phone you buy now. On average, smartphone GPU performance appears to be breaking Moore's Law and is becoming well over 2x the performance year over year with no sign of slowing down. What you want is something that's on the market which you will be satisfied with now. That's all you can count on.
Thanks guys.. in the end i went for the 2X as i paid £278 for the handset with a trade in for my desire.
Am very happy with the fone at the moment but having a issue with the free Shrek Kart voucher as it seems the voucher may have been used with someone else, not too worry.
Haven't had chance with gaming on it but just hope that we get games that are optimised for tegra 2, rather than ports from another more powerful GPU?!.
Sent from my LG-P990 using XDA Premium App
During the keynote of the new iPhone in 07 days, Phil Schiller, Apple's VP of markting, showed a mysterious plot that claimed the new processor A5X tablet offering up to four times more graphics performance than the quad-core NVidia Tegra 3 .
Nvidia did not like this chart.
Ken Brown, a spokesman for the company, told ZDNet that he was "flattered" to have been compared by Apple, but that tests performnace require more information.
"We have no information on this benchmark," said Brown. "We need to understand what application was used. Was only one or several applications? What drivers did they use? There are many issues in benchmark tests."
Ken is right to argue that Apple simply hid that information. Nowhere in the Cupertino company shows how he got those numbers, and probably will not even explain.
Nvidia promised to do their own benchmark tests so the new iPhone is released, March 16. Of course, these new tests Tegra 3 will do better on tests than Apple, as happened so many years (and still does) in disputes between NVidia and AMD, where each of the companies showed different benchmark tests where their chipsets fared better.
At least we know that once the new iPhone is released, numerous comparative tests and the Internet began to emerge, and we have more solid information about who gets the better of the fray.
jeiih said:
Ken is right to argue that Apple simply hid that information. Nowhere in the Cupertino company shows how he got those numbers, and probably will not even explain.
Nvidia promised to do their own benchmark tests so the new iPhone is released, March 16. Of course, these new tests Tegra 3 will do better on tests than Apple, as happened so many years (and still does) in disputes between NVidia and AMD, where each of the companies showed different benchmark tests where their chipsets fared better.
At least we know that once the new iPhone is released, numerous comparative tests and the Internet began to emerge, and we have more solid information about who gets the better of the fray.
Click to expand...
Click to collapse
Should be iPad I think.
lamborg said:
Should be iPad I think.
Click to expand...
Click to collapse
Indeed.
Sent from my MB860 using XDA
i cant imagine the A5x being anywhere close to the speed of the Tegra 3.
its basically an incremental upgrade to the A5, and the Tegra 3 is in a league of its own.
This kind of seems more as bs than actual truth. I agree with what emjlr3 said, the Tegra 3 by far sets the standard for high end tablet hardware in my opinion.. The A5X is merely an incrementally improved A5. Not to mention the Tegra 3 has a Quad Core.. While the A5X is only a dual core at most but the specs of the Tegra 3 are 12 Graphics Processing Cores, while the A5X has 4.
i've seen the "RESOLUTIONARY" iPad video its complete bull****. Everything on there is definitely untrue with no evidence. for example they claimed that their so awesome s(*it)Pad has a better display than any HDTV.. seriously my 30" Sony Monitor with which i am typing now manages 2048x1536 with ease.. and its much more sharp than what ive seen on the ipad.. and since years nvidia is miles ahead with its gpus and the A5X(which isnt even their own creation or how they would call it "groundbreaking innovation" (its made by Samsung)) isnt even near the performance of the tegra 3 or even the Adreno 225 ..
Hmm..
Well, i would like to see proof instead of just bold claims. I've seen the tegra 3..and its pretty darn impressive. Let's see how the A5x stack up ay?
realfelix said:
i've seen the "RESOLUTIONARY" iPad video its complete bull****. Everything on there is definitely untrue with no evidence. for example they claimed that their so awesome s(*it)Pad has a better display than any HDTV.. seriously my 30" Sony Monitor with which i am typing now manages 2048x1536 with ease.. and its much more sharp than what ive seen on the ipad.. and since years nvidia is miles ahead with its gpus and the A5X(which isnt even their own creation or how they would call it "groundbreaking innovation" (its made by Samsung)) isnt even near the performance of the tegra 3 or even the Adreno 225 ..
Click to expand...
Click to collapse
Dude, you yourself said, that you are using a monitor for Hi Res display. Apple never claimed that the new iPad's display is better than any "monitor's" display. It only claimed it has better resolution than any HDTV because the highest resolution for any HDTV right now is 1920 x1080. So in that point, apple did not lie.
In my opinion if the new iPad can display graphics that is more crisp but at the same speed as the current iPad, then it at least doubled it's own speed from the last iPad. If it renders graphics faster, then the 4x faster claim need to be proven by benchmarking.
I've had both an iPad and a Tegra tablet. Depending on configurations, the Android tablets can match the iPad when it comes to graphics, but I noticed there are some programs which are not written well and graphics stutter. This happens more for the Android, I guess it could be because Apple has strict coding regulations as compared to the Android which is more open.
Either way, I think it's more what you prefer to use. I have a Galaxy Nexus for my phone but I have the 1st gen iPad for my tablet.
Let's do the logic here, ipad 2 with sgx 543mp2> Tegra 3. Therefore ipad 3 with sgx 544 that is twice the sgx 543 is also greater than the Tegra 3.
Just search for off screen 720p benchmarks and you'll seen the proof.
Now cpu wise the Tegra 3 is more than likely much more powerful.
$1 gets you a reply
emjlr3 said:
i cant imagine the A5x being anywhere close to the speed of the Tegra 3.
its basically an incremental upgrade to the A5, and the Tegra 3 is in a league of its own.
Click to expand...
Click to collapse
Uh... whut. They were talking about graphics performance and the even the A5 is faster than tegra 3 in that respect. The A5x being 4 times faster is quite plausible.
red12355 said:
Uh... whut. They were talking about graphics performance and the even the A5 is faster than tegra 3 in that respect. The A5x being 4 times faster is quite plausible.
Click to expand...
Click to collapse
i thought the point of the Tegra 3 was to bring desktop graphics to a tablet/phone?
the ipad 2 surely did not have desktop like graphics
AnandTech benchmark ASUS Eee Pad Transformer Prime & iPad 3. Although the number is not up to 4x, but iPad 3 still comes with slightly better graphic performance. Obviously, CPU on Tegra 3 is better.
ARM Sets New Standard for the Premium Mobile Experience (ARM Cortex A72 & Mali-T880)
Here we go again! | WOOT WOOT! |
ARM Sets New Standard for the Premium Mobile Experience
03 February 2015
News Highlights:
New ARMv8-A-based ARM® Cortex®-A72 processor, 50X increase in CPU performance compared to just five years ago
New ARM CoreLink™ CCI-500 Cache Coherent Interconnect, allowing higher system bandwidth, and increasing system efficiency
New ARM Mali™-T880 GPU delivers console-quality gaming and stunning visuals in a mobile power envelope
Optimization for the leading-edge TSMC 16nm FinFET+ process with ARM POP™ IP
Cortex-A72 licensees include HiSilicon, MediaTek and Rockchip.
Cambridge, UK, Feb. 3, 2015 - ARM today announced a suite of IP that will enable a new standard for premium experiences on 2016 mobile devices. At the heart of this suite is the ARM Cortex-A72 processor, which is the highest performing CPU technology available for developing mobile SoCs today. In target configurations, the Cortex-A72 processor will deliver CPU performance that is 50X greater than the leading smartphones from just five years ago. The ARM premium mobile experience IP suite also offers a significant graphics upgrade generating a stunning visual experience for users at up to 4K120fps resolution. Devices with this new industry-leading technology suite are expected to enter the market in 2016.
The ARM premium mobile experience IP suite offers the most compelling mobile technology available today. Alongside the Cortex-A72 processor is the new CoreLink CCI-500 interconnect and the new Mali-T880 GPU, ARM's highest performing and most energy-efficient mobile GPU, along with Mali-V550 video and Mali-DP550 display processors. To further ease chip implementation, the suite also includes ARM POP IP for the leading-edge TSMC 16nm FinFET+ process.
"Our new premium mobile experience IP suite with the Cortex-A72 processor delivers a decisive step forward from the compelling user experiences provided by this year's Cortex-A57 based devices," said Pete Hutton, executive vice president and president, products groups, ARM. "For multiple generations, together with our partners, we have delivered the leading-edge of the premium mobile experience. Building on this, in 2016 the ARM ecosystem will deliver even slimmer, lighter, more immersive mobile devices that serve as your primary and only compute platform."
The 2016 Premium Mobile Experience
The premium mobile experience IP suite addresses the ever-increasing demands of end-users for their primary, always-connected mobile devices capable of creating, enhancing and consuming any content. For 2016 devices, ARM and its partners will boost the mobile experience associated with use cases such as:
Immersive and sophisticated image and video capture, including 4K120fps video content
Console-class gaming performance and graphics
Productivity suites requiring fluid handling of documents and office applications
Natural language user interfaces capable of running natively on a smartphone.
Introducing Cortex-A72, the Highest Performance ARM Cortex Processor
More than ten partners, including HiSilicon, MediaTek and Rockchip, have already licensed the Cortex-A72 processor, which is based on the ARMv8-A architecture that delivers energy-efficient 64-bit processing while providing full backward compatibility to existing 32-bit software. The Cortex-A72 processor will deliver substantial new benefits:
Sustained operation within the constrained mobile power envelope at frequencies of 2.5 GHz in a 16nm FinFET process and scalable to higher frequencies for deployment in larger form factor devices
3.5X the performance of 2014 devices based on the Cortex-A15 processor
Improved energy efficiency that delivers a 75 percent reduction in energy consumption when matching performance of 2014 devices
Extended performance and efficiency when the Cortex-A72 CPU is combined with a Cortex-A53 CPU in ARM big.LITTLE™ processor configurations.
CoreLink CCI-500, Extending Efficiency Across the SoC
The CoreLink CCI-500 Cache Coherent Interconnect enables big.LITTLE processing and delivers system power savings thanks to an integrated snoop filter. CoreLink CCI-500 delivers double the peak memory system bandwidth and offers a 30 percent increase in processor memory performance compared to the previous generation CoreLink CCI-400. This enables more responsive user interfaces and accelerates memory intensive workloads such as productivity applications, video editing and multi-tasking. CoreLink CCI-500 fully supports ARM TrustZone® technology for a secure media path enabling protection of multimedia content when used with the Mali product family.
Mali-T880, A Ground-Breaking Mobile Graphics and Visual Experience
The new Mali-T880 GPU delivers 1.8X the graphics performance of today's Mali-T760 based devices and a 40 percent reduction in energy consumption across identical workloads. The Mali-T880 enables high-end, complex use cases to be enjoyed on power-constrained mobile and consumer platforms with its advances in energy efficiency, additional arithmetic capabilities and scalability. For mobile gamers, the result is a more advanced gaming and console-like experience. Native support for 10-bit YUV provides stunning fidelity for premium 4K content, complementing the Mali-V550 video processor and Mali-DP550 display processors.
Energy-efficiency continues to be the guiding design principle across the spectrum of the Mali product family, including a diverse set of proven, system-wide bandwidth reduction technologies. In premium device configurations, the Mali-V550 video processor fully supports HEVC decode and encode on a single core. In addition, it offers scalability up to 4K120fps with its full eight cores. The Mali-DP550 display processor offers enhanced capabilities for offloading tasks such as composition, scaling, rotation and image post-processing from the GPU to maximize battery life.
Improved Time-to-Market with FinFET Technology
The new POP IP for advanced TSMC 16nm FinFET+ enables any silicon vendor to migrate from 32/28nm process nodes with predictable performance, power results and time-to-market. ARM POP IP will enable Cortex-A72 processors to sustain 2.5 GHz in smartphones and scale to higher frequencies for larger form-factor devices in typical conditions. POP IP also supports implementations of the Mali-T880 for TSMC 16nm FinFET+.
Advancing Mobile Ecosystem Innovation with ARMv8-A Leadership
The mobile ecosystem is already taking advantage of the benefits of ARMv8-A with the move to Google Android™ 5.0 Lollipop. The new IP suite builds on this, enabling partners to deliver even more performance in the slimmest of form factors without compromising battery life. Over the course of 2015 and 2016, ARM expects significant adoption of Google Android 5.0 Lollipop in the premium mobile device market, further unleashing the capabilities of 64-bit ARMv8-A based CPUs. This opens the door for more application developers to take advantage of the doubling of SIMD multimedia (ARM NEON™ technology) and floating point performance, crypto instructions to protect consumers data and 4GB or higher memory support; delivering the next-generation premium mobile experience.
Partner Quotes
"The need to be always connected with access to data, premium content and storage is dramatically changing as more devices and people use the connected cloud for digital experiences," said George Yao, general manager of Turing Processor BU, HiSilicon. "HiSilicon is committed to offering platforms that support these demands by delivering more performance and outstanding energy-efficiency. We are pleased to support ARM as it introduces the first full suite of IP targeted at premium mobile platforms. The game-changing Cortex-A72 delivers more than 3X the performance of 2014 devices and Mali-T880 takes state-of-the-art graphics to the next level. Our partnership will usher in a new era for mobile and networking solutions."
"The pace of innovation in mobile is accelerating at an unprecedented rate, which means we need to deliver the latest technology to our customers as fast as possible," said Joe Chen, senior vice president of MediaTek. "We are pleased to partner with ARM for the launch of Cortex-A72, bringing the ARMv8-A architecture to market with leading performance and energy-efficiency benchmarks. Ultimately it is all about providing a better experience for end users as the complexity of applications, content and devices increases."
"Increasingly, consumers are adopting mobile devices including smartphones, tablets, phablets and other large screen devices as their primary compute platforms. These larger form factor devices demand higher performance and energy-efficiency that scales across a variety of processor configurations," said Mr. Feng Chen, chief marketing officer, Rockchip. "Rockchip is pleased to partner with ARM to introduce the Cortex-A72 to enable a wide range of premium mobile platforms that consumers can take advantage of. Rockchip is dedicated to bringing cutting-edge CPU and GPU technologies and solutions to our customers and end users to enable a compelling user experiences for personal and enterprise use."
"TSMC's 16FinFET+ process is already delivering exceptional results with SoCs based on Cortex-A57 thanks to rapid progress in yield and performance," said Suk Lee, TSMC Senior Director, Design Infrastructure Marketing Division. "The combination of TSMC 16FF+ process technology and the implementation advantages of ARM POP IP gives our customers the opportunity to rapidly bring highly optimized mobile SoCs based on Cortex-A72 to market in early 2016."
"Cadence and ARM have a long history of close collaboration that has resulted in some noteworthy achievements enabling mutual customer success," said Dr. Chi-Ping Hsu, senior vice president and chief strategy officer for EDA, Cadence. "In support of the new ARM Premium mobile experience IP suite, Cadence has created a complete system-on-chip (SoC) environment to optimize its implementation, verify the system during OS boot-up and analyze interconnect subsystem performance. ARM used the Cadence digital and system-to-silicon verification tools and IP during the ARM Cortex-A72 processor development to ensure that the flow met complex mobile design requirements. This will help designers to shorten time to market while achieving optimal results at advanced process nodes."
"Through more than 20 years of collaboration with ARM, we've enabled our customers to use Synopsys tools to quickly get their innovative products to market while meeting power, performance and area design targets," said Deirdre Hanford, executive vice president, customer engagement, Synopsys. "Early adopters of ARM's new suite of IP, including the Cortex-A72 processor, CoreLink CCI-500 interconnect, Mali-T880 GPU, Mali-V550 video processor and Mali-DP550 display processor, are already using Synopsys tools, methodology and professional services to design and verify their products aimed at delivering a premium mobile experience. For the ARM Cortex-A72, a member of this new suite of IP, our Reference Implementation (RI) builds on our successful RIs for ARM Cortex-A57 and Cortex-A53 and will enable designers to take advantage of the 10X increase in design throughput that Synopsys' IC Compiler™ II product offers, while achieving the performance target in a mobile power envelope."
Ends
Click to expand...
Click to collapse
arjun90 said:
Here we go again! | WOOT WOOT! |
Click to expand...
Click to collapse
Just in time for my update to the Note 6