Friday, March 26, 2010

Hummingbird vs Snapdragon: the 1 Ghz smartphone showdown

NOTE: This article has been officially hosted at AlienBabelTech.com. For a better reading experience, you may want to view the article here!

If you follow smartphone technology at all, you're sure to have heard of the Qualcomm Snapdragon processor. It's the reigning smartphone CPU heavyweight; a 1 GHz processor packed with a multitude of features, based upon the same ARM CPU technology that modern smartphones such as the Droid, Palm Pre, Nokia N900 and iPhone 3GS use. However, unlike those processors, the Snapdragon runs at 1 GHz while the others run at 600 MHz and under, and thus has become the chip of choice for premium smartphones.

The Snapdragon SoC (System on a Chip) has appeared on the market in several devices recently. The most well-known example is probably the Google Nexus One, though it had already appeared in a previous device, the HTC HD2. The HD2, released November 11th 2009, had a Snapdragon processor as well as a massive 4.3-inch display (diagonally measured), and received rave reviews that almost unanimously ended with one major complaint: the Windows Mobile 6.5 operating system. It’s an operating system largely unchanged from its predecessors and prone to software problems. In addition, to really make good use of the processing power of the phone, applications needed to exist that made use of that power, and the majority of applications written for Microsoft’s mobile OS just didn’t take advantage. The industry begged for an HD2 with Google’s Android mobile operating system, and HTC responded that it wasn’t going to happen.

But then Sprint announced the HTC EVO 4G at the CTIA 2010 trade show, and the mobile industry collectively went wild. Here was the phone everyone had been dreaming of; a 4.3-inch display and 1 GHz Snapdragon like the HD2, as well as a deployable kickstand, 8MP rear camera, 1.3MP front camera, HDMI port, and 4G WiMAX connectivity. The HD2 had essentially been reborn, new and improved, for the Android OS. Judging by the limelight cast upon the EVO 4G by the mobile enthusiast community, the EVO 4G is positioned to become one of the best selling smartphones of the year.

However, another device debuted at CTIA 2010 that was largely overshadowed by the launch of the EVO 4G: the Samsung GT-i9000 Galaxy S. This new phone, in contrast, has a 4-inch Super AMOLED display (more on that later), 5MP rear camera, 0.3MP front camera, (GSM/HSPDA) 3G/3.5G connectivity… and was mentioned almost as an afterthought to contain Samsung’s own 1 GHz processor. Samsung spent a lot of time at CTIA 2010 talking about the Super AMOLED display, and in contrast only a few moments disclosing details on the new SoC, stating that it has over 3x better performance than the leading competition (referring to graphics performance), and bests all other smartphone processors on the market today. Only later was it confirmed that the SoC was Samsung’s new 45 nm “Hummingbird” platform, the only production 1 GHz ARM processor thus far to challenge Qualcomm’s Snapdragon.

When the news of these phones hit the tech blogs, nearly all of the attention went to the HTC EVO 4G. The EVO 4G was what many had been waiting for, and the Samsung was typically given hardly a second glance. But let’s take a moment to really compare the hardware of these two Android 2.1 smartphones, and then we’ll even go a bit deeper into how the SoCs actually stack against one another when it comes to CPU and GPU processing power.


               HTC EVO 4G and the Samsung i9000 Galaxy S
                              Physical Properties

HTC EVO 4GSamsung GT-i9000 Galaxy S
OS: Android 2.1OS: Android 2.1
Carrier: SprintCarrier: AT&T (likely)
Cell Data: EV-DO Rev. A (3G), WiMAX (4G)Cell Data: EDGE (3G), HSDPA (3.5G)
Thickness: 13 mmThickness: 9.9 mm
Weight: 170 grams (with battery)Weight: 118 grams (with battery)
Processor: Snapdragon 1 GHzProcessor: Hummingbird 1 GHz
RAM: 512 MBRAM: Unknown
Storage: 1 GB internal
+ microSD port (up to 32 GB)
Storage: 8 or 16 GB internal
+ microSD port (up to 32 GB)
UI: HTC SenseUI: Samsung Smart Life
Touchscreen: Capacitive multi-touchTouchscreen: Capacitive multi-touch
Display Type: Transreflective TFTDisplay type: Super AMOLED
Display Diagonal Size: 4.3”Display Diagonal Size: 4.0”
Display Resolution: 480 x 800 pixelsDisplay Resolution: 480 x 800 pixels
Dot Pitch: 217.4 pixels per inchDot Pitch: 232.3 pixels per inch
Video out: HDMI jack (uncovered)Video out: Wireless only, via DLNA
Audio out: 3.5mm headphone jackAudio out: 3.5mm headphone jack
USB: Micro-USB 2.0 jack (uncovered)USB: Micro-USB 2.0 jack (sliding cover)
Bluetooth: Version 2.1 + EDRBluetooth: Version 3.0
WiFi: 802.11b/gWiFi: 802.11b/g/n
Radio: Analog FMRadio: Analog FM
GPS: Internal GPS, A-GPS, Digital compassGPS: Internal GPS, A-GPS, Digital compass
Rear Camera: 8MP Autofocus, Digital zoomRear Camera: 5MP Autofocus, Digital zoom
Flash: Dual LEDFlash: None
Video Capture: 720pVideo Capture: 720p
Front Camera: 1.3MPFront Camera: 0.3MP (VGA)


When it comes to built-in storage, the Galaxy S pulls ahead of the EVO 4G with its included 8 / 16 GB internal hard drive (there are indications that different models will be available for each size), while the EVO 4G comes with only 1 GB on board. It may be a moot point however, as both phones include microSD ports for removable memory and the EVO 4G is likely to come with a 16 or 32 GB microSD memory card included. It should be pointed out that due to current limitations within the Android OS, you can’t install apps onto removable memory (unless you root the phone) but 1 GB should still be plenty of room for your mobile applications.

Regarding video capture, both phones can record 720p HD video without any trouble, thanks to integrated video encoders in both SoCs. The Galaxy S has a 5MP camera while the EVO 4G has an 8MP, but it should be pointed out that pixel count alone shouldn’t be treated as an indicator of camera performance. Image noise, focus, performance in low-light conditions, color accuracy etc are all variables that won’t be determined until cameras in both phones can really be used side-by-side. It should be pointed out that the Galaxy S at CTIA 2010 did not have an LED flash of any sort, which is surprising considering this is fairly standard-faire now on most smartphones. However, the Samsung press kit is noticeably devoid of any photos or renders of the rear of the phone, which could indicate that this might change. Interestingly, both phones include a second camera on the front of the device. The EVO 4G has a 1.3MP front-facing camera, while the Galaxy S manages a VGA (0.3MP) unit. These cameras could be used for video conferencing (among other uses), though except when connected to a wireless network, the EVO 4G’s WiMAX connection would probably be better suited to such a task.

Speaking of wireless connectivity, while we’ve already covered the 3G / 4G differences of the phones, a major feature of the EVO 4G is its ability to function as a mobile 4G wireless hotspot for up to 8 devices. The Galaxy S can be set up to act as a hotspot as well, but only after rooting the Android OS, an administrative access hack which many users won’t want to have to bother with. It’s also interesting to note that the Samsung Galaxy S has Bluetooth 3.0, while the EVO 4G has the standard Bluetooth 2.1 + EDR. Additionally, the Samsung apparently also supports 802.11n for connection to wireless n-only networks, while the EVO 4G can only connect to wireless b/g networks (or wireless n networks supporting b/g devices).

The displays of both phones are key selling points for each. The Samsung Galaxy S possesses a 4-inch Super AMOLED display, which Samsung claims is 20% brighter than a regular AMOLED display, consumes 20% less power, and is 80% easier to see in direct sunlight. In addition, Samsung states that the Super AMOLED manages to integrate the touchscreen directly into the display instead of laying it over the top as in conventional LCD and AMOLED displays, which allows for a slimmer display, and thus a slimmer device.

In contrast, the HTC EVO 4G has a larger 4.3-inch display, but isn’t AMOLED. This is not necessarily a bad thing; high-quality displays like the 3.7-inch LCD display on the Motorola Droid may not be quite as vibrant or energy-efficient as AMOLED, but are still very impressive. The EVO 4G’s screen is essentially the same, if not exactly the same, as the screen on the aforementioned HTC HD2. That screen received a very good response from mobile reviewers, and it’s likely that the EVO 4G’s screen will be no different. Compared to an AMOLED display, colors may not look as bright, and since the LCD display is backlit, blacks will not look quite as dark. Viewing angle is also more of a concern for LCD displays than it is on AMOLED displays, which is one reason why the EVO 4G includes a kickstand for viewing media (aside from general convenience). For viewing on a separate display, the EVO 4G features an HDMI-out port, furthering its usefulness as a media device. Samsung counters with a wireless solution on the Galaxy S, streaming video wirelessly via DLNA to compatible displays, (and it’s no coincidence that this includes higher-end Samsung ones!) Ultimately, it becomes a toss-up for the consumer whether they want the larger display of the EVO 4G, or the brighter, sunlight-friendly, and more energy efficient display of the Galaxy S.

To most smartphone users, the display is one of the most important factors in deciding upon a phone, hence the enthusiasm over the EVO 4G, with one of the largest smartphone displays on the market. But underneath a beautiful display there needs to be a processor that can handle the complex processing needs of the application-intensive Android market, encode and decode high-definition audio and video, as well as fluidly handle 2D and 3D graphics for software and gaming, all while keeping power consumption to an absolute minimum. The modern-day smartphone is a marvel of engineering, essentially inserting a miniature computer into a device we come to rely upon for information, communication, and entertainment.


It took a bit of digging to find the numbers needed to really break down and compare the hardware in these modern-day smartphones. This chart (linked because it doesn't fit here) simplifies these findings, and includes the iPhone 3GS and the Droid for reference.


CPU Performance

Before I go into details on the Cortex-A8, Snapdragon, Hummingbird, and Cortex-A9, I should probably briefly explain how some ARM SoC manufacturers take different paths when developing their own products. ARM is the company that owns licenses for the technology behind all of these SoCs. They offer manufacturers a license to an ARM instruction set that a processor can use, and they also offer a license to a specific CPU architecture.

Most manufacturers will purchase the CPU architecture license, design a SoC around it, and modify it to fit their own needs or goals. T.I. and Samsung are examples of these; the S5PC100 (in the iPhone 3GS) as well as the OMAP3430 (in the Droid) and even the Hummingbird S5PC110 in the Samsung Galaxy S are all SoCs with Cortex-A8 cores that have been tweaked (or “hardened”) for performance gains to be competitive in one way or another. Companies like Qualcomm however will build their own custom processor architecture around a license to an instruction set that they’ve chosen to purchase from ARM. This is what the Snapdragon’s Scorpion processor is, a completely custom implementation that shares some similarities with Cortex-A8 and uses the same ARMv7 instruction set, but breaks away from some of the limitations that the Cortex-A8 may impose.

Qualcomm’s approach is significantly more costly and time consuming, but has the potential to create a processor that outperforms the competition. Through its own custom architecture configuration, (which Qualcomm understandably does not go into much detail regarding), the Scorpion CPU inside the Snapdragon SoC gains an approximate 5% improvement in instructions per clock cycle over an ARM Cortex-A8. Qualcomm appeals to manufacturers as well by integrating features such as GPS and cell network support into the SoC to reduce the need of a cell phone manufacturer having to add additional hardware onto the phone. This allows for a more compact phone design, or room for additional features, which is always an attractive option. Upcoming Snapdragon SoCs such as the QSD8672 will allow for dual-core processors (not supported by Cortex-A8 architecture) to boost processing power as well as providing further ability to scale performance appropriately to meet power needs. Qualcomm claims that we'll see these chips in the latter half of 2010, and rumor has it that we’ll begin seeing them show up first in Windows Mobile 7 Series phones in Fall. Before then, we may see a 45 nm version of the QSD8650 dubbed “QSD8650A” released in the summer, running at 1.3 GHz.

You might think that the Hummingbird doesn’t stand a chance against Qualcomm’s custom-built monster, but Samsung isn’t prepared to throw in the towel. In response to Snapdragon, they hired Intrinsity, a semiconductor company specializing in tweaking processor logic design, to customize the Cortex-A8 in the Hummingbird to perform certain binary functions using significantly less instructions than normal. Samsung estimates that 20% of the Hummingbird's functions are affected, and of those, on average 25-50% less instructions are needed to complete each task. Overall, the processor can perform tasks 5-10% more quickly while handling the same 2 instructions per clock cycle as an unmodified ARM Cortex-A8 processor, and Samsung states it outperforms all other processors on the market (a statement seemingly aimed at Qualcomm). Many speculate that it’s likely that the S5PC110 CPU in the Hummingbird will be in the iPhone HD, and that its sister chip, the S5PV210, is inside the Apple A4 that powers the iPad. (UPDATE: Indications are that the model # of the SoC in the Apple iPad’s A4 is “S5L8930”, a Samsung part # that is very likely closely related to the S5PV210 and Hummingbird. I report and speculate upon this here.)

Lastly, we really should touch upon Cortex-A9. It is ARM’s next-generation processor architecture that continues to work on top of the tried-and-true ARMv7 instruction set. Cortex-A9 stresses production on the 45 nm scale as well as supporting multiple processing cores for processing power and efficiency. Changes in core architecture also allow a 25% improvement in instructions that can be handled per clock cycle, meaning a 1 GHz Cortex-A9 will perform considerably quicker than a 1 GHz Cortex-A8 (or even Snapdragon) equivalent. Other architecture improvements such as support for out-of-order instruction handling (which, it should be pointed out, the Snapdragon partially supports) will allow the processor to have significant gains in performance per clock cycle by allowing the processor to prioritize calculations based upon the availability of data. T.I. has predicted its Cortex-A9 OMAP4440 to hit the market in late 2010 or early 2011, and promises us that their OMAP4 series will offer dramatic improvements over any Cortex-A8-based designs available today.


GPU performance

There are a couple problems with comparing GPU performance that certain recent popular articles have neglected to address. (Yes, that’s you, AndroidAndMe.com, and I won’t even go into a rant about bad data). The drivers running the GPU, the OS platform it’s running on, memory bandwidth limitations as well as the software itself can all play into how well a GPU runs on a device. In short: you could take identical GPUs, place them in different phones, clock them at the same speeds, and see significantly different performance between them.

For example, let’s take a look at the iPhone 3GS. It’s commonly rumored to contain a PowerVR SGX 535, which is capable of processing 28 million triangles per second (Mt/s). There’s a driver file on the phone that contains “SGX535” in the filename, but that shouldn’t be taken as proof as to what it actually contains. In fact, GLBenchmark.com shows the iPhone 3GS putting out approximately 7 Mt/s in its graphics benchmarks. This initially led me to believe that the iPhone 3GS actually contained a PowerVR SGX 520 @ 200 MHz (which incidentally can output 7 Mt/s) or alternatively a PowerVR SGX 530 @ 100 MHz because the SGX 530 has 2 rendering pipelines instead of the 1 in the SGX 520, and tends to perform about twice as well. Now, interestingly enough, Samsung S5PC100 documentation shows the 3D engine as being able to put out 10 Mt/s, which seemed to support my theory that the device does not contain an SGX 535.

However, the GPU model and clock speed aren’t the only limiting factors when it comes to GPU performance. The SGX 535 for example can only put out its 28 Mt/s when used in conjunction with a device that supports the full 4.2 GB per second of memory bandwidth it needs to operate at this speed. Assume that the iPhone 3GS uses single-channel LPDDR1 memory operating at 200 MHz on a 32-bit bus (which is fairly likely). This allows for 1.6 GB/s of memory bandwidth, which is approximately 38% of what the SGX 535 needs to operate at its peak speed. Interestingly enough, 38% of 28 Mt/s equals just over 10 Mt/s… supporting Samsung’s claim (with real-world performance at 7 Mt/s being quite reasonable). While it still isn’t proof that the iPhone 3GS uses an SGX 535, it does demonstrate just how limiting single-channel memory (particularly slower memory like LPDDR1) can be and shows that the GPU in the iPhone 3GS is likely a powerful device that cannot be used to its full potential. The GPU in the Droid likely has the same memory bandwidth issues, and the SGX 530 in the OMAP3430 appears to be down-clocked to stay within those limitations.

But let’s move on to what’s really important; the graphics processing power of the Hummingbird in the Samsung Galaxy S versus the Snapdragon in the EVO 4G. It’s quickly apparent that Samsung is claiming performance approximately 4x greater than the 22 Mt/s the Snapdragon QSD8650’s can manage. It’s been rumored that the Hummingbird contains a PowerVR SGX 540, but at 200 MHz the SGX 540 puts out 28 Mt/s, approximately 1/3 of the 90 Mt/s that Samsung is claiming. Either Samsung has decided to clock an SGX 540 at 600 MHz, which seems rather high given reports that the chip is capable of speeds of “400 MHz+” or they’ve chosen to include a multi-core PowerVR SGX XT solution. Essentially this would allow 3 PowerVR cores (or 2 up-clocked ones) to hit the 90 Mt/s mark without having to push the GPU past 400 MHz.

Unfortunately however, this brings us right back to the memory bandwidth limitation argument again, because while the Hummingbird likely uses LPDDR2 memory, it still only appears to have single-channel memory controller support (capping memory bandwidth off at 4.2 GB/s), and the question is raised as to how the PowerVR GPU obtains the large amount of memory bandwidth it needs to draw and texture polygons at those high speeds. If the PowerVR SGX 540 (which, like the SGX 535 performs at 28 Mt/s at 200 MHz) requires 4.2 GB/s of memory bandwidth, drawing 90 Mt/s would require over 12.6 GB/s of memory bandwidth, 3 times what is available. Samsung may be citing purely theoretical numbers or using another solution such as possibly increasing GPU cache sizes. This would allow for higher peak speeds, but it’s questionable if it could achieve sustainable 90 Mt/s performance.

Qualcomm differentiates itself from most of the competition (once again) by using its own graphics processing solution. The company bought AMD’s Imageon mobile-graphics division in 2008, and used AMD’s Imageon Z430 (now rebranded Adreno 200) to power the graphics in the 65 nm Snapdragons. The 45 nm QSD8650A will include an Adreno 205, which will provide some performance enhancements to 2D graphics processing as well as hardware support for Adobe Flash. It is speculated that the dual-core Snapdragons will utilize the significantly more powerful Imageon Z460 (or Adreno 220), which apparently rivals the graphics processing performance of high-end mobile gaming systems such as the Sony PlayStation Portable. Qualcomm is claiming nearly the same performance (80 Mt/s) as the Samsung Hummingbird in its upcoming 45 nm dual-core QSD8672, and while LPDDR2 support and a dual-channel memory controller are likely, it seems pretty apparent that, like Samsung, something else must be at play for them to achieve those claims.

UPDATE - As of Christmas Eve 2010, I believe I've sorted out how Samsung pulls this off. I've put up a blog post explaining: http://sean-the-electrofreak.blogspot.com/2010/12/mystery-solved-how-samsung-pulls-off.html

While Samsung and Qualcomm tend to stay relatively quiet about how they achieve their graphics performance, T.I. has come out and specifically stated that its upcoming OMAP4440 SoC supports both LPDDR2 and a dual-channel memory controller paired with a PowerVR SGX 540 chip to provide “up to 2x” the performance of its OMAP3 line. This is a reasonable claim assuming the SGX 540 is clocked to 400 MHz and requires a bandwidth of 8.5 GB/s which can be achieved using LPDDR2 at 533 MHz in conjunction with the dual-channel controller. This comparatively docile graphics performance may be due to T.I’s rather straightforward approach to the ARM Cortex-A9 configuration.


Power Efficiency

Moving onward, it's also easily noticeable that the next generation chipsets on the 45 nm scale are going to be a significant improvement in terms of performance and power efficiency. The Hummingbird in the Samsung Galaxy S demonstrates this potential, but unfortunately we still lack the power consumption numbers we really need to understand how well it stacks up against the 65 nm Snapdragon in the EVO 4G. It can be safely assumed that the Galaxy S will have overall better battery life than the EVO 4G given the lower power requirements of the 45 nm chip, the more power-efficient Super AMOLED display, as well as the fact that both phones sport equal-capacity 1500mA batteries. However it should be noted that the upcoming 45 nm dual-core Snapdragon is claimed to be coming with a 30% decrease in power needs, which would allow the 1.5 GHz SoC to run at nearly the same power draw of the current 1 GHz Snapdragon. Cortex-A9 also boasts numerous improvements in efficiency, claiming power consumption numbers nearly half that of the Cortex-A8, as well as the ability to use multiple-core technology to scale processing power in accordance with energy limitations.

While it’s almost universally agreed that power efficiency is a priority for these processors, many criticize the amount of processing power these new chips are bringing to mobile devices, and ask why so much performance is necessary. Whether or not mobile applications actually need this much power is not really the concern however; improved processing and graphics performance with little to no additional increase in energy needs will allow future phones to actually be much more efficient in terms of power. This is because ultimately, power efficiency relies in a big part on the ability of the hardware in the phone to complete a task quickly and return to an idle state where it consumes very little power. This “burst” processing, while consuming fairly high amounts of power for very short periods of time, tends to be more economical than prolonged, slower processing. So as long as ARM chipset manufacturers can continue to crank up the performance while keeping power requirements low, there's nothing but gains to be had.


Conclusion

The Samsung Galaxy S with a 45 nm Hummingbird beating for a heart looks to be the most powerful ARM SoC available in the near future, however, the 45 nm Snapdragons and the Cortex-A9 TI OMAP 4 series will almost certainly reclaim that title later this year and onwards into 2011. The EVO 4G, while having an impressive set of specifications, still runs a 65 nm chip that is less power efficient and pales in terms of graphics performance when compared to the Galaxy S. If you happen to live under Sprint’s WiMAX umbrella and can’t resist a 4.3-inch screen, the EVO 4G may be a hard opportunity to resist. But if you’re looking for the better performer and display with unsurpassed quality, a Samsung Galaxy S looks like the choice to make assuming it lands on a carrier that works for you. AT&T is likely, signs point to T-Mobile as well, and it’s speculated that it’ll eventually end up on all 4 major US carriers. Both phones are headed our way in “Summer 2010”. One thing is certain; it’s not going to be an easy wait!

56 comments:

  1. Sean, how can I contact you? I need to talk to you about hosting your blog. Please have a look at our website and contact Karan from about page.

    http://alienbabeltech.com

    About Page
    http://alienbabeltech.com/main/?page_id=9136

    Hoping to talk to you soon !

    ReplyDelete
  2. Thank you for your interest. I'll be in touch!

    ReplyDelete
  3. Interesting! Where did you find how much RAM the Samsung Galaxy S has?

    ReplyDelete
  4. gazab, I'll be honest with you, I published that because it was corroborated in several places across the web, but it hasn't been verified 100%. As always with the internet, sometimes they all use the same bad source for info. I'm waiting on an email back from a Samsung contact regarding the RAM as well as a couple other details I'd like to verify, such as the lack of the LED flash on the Galaxy S (but noticeable lack of press shots showing the back of the device).

    I'll post back here when I get more info!

    ReplyDelete
  5. The more research I do, the more conflicting info I find. Pulling the paragraph about the RAM until I get a response back from Samsung or an infallible source. Thank you gazab for keepin' me honest. :)

    ReplyDelete
  6. Haha, no problem! I'm just as curious as you! If they confirm 512MB and a LED flash I don't think I can resist buying it...

    I'm very glad that manufacturers are a bit more open when it comes to what's under the hood nowadays. But we're not all the way there yet :)

    ReplyDelete
  7. I'm working on an expanded version of this article for AlienBabelTech.com. I'll be sure to post a link when I'm done. I too am really hoping for those two features and a decent carrier ::crosses fingers::

    ReplyDelete
  8. Hi Sean,

    This was an excellent review of these four phones' technology and their merits; also very much enjoyed the in-depth information regarding the various 'breeds' of processors ('dragon, 'bird...!). It IS difficult to find solid information on the Web - I generally have to search far and wide, and still come up with nothing reliable. Thanks also for indicating your sources for these specs, as well as refraining from describing speculation as 'fact.'
    I've bookmarked your site, and I'll be following your posts in the future. Personally, I'm very interested in the evolution of processors, battery technology, power-usage, and software with regard to consumer tech; but I like to evaluate technical information with a more granular lens, as you have done in your article. Unfortunately, I have the interest, but not the skills to interpret the information on my own.
    Thanks! and please keep writing. By the way, you may enjoy reading a review and contrast of the screen quality of various of the major smartphones on the market (iphone, nexus one, and soon - droid). I found this site while searching for hard data on these phones - and loved what I read (I'm not affiliated with this group! just thought you might enjoy):

    http://www.displaymate.com/Nexus_One_ShootOut.htm

    *my blog page is just a placeholder...*

    Best Regards -
    Jon

    ReplyDelete
  9. Jon, thank you very much, I appreciate your feedback!

    That's an excellent article regarding the displays on the Nexus One, Droid, and iPhone 3GS, I found it very interesting. I hope they do a review of the Super-AMOLED on the Samsung Galaxy S to see if it's all cracked up to be.

    I did read an article the other week regarding the performance of the Nexus One screen. It's more concerned with the loss of image quality due to the sub-pixel arrangements.

    http://arstechnica.com/gadgets/news/2010/03/secrets-of-the-nexus-ones-screen-science-color-and-hacks.ars/

    I'm unsure if they picked that way of doing it due to some sort of limitation in the AMOLED technology, or if it was simply to reduce manufacturing costs, but it seems to have some pretty dramatic consequences for a display that purports itself to be cutting-edge.

    I hope to do a proper review of the EVO 4G and the Galaxy S when they come out this summer. Of course, I'll probably have something else to focus on by then... the industry never stops moving!

    ReplyDelete
  10. Just something that occurred to me... Samsung owns the PenTile technology and manufactures the AMOLED screen on the Nexus One. The article in my post above establishes that PenTile works fine for displaying images and photos but doesn't work so well for text, where having pixels in the standard arrangement works better.

    If the Super-AMOLED on the Samsung Galaxy S also uses a PenTile arrangement, text quality may be effected. Though, I'm wondering if they're going to use a RGBW PenTile arrangement (http://en.wikipedia.org/wiki/PenTile) instead of the RGBG arrangement on the Nexus One, as they did describe a brighter display with less power consumption (a feature of RGBW according to that Wiki article) on the Super-AMOLED.

    I also wonder if, assuming that's what they do, if it will result in the same kind of text quality concerns that the Nexus One has been subject to.

    Maybe if I find a high-enough resolution video of a Samsung Galaxy S in action and zoom in reaaaaaaaalll close... :-p

    ReplyDelete
  11. My only concern has nothing to do with that actual phone. My concern is with Sprints 4G map. Their 4G coverage isn't that vast and no where near that of Verizon's 3G, which I heard they were in the process of upgrading to 4G.

    You should also be happy that I got around to reading about half of one of your blogs. I had some spare time on my hands. :)

    ReplyDelete
  12. Good to see you Kyonye :)

    You're right, Sprint has a long way before they really provide enough coverage for the majority of subscribers. In addition, most other providers are planning on using LTE for their 4G networks, so Sprint is introducing further mobile industry fragmentation when the rest of the American public is just finally hoping to see phones that will work on multiple 4G networks.

    You do have to give Sprint credit for one thing though; they're apparently pushing the expansion of their 4G network pretty aggressively:

    http://www.engadget.com/2010/03/23/sprint-announces-seven-new-wimax-markets-says-let-atandt-and-ver/

    ReplyDelete
  13. I just remembered something else regarding hardware specifications worth investigating: the touch screen technology.

    The Nexus One took some heat a couple of weeks ago when it was discovered it had some multitouch flaws when compared to the Motorola Droid. Here you can see the issue: http://www.youtube.com/watch?v=qzhUzq6bTPg.

    This test: http://www.youtube.com/watch?v=np_znhgcAg4 also shows some interesting differences between smartphones touch screen accuracy.

    It seems the Nexus One uses Synaptics ClearPad 2000 (http://www.synaptics.com/solutions/products/clearpad) for its touch screen and that's why it only has limited multitouch support. ClearPad 3000 would've been a much better choice.

    So, what other kind and brands of screens exist? What does the iPhone, Droid, EVO 4G and Galaxy S use? Does it even matter?

    ReplyDelete
  14. The OMAP3 is publically documented, so there is no need for speculation as to the Droid processor. The SGX530 is run at 110 MHz (check the Droid kernel source) and is, according to TI marketing and technical documentation, capable of 10 Mtriangles/sec. Also, the OMAP3 has 256kB of L2 cache as well (which is omitted from your table).

    There is also quite a lot of semi-technical details on the Qualcomm developer sites (free registration and click-through-NDA required) about the GPU in the Snapdragon. It is now known as an Adreno 200, but apparently is identical to the AMD Imageon z430. In addition, I would like to point out that the Scorpion core in the Snapdragon is not an ARM Cortex A8 - it's very similar, but is a different, original implementation of the ARMv7 ISA. You can see some more details here: http://www.insidedsp.com/tabid/64/articleType/ArticleView/articleId/238/Qualcomm-Reveals-Details-on-Scorpion-Core.aspx

    Regarding the Samsung Cortex SoCs, you might want to check the terms of your confidentiality agreements with Samsung... there are statements in this article about the SoCs that are not publicly published, and are stated authoritatively, implying you've got access to the Samsung-confidential datasheets and manuals.

    Comparing GPUs based on benchmarks is disingenuous at best; it is highly likely that Apples PowerVR/SGX drivers are different than the Linux ones used on the Droid, and certainly different from the ATI Linux drivers for the z430 in the Snapdragon. And of course, many of these benchmarks are dominated by memory bandwidth (the constant challenge in mobile graphics) and thus depend on how external memory is configured a lot more than which SGX core is used and what it's clocked at.

    Finally, it'll be interesting to see what the Apple A4 chip turns out to be. I wouldn't be surprised if it turned out to be a Samsung S5PC110.

    ReplyDelete
  15. Correction: by "comparing GPUs" I meant inferring GPU details from comparing benchmarks.

    ReplyDelete
  16. Gazab, good call. I didn't really go into the actual touchscreen technology at all in my article, and you're right, the Nexus One came out with some real issues.

    I should be able to determine who manufactures the touchscreen for the EVO 4G, assuming the display is the same as the one used on the HTC HD2 (which I believe it is).

    Samsung the Super-AMOLED integrates the touchscreen directly as part of the Super-AMOLED technology, but I don't know much more than that. Time to do some more research!

    ReplyDelete
  17. Hugo, I'd reviewed some of the public T.I. documentation for the OMAP3 and found it extremely vague and unhelpful. From there I'd pursued public sources, but honestly it was supposed to be merely a point of reference to be used against the rest of my argument comparing Hummingbird versus Snapdragon (though admittedly, a point of reference needs to be accurate to be effective). Thank you for the updated information. If this information is publicly available, I was not looking hard enough.

    I knew that 128 KB of L2 is a minimum required in the OMAP3 series of processors, but nowhere had I'd found that the 3430 actually contained 256 KB. This was however explicitly outlined in the documentation for the S5PC100 in the iPhone 3GS, so I assumed (key word there) that the OMAP3430 in the Droid use the minimum amount (though, not being sure, I did not specify it outright.)

    I'd read that the Snapdragon was derived from ARMv7, but the article I'd found it mentioned in had only described it in passing and all other sources had described it as an ARM Cortex-A8 SOC. Thank you for the link the article, that website seems an excellent resource, and I'm surprised to find it's not in the list of valuable sites I've compiled (even when I did research on DSP implementation on Cortex-A8!) My Search-Fu needs more wisdom, I suppose.

    I've not really made any effort to gain developer-level access to some of the information primarily due to the NDAs involved; I would be in a dangerous position determining what I'm allowed to disclose as a tech blogger / columnist versus what I must hold in confidence.

    For this reason, the Samsung statistics I use are also from publicly available sources; primarily from Samsung's own marketing statements and press releases. Obviously that information must be taken with a grain of salt, but I've little other choice at this point...

    You make a good point as well regarding comparing GPU benchmarks. But until we have a cross-platform solution that will reliably produce real-world results, we have to muddle our way through as best we can using simple numbers like polys per sec and fill rates. I'm trying to bring some transparency to what most of consider "mystery chips" in these devices, but I'm rapidly learning how difficult that really is.

    Hugo, given your strong knowledge of this topic, I'd be interested in picking your brain further. If you're willing, please contact me at electrofreak (at) gmail [dot] com

    Thanks again!

    ReplyDelete
  18. I'd be happy to answer any questions here. Cheers

    ReplyDelete
  19. Sure!

    Did a lot of extra research last night on the OMAP processors, some good stuff if you dig deep enough. I've decided to expand my article to include the upcoming OMAP4400, because TI is definitely still a valid contender.

    Documentation is pretty good, ARM is throwing numbers all over the place on the Cortex-A9 (only requires registration on their website, no NDA needed).

    I don't want to dig too deep into how the architecture actually works; I'm not a pro on ARM chips by any stretch of the imagination and my article is intended to just gloss over the kind of numbers we should expect to see from these devices.

    I'll be adding due warning about comparing specs for the GPU, as I've already done with CPU numbers. I'm also doing some research into ARMv7 so I can attempt to highlight the basic differences between the instruction sets of the processors.

    This article has been a great learning experience for me, and I hope to produce an article that will provide accurate yet simple information. AnandTech will be releasing a detailed article comparing the details of the architectures of the various ARM processors pretty soon which is really their area of expertise; I don't intend on competing with that kind of material.

    I will be posting an updated version of my article tomorrow, and I hope that it will be free of any errors. What started as a basic spec comparo between the Galaxy S and EVO 4G has turned into a crash course in ARM architecture for me. I love it, I really do, I just don't want to give my readers the false impression that I'm an expert on the topic; I'd rather give them the numbers, give them the "compare them at your own risk" disclaimer, and let them draw their own conclusions.

    I'm rapidly starting to believe that the Hummingbird is using a multi-core PowerVR (or "SGX XT") solution. Either that or they've clocked the SGX 540 everyone seems to believe it contains to 600 MHz, which seems like an energy drain; a multi-core solution would be much more economical.

    Documentation on the OMAP4400 is pretty clear however that the chip contains an SGX 540. But again, the 540 itself doesn't seem to have the competitive punch needed to rival the graphics power claimed in the 45 nm Snapdragon and Hummingbird. So, it seems like something else is going on here. Either that SGX 540 can take some pretty hefty clock speeds (I've found documentation that indicates it can handle 400 MHz and higher) or they're leaving some information out.

    Any ideas?

    Last but not least, I doubt the A4 contains a Samsung S5PC110 simply because Apple acquired P.A. Semi in 2008, and it's known that they were working on ARM SoCs at the time. It is known that Samsung manufactures the chip, but it stands to reason that Apple would be using the chip that they designed through P.A. Semi. Who knows though, maybe Samsung collaborated on it. Too bad Apple isn't in the habit of talking about such things...

    ReplyDelete
  20. This page:
    http://www.samsung.com/global/business/semiconductor/productInfo.do?fmly_id=229&partnum=S5PC100&xFmly_id=229
    claims the SGX535 implementation in the iPhone 3Gs is only capable of 10 M/triangles per second.

    The total available system RAM bandwidth (shared between the apps processor and the GPU, LCDC etc), is only going to be about 200 MHz * 32-bits * LPDDR1 = about 1.6 GB/sec. The maximum possible polygon rate on the SGX535 assumes maximum supported bandwidth (4.2 GB/sec) is available.

    It's starting to look like the biggest performance improvement we will see over the next year or so is the switch to LPDDR2 and dual-channel memory configurations. OMAP4 supports this. One of the two Hummingbird configurations supports dual channel, and both LPDDR2. The new Qualcomm SoCs will invariably support LPDDR2 and no doubt have a dual-channel memory controller too.

    Also, for some more information on the Qualcomm/ATI GPU roadmap, see http://www.qdevnet.com/dev/gpu/processors

    The SGX core clock speeds can be assumed to scale up nicely with a die-shrink. If they were running at 200 MHz on 65 nm, there is no reason to assume a clockspeed less than ~350-400 MHz on 45 nm.

    Regarding the A4, the latest information reveals that: (a) it's fabbed by Samsung, (b) Intrinsity - the guys who did the Hummingbird A8 core - was recently acquired by Apple, (c) the A4 also has a 1 GHz A8 core, (d) the A4 appears to use a dual-channel memory configuration like the S5PV210, (e) the package and ball pitch appears similar if not identical to the S5PC110, (f) the A4 still is using a SGX535 with similar 3D benchmark results. So in summary, the A4 is likely to be a simplified S5PV210 with an older GPU, in the package of a S5PC110...

    Finally, I assume you've seen the Anandtech articles today:
    http://anandtech.com/show/3633/apples-a4-soc-faster-than-snapdragon
    http://anandtech.com/show/3632/anands-google-nexus-one-review

    Interestingly, the Scorpion core is described as being somewhere inbetween an A8 and an A9... some out-of-order execution is supported, the VFP (floating point unit) is pipelined, and the NEON SIMD unit is twice as wide (twice the throughput of vectorized math assuming no memory bandwidth bottleneck). He describes the Adreno GPU as dated, but after doing more research myself I'm not sure I agree that it is as inferior to the SGX535 as he implies...
    I think the drivers probably have more to do with it.

    ReplyDelete
  21. Hugo:
    A) Who ARE you?!
    B) Can I be your friend??

    In all seriousness, what you say makes sense yeah, because 1.6 / 4.2 = 38%, 38% of 28 million triangles per second for the SGX 535 = 10.66 million triangles per second.

    Wow. I get what you're saying. These phones really are slaves to their memory bandwidth!

    I knew something was very wrong with the numbers that were originally posted on the article done by AndroidAndMe (which the entire internet has seen in its god-awful glory) and I knew that the iPhone did not have the kind of GPU performance that was being claimed.

    Man, I need to stop asking you questions because it keeps making me want to rework my article all over again, lol.

    Where did you find the PowerVR SGX memory bandwidth numbers? I'm not seeing those anywhere in my notes, and I've looked through most of the documentation on Imagination's website.

    You've really given me some great documentation already to work with, I appreciate it!

    ReplyDelete
  22. NVM, I see where you found it. Now I'm just trying to determine the proper fill rates for the iPhone and Droid. In particular, knowing the fill rate for the iPhone would allow me to pretty much definitively state if it actually contains an SGX 535 or not.

    Also, I didn't see where any TI documentation claims 10 Mt/s for the OMAP3430, and an SGX 530 at 110 MHz should be 7.7 Mt/s. The fill rate should be 190 Mpx/s as it should be subject to the same memory bandwidth limitations as the iPhone.

    The thing concerning me right now is that even with LPDDR2 and dual-channel, I don't see how Samsung and Qualcomm are going to pull off the numbers they're claiming. I mean, figure LPDDR2 at 533 MHz * 32 bit * LPDDR2 * 2 (dual channel) = 8.5 GB/s. If the SGX 540 needs 4.2 GB/s to process 28 Mt/s then it needs 12.6 at least to process 90 Mt/s.

    Not to mention I really have a hard time believing that they're going to clock the SGX 540 at 600 MHz to get that kind of performance, 45nm die or no. 400 MHz would be fine, TI's documentation lists that as supported, and it would fall within the 8.5 GB/s memory bandwidth needed.

    But hey, the claims by Samsung and Qualcomm still stand... and I'm not quite sure how they're accomplishing that (assuming they're not doing some serious number-fudging!)

    ReplyDelete
  23. ::Wondering aloud:: I'm wondering if maybe they're using Rambus' Mobile XDR. If their claims are true, they could provide the memory bandwidth that Samsung would need for that kind of crazy GPU performance.

    Details on the RAM in the Samsung Galaxy S is noticeably missing from any spec sheet, and the email I sent last week asking about Samsung it has gone unanswered. Maybe this is the ace up their sleeve?

    The only problem is that Mobile XDR was only publicly announced last month, though Rambus says the architecture is available for licensing.

    Hmmm!

    ReplyDelete
  24. I doubt they're using Mobile XDR... it sounds like that it only became available for licensing in February, and given that it requires both memory and SoC support, I would guess it'll be 12-18 months minimum before we see it in products.

    The other thing to consider (yay, yet another wild card!) is that the Tile-based Deferred Rendering architecture that these devices use can get away with a relatively small but high bandwidth local memory on chip (called "GMEM" in Adreno, "system-level cache" in SGX). You can read more about the advantages of fast local storage in the context of the Xenos graphics chip in the Xbox 360 (which incidentally uses a similar unified shader architecture to the Adreno 200):
    http://en.wikipedia.org/wiki/Xenos_%28graphics_chip%29
    (cf. the Adreno docs also say the GMEM is used to support dedicated color, Z and stencil buffers, but not the whole framebuffer obviously.)

    One way the newer 45nm SoCs can improve graphics performance drastically is to increase the size of the on chip graphics memory. In fact, increasing on chip memories & caches is one of the more important ways that smaller feature sizes improve performance. It also my understanding (but don't quote me on this) that fill rates are going to be limited by off-chip memory bandwidth more so than the polygon rate because it necessarily involves the framebuffer (which is necessarily too large for on-chip memory which is fabricated in a static RAM process).

    I look forward to seeing your revised article!

    ReplyDelete
  25. Oh and as to who I am: just another "electronics freak" :-) I designed a board around the OMAP3530 last year for a low-volume industrial product (and did the Linux bring-up for it), and am thus quite familiar with that particular SoC, and I've done some iPhone OpenGL programming so have an interest in mobile GPUs as well.

    ReplyDelete
  26. Very interesting, Hugo. And yes, I just received a response from Rambus, they've not yet licensed the technology out to anyone yet. Wasn't aware SoC support was needed, but frankly, it makes sense, I shouldn't expect it to be a plug-n-play alternative to LPDDR2.

    I think I'm going to keep my article from straying too far past polygon counts. Even fill rate is just thrown into my article's chart for reference. Maybe in a future article when I can really reasearch the GPU thoroughly I can look into it.

    I guess I'll have to leave some questions hanging in my article. I hate to do it, but some of this stuff is just buried in too much of a cloud of secrecy by Samsung and Qualcomm. TI seems pretty forthcoming, but probably because they're essentially using a Cortex-A9 hard macro with their OMAP4 (they may be making some changes but it looks like they're going to be sticking pretty closely to the stock A9 config).

    I learn so much every day I put into this, and it only makes me want to write more and more... I'm getting to the point where I'm going to have to cut myself short and just publish.

    Anyhow, again, I appreciate your feedback; it's helped me turn my article into a much wiser piece of work!

    ReplyDelete
  27. Aaannnd, the part number for the A4 escapes:
    http://iphonejtag.blogspot.com/
    S5L8930 - sounds a lot like a Samsung SoC part number to me :-)

    ReplyDelete
  28. ...and they're all too busy freaking out over the prospect of iPad jailbreak to even notice / care! Good eye!

    You were pretty close to hitting that nail right on the head!

    ReplyDelete
  29. Article republished. This is pretty much it's final form.

    ReplyDelete
  30. Some minor corrections:

    "If you follow the smartphone industry at all, you're sure to have heard of the Qualcomm Snapdragon processor. It's the new smartphone CPU heavyweight, a 1 GHz processor packed with a multitude of features, based upon the ARM Cortex-A8 that many modern smartphones such as the Droid, Palm Pre, Nokia N900 and iPhone 3GS use."
    -- Wrong, it's not based upon a Cortex A8: "similar" yes, "based on" no.


    "For instance, Qualcomm chose to increase the width of the internal bus for the chip to 128 bits, while allowing it to shut down half of that bandwidth if necessary to conserve power."
    -- I believe this is wrong too. I think you may have confused the doubling of the SIMD/vector engine (from 64 in the A8 to 128 bits in the Scorpion). Shutting down half a bus would have a negligible, almost unmeasurable effect on power consumption.

    "Changes in core architecture also allow a 25% improvement instructions that can be handled per clock cycle, meaning a 1 GHz Cortex-A8 will perform considerably quicker than a 1 GHz Cortex-A8 (or even Snapdragon) equivalent."
    -- Ignoring the obvious typo (first A8 should read A9), I think you're not giving the A9 enough credit. Read Anand's A9 article again (http://www.anandtech.com/show/2911/4).

    "Rumors indicate that Qualcomm and Samsung are also working on 45 nm Cortex-A9 SoCs, though no timetable has been set."
    -- Qualcomm are not likely to be working on an A9 SoC, but rather dual core and out-of-order-execution extensions to their Scorpion core.

    "Qualcomm claims that we'll see their next generation of Snapdragons mid-2010"
    -- True, but this is the QSD8x50A dieshrink (1.3 GHz, slightly improved, faster GPU), not the upcoming dual-core version, or the inevitable A9-competing out-of-order-execution extensions to the Scorpion core. I believe we can expect dual core Snapdragons in smartphones late 2010 to early 2011. And, you may have seen, Nvidia recently announced that they intend to bring the Tegra-2 (dual A9s) to the smartphone market by late this year as well.

    ReplyDelete
  31. Hugo, I've already addressed a few of your concerns when I proofed my draft last night (after posting it, not a great idea TBH!)

    Calling a Snapdragon a Cortex-A8: Already fixed

    128-bit Snapdragon bus: I'd heard the SIMD bus had been expanded, but I'd also read that the CPU bus itself was expanded to 128 bits, improving data throughput, and that a significant power-saving feature was the ability of the processor to shut down half of that bus. Looks like I need to find where that was stated and double-check it.

    Cortex-A9 improvements: Yeah, A9 brings a lot of new things to the table like out-of-order instruction handling etc, but I wasn't sure I wanted to wade too deep into the details. If I can provide my user with a simple explanation of A9's merits, I may expand upon it further.

    Qualcomm and Samsung working on Cortex A9: I already removed that reference entirely. It was a rumor that I couldn't corroborate, and you're right, Qualcomm has invested too much into Snapdragon to let it go so soon.

    Next-gen Snapdragons: Well, I've seen some pretty good rumors stating that a Windows Mobile 7 Series phone will hit the market in Fall sporting a dual-core Snapdragon (call it a flagship device if you will). We'll see if that actually pans out. But I haven't heard a whole lot about the single-core 45 nm "A" varients, have you heard anything interesting?

    The article has been republished with the changes I made last night, but tonight when I put the final touches on it, I'll be taking a look at some of the points you made above. As usual, thank you for your feedback Hugo, it's much appreciated.

    ReplyDelete
  32. Re: the QSD8x50A - see http://www.qdevnet.com/dev/gpu/processors
    They have a 1.3 GHz processor, and the Adreno 205 with "significant improvements in shader performance over Adreno 200 GPU", as well as hardware 2D (SVG/flash) acceleration.

    The same page says that the Adreno 220 ("our next GPU") will come out in 2011 - I believe this is the GPU the QSD8x72 uses, because it seems they would have made more of a fuss about the 205 if it was ~4x more performant than the Adreno 200 (as claimed by http://www.qctconnect.com/products/snapdragon.html).

    ReplyDelete
  33. thank you both electrofreak and hugo for this very intriguing article and dialog.

    ReplyDelete
  34. Thank you; I hope Hugo will return in the future to comment on other articles of mine as well, his contributions greatly improved the accuracy and detail of my article.

    Trying to decide what I should write about next currently; this article ended up being a huge investment in time (one that I don't regret by any measure) but in doing so has sort of pulled me off track. I'd love to go into further detail, but AnandTech will be doing an architecture breakdown in the near future that I'd rather not try to compete with (and judging by the quality of their articles in general I doubt I could) and I'm not sure what I should try to take a look at next when I don't even own a smartphone myself! (I'm one of those "If I wait 2 more months, 'X' phone will be out...")

    I probably should put ARM stuff on hold for a while and get my Cisco training done, finally!

    ReplyDelete
  35. Electrofreak, Great article. I am surprised you didnt mention much about OMAP3630 and Tegra 2 in you article. Most devices in 2010 will be based on these SOCs. My bet is Tegra 2 with A9 and LPDDR2 will rock. Any insight on this will be nice.

    ReplyDelete
  36. I tried to stick with SoCs already on the market to compare with the EVO 4G and Galaxy S for this article.

    However, I definitely do plan on covering new SoCs as more information becomes available. Tegra 2 does indeed look very promising, and TI's 45 nm Cortex A8s will likely be no slouch either.

    I would also like to officially announce that my article has been published by AlienBabeltech.com!

    It can be seen here: http://alienbabeltech.com/main/?p=17125

    ReplyDelete
  37. AFAIK its the upcoming 45nm 1,3GHz Single Core from Qualcomm which is 30% more Power Efficient compared to the 65nm QSD8250.
    What I heard about the dual core version is that it consumes "not that much more" power in idle-mode than the QSD8250 but is more efficient in work progress.

    ReplyDelete
  38. I think both are 30% more power efficient by virtue of the 45 nm design, which is what allows the 1.5 GHz dual-core to run at only slightly higher power consumption levels as the current QSD8x50 (which apparently eats about 500 mW on average).

    At least, my math assumed that by 30% more power efficient, they meant consuming only 70% as much power per MHz as the QSD8x50, which calculates out to 525 mW at 1.5 GHz.

    The 1.3 GHz QSD8650A would run at 455 mW, a reasonable improvement over current power consumption rates, even with the hike in clock speed. Pretty impressive!

    ReplyDelete
  39. Thanks man, really appreciate this article!

    ReplyDelete
  40. "Samsung may be citing purely theoretical numbers ..."

    Ok so did you hold both phones side-by-side and test them to actually verify this long piece of crap? If not, what a waste. I almost took you seriously ... looks like a lot of creative positioning of already written rhetoric. Do some real research before you right about it.

    ReplyDelete
  41. Ah. I could delete the previous post, but I'm in the mood to put some smackdown on this moron.

    Your comment is pretty amusing, my idiotic little friend, because this article was written around 3 months ago. This was shortly after CTIA 2010, where the Samsung Galaxy S was announced alongside only a handful of specs.

    The phone was demoed briefly at a show where reviewers were allowed to touch it for a couple minutes at a time only with pre-loaded (and no benchmark) software installed.

    So this article was written based upon virtually no available information, and absolutely zero opportunity to test the device alongside the EVO 4G, which also had just been announced and also was unavailable for testing.

    I hear the Windows 7 phones are coming out in a few months. Why don't you "do a little research" and write an article on a few Windows 7 phones? What's that? There isn't any information on them aside from some rumored stats, a corporate press statement, and some shaky camera shots?

    Welcome to my world :)

    PS - Next time you feel the need to open your mouth and make yourself sound like a fool, check the date on the article first. ;)

    ReplyDelete
  42. Oh and maybe it's just my inner typist talking, but I'm surprised that you spelled the word "rhetoric" properly, but couldn't manage to spell the word "write" right. (Actually, you did spell it "right", har har.)

    Seriously, put down the thesaurus, you're not fooling anyone.

    ReplyDelete
  43. LOL@ "right" - Great article btw. I'm seriously considering holding off on the Evo in lieu of of the Epic just announced from Sprint (Galaxy S Pro).

    ReplyDelete
  44. Thanks Brian. In retrospect my response was perhaps a little too heated, but I felt the need to defend my work. :)

    I'm also looking at the Galaxy S Pro as my likely next phone, but the fact that it's Sprint (I get teased mercilessly when I mention I might get a Sprint phone) and the rather surprising benchmark results of the Droid X is making me hesitate.

    I'm really looking forward to seeing some more hard numbers. I've been assured that they're forthcoming by a couple of my contacts, and we might even see some show up in the next few days.

    ReplyDelete
  45. @Electrofreak - "*affected. Grammar fail."
    It is actually supposed to be effected. Affected it derived from Affection which has to do with feelings. Effect is used when something effects something else, as in cause and effect. However I do know that when looking in the dictionary today Affect is used interchangeable which is wrong. No clue why the English language went this way. However it is a big pet peeve of mine.

    ReplyDelete
  46. I like this article. I'm about 6 days out from picking up my Vibrant. It will be a great phone to get. It will be my first one. :)

    ReplyDelete
  47. Swehes:

    Thanks for the attempt at correction, but you should probably review the definition of both words, because you're incorrect.

    Affect means to impact or cause a change to something.

    Effect means to produce or bring something about.

    Copied directly from dictionary.com, removed noun definitions (as the word was being used as a verb):

    af·fect1    [v. uh-fekt; n. af-ekt] Show IPA
    –verb (used with object)
    1.
    to act on; produce an effect or change in: Cold weather affected the crops.


    ef·fect   [ih-fekt]
    –verb (used with object)
    10.
    to produce as an effect; bring about; accomplish; make happen: The new machines finally effected the transition to computerized accounting last spring.

    I wrote: "Samsung estimates that 20% of the Hummingbird's functions are affected, and of those, on average 25-50% less instructions are needed to complete each task."

    Since the functions were acted upon, an effect or change was made in them, the word affected is the appropriate usage. Proper use of the word effected would be in a sentence such as "The modifications Samsung effected are estimated to impact 20% of the Hummingbird's functions, and of those, on average 25-50% less instructions are needed to complete each task."

    To further explain the difference, read Dictionary.com's usage note for affect:
    Usage Note : Affect and effect have no senses in common. As a verb affect is most commonly used in the sense of "to influence" ( how smoking affects health ). Effect means "to bring about or execute": layoffs designed to effect savings. Thus the sentence These measures may affect savings could imply that the measures may reduce savings that have already been realized, whereas These measures may effect savings implies that the measures will cause new savings to come about.

    Grammar is a pet peeve of mine too ;)

    ReplyDelete
  48. And H-E, thank you. I wish I had the time to expand upon it now that much more information is flooding in now that people have had the chance to benchmark both phones. The Galaxy S definitely outperforms the EVO 4G, however there have been some surprising results in performance from TI's own 45 nm OMAP processors. Take a look at this post if you haven't already:
    http://sean-the-electrofreak.blogspot.com/2010/06/ruminations-on-various-benchmarks-for.html

    ReplyDelete
  49. And yes, to those of my blog subscribers for comment updates, I'm sure you know how obsessively I tend to take down and repost my comments after screening them repeatedly for grammar and spelling issues. I'll take the opportunity to apologize to anyone spammed by numerous notifications of my comment posting. It's probably why I have relatively few followers! :-p

    So, I'll forgo the normal delete and repost I'm tempted to do to correct the redundant phrasing in my second sentence in the previous post. But I promise you, it's killing me!

    ReplyDelete
  50. Man, you completely forgot to mention TEGRA2!!!

    ReplyDelete
  51. Tegra 2 is Cortex-A9, like the OMAP 4440 I touched upon. Since there is very little documentation to describe any differences between the OMAP 4440 and the Tegra 250 (for example) I did not cover the topic.

    However, as more information on the specific manufacturer's implementations of Cortex-A9 come through, I look forward to writing about them. I assume that early on in Cortex-A9 production, the primary differences between the chips will be the graphics solution, and I'm sure that nVidia will not disappoint.

    ReplyDelete
  52. sean, great article. I am generally not involved in hardware design issues but i like when people spend their spare time to do good for the public by researching things. I came to your site because i wanted to do some reading before buying the Galaxy S. Of course i am not the average mobile phone user. In the real world i am developing enterprise java apps but started developing a small android app recently... Again, thanks for your efforts. Keep on!

    ReplyDelete
  53. Glad you enjoyed the article Marc! If you haven't seen it already, check out my most recent post, Android phones benchmarked; it's official, the Galaxy S is the fastest. I only wish I'd had the opportunity to benchmark the phones in person myself!

    I wish I could write more, but my new job has kept me busy. Perhaps when things settle down a bit I can get back into researching and writing.

    ReplyDelete
  54. Thanks for giving me this wonderful information; it can very helpful for the track a cell phone.

    ReplyDelete