Latest Pipeline Posts
Checking Their Pulse: Hisense's Google TV Box at CES
by Jason Inofuentes 14 hours ago

So, Google TV is still happening. Indeed, more players are getting into the game than ever. Hisense is a Chinese OEM/ODM that's seen steady growth in the television market internationally, and hopes to build a big presence in the US this year. Their Google TV box, Pulse, was announced as among the first to be built around the Marvell Armada 1500 chipset, and we've been waiting for it patiently ever since. It's available on Amazon right now, and we'll hopefully have it in for review soon. For now, we got a chance to take a peek at Hisense's interpretation of Google TV while on the show floor at CES. 

 

To re-cap, Google TV is the stab at altering the television viewing paradigm by Mountain View's finest. It has gone through some pretty immense transformations since it was first introduced and while all implementations share a basic UI paradigm, they've allowed OEMs to skin parts of the experience. The latest software iteration (V3, in their parlance), has three key conceits: Search, Voice and a recommendation engine. Search, understandably, is Google's strong suit, and is leveraged to great success. Voice's execution is good, though the value is limited. Primetime is their recommendation engine, and while it's no doubt quite good, it feels little different than the similar features provided by Netflix and the like. 

 

Hisense isn't shipping V3 software just yet, but a few things stand out about their software. We'll start with the Home screen. Lightly skinned, and functional, the screen is fairly satisfying. The dock and the three featured apps across the top are static, but that "Frequently Used" field is populated automatically based on your usage. That area below the video field would make a great place for a social feed widget, or perhaps some other useful data, but, as usual, is instead devoted to ad space. Just off the Home button, is a new button, that maps to an old function. Previously, hitting the Home button from the Home screen, brought you to a field where a user could configure widgets. Here that "double tap" is moved to a separate button, but looks largely the same. 

 

     

The remote control is a many buttoned affair, with a large touchpad (complete with scroll regions) on one side, and a QWERTY keyboard on the back. The touchpad is quite large, though responsiveness was a bit hit or miss, it's hard to blame the BT/WiFi powered hardware in such a spectrum crowded environment. The button lay out is oddly cramped for such a large remote, thanks to that touchpad and a similarly large set of directional keys. The QWERTY keyboard on the back, though, benefits from the acreage, and has a good layout. No motion controls are on offer here, this a tactile interface all the way. And truly, I'm not going to miss waving a wand around. 

There are three hardware things a Google TV needs to get right, and so far none have hit on all three. Video decode needs to be flawless and extensive; if local file playback is available, it shouldn't be limited to just a handful of codecs and containers, and it shouldn't ever falter. 3D rendering should at least be passable; as an Android device, it'd be nice to be able to play some games on these things, and so far that's something that's been ignored. More important than 3D though, 2D composition must be fast, no matter how many effects you throw at the screen. In many past devices, the UI was generally sluggish, but it slowed to an absolute crawl when you asked it to overlay a UI component over video. Imagine our surprise, then, when Hisense pulled it off without a hiccup. 

 

Hitting the Social button while watching a video brings up this lovely widget, which shows your Twitter and Facebook feeds and even offers sharing and filtering options. The filtering options are most intriguing, since they'd allow you to follow a content based hashtag (say #TheBigGame) and participate in the coversation related to the content you're watching, and all on the same screen. For terrestrial content the widget shifts the content into the upper left region so that none of it is obscured by the widget. 

 

But as nifty as the widget may be, what really set it apart was how quickly its components were drawn and updated. From the time the button was depressed to the fully composited and updated widget was shown couldn't have been but a second. Jumping from there to the Home screen was quicker, and opening Chrome and navigating to our home page all happened without noticeable stutter. 

 

Chatting with Marvell later, we discussed how they used their own IP to develop their composition engine and targeted just this sort of use case for it. Based on our time with their solution on the show floor, they and Hisense have done some good work. We can't wait to get our hands on the hardware ourselves and see jsut how good it gets. 

The Tegra 4 GPU, NVIDIA Claims Better Performance Than iPad 4
by Anand Lal Shimpi 17 hours ago

At CES last week, NVIDIA announced its Tegra 4 SoC featuring four ARM Cortex A15s running at up to 1.9GHz and a fifth Cortex A15 running at between 700 - 800MHz for lighter workloads. Although much of CEO Jen-Hsun Huang's presentation focused on the improvements in CPU and camera performance, GPU performance should see a significant boost over Tegra 3.

The big disappointment for many was that NVIDIA maintained the non-unified architecture of Tegra 3, and won't fully support OpenGL ES 3.0 with the T4's GPU. NVIDIA claims the architecture is better suited for the type of content that will be available on devices during the Tegra 4's reign.
 
Despite the similarities to Tegra 3, components of the Tegra 4 GPU have been improved. While we're still a bit away from a good GPU deep-dive on the architecture, we do have more details than were originally announced at the press event.


    

Tegra 4 features 72 GPU "cores", which are really individual components of Vec4 ALUs that can work on both scalar and vector operations. Tegra 2 featured a single Vec4 vertex shader unit (4 cores), and a single Vec4 pixel shader unit (4 cores). Tegra 3 doubled up on the pixel shader units (4 + 8 cores). Tegra 4 features six Vec4 vertex units (FP32, 24 cores) and four 3-deep Vec4 pixel units (FP20, 48 cores). The result is 6x the number of ALUs as Tegra 3, all running at a max clock speed that's higher than the 520MHz NVIDIA ran the T3 GPU at. NVIDIA did hint that the pixel shader design was somehow more efficient than what was used in Tegra 3. 
 
If we assume a 520MHz max frequency (where Tegra 3 topped out), a fully featured Tegra 4 GPU can offer more theoretical compute than the PowerVR SGX 554MP4 in Apple's A6X. The advantage comes as a result of a higher clock speed rather than larger die area. This won't necessarily translate into better performance, particularly given Tegra 4's non-unified architecture. NVIDIA claims that at final clocks, it will be faster than the A6X both in 3D games and in GLBenchmark. The leaked GLBenchmark results are apparently from a much older silicon revision running no where near final GPU clocks.
 
Mobile SoC GPU Comparison
  GeForce ULP (2012) PowerVR SGX 543MP2 PowerVR SGX 543MP4 PowerVR SGX 544MP3 PowerVR SGX 554MP4 GeForce ULP (2013)
Used In Tegra 3 A5 A5X Exynos 5 Octa A6X Tegra 4
SIMD Name core USSE2 USSE2 USSE2 USSE2 core
# of SIMDs 3 8 16 12 32 18
MADs per SIMD 4 4 4 4 4 4
Total MADs 12 32 64 48 128 72
GFLOPS @ Shipping Frequency 12.4 GFLOPS 16.0 GFLOPS 32.0 GFLOPS 51.1 GFLOPS 71.6 GFLOPS 74.8 GFLOPS
 
Tegra 4 does offer some additional enhancements over Tegra 3 in the GPU department. Real multisampling AA is finally supported as well as frame buffer compression (color and z). There's now support for 24-bit z and stencil (up from 16 bits per pixel). Max texture resolution is now 4K x 4K, up from 2K x 2K in Tegra 3. Percentage-closer filtering is supported for shadows. Finally, FP16 filter and blend is supported in hardware. ASTC isn't supported.
 
If you're missing details on Tegra 4's CPU, be sure to check out our initial coverage. 

Intel's Quick Sync: Coming Soon to Your Favorite Open Source Transcoding Applications
by Anand Lal Shimpi 20 hours ago

 

Intel's hardware accelerated video transcode engine, Quick Sync, was introduced two years ago with Sandy Bridge. When it was introduced, I was immediately sold. With proper software support you could transcode content at frame rates that were multiple times faster than even the best GPU based solutions. And you could do so without taxing the CPU cores. 
 
While Quick Sync wasn't meant for high quality video encoding for professional production, it produced output that was more than good enough for use on a smartphone or tablet. Given the incredible rise in popularity of those devices over recent history and given that an increasing number of consumers moved to notebooks as primary PCs, a fast way of transcoding content without needing tons of CPU cores was exactly what the market needed.
 
There was just one problem with Quick Sync: it had zero support in the open source community. The open source x264 codec didn't support Quick Sync, and by extension applications like Handbrake didn't either. You had to rely on Cyberlink's Media Espresso or ArcSoft's Media Converter. Last week, Intel put the ball in motion to change all of this. 
 
With the release of the Intel Media SDK 2013, Intel open sourced its dispatcher code. The dispatcher simply detects what driver is loaded on the machine and returns whether or not the platform supports hardware or software based transcoding. The dispatcher is the final step before handing off a video stream to the graphics driver for transcoding, but previously it was a proprietary, closed source piece of code. For open source applications whose license requires that all components contained within the package are open source as well, the Media SDK 2013 should finally enable Quick Sync support. I believe that this was the last step in enabling Quick Sync support in applications like Handbrake.
 
I'm not happy with how long it took Intel to make this move, but I hope to see the results of it very soon. 

Vizio's New Touch Notebook and AIO PCs at CES
by Vivek Gowri 20 hours ago

Vizio used CES as the platform to debut the third revision to its PC lineup, which currently consists mostly of ultrabooks and all-in-ones. The first revision was the initial launch last summer, while the second revision brought touchpad updates (replacing the godawful Sentelic pads with better Synaptics units) and Windows 8. This third revision brings touchscreens and quad-core CPUs across the board to all Vizio systems, regardless of notebook or all-in-one. 

Vizio’s notebook lineup is presently structured with a Thin+Light and a Notebook; the former is available in two form factors (14” 900p and 15.6” 1080p) with Intel’s ULV processors, solid state storage, and integrated graphics, while the Notebook is 15.6” 1080p with quad-core IVB processors, Nvidia’s GT 640M LE graphics, and a 1TB hard drive paired with a 32GB caching drive. Across the board, we see IPS display panels, fully aluminum chassis, and uniform industrial design. 

The new Thin+Light Touch again come in 14” and 15” models, with either AMD A10 or Ivy Bridge i7 quads exclusively, with AMD dedicated graphics available with the AMD model. The dual-core and ULV parts are gone, and with nary a mention of the CN15 Notebook, it would appear that it has been killed off because of too much overlap with the Thin+Light Touch. Both quad-core CPUs and dedicated GPUs are available in the latter, so you’re not losing much, though that means there is no longer an Intel quad + dGPU config on offer.

As can probably be surmised from the name, the Thin+Light Touch is available exclusively with a capacitive multitouch display. This adds a bit of thickness and weight to the chassis, but the 15.6” model is still 4 pounds (from 3.89lbs before) so it’s not a huge amount. Other improvements include a much more structurally sound palmrest and interior, which results in significantly less flex in both the body as well as the keyboard. This is likely the most significant of the chassis-level upgrades, and fixes the last major flaw from the second revision notebooks. Battery capacity has been “nearly doubled” which indicates capacity should be close to 100Wh (the previous Thin+Light was 57.5Wh) with the hope of substantially improving battery life.

Gallery: Vizio Laptops

It seems like a pretty targeted generational update, with all of the pain points from the first two notebooks fixed. I think I’d still like to see some improvements in terms of ports on offer (2xUSB and no SD slot just isn’t enough), but the gorgeous IPS display and nice industrial design make up for any remaining flaws. Price points are expected to be similar to the previous Thin+Light, and availability is expected to be in the early spring timeframe. 

Vizio also had its All-in-One Touch series desktops at their suite in the Wynn, though these are not new products. Vizio updated the AIO series with touchscreen displays and Synaptics touchpads at the Windows 8 launch, and simply brought those to Las Vegas to complement their new notebook, tablet, and HDTV products on the show floor.

Razer Edge: Impressions and Thoughts
by Vivek Gowri 21 hours ago

I spent a fair amount of time at CES playing with the Razer Edge, mostly because it was one of the more intriguing new products on the show floor. (Shield was another one, but Nvidia sadly kept it in a glass cage.) As recapped in our announcement post, it’s a 10.1” tablet that packs an ultra-low voltage Ivy Bridge CPU and an Nvidia GT 640M dGPU and comes with a gamepad accessory that turns it into the world’s largest GameBoy Advance. This, for a tablet, is a ridiculous amount of power. I’ve always been someone who appreciates insanity in mobile technology design, and the insanity of a 45W power envelope in a 10” form factor is something that I respect. 

The Edge on its own is pretty intense - 0.8” is really thick for a tablet, with a general sense of chunkiness that starkly contrasts the extremely svelte Blade. The intake and exhaust vents are put to the test in any extended gaming, and one of the units on the show floor that had been continuously running Dirt 3 for the previous few hours was....warm. It’ll be hard to tell how close to thermal equilibrium the Edge gets until we get one in our labs, but I expect it to throttle significantly at some point. 

17W Core i5 and i7 parts were chosen instead of the new 7W Y-series CPUs due to the higher clock speeds and better turbo capabilities of the U-series processors. The SSD has not yet been finalized, with different drives in all the prototypes that I played with. The display panel is in fact an IPS panel, which my announcement post was mistaken about (I was misinformed initially during the CES pre-briefing, but Razer’s engineering team corrected me). It looks pretty decent, and the capacitive touch panel was pretty responsive. The 1366x768 resolution matches up with most of the other 10.1” Windows tablets we’ve seen, and was likely chosen in lieu of 1080p so that the GT 640M LE could comfortably game at native resolution. 

There’s a 40Wh battery on board, with an extended 40Wh extended battery that fits in the gamepad and notebook docks. (It’s a 14.8V 2800 mAh battery, for an exact capacity of 41.44 Wh). 80Wh is a ton of battery for a device this small, but when stressed, it’ll go quickly. A rough estimate of the internal components gives us a basic estimate of 40W power draw (17W CPU, 22W dGPU, in most gaming situations figure a 50% load on CPU and 100% load on GPU, add about 10W for display, wireless and other miscellaneous stuff) and we’re sitting at an hour of gaming on the internal battery and 2 hours with the extended pack. Obviously, turning down settings to reduce system load, brightness, and the intensiveness of the game being played will affect these figures - Razer quotes a range of 2-4 hours of mobile gameplay. Normal battery life should be in the 3-5 hour range on the internal battery and about double that with the extended pack. 

The gamepad controller essentially works like an Xbox controller, with intuitive controls and built-in force feedback. It’s pretty cool, I feel like it’s something I would have absolutely killed for when I rode the bus to school every day back in my early undergrad days. The tablet clips into the gamepad, which essentially envelops the tablet like a case, and then you’re off. I spent my fair share of time playing Dirt on it, and it was just great. Control layout is identical to the 360, and the analogs and triggers are responsive. Razer definitely knows how to put a good 360 controller together, as evidenced by the Sabertooth, so this came as no real surprise. The setup adds a bit of heft to the tablet, to the tune of roughly 3.25 pounds, but for the amount of mobile gaming potential it brings, I’d say it’s a relatively small loss. The only downer was the $249 price point on the accessory.

The keyboard dock, on the other hand, was kind of a disappointment. It’s definitely a work in progress and isn’t expected to ship until Q3 (the gamepad and docking station will ship alongside the Edge in Q1), but it’s a clunky piece of kit with a currently not-very-good keyboard and a pretty unrefined hinge/latch design. I’ll chalk the flex down to the handbuilt state of the prototypes but the key sizing is way too small - instead of going edge to edge like most netbooks, there’s a border left around the keyboard that results in tiny keys. The keys absolutely need to be bigger for any semblance of a decent typing experience. There’s a lot of improvement that can be done here; I suggest the design team pick up a Transformer laptop dock or a late-model Eee PC and borrow liberally from that keyboard design. ASUS has absolutely perfected the 10.1” keyboard, so it’s not a bad idea. I’m not going to rake Razer over the coals on a product that clearly isn’t anywhere near finished yet though, so let’s move on.

The docking station was set up with an LCD TV and a pair of Sabertooth controllers in multiple places in the Razer CES booth, as well as their meeting suite. In all cases, the display was set to be mirrored, presumably to ensure that the games were played at the internal panel’s native 768p and not 1080p (where performance would understandably struggle). I’m still really interested in tossing an Edge + dock on my desk with a 24” display and a Bluetooth keyboard/mouse, it seems like one of the more viable 2 pound desktop replacements around. 

Pricing slots in at $999 for the base i5/4GB/64GB model, $1299 for the i7/8GB/128GB Pro model, and $1499 for the Pro plus Gamepad bundle. Doubling capacity to 256GB will run you an extra $150 for the Pro models. If you don’t want anything other than the tablet, the base model is a pretty good deal, but once you start adding accessories you might as well spring for the Pro bundle and resign yourself to paying Razer’s typically expensive peripheral costs. They don’t even try to deny that the Blade, the Edge, and all of their keyboards and mice are pricey - Razer has cultivated a premium brand ethos, and it’s done pretty well for them thus far.

G.hn and HomePlug Head for Showdown
by Ganesh T S 22 hours ago

It has been a while since we covered PLC (powerline communication) technology here, but we took the opportunity to check up on the latest and greatest in the area at CES. G.hn has been championed by the HomeGrid forum and the companies promoting them in early 2011 included Sigma Designs, Lantiq and Marvell. In fact, at CES 2011, we visited the Sigma Designs suite to see G.hn silicon in action for the first time. Lantiq had also demonstrated a G.hn chipset at the same show. Much water has flown under the bridge since then, and Lantiq seems to have quietly dropped off advertising their XWAY HNX solutions on their website. Sigma Designs is financially not doing too well, and Michael Weissman, one of their most vocal G.hn proponents, has moved on. These factors, however, didn't prevent them from introducing their 2nd generation G.hn chipset (PDF). There has been a change of PR hands at Sigma Designs, and we were unfortunately not invited to see it in action. However, Marvell was gracious enough to invite us to check out their G.hn system in action.

Meanwhile, HomePlug invited us to check out a compatibility test using commercially available HPAV (HomePLug AV) equipment. Qualcomm Atheros is no longer the sole vendor, with Broadcom and M-Star also pitching in with their own solutions. The Broadcom solution with the integrated AFE (Analog Front End) has been well received by the vendors. HomeGrid forum regularly organizes plugfests too, but they are of little relevance if one can't purchase the involved equipment in the stores.

It is good to see G.hn silicon in what appears to be ready-to-ship casing, but the bigger question is one of compatibility with existing equipment. Marvell indicated that service providers are lining up to supply G.hn equipment to customers (particularly in the growing Asian markets). However, with HPAV equipment already well spread throughout the world (particularly through consumer channels), it remains to be seen if service providers can take the risk of their equipment performance degrade in a MDU (multiple dwelling unit) scenario where the adjoining units have HPAV equipment. Marvell does promise good network isolation in the MDU case, and it will be interesting to see how a HPAV network and G.hn network can co-exist.

The progress with G.hn seems to be very slow. It is a pity that silicon demonstrated as early as January 2011 is yet to ship to customers two years down the line. Under conditions of anonymity, some of the networking vendors told us that they have given up on G.hn and are looking forward to HPAV2 silicon coming out towards the end of the year. The HomeGrid forum and its members have been quick to publicize any service provider / supplier agreements, and till now, we have received reports of Comtrend, Suttle, Chunghwa Telecom Labs and Motorola Mobility showing interest in G.hn. As long as Sigma Designs and Marvell remain in the fray, G.hn lives to fight another day. We will be keeping close tabs to find out when the first G.hn products start shipping to the customers of the service providers who have opted for it.


 

Buffalo Technology Updates NAS and DAS Lineup at CES 2013
by Ganesh T S yesterday

Brian already updated readers on the new products from Buffalo in the networking and Thunderbolt space. There were updates on the NAS front too. The primary announcement was the launch of the LinkStation 400 series of NAS devices. The available models include single and dual bay configurations with the option of going diskless (410D / 420D / 421E). These NAS devices also incorporate support for the BuffaloLink remote service and new mobile apps. The chassis has a black matte finish. Buffalo claims support for 80 MBps+ in throughput performance. Pricing ranges from $149 for the 421E to to $719 for a 8TB 420D. Availability is slated for end of Q1 2013.

The BuffaloLink service enables secure cloud access to the NAS. It consolidates various NAS devices under one account and provides easy remote access. The service works via relay mechanism and doesn't require any port forwarding. Buffalo maintains servers in US, Europe and Asia for this purpose. The service is available free for the life of the product. Plans are also underway to expand BuffaloLink to include other products such as routers.

The DriveStation DDR is a USB 3.0 DAS (Direct Attached Storage) unit with a 1 GB DDR3 cache. This is in addition to the 32 - 64 MB cache already present in the hard disk. This DRAM allows caching of writes to the hard disk. This makes it appear to the user that the writes to the DAS are quite fast (as much as 350% rise in some cases). Of course, there is no protection against power loss. Users have to be extremely careful in ensuring that the DriveStation DDR is connected to a UPS in case critical data is being transferred to it. Pricing ranges from $119 for a 1TB version to $189 for a 3TB version. Availability is scheduled for end of Q1 2013.

Head on over to the source link for more specifics on the products launched.

 

Acer Shows Off 2880x1620 Panel
by Jarred Walton yesterday

We visited with Acer at this CES, and they didn't specifically have anything that we were told we could discuss, but after seeing another publication with pictures of Acer's pre-release 2880x1620 IPS display laptop it appears that's fair game. So, let me tell you what we know.

The panel as noted is 2880x1620, with a diagonal of around 15.6" (give or take 0.1" I'd guess). This is basically the non-Apple version of the QWXGA+ display, only in 16:9 attire rather than 16:10. The display is clearly IPS or some other wide viewing angle design, and when we walked into Acer's suite to look at the laptops and tablets, from an oblique angle it stood out as far and away the best display of the bunch. I also took some time to show the same image (wallpaper) on the 2880 panel alongside adjacent 1366x768 and 1080p panels (both TN), and the difference in color was astounding.

My best guess for when we'll see this LCD show up in an Acer laptop (and potentially in laptops from other vendors) is around late Q2 2013, when the Haswell launch occurs. That should give the OEMs plenty of time to figure out how they're going to deal with an ultra-high-DPI panel in Windows, and that's where Apple's control over both the hardware and the OS is going to be difficult to beat. Hopefully when the display shows up, manufacturers will also remember to spend the extra time and money to pre-calibrate for accurate colors, and it sounds like that's at least in the cards.

Interacting with HTPCs: Adesso, IOGear & Rapoo Demonstrate Options at CES 2013
by Ganesh T S yesterday

Media Center remotes are a dime a dozen, but, judging by the threads which frequently pop up on AVSForum, it appears as if full-sized keyboards are preferred by a number of users. Some of the popular options for controlling HTPCs include the diminutive Logitech diNovo Mini and the Lenovo N5902 keyboard / 'trackball' combo. The Logitech K400 with an integrated touchpad is also quite good and economical (and my personal HTPC solution for now), but it isn't really ideal as an extended alternative for a mouse. At CES, we went around the show floor looking for HTPC control solutions. In particular, we paid attention to the full size offerings. A separate mouse is out of the question in a HTPC setup, and mostly, we were bundled with either a touchpad or a trackball. The wireless communication happened in either the 5 GHz or 2.4 GHz band with a specialized USB receiver on the PC side. In some cases, the communication protocol of choice was Bluetooth. Communication devices using Bluetooth can also interface with tablets supporting Bluetooth.

Adesso:

Adesso had the yet-to-be-launched WKB-4150DW Bluetooth 3.0 aluminum touchpad keyboard on display. It is mainly intended to interface with tablets, but the build and features make it an ideal HTPC companion. The differentiating feature of this product is the option to use either 2.4 GHz (with a dedicated USB receiver) or Bluetooth for communication using a switch at the rear of the unit.

Older keyboard / trackpad / trackball combo models were also on display.

IOGear:

IOGear wasn't introducing any new keyboard / mouse combos, but they had their full lineup on display. The GKM571R appeared to be quite interesting given its minimalist design. The unit even turns off completely when the upper lid is closed. The on-lap keyboard with optical trackball and scroll-wheel, GKM581R, in addition to being an ergonomic alternative for HTPCs, is also compatible with multiple game consoles (including the PS3). The GKM681R retains the same compatibility of the GKM581R, but in a compact form factor, without the on-lap ergonomic design. The GKM561R has a laser trackball for 400, 800 or 1200 dpi settings. The unit is MCE-ready with appropriate shortcuts and also retains the game console compatibility of the previous two models.

Rapoo:

Rapoo had a variety of Windows 8 peripherals on display. Of interest to the HTPC crowd were the wireless multimedia touchpad keyboard (E9180P) and the wireless illuminated keyboard with touchpad (E9090P). Both of these communicate in the 5 GHz spectrum, avoiding interference with Wi-Fi, Bluetooth and other 2.4 GHz devices. There is support for customizable touch gestures for personalizing the navigation experience. The latter features inductive wireless charging and the backlight is adjustable.

We are looking forward to having some of these models over for review towards the end of this quarter.

Seagate and LaCie Demonstrate Complementary Product Lineups at CES 2013
by Ganesh T S yesterday

Seagate is well on its way to complete the acquisition of LaCie, and the two companies had a joint presence at CES 2013. For the most part, the companies have complementary lineups. There are two areas of overlap, namely, the external hard drive space and the entry-level business NAS systems and network attached hard disks. In the former space, LaCie differentiates by providing different aesthetics to the case itself. In the latter space, the differentiation is almost non-existent. In particular, both the LaCie 2big NAS and the Seagate BlackArmor NAS 220 serve the same market segment and have similar performance. It will be interesting to observe how LaCie and Seagate consolidate their budget business NAS offerings.

Seagate Wireless Plus:

The most important announcement from Seagate was the Wireless Plus portable hard drive. This is a follow-up product to the Seagate GoFlex Satellite that was reviewed in late 2011. Seagate claims to have increased the battery life by better optimizing the drive up time depending on the content being streamed. The included battery is good for up to 10 hours of video playback according to Seagate.

iOS and Android apps are available to interface with the wireless drive and access the content. In our hands-on testing, we found the Android app to perform way worse than the iOS app with respect to speed and ease of use. The STCK1000100 1 TB version is available for pre-order at a price point of $200. LaCie doesn't have any similar product in their line-up.

Seagate Central:

The Seagate Central is a network attached hard disk with a very pleasing industrial design. The unit is based on a Cavium chipset (ARM-based) and has a single GbE port as well as a USB port in a recessed nook. We voiced our concerns about the placement of the USB port (too close to the network jack, and also lacking clearance against the recessed wall). In terms of products in the same category, Seagate is pitting this against the Western Digital MyBook Live and the Iomega single bay network attached hard disk. The plus points of the Seagate Central include a Samsung SmartTV app to access the content, as well as Android and iOS apps which replicate the functionality seen in the Wireless Plus's apps. The issues we pointed out with the Android app in the Wireless Plus remain in the Seagate Central too.

The product will ship in March with a MSRP of $190, $220 and $260 for the 2TB, 3TB and 4TB versions respectively. LaCie has a network attached hard disk in the LaCie CloudBox as well as the LaCie d2 Network 2, though they aim at a different segment of the network attached hard disk market.

LaCie 5big NAS Pro:

This is a 5-bay NAS based on the Intel Atom D2700 platform meant as a performance offering in the SMB NAS market. We were able to present some thoughts with a beta unit just prior to CES. Do head on to the first part of our review for more information about the 5big NAS Pro.

LaCie 5big Thunderbolt:

Unless hard disks are placed in a RAID configuration, they are unable to saturate Thunderbolt links. LaCie introduced the 5big Thunderbolt, which can deliver up to 785 MBps of throughput. There are two Thunderbolt ports for daisy chaining support.

The pricing of the diskless version starts at $1200.

LaCie Blade Runner:

We have had Neil Poulton-designed external HDDs from LaCie before, and now, they have introduced the Blade Runner, designed by Philippe Starck. The design of the enclosure is hard to describe, so we will let the gallery below do the talking.

The 4 TB Blade Runner has a USB 3.0 interface. It is in a limited edition run of 10K units and has a MSRP of $300.

Seagate also briefed us under NDA on some exciting announcements scheduled for the next two quarters, along with some demonstrations. Stay tuned for more Seagate / LaCie coverage in the near future.

IOGear Demonstrates HDMI Switching Solutions at CES 2013
by Ganesh T S yesterday

We visited IOGear's booth at CES and saw a variety of devices including HDMI switching solutions, I/O devices and other A/V gear. This post covers the HDMI switching solutions alone. There were two products which stood out in the demo. The first one was targeted towards home consumers. IOGear touts this as the first wireless streaming matrix for home use. It has 5 HDMI inputs and 2 HDMI outputs. One of the HDMI outputs is hardwired, while the second is wireless. A wireless receiver is bundled with the unit and can be placed up to 100 ft away (across walls). This wireless technology is based on WHDMI (5 GHz technology). The input for the wireless HDMI output can be configured from the second room. This device can also be used to clone HDMI outputs across two different locations. The device supports 3D over HDMI up to 1080p24.

The device does blank out the relevant output while switching, but that shouldn't be a factor in home usage scenarios. The Wireless 5x2 HD matrix (GWHDMS52) will ship in Spring for $400.

On the other hand, IOGear also has a rackmount 4x4 switcher meant for custom installers and the professional crowd. There is zero-delay switching without output blanking for this model. It can be controlled using the front panel, IR remote or RS-232 for professional applications. The AVIOR GHMS8044 is priced online around $700 (MSRP is $820).

 

IOGear also had some high capacity mobile battery chargers on display (up to 11000 mAh) and a Realtek-based WiDi / Miracast sink on display. The GWAVR WiDi / Miracast sink will debut at a MSRP of $80.

Swiftech and Steiger Dynamics: German Engineering Comes Home
by Dustin Sklavos yesterday

I don't know about you, but for me, the word "engineering" gets a lot more enticing when it's preceded by the word "German" (or the phrase "Commander LaForge, please report to.") Nanoxia, Swiftech, and Steiger Dynamics were all sharing a suite at CES this year, and while I've seen most of what Nanoxia has to offer with the Deep Silence 1, Swiftech and Steiger Dynamics were another thing entirely.

My meeting with Swiftech was brief and focused predominately on their new H220 closed-loop cooler. While they've offered watercooling kits of all types for a long time now, the H220 is targeted squarely at the market being served by Corsair's H100i, Thermaltake's Big Water 2.0 Extreme, and NZXT's Kraken X60. The H220 really screams quality, though, and you can tell Swiftech has been in the watercooling game for some time when you start examining the details.

Unlike most competing solutions, the H220's reservoir can be refilled, and the radiator uses brass tubing surrounded by copper fins (most closed-loop coolers sourced from Asetek or CoolIT rely on aluminum fins in the radiator). If you open the cooling loop (and Swiftech has designed the H220 for exactly that), the pump on the CPU waterblock is actually capable of handling the thermal load from an overclocked processor and two GeForce GTX 680s.

Swiftech had four comparison systems on display to show just how much better the H220 was than Thermaltake and Corsair's solutions, with each system employing a 240mm cooler from each company and the fourth with the H220's cooling loop including two GTX 680s. The H220 was able to either perform roughly 5C or so better than the competitors at comparable or lower noise levels. While I'm skeptical about the comparison systems (no two i7-3770Ks overclock exactly the same), I'm still pretty confident the H220 will be a force to be reckoned with.

Swiftech's H220 will be retailing with an MSRP of $139.

Meanwhile, Ganesh was gracious enough to coordinate a meeting between me and freshly minted system integrator Steiger Dynamics. Steiger Dynamics has one product, but it's a doozy: the LEET. While the name may not excite you, the product ought to at least pique your interest.

The LEET is essentially a custom desktop designed to be a media center, and specifically, a gaming machine. While comparable products like the DigitalStorm Bolt, the iBuyPower Revolt, and Alienware's X51 all look more like gaming consoles and were designed to see just how much power could be crammed within a specific envelope, the LEET goes in the opposite direction. Steiger Dynamics has produced something that looks like a home theater appliance, and within the enclosure is a custom liquid cooling loop (produced with Swiftech's aid, naturally) capable of supporting an Intel Sandy Bridge-E hexa-core Core i7 along with dual GeForce GTX 690s.

Where this product proves itself, though, is in how silently it runs. While pushing the CPU and graphics hardware at full bore (we're talking Prime95 plus FurMark) will produce an audible increase in noise, in gaming the LEET is essentially silent. I tried Crysis 2 and Far Cry 3, both at their maximum 1080p settings, and I didn't hear a peep from the system on display.

Steiger is still small, but they have a very attractive product. The price is going to put it out of reach for a lot of users (it starts at a not inconsiderable $1,798), but for those that can afford it, it's going to be a very impressive machine. This isn't something that can be easily built off of the shelf, and it shows. Expect to hear more from Steiger Dynamics in the future.

CES 2013: Cases and Cooling in the New Year
by Dustin Sklavos yesterday

I'm pleased to report that this year's visit to CES bore promising fruit for the new year of desktop PC cases along with cooling and even desktop machines in general. While the way notebooks and tablets will shake out over the next couple of years is at least somewhat difficult to pin down, the chassis and PC cooling industries produced very clear trends.

Much as when Jarred and I lauded the notebook industry for largely dispensing with glossy plastic (a practice HP has backslid on horribly with their G series and Pavilion notebooks), gaudy and ostentatious "gamer" cases are on the way out in favor of more staid and streamlined designs. Excepting Cougar's questionable Challenger case reviewed last year, most of the disconnect seems to stem from Taiwanese manufacturers and designers having a hard time trying to pin down western markets. That problem was largely absent from Rosewill, Thermaltake, and CoolerMaster's lines this year, though Enermax seems to be lagging behind.

When I visited with Enermax, they were showing off a few cases, but a consistent problem nagged them: two USB 2.0 ports, one USB 3.0. I asked why they were doing this, and they said dual USB 3.0 ports actually drove the cost up a couple of dollars. This is pretty reminiscent of the same attitude that's burying the non-Apple notebook industry (especially in the face of tablets), a misunderstanding that while western consumers are tight, they're not that tight.

Thankfully most of the rest of the case industry seems to have caught up with the step of progress. Thermaltake, once one of the biggest offenders, showed off their surprisingly elegant "Urban" series of silent enclosures destined to compete with NZXT's H2. Meanwhile, NZXT's new Phantom 630 is still on the ostentatious side, but only just so.

The biggest news for cases is that case design as a whole has progressed. Space behind the motherboard tray, and specifically channels for cabling, are pretty much standard now. What I was happy to see is USB 3.0 proliferating down to the sub-$70 market, using internal headers, and fan controllers everywhere. 140mm fan mounts are also becoming increasingly common, and the majority of manufacturers are trying to produce cases that can support 240mm radiators like Corsair's H100i.

Speaking of closed loop cooling, this is pretty much the big year for that technology to really take off. While NZXT and Corsair are still working off of designs that involve copper waterblocks and aluminum fins in the radiators, CoolerMaster's Eisberg line and Swiftech's new 240mm cooler all use copper fins in the radiators in addition to having beefier waterblocks and pumps. NZXT remains the only purveyor (extending from Asetek) of 140mm-derived radiators for the time being, but I don't expect that to last. Meanwhile, Zalman's liquid cooler doesn't use a conventional radiator at all, instead opting for a custom design reminiscent of their CPU heatsink designs. There were plenty of air coolers on display, too, but it's clear this is the direction things are going.

Finally, a brief word about boutiques. The last two years have suggested the boutiques and system integrators are beginning to seriously diverge, diversifying from each other primarily through offering custom chassis and notebook modifications. This year it was made plain by iBuyPower's aggressive retail push with their wholly custom Revolt and DigitalStorm's revised Bolt and Aventum II.

Every year someone proclaims the death of the desktop, and every year physics tells them what to go do with themselves. Powerful desktops and enthusiast machines are definitely getting smaller physically and more niche as a market, but desktops continue to offer the best longevity and bang for the buck of any personal computing platform. The PC gaming industry in particular has been tremendously revitalized, and while NVIDIA's GRID suggests a future of cloud gaming, it's still a ways off. In the meantime, 2013 should remain fairly bright for enthusiasts and do-it-yourself'ers.

Intel Brings Core Down to 7W, Introduces a New Power Rating to Get There: Y-Series SKUs Demystified
by Anand Lal Shimpi yesterday

For all of modern Intel history, it has specified a TDP rating for all of its silicon. The TDP rating is given at a specific max core temperature (Tj_MAX) so that OEM chassis designers know how big to make their cases and what sort of cooling is necessary. Generally speaking, anything above 50W ends up in some form of a desktop (or all-in-one) while TDPs below 50W can go into notebooks. Below ~5W you can go into a tablet (think iPad/Nexus 10), and below 2W you can go into a smartphone. These are rough guidelines, and there are obviously exceptions.
 
With Haswell, Intel promised to deliver SKUs as low as 10W. That's not quite low enough to end up in an iPad, but it's clear where Intel is headed. In a brief statement at the end of last year, Intel announced that it would bring a small amount of 10W Ivy Bridge CPUs to market in advance of the Haswell launch. At IDF we got a teaser that Intel could hit 8W with Haswell, and given that both Haswell and Ivy Bridge are built at 22nm with relatively similar architectures it's not too far of a stretch to assume that Ivy Bridge could also hit a similar power target. Then came the CES announcement: Intel will deliver 7W Ivy Bridge SKUs starting this week. Then came the fine print: the 7W SKUs are rated at a 10W or 13W TDP, but 7W using Intel's Scenario Design Power (SDP) spec. Uh oh.
 
Let's first look at the new lineup. The table below includes both the new Y-series SKUs as well as the best 17W U-series SKUs:
 
Low TDP Intel Core Processor Comparison
  Pentium 2129Y Core i3-3229Y Core i5-3339Y Core i5-3439Y Core i5-3317U Core i7-3689Y Core i7-3517UE
Nominal TDP 10W 13W 17W 13W 17W
cTDP Down - 10W 13W 10W 13W
SDP 7W - 7W -
Cores/Threads 2/2 2/4
Base CPU Clock 1.1GHz 1.4GHz 1.5GHz 1.5GHz 1.7GHz 1.5GHz 1.7GHz
1C Turbo - - 2.0GHz 2.3GHz 2.6GHz 2.6GHz 2.8GHz
2C Turbo - - 1.8GHz 2.1GHz 2.4GHz 2.4GHz 2.6GHz
L3 Cache Size 2MB 3MB 4MB
GPU HD HD 4000
Base GPU Clock 350MHz
Max GPU Clock 850MHz 1.05GHz 850MHz 1.1GHz
Quick Sync No Yes
AES-NI No Yes
VT-d No Yes
VT-x Yes
Socket FCBGA-1023
Price $150 $250 $250 $250 $225 $362 $330
 
Compared to a similarly configured U-series part, moving to a Y-series/7W part usually costs you 200MHz in base clock, ~250MHz in max GPU clock, and 200 - 300MHz in max turbo frequency. Cache sizes, features and Hyper Threading are non-negotiable when going between U and Y. The lower clocks are likely the result of lower operating voltages and a side effect of the very low leakage binning. The cost of all of this? Around an extra $30 over a similar U-SKU. That doesn't sound like much but when you keep in mind that most competing ARM based SoCs sell for $30 themselves, it is a costly adder from an OEM's perspective.
 
Now the debate.
 
Intel should have undoubtedly been very specific about 7W being an SDP distinction, especially when the launch slide compared it to TDPs of other Intel parts. Of course Intel failed to do this, which brought on a lot of criticism. To understand how much of the criticism was warranted we need to first understand how Intel comes up with a processor's TDP and SDP ratings.
 
Intel determines a processor's TDP by running a few dozen workloads on the product and measuring thermal dissipation/power consumption. These workloads include individual applications, multitasking workloads (CPU + GPU for example) and synthetic measures that are more closely related to power viruses (e.g. specifically try to switch as many transistors in parallel as possible). The processor's thermal behavior in all of these workloads ends up determining its TDP at a given clock speed.
 
Scenario Design Power (SDP), on the other hand, is specific to Intel's Y-series SKUs. Here Intel takes a portion of a benchmark that stresses both the CPU and GPU (Intel wouldn't specify which one, my guess would be something 3DMark Vantage-like) and measures average power over a thermally significant period of time (like TDP, you're allowed to violate SDP so long as the average is within spec). Intel then compares its SDP rating to other, typical touch based workloads (think web browsing, email, gaming, video playback, multitasking, etc...) and makes sure that average power in those workloads is still below SDP. That's how a processor's SDP rating is born.
 
If you run a power virus or any of the more stressful TDP workloads on a Y-series part, it will dissipate 10W/13W. However, a well designed tablet will thermally manage the CPU down to a 7W average otherwise you'd likely end up with a device that's too hot to hold.
 
Intel's SDP ratings will only apply to Y-series parts, the rest of the product stack remains SDP-less. Although it debuted with Ivy Bridge, we will see the same SDP ratings applied to Haswell Y-series SKUs as well. Although Y-series parts will be used in tablets, there are going to be some ultra-thin Ultrabooks that use them as well. In a full blown notebook there's a much greater chance of a 7W SDP Ivy Bridge hitting 10W/13W, but once again the burden falls upon the OEM to properly manage thermals to deliver a good experience.
 
The best comparison I can make is to the data we saw in our last power comparison article. Samsung's Exynos 5 Dual (5250) generally saw power consumption below 4W, but during an unusually heavy workload we saw it jump up to nearly 8W. While Samsung (and the rest of the ARM partners) don't publicly specify a TDP, going by Intel's definition 4W would be the SoC's SDP while 8W would be its TDP if our benchmarks were the only ones used to determine those values.
 
Ultimately that's what matters most: how far Intel is away from being able to fit Core into an iPad or Nexus 10 style device. Assuming Intel will be able to get there with Ivy Bridge is a bit premature, and I'd probably say the same thing about Haswell. The move to 14nm should be good for up to a 30% reduction in power consumption, which could be what it takes. That's a fairly long time from now (Broadwell is looking like 2H-2014), and time during which ARM will continue to strengthen its position.
 
Acer's W700 refresh, with 7W SDP Ivy Bridge in tow
 
As for whether or not 7W SDP parts will actually be any cooler running than conventional 10W/13W SKUs, they should be. They will run at lower voltages and are binned to be the lowest leakage parts at their target clock speeds. Acer has already announced a successor to its W700 tablet based on 7W SDP Ivy Bridge with a 20% thinner and 20% lighter chassis. The cooler running CPU likely has a lot to do with that. 
 
Then there's the question of whether or not a 7W SDP (or a future 5W SDP Haswell/Broadwell) Core processor would still outperform ARM's Cortex A15. If Intel can keep clocks up, I don't see why not. Intel promised 5x the performance of Tegra 3 with a 7W SDP Ivy Bridge CPU. Cortex A15 should be good for around 50% better performance than Cortex A9 at similar frequencies, so there's still a decent gap to make up.
 
At the end of the day, 7W SDP Ivy Bridge (and future parts) are good for the industry. Intel should have simply done a better (more transparent) job of introducing them.

Samsung's Exynos 5 Octa: Powered by PowerVR SGX 544MP3, not ARM's Mali

At CES, Samsung announced its Exynos 5 Octa SoC featuring four ARM Cortex A7s and four ARM Cortex A15s. Unusually absent from the announcement was any mention of the Exynos 5 Octa's GPU configuration. Given that the Exynos 5 Dual featured an ARM Mali-T604 GPU, we only assumed that the 4/8-core version would do the same. Based on multiple sources, we're now fairly confident in reporting that the with the Exynos 5 Octa Samsung included a PowerVR SGX 544MP3 GPU running at up to 533MHz.

The PowerVR SGX 544 is a lot like the 543 used in Apple's A5/A5X, however with the addition of DirectX 10 class texturing hardware and 2x faster triangle setup. There are no changes to the unified shader ALU count. Taking into account the very aggressive max GPU frequency, peak graphics performance of the Exynos 5 Octa should be between Apple's A5X and the A6X (assuming Samsung's memory interface is just as efficient as Apple's):

Mobile SoC GPU Comparison
  PowerVR SGX 543MP2 PowerVR SGX 543MP4 PowerVR SGX 544MP3 PowerVR SGX 554MP4
Used In A5 A5X Exynos 5 Octa A6X
SIMD Name USSE2 USSE2 USSE2 USSE2
# of SIMDs 8 16 12 32
MADs per SIMD 4 4 4 4
Total MADs 32 64 48 128
GFLOPS @ Shipping Frequency 16.0 GFLOPS 32.0 GFLOPS 51.1 GFLOPS 71.6 GFLOPS

It's good to see continued focus on GPU performance by the major SoC vendors, although I'd like to see a device ship with something faster than Apple's highest end iPad. At the show we heard that we might see this happen in the form of an announcement in 2013, with a shipping device in 2014.

ASUS $149 7-inch MeMO Pad Headed to the US, Powered by VIA SoC & Android 4.1
by Anand Lal Shimpi yesterday

Just after we got back from CES, ASUS wrote to tell us that a cost reduced version of the 7-inch MeMO Pad was coming to the US. Starting at $149 for an 8GB model, the MeMO Pad features a VIA WM8950 SoC. We haven't seen the VIA name in a while but inside the SoC is a single ARM Cortex A9 CPU running at up to 1GHz and an ARM Mali-400 of unknown core configuration. The total DRAM size hasn't gone down compared to the Nexus 7, but display resolution has (1024 x 600 vs. 1280 x 800). ASUS is promising up to 140-degree max viewing angle and 350 nits max brightness for the MeMO Pad's 7-inch display.

Unlike the Nexus 7, the MeMO Pad does come with a microSD card slot to expand storage beyond its default 8/16GB configuration. The chassis looks very similar to the Nexus 7, but it is slightly thicker and does weigh a little more as well. Battery capacity hasn't been touched though. There's a 1MP front facing camera with f/2.0 lens. The MeMO Pad will ship with Android 4.1. ASUS was quick to point out that the device will ship with Google's Play Store and the US version will support Hulu Plus, Netflix and HBO Go.

Tablet Specification Comparison
  Apple iPad mini ASUS MeMO Pad (ME172V) Google Nexus 7
Dimensions 200 x 134.7 x 7.2mm 196.2 x 119.2 x 11.2mm 198.5 x 120 x 10.45mm
Display 7.85-inch 1024 x 768 IPS 7-inch 1024 x 600 7-inch 1280 x 800 IPS
Weight 308g (WiFi) 370g 340g (WiFi)
Processor 1GHz Apple A5 (2 x Cortex A9, PowerVR SGX543MP2)

VIA WM8950 (1GHz Cortex A9 + Mali-400)

1.3 GHz NVIDIA Tegra 3 (T30L - 4 x Cortex A9)

Connectivity WiFi , Optional 4G LTE WiFi WiFi, Optional 3G
Memory 512MB 1GB 1GB
Storage 16GB—64GB 8GB, 16GB + microSD slot 16GB, 32GB
Battery 16.3Wh 16Wh 16Wh
Starting Price $329 $149 $199

The cost reduction in the bill of materials translates to $50 savings in retail; the MeMO Pad will start at $149 compared to $199 for the Nexus 7. While I bet the latter will still be the 7-inch Android tablet of choice for discerning consumers, the MeMO Pad will be ASUS' attempt to gain a foothold in the ultra competitive, low cost tablet market. Availability is expected in the US starting in April.

Ambarella Announces A9 Camera SoC - Successor to the A7 in GoPro Hero 3 Black
by Brian Klug yesterday

I've been playing around with and trying to review the GoPro Hero 3 Black since the holidays, a small sports-oriented portable camera which can record up to 4K15 video or 1080p60 video with impressive quality. Inside the GoPro Hero 3 Black is an Ambarella A7 camera system on chip, while the Hero 3 White and Silver contain the previous generation Ambarella A5S. Both the A5S and A7 are built on Samsung's 45nm CMOS process.

During CES Ambarella announced their successor to the A7, the A9 (neither of which is to be confused with ARM's Cortex A7 or A9 CPU). This new camera SoC moves to Samsung's 32nm HK-MG process and brings both lower power consumption for the same workloads and the ability to record 4K30 video as well as 1080p120 or 720p240, double the framerate of the previous generation thanks to higher performance. The A9 is the direct successor to the A7 and enables 4K video capture with enough framerate (30FPS) for playback without judder, the previous generation's 4K 15 FPS capture pretty much limited it to use for recording high resolution timelapses or other similar scenes that would be played back at increased speed. 

The A9 SoC also includes two ARM Cortex A9s onboard at up to 1.0 GHz, up from the single ARM11 at 528 MHz from the previous generation. Ambarella claims under 1 watt of power consumption for encoding a 1080p60 video on the A9 and under 2 watts for 4K30 capture. The A9 also has built-in support for HDR video capture by combining two frames - thus 1080p120 becomes 1080p60 with HDR enabled. I got a chance to take a look at both HDR video capture on the A9 as well as 4K30 video encoded on a development board and decoded using the same board on a Sony 4K TV. I came away very impressed with the resulting video quality, clearly the age of 4K/UHD is upon us with the availability of inexpensive encoders like these that will make their way into small form factor cameras. 

I wouldn't be surprised to see a GoPro Hero 3 Black successor with the A9 SoC inside and those higher framerate capture modes fully enabled at some point in the very near future since the A9 SoC is now available to customers.

Source: Ambarella

Brian's Concluding Thoughts on CES 2013 - The Pre-MWC Show
by Brian Klug yesterday

Anand asked each of us to write up some final thoughts on CES 2013, something which is honestly a daunting task at best, and a potentially controversial or rage-inducing one at worst. Having now attended three CESes, I still have a relatively small window of reference within to gauge this one, but some things like CES need almost no context.


Press Conferences at CES are often largely fruitless, but something to behold

First, CES is and hopefully always will be a spectacle. I actually disagree with many who say that Las Vegas shouldn't be the venue for CES. That's because there's something appropriate about Las Vegas being the home to CES, since it's a city and environment I approach with as much skepticism and trepidation as the products and announcements made during the show. Almost everything you see isn't what it seems, and I've recounted a few analogies in person that I think bear repeating here.

Just like the showiest buildings and people usually have the least to offer (and thus rely entirely on show and presentation to draw you in), so too do exhibitors and companies and everyone giving you their exactingly rehearsed pitch. That is to say good products and announcements draw their own crowds and don't need overselling with dramatic entrances and expensive demos. Some of the most engaging and busy companies I met with had almost no presence on the show floor, and instead had only a tiny meeting room with a single table and a few chairs. Similarly just like hotels on the strip appear close and within walking distance (signs can be read almost two miles away), so too should one approach the release dates for things announced at CES — they're almost always further away than they appear. Finally Vegas itself is a carefully engineered, computationally optimized environment designed to extract maximum dollar from anyone it entwines, and so too the CEA crafts (and engineers) a show that will be crowded regardless of the number of attendees, and record breaking in scale regardless of whether there's actually any real growth going on.

I guess what I'm saying is that there is a certain kind of skepticism one has to approach everything CES related with, and in that sense the only appropriate context for gauging CES is itself, as a subset of Las Vegas. I don't think you could ever have a CES detached from that environment, and especially not without the pervasive, eye-stinging smell of the casino floors, which is a world-unique combination of cigar smoke, cigarette smoke, spilled drinks, sweat, hooker spit, and the despair of a thousand souls and crushed dreams. No, a CES detached from Vegas wouldn't be a CES at all.

So what are my concluding thoughts about this CES? I've heard lots of grumbling from a few of my peers that CES is increasingly irrelevant as a place to look for mobile news, but I think that's false. They're talking about handset announcements and superficial things one can get a hands-on with and take pictures of rather than actual mobile news. From a superficial mobile handset perspective, sure, CES is increasingly not the place to go; it's too close to MWC, and too many OEMs now want to throw their own events to announce specific handsets or just do it at the mobile show rather than get lost in the noise that is CES. Unless you're a case manufacturer there really isn't any reason to go launch your major handset at CES unless it is going to launch very early in the new year's product cycle. There were only a few major mobile handset announcements — Intel, ZTE, Huawei, and Sony announced new phones. Samsung dropped a token handset announcement or two in at the end, but nothing flagship at all.

On the other hand, there was actually plenty of mobile related news that wasn't handset related. You just have to know what to look for. Heck, the keynote was by Qualcomm CEO Paul Jacobs instead of Microsoft's finest. What clearer indication does one need that we're entering a mobile era where desktop takes a back seat?


Snapdragon MSM8974, err, we mean S4 Extreme, err we mean Snapdragon 800, err, we give up...

There was plenty of mobile news as far as I'm concerned. Qualcomm finally announced MSM8974, the part which is essentially the MSM8960 successor on Qualcomm's roadmap, and a revamped version of APQ8064 with Krait 300 CPU inside. They also rolled out new Snapdragon branding which completely casts away Qualcomm's previous S-series and has me yet again fielding questions from everyone about what part now fits under what arbitrarily drawn umbrella in the 200 / 400 / 600 / 800 series (spoiler: Even I don't know how the existing lineup maps to the new numbering, just that the part I will keep calling MSM8974 is an 800 series and that APQ8064T is a 600 series).

NVIDIA formally announced its Tegra 4 SoC the night ahead of the CES rush, and alongside it, Project Shield, its mobile gaming platform slash PC gaming accessory. In addition NVIDIA announced its Icera i500 LTE platform and made some interesting promises about future upgradeability. Similarly Samsung announced Exynos 5 Octa, (an absolutely unforgivably atrocious name for an SoC that will expose at most 4 cores at once to the OS, and with 5410 not 5810 as its part number). Samsung also showed some curved displays and example designs I got a chance to look at, before telling me they regretted ever letting me into their showroom and meeting area.

Broadcom also teased, but somehow didn't actually formally announce, its LTE baseband that has been rumored and rumbled about for months. We saw the first PowerVR Series6 GPU, a two cluster, running on LG's silicon that's inside some of its smart TVs. I met with Allwinner who talked about their A31 and A20 SoCs that are getting a lot of attention, in perhaps the event's smallest meeting room. Audience also announced their eS515 noise cancelation and sound processor, the first to implement a three-mic algorithm (others with three switch between pairs). Amazingly T-Mobile also lit up AMR-WB (adaptive multi rate wideband) for higher fidelity voice calls across the whole network right during the show. In short, there's a ton of mobile news, it just isn't mobile handsets or something necessarily tangible.


I found some headphones, iPhone cases, and notebook sleeves that Anand will love... (He has said numerous times he's tired of seeing basic games like Cut The Rope be the flagship titles.)

I guess my takeaway is that CES' niche in the mobile space isn't to be the place for handsets you can put your hands on, but rather as a place to lay the groundwork for that to happen at MWC. Essentially companies announce or tease hardware for what's going to show up at the next show in specific devices, and I get that this is what most people are after, but if you're looking for CES to be that venue you're doing things very, very wrong. Anyhow, the above is really what CES seems like to me, with lots of announcements about things that we will shortly see in actual devices announced at that next show.

Wireless charging was also a huge theme for my CES — somehow I met with every wireless charging standards body and a few charge transmission and receive chip suppliers in the same day. The differences between resonant and magnetic inductance based wireless charging systems are now burned into my head. The operators and device makers are testing the waters with just a few devices to see how consumers react, and the future of wireless charging hinges on their adoption.

But the real winners of CES in my mind are two things — the Samsung Galaxy Camera, and Oculus Rift.

Samsung Galaxy Camera seemed to be everywhere. Even though the device had nothing to do with CES, I couldn't help but notice that everyone seemed to have one — journalists, analysts, attendees, exhibitors, you name it. There were people who were clearly sampled a review unit and others that simply went out and bought the device to try it out. I found myself pulling out my DSLR less and less and using the Galaxy Camera more and more for video, simple photos, and whenever I had it handy. Obviously the Galaxy Camera can't match even the most entry-level DSLR, but there's something immediately more powerful about a camera that's uploading photos and videos to Dropbox so you'll have them ready, or the ability to immediately tweet and post something interesting. Connected camera is going to become even more of a critical topic in the coming year, and I've never ever had a device elicit so many questions and curious glances as the Galaxy Camera in my time reviewing devices. Say what you want about the imaging experience or battery life, but Samsung did an excellent job nailing most of the core fundamentals and UX right out of the gate with Galaxy Camera. That's something the Nikon S800c and Polaroid are still a ways away from.


The author using Oculus

Oculus Rift I'm going to write a lot about in another CES related pipeline post, but their demos and pre-development platform hardware are already so much better than any shipping VR kit out there that it's embarrassing. The tracking speed and fluidity are scarily good, and the experience already so close to being phenomenally good that everyone describes the experience as "going in" and "coming out" rather than "putting the goggles on" or "taking them off." That isn't to say that things are perfect, but the development kit that's coming will be nothing short of impressive.

Finally, I have to hand it to the wireless operators who managed to keep their LTE networks up and working during the entire show. I fully expected at least one of the major operators to die entirely, and this is perhaps the first year that I haven't had massive issues doing simple things like sending SMSes or placing calls while being anywhere near LVCC. The numerous DAS (Distributed Antenna Systems) in the LVCC and major hotels did their job and worked well. Things weren't flawless or necessarily the fastest I've seen them but it isn't like connectivity died entirely.

CES 2013 drew to a close almost as quickly as it started from my perspective, and I can't help but admit that I already miss the insane pace of it all.

CES 2013 LTE Throughput Face-Off - AT&T versus Verizon Wireless
by Brian Klug 2 days ago

Last year I ran a series of speedtests on both AT&T and Verizon Wireless LTE during CES, the results of which were pretty interesting. My friends continue to poke fun at how obsessive I am with running speed tests, and this year I decided to do the same experiment during CES 2013 by carrying around an iPhone 5 on AT&T LTE and HTC Droid DNA on Verizon Wireless LTE and running speedtests at the same time and place periodically on the two handsets side by side. 

The places I tested are pretty much indicative of your average tech journalist's in Las Vegas for CES — I ran a few in my hotel (LVH), inside and throughout the LVCC North and South halls' show floors and concourses, the Venetian, Mandalay Bay, Palms, and a smattering of the Las Vegas Strip itself. Most of the major hotels have DASes inside them (Distributed Antenna System or sometimes Digital Antenna System). I know for a fact that at least the Palms and Venetian have DASes, as does the strip itself. Inside the LVCC there are multiple DASes, with at least three different systems visible that I managed to spy.

Spotting these indoor systems is almost always a challenge, as they're camouflaged intentionally to both dissuade vandalism or theft, and also for aesthetic reasons. Even if you know what you're looking for they're at best difficult to spot or out of sight and sometimes completely camouflaged. That said, I snapped photos of the ones I was pretty confident were in fact DAS antennas and put them in a gallery.


An example set of DAS antennas in LVCC North Hall

So how did AT&T and Verizon fare at CES 2013? First, I should note that both Verizon and AT&T were best I can tell only using their 700 MHz spectrum assets, that means band 13 for VZW and band 17 on AT&T with 10 MHz of bandwidth. I didn't see Band 4 on the iPhone 5, and the DNA (and all new HTC devices) lacked Field Test or similar engineering menus that would enable me to check, but it lacks band 4 anyways. 

I ran 100 speedtests on both devices at the same places during my day (and night) at and around the show. Both were to the Switch Communications server in Las Vegas, for consistency of host. I then used my same script to create some histograms of the results, and stats.

 

AT&T LTE Results
 
Downstream Stats (Mbps)
Avg: 16.677; Max: 44.928; Min: 0.632, StDev: 11.323
 
Upstream Stats (Mbps)
Avg: 4.370; Max: 16.015; Min: 0.081, StDev: 3.696
 
Latency Stats (ms)
Avg: 174.95; Max: 256; Min: 144, StDev: 24.2187
 
Number of Tests Run: 100
 
 
 
Verizon Wireless LTE Results
 
Downstream Stats (Mbps)
Avg: 10.585; Max: 39.65; Min: 0.702, StDev: 7.854
 
Upstream Stats (Mbps)
Avg: 4.680; Max: 11.402; Min: 0.041, StDev: 3.165
 
Latency Stats (ms)
Avg: 130.13; Max: 192; Min: 99, StDev: 17.153
 
Number of Tests Run: 100

When it comes to downstream throughput, it seems as though AT&T edged out Verizon Wireless for CES 2013, with AT&T bringing in an average of 16.7 Mbps compared to Verizon's 10.6 Mbps. On upstream, AT&T hits 4.4 Mbps, Verizon hit a slightly higher 4.7 Mbps. Latency is a bit more interesting, with Verizon having lower average latency at 130.1 ms, and AT&T coming in at 175 ms, but I suspect this is more a story about the different routing that each takes both through the ePC (Evolved Packet Core) and onto the internet and then test server. Both operators did pretty well during the show, all things considered. Upstream took a huge beating and varied wildly depending on how close you were to a base station and how many people there were around you, but tweeting, checking emails, and downloading the latest schedule at least worked. 

I will say that although AT&T had better downstream throughput at the end of the day, on the whole Verizon's coverage profile continues to be much better. In the Mandalay Bay I lost AT&T connectivity quite a bit and suspect the building either doesn't have a DAS or AT&T isn't hooked into it, whereas Verizon was solid. Inside the LVCC both operators understandably had great coverage. 

I have to hand it to both major operators for keeping their LTE networks up at the show floor. I remember before Verizon or AT&T deployed LTE having limited to no connectivity in Las Vegas, much less inside LVCC during CES, and this seems to now largely be a thing of the past. 

Wilocity Bringing WiGig To Your Desk, Lap, Home and Office
by Jason Inofuentes 2 days ago

We had a chance to meet with Wilocity to take a look at their progress in bringing WiGig to market. Let’s start with a primer. WiGig (802.11ad) is an air interface that operates in the 60GHz range, providing massive bandwidth and some keen tech to eliminate crowding issues that are seen in 2.4GHz protocols. At such a high frequency, though, propagation is rather limited. So, though some provisions for bouncing signals around a corner are made, this is meant primarily as a line of sight interface. So what can you accomplish with WiGig? Let’s look at the demos.
 
Wilocity has fleshed out two routes for using WiGig, a networking protocol and a bus replacement. With a WiGig module installed in your device (tablet/PC at present) you can utilize multiple peripherals connected to a WiGig enabled dock. Your computer will detect these peripherals as connected through a PCI Express bus, and with so much bandwidth available, no penalty is paid versus having the peripheral connected to a wired PCI Express lane. The low latency lends itself to video and gaming applications. This is truly a wire replacement solution, and though it is limited to in-room experience, it’s still better than dragging cables around. 
 
 
The networking solution WiGig has developed is impressive, though its current state is better suited to the office than the home. Businesses that handle large files are often at the mercy of wired connections to rapidly move files between users. Though WiFi speeds are improving, and 802.11ac is finally starting to trickle out, maximum throughput is still achieved on GigE connections alone. Configuring a WiGig network over a bank of cubicles, though, would allow each client to receive greater than GigE speeds with just a few access points, and without interfering with other wireless products and devices. Wilocity’s solution uses 802.11ac as a fall back and when operating beyond line of sight, and does so seamlessly. 

isplay mirroring using a WiGig enabled dock and tablet.

That need for line of sight limits its appeal to closed environments, such as in the home, but as a value add, having an 802.11ac access point that also provides WiGig throughput in the same room as the router, there could be applications for media centers and home offices. 
 

In addition to the demos, Wilocity showed off its recently updated module (built in partnership with Qualcomm Atheros), now upgraded from 802.11n to ac, and featuring a new, smaller RF module that has 32 individual elements arranged in an almost omnidirectional fashion to improve performance. Marvell also has a new access point module, with 4x4 802.11n. Products featuring these module should be released this year, and the road map for the next few years continues this trend of improving service while shrinking the module size down further. Ultimately they would like to see a component small enough to be embedded in a handset. They expect the first WiGig equipped handsets to premiere in 2014, with mass production in 2015. For now, we’ll be on the look out for any WiGig devices that are ready to be run through their paces. 

Latest from AnandTech