Our proud sponsors:

Select Category
Sponsors: Advantech | Handheld Group | Getac | Juniper Systems | MobileDemand
Sponsors: Motion Computing | Panasonic | RUGGEDBOOK | Trimble | Winmate

June 25, 2013

Logic Supply's logical approach to engineering their own systems

When it comes to rugged computing gear, most people interested in this industry know the big players that dominate the market and get all the media coverage. But that's not everything there is. Unbeknownst to many outside of the circle of customers and prospects, a surprising number of smaller companies are designing and manufacturing rugged computing systems of one type or another. At times we come across them by chance. Other times they find us.

And so it was with Logic Supply, located in South Burlington, a small town in the northwestern part of Vermont. They call themselves "a leading provider of specialized rugged systems for industrial applications," and asked if we could include them in our resource page for rugged system vendors. We could. And that led to some back and forth where I learned that while Logic Supply distributes a variety of rugged/embedded systems and components, they have also begun developing their own high-end chassis under their own LGX brand. That was interesting, and so we arranged an interview with Rodney Hill, Logic Supply's lead engineer, on the company, and how they go about creating their own, home-developed solutions in addition to being a distributor of products engineered elsewhere.

RuggedPCReview: If you had to describe Logic Supply’s approach to case and system engineering in a minute or less, what would you say?

Rodney Hill (Logic Supply): So, LGX is Logic Supply’s engineering arm. The design approach for LGX systems and cases can be boiled down to three ideas. First is our “designed to be redesigned” philosophy. Seed designs that are scalable and modular. From a seed idea we can create a product line through swappable front- and back-plates or resized geometry. Second is mass customization — by using standardized screws, paints, sheet metal folds, and design concepts, we leverage mass produced hardware whenever possible to keep the cost low. And through our modular designs we customize the off-the-shelf by using “upgrade kits,” which are quick to source and are cost-effective. Finally, innovation, not invention! There is a difference. Add value to things that work well, but do not re-invent the wheel.

Was that under a minute?

RuggedPCReview: Almost. But now let’s expand. You said scalable, modular, and “designed to be redesigned.” What do you mean by this?

Rodney Hill (Logic Supply): Designing a new chassis is four to five times the price of redesigning a seed design. Much of the time wasted in projects is done so while selecting paints, screws, boxing, foam, metal fold designs, etc. By using standardized design methods and seed concepts, our team can immediately start adding value. Ultimately the customer is only paying for the design and not the time the engineers spent trying to get their act together. We will be faster and more focused on quality and containing costs and risk.

So, to your question, designed to be redesigned systems from LGX have already incorporated flexible features to accommodate 80% of the customizations that customers request with off-the-shelf hardware. The last 20% are resolved with ‘upgrade’ kits that will be included with the off-the-shelf chassis kit. But you’re also using the proven benefits of the rest of the chassis (EMI and RFI shielding, for instance), and only adding risk in small portions. Meaning the rest of the chassis is still meeting all the same design criteria it was originally intended to support. So you can easily customize it without the risk of any negative effects on any of those features.

In terms of scalability versus modularity — there are design themes seen in our cases. If you look carefully enough, you can begin to see connections between designs. The NC200 and the MK150 are two totally different designs – however they share about 80% of the same DNA, from vent holes to metal folds, etc.

RuggedPCReview: How does cooling play into ruggedization?

Rodney Hill (Logic Supply): Nature always wins. Meaning dust and water will destroy everything if given the chance. You need to decide how long the computer needs to live, and how much you’re willing to pay for it. Heat will shorten the life of components.

So in terms of chassis design concepts: Keep the chassis cool as possible and as quiet as possible. Intelligent design is required to incorporate standardized cooling methods and proven airflow paths to cool many types of devices. Fan diameter, placing, vent design all will have effects on the acoustic design as well. Logic Supply will engineer noise out of systems with fan-muffling technologies, maximizing air throughput with smaller, more simple fans by identifying inefficiencies in orifice design. In short, having a fan against a grille will kill 50-60% of the airflow and multiply the noise by two or three times.

Vent holes equal dust. Dust causes fans to break, which in turn results in hot computers. Eliminate vents and go fanless. The operational temperatures and ruggedness greatly increase. Logic Supply defines “fanless” different than the IMP mass market. Our definition is not simply “no fans.” It is more than that: no fans, no vents for dust and dirt, and maintain the ability to cool the computer system at 100% duty load for hours and days at a time. We want these systems to be heavy duty, and also to be able to last a long time. It is rated for high performance!

RuggedPCReview: Can you talk about the design process? How long does it take from start to finish?

Rodney Hill (Logic Supply): It happens pretty fast. This year we’ve done a fanless Mini-ITX case, a 1U rackmount case, a NUC [Next Unit of Computing] case, and we’re finalizing a fanless NUC case right now. We’ve also finished a number of customer-specific designs. These design concepts typically originate in sales — you know, this customer wants to do X and none of our existing solutions do it. But because we use seed designs, we don’t start from scratch. It really all depends, but usually designs take under three weeks, and prototypes are ready a few weeks after that. We review, test, and modify, then we’re typically getting production units in-house around five or six weeks after that.

These core platforms can then be sold off-the-shelf, or customers can either go the semi-custom route or more radically modify the design. For simple modifications (like back-plates, front-plates, and simple changes) maybe one to five days in design and a three to four day lead on parts. For customized chassis design with samples, five to six weeks, and four to six after that to mass production.

RuggedPCReview: Alright, finally, can you give us an example of a successful customer product development?

Rodney Hill (Logic Supply): Sure. Last year we worked with a company called StreamOn to make a custom appliance with off-the-shelf components. StreamOn offers streaming audio solutions for the radio broadcast industry. The hardware they were using at the time was going End-of-Life, and they also needed a more specialized embedded system because their business was growing and they wanted to offer more features to their customers. They needed a variety of other things — outsourced fulfillment and things like that — but from an engineering perspective it was mostly that — the EOL and specialization. And all while remaining affordable for their customers. We worked from an existing system design — the ML250 — and customized it toward what they needed. We added an SSD, LCD screen and multifunction buttons, and on-case branding.

Ultimately, the system we created was something like 30% smaller, and it was fanless, so it was more efficient, and had a longer life expectancy. It also had a built-in watchdog timer and auto restart bios so it could avoid any complications related to sudden power outages, etc. And it actually ended up being even less expensive for their customers than what they were previously offering. So that all worked out quite well. In fact, they recently won the [Radio World] “Cool Stuff Award,” which was pretty, well, cool!

This whole process was consistent with our typical design timeline, by the way. From the initial conversations to mass production — with samples and prototyping — we took about three months.

Posted by conradb212 at 03:16 PM | Comments (0)

June 24, 2013

Why the JTG Daugherty NASCAR racing team chose rugged Dells

The Christmas tree began its count-down. Yellow, yellow, yellow, GREEN! For an anxious moment, the racing slicks of my supercharged Acura fought for traction, then bit. 8,000, 8,500, 8,800 rpm, shift. Shift. Shift. Shift, and the 1/4-mile at Sacramento Raceway was over. The car slowed and I reached over to stop data logging on the laptop securely sitting on its mount, just having recorded tens of thousands of data points as the car shot down the track. The laptop was a Dell Latitude ATG 630D, connected via USB to the Hondata ECU under the dash of the car. Minutes later I would analyze the run on the Dell, temperatures, shift points, slippage, air/fuel ratio, knocks, timing, etc., and then make changes on the fly. The next heat was in less than 15 minutes.

At the time I didn't know that a few years later I'd be talking with the JTG Daugherty NASCAR racing team about how they used rugged Dell laptops on their #47 Sprint Cup car, driven by NASCAR legend Bobby Labonte. Labonte won the Cup in 2000 during an era where he was a perennial contender. And also won IROC in 2001, following in the footsteps of his brother Terry Labonte, also an IROC and Cup champion. Now a senior amongst NASCAR drivers at age 49, Labonte's piloting car #47 for the team owned by Jodi and Tad Geschickter, and NBA Hall of Famer Brad Daugherty. Lady Luck hasn't been too kind to them this season, but that's certainly not due to this talented group and also not due to the technology they're using. Most recently, while Martin Truex Jr. won at Sonoma Raceway in his Toyota Camry, a blown oil cooler ended Labonte's race in essentially the same Camry before it even began. Those are the breaks.

So I felt almost guilty when I got on the phone Monday morning after that race with Matt Corey, who is the IT administrator at JTG Daugherty Racing, and Dell's Umang Patel and Alan Auyeung to discuss JTG Daugherty's use of Dell technology. Corey in particular, probably didn't feel too good after the frustrating weekend and had plenty of other things to do at their shop, but he took the call. Much appreciated.

So how is JTG Daugherty Racing using Dell computers? And what made them decide to use Dell from the driver to the garage and the pit crew to the data center with a complete suite of Dell technology and solutions that also includes rugged Dell ATG and XFR laptops? The choice of Dell for data center and office isn't much of a surprise, given that Dell has consistently been in the top three PC vendors worldwide and in the US. What's more interesting is that JTG Daugherty also chose Dell for their rugged laptops, a field dominated by Panasonic, Getac and a number of other vendors specializing on rugged equipment.

Corey began by explaining the inherent need of a NASCAR racing team for rugged technology. No surprises here. There's rain, dust, vibration, extreme temperatures, the whole gamut of hazards rugged mobile computing gear is designed and built to survive. Add to that the extreme time crunch as a race car is tested and prepared, the extreme need for absolute reliability in a sport where fractions of a second matter, and a race car is checked, refueled and has all of its tires changed in something like 13 seconds. Things simply must not go wrong, ever, in such an environment, and racing teams certainly cannot put up with finicky computing technology that may or may not be up to the job. As an example, Corey tells of an incident where a consumer laptop simply wasn't able to handle vibration, causing them a lot of grief.



So as a result, JTG Daugherty now uses rugged gear. Their race engineering team has Dell Latitude E6430 ATG laptops. The ultra-rugged Dell Latitude X6420 XFR is used on the truck and trailer. They also use Windows-based Dell Latitude 10 tablets in Griffin Survivor cases supplied by Dell. All of this means that the team can collect performance stats, analyze it, and make changes quickly and reliably. "We have connectivity everywhere," said Corey. "As the car chief makes a decision about a change to the car, for example, he now notes this on his Latitude 10 and the information is instantly communicated to everyone across the organization. All decisions from the car chief trickle down to updates to the racecar and with everyone synced together with tablets and other Dell technology, that information flow is now much faster, more reliable and more efficient."

But still, why Dell for the rugged gear? Here I expected Corey to point to the advantage of dealing with a one-stop vendor. Instead he says that they had used Toughbooks in the past and liked them, but that "they really didn't change much over the years, same always," and that Dell updates more frequently. "We don't want "plain vanilla," he said, "we want to be on the cutting edge of technology" and lists the memory and processing speed required to power through race simulations, high resolution imaging, and massive data sets.

Staying at, or near, the leading edge in technology while still adhering to the longer purchase cycles and life spans of rugged equipment, and guard against obsolescence of docks, peripherals, accessories and software has always been a challenge for the rugged computing industry. While Intel tends to unveil a new processor generation and ancillary technology every 12 to 18 months, the rugged industry cannot possibly update at the same pace. Even Dell is not immune in that regard; as of now, the rugged XFR laptop is still based on the E6420 platform.

Yet, Dell does have the advantage of very high production volume and with that comes access to the very latest technology. Combine that with the convenience and peace of mind of dealing with a large one-stop shop, and it's no surprise that even a NASCAR racing team chose Dell.

See NASCAR Team Selects Dell to Speed Past the Competition, Dell's Latitude for Rugged Mobility page, and RuggedPCReview.com's most recent reviews of the Dell Latitude ATG and Dell Latitude XFR.

Posted by conradb212 at 07:47 PM | Comments (0)

May 28, 2013

Rugged notebooks: challenges and opportunities

I've been working on setting up our new rugged notebook comparison tool over the past few days. So far, the tool, where users can compare the full specs of up to three rugged notebooks side-by-side and also quickly link to our analysis of the machines, has far fewer entries than our comparison tools for rugged handhelds and rugged tablets. As I asked myself why there were only relatively few products out there, I thought about the overall rugged notebook situation.

A little while ago I came across a news brief by DigiTimes, the Taipei-based tech news service that's always interesting to read (albeit not always totally accurate). The news item was about Getac gunning for an increased market share in rugged notebooks. Digitimes said the current worldwide rugged notebook marketshare situation was something like Panasonic having 60%, Getac and General Dynamics Itronix each about 12.5%. They didn't specify the remaining 15%, but it's obviously a number of smaller players.

That news came just a short while after General Dynamics officially pulled the plug on Itronix, so those 12.5% that used to be GD-Itronix rugged notebooks such as the GD6000, GD8000 and GD8200, are now gone and up for grabs. Who will step up to bat? Will Getac take over what GD-Itronix used to have? Or will Panasonic's Toughbooks get even more dominant? Or will perhaps someone else emerge?

There's no easy answer. And the market is a rather fragmented one. First, it's not totally clear what makes a notebook "rugged." In the business we generally differentiate between "rugged" and "semi-rugged," where the more expensive fully rugged devices carry better sealing and are built to handle more abuse than semi-rugged models that offers somewhat less protection, but usually cost and weigh less in return. But rugged and semi-rugged are not the only designations you see in the market. Some manufacturers also use terms such as "business-rugged," "vehicle-rugged," "durable," or even "enterprise-rugged." There's also "fully-rugged" and "ultra-rugged."

Of machines on the market, we'd consider products such as the Panasonic Toughbook CF31, Getac B300 or GD-Itronix GD8200 as rugged, and the Panasonic Toughbook 53, the Getac S400 and the GD-Itronix GD6000 as semi-rugged. But then there are also notebooks specifically made for enterprise and business that are better made than run-of-the-mill consumer notebooks, but somehow defy definition. Examples are the very light magnesium notebooks by Panasonic that cost a lot more than any regular laptop and can take much more abuse, but do not look tough and rugged.

Then there's yet another category of laptops that are almost exclusively used for business and vertical market applications, and that's the convertible notebooks. These had their origin when the industry was intrigued by tablets in the early 1990s and then again in the early 2000's, but wasn't quite sure if customers would accept them, so they made something that could be used both as a tablet and as a laptop. These usually cost more than notebooks, and were heavier than tablets, but somehow the concept is still around, and there are many models to choose from. Some are fully rugged, such as the Getac V100/V200 and the Panasonic Toughbook 19, others are semi-rugged like the Panasonic Toughbook C2, or business-rugged, such as Lenovo ThinkPad X230t or the HP EliteBook 2760p.

Yet another category is rugged notebooks that are based on large volume consumer notebooks. Examples are the semi-rugged Dell Latitude ATG and the fully rugged Dell Latitude XFR. With Dell having quick and easy access to all the latest technology, the ATG at least is almost always at, or close to, the state-of-the-art in processors and other technology.

And there further twists. While the likes of Panasonic and Getac make their own notebooks, a good number of others are made by a small handful of OEMs under exclusive or (more often) non-exclusive agreements with resellers that put their own brand names and model numbers on the devices. Taiwanese Twinhead, for example, had a longstanding relationship with the now defunct General Dynamics Itronix, with some models exclusive to Itronix and others marketed by various vendors through different channels. That can make for interesting situations. While Twinhead was and is an important OEM, they also sold their mostly semi-rugged lineup under their own name and the Durabook brand, and also through their US subsidiary GammaTech.

But there's more. A number of smaller players, or small parts of larger industries, provide highly specialized rugged notebooks that are often so unique as to only target very narrow markets. Some machines are built specifically to the requirements of military and other government contracts. Their names and brands are usually unknown to anyone outside of the small circle of targeted customers.

Why are there so few rugged and semi-rugged notebooks? One reason is that the market for them isn't all that large. They are popular in police cars and similar applications, and wherever notebooks simply must be much better built than fragile consumer models. Another reason is price. Even relatively high-volume semi-rugged laptops cost two to three times as much as a similarly configured consumer model. Rugged notebooks run three to five times as much, and specialized models may be ten times as much.

By and large, the rugged computing industry has been doing a good job educating their customers to consider total cost of ownership as opposed to looking only at the initial purchase price, but it's not always an easy sell. And with handy, inexpensive tablets flooding the market, it isn't getting any easier. Add to that the fact that makers of rugged notebooks always had a special millstone hanging around their necks, that of having to make sure that products stay compatible with existing docks, peripherals and software. That often prevents them from adapting to new trends and switching to newer technologies and form factors (like, for example, wider screens) as quickly as some customers demand. While it's certainly nice to see Intel coming out with a new generation of ever-better processors every year or two, it's not making it easier for rugged manufacturers to stay current in technology and features either.

As is, if Itronix really had a roughly 12.5% market share, that slice of the pie is now up for grabs and it should be interesting to see who ends up with it.

Posted by conradb212 at 03:24 AM | Comments (0)

May 17, 2013

How Motorola Solutions made two mobile computers condensation- and freezer-proof

Good phone conversation today with the PR folks from Motorola Solutions. The occasion was the introduction of two interesting new products, the Omni XT15f industrial handheld, and the Psion VH10f vehicle-mount computer. The key here is the "f" in both of the names. It stands for "freezer" and that's what the two new devices are all about. Big deal?

Actually, yes. At least for workers who use their computers in and around freezers. That includes storage of perishable foods, the strictly temperature-controlled environments where medications are stored, and numerous other places for goods that need to be or stay frozen. So what's the issue? You just get devices that can handle the cold and that's it, right?

Yes, and no. While the environmental specs of most rugged computing devices include their operating temperature range, the range only tells the temperatures within which the device can be used. In the real world, and particularly when working around freezers, temperature alone isn't the whole issue. What matters is how a device can handle frequent, rapid changes in temperature. The real enemy then becomes condensation, and not so much temperature. Extreme temperatures remain an issue, of course, but one that must be addressed as part of the larger issue of rapidly changing temperatures.

So what exactly happens? Well, if you go from a hot and humid loading dock into a freezer, the rapidly cooling air in and around a device loses its ability to carry moisture, which then becomes condensation. That condensation then freezes, which can cause frost on displays, rendering them illegible, frozen keys on the keypad, and possibly internal shorts. When the worker leaves the freezer environment, the frost quickly melts, again affecting legibility of the display and possibly causing electrical shorts. It's quite obvious that extended cycling between those two environments not only makes the device difficult to use, but it's almost certainly going to cause damage over time.

Now add to that the slowing down of displays in extreme cold and the general loss of battery capacity, and it becomes obvious why this is an issue for anyone using mobile computers in those environments. And hence the new "freezer" versions of those two Motorola Solutions products (Omnii XT15f on the left, Psion VH10f on the right).

So what did Motorola do? Weber Shandwick's ever helpful Anne Norburg suggested I talk to the source and arranged the call, and so I had a chance to ask Amanda Honig, who is the Product and Solutions Marketing Manager for Enterprise Mobile Computing, and Bill Abelson of Motorola's media team. The overall challenge, they said, was to provide reliable "frost- and condensation-free" scanning. In order to do that, they had to address a number of issues:

Since the scan window can fog up, they used internal heaters to automatically defog the window, thus facilitating accurate scans under any condition.

Since external condensation can quickly freeze around keys and make the keypad difficult or impossible to operate, they designed special freeze-resistant keyboard layouts with larger and more widely spaces keys.

Since the airspace between the LCD display and the touchscreen overlay can fog up from condensation and make the display unreadable and imprecise to operate, they optically bonded layers to eliminate air spaces and used a heater to eliminate internal display fogging.

Since battery capacity tanks in very low temperatures and standard batteries can get damaged, they used special low temperature batteries with higher capacities and minimized performance loss at low temperatures.

And to make sure this all worked transparently and without needing any worker involvement, they included environmental sensors and heater logic circuitry so that the device automatically handles the rapidly changing temperatures and humidity. There are, however, also ways to do it manually.

Finally, since it makes no sense to overbuild, they offer two versions. One is called "Chiller" and is considered "condensation-resistant," with an operating temperature range of -4 to 122 degrees Fahrenheit. The other is called "Arctic" and is considered "condensation-free." That one can handle -22 to 122 degrees. The Chiller and Arctic versions add US$700 and US$1,100, respectively, to the cost of the basic Omni XT15 handheld computer. If it means fewer equipment hassles when getting in and out of freezers, that's a small price to pay.

There's another interesting aspect to all this. Changing and upgrading existing equipment is never easy, but in this case it was made easier because Psion, even prior to its acquisition by Motorola Solutions, had given much thought to modular design as a means to quickly and easily adapt to special requirements, easier maintenance, and also to future-proofing. At the very least this means much of the repairs and maintenance can be done in the field. And I wouldn't be surprised if it also made it easier to come up with these special versions

Posted by conradb212 at 11:19 PM | Comments (0)

May 14, 2013

Handheld: Pursuit of a vision

I had a chance yesterday to meet over dinner with Sofia Löfblad, Marketing Director at Handheld Group AB, and Amy Urban who is the Director of Marketing at Handheld US. I hadn't seen them since I presented at the Handheld Business Partner Conference in Stockholm three years ago, and it was a pleasure catching up in person.

The Handheld Group (not to be confused with Hand Held Products, which is now part of Honeywell) is a remarkable rugged mobile computing success story. Having its origins as a distributor of vertical market mobile computers from the likes of Husky, TDS and others, Handheld went on to establish its own identity with its own distinct product lines. In fact, they call themselves a "virtual manufacturer."

What does that mean? Well, while it is not unusual for larger distributors to resell OEM products under their own name, Handheld went one step farther. They not only have their own brands (Nautiz for rugged handhelds, Algiz for rugged tablets), but also their own design language and color scheme (Sofia even knew the precise Pantone color numbers), and they often have exclusive arrangements with their OEMs. So in addition to having a very cohesive brand identity and consistent look, Handheld products are less likely to immediately be identified by industry followers as rebranded versions of a common design.

How has that worked out for the Handheld Group? Apparently quite well. They now have ten facilities all over the world, as well as several hundred authorized partners. And they've been able to score impressive wins like a contract for 10,000 rugged handhelds with Netherland Railways against much larger competition.

They also proved their knack for coming out with the right product at the right time with devices such as the Algiz 10X (a rugged but light and handy 10-inch tablet), the Algiz XRW (a super-compact rugged notebook), and the Nautiz X1, which they call the toughest smartphone in the world. On the surface, that doesn't sound all that terribly exciting, but it really is, and here's why:

I am on record as bemoaning the demise of the netbook, those small and handy notebooks that used to sell by the tens of millions. Then they disappeared due to a combination of being replaced by consumer tablets, and, even more so, an unfortunate industry tendency to keep netbooks so stunted in their capabilities as to render them virtually useless for anything but the most basic tasks. Well, now that they are gone, many wish they could still get a small, competent Windows notebook that's tough and rugged, but isn't as big, expensive and bulky as a full-size rugged notebook. And that is the Algiz XRW. I've liked it ever since I took an early version on a marine expedition to the Socorro islands a couple of years ago (see Case Study: Computers in Diving and Marine Exploration. And the latest is the best one yet (see here).

The Algiz 10X likewise is a Q-ship (i.e. an innocuous looking object that packs an unexpected punch). On the surface, it's just a rugged legacy tablet, albeit a remarkably compact and lightweight version. And while that is mostly what it is, the 10X hits a sweet spot between old-style rugged tablet and new-style media tablet. One that will likely resonate with quite a few buyers who still need full Windows 7 and full ruggedness on a tablet and also some legacy ports, all the while also wanting a bright wide-format hi-res screen and a nice contemporary look.

Then there's the Nautiz X1 rugged smartphone, and that's a real mindblower. By now there are quite a few attempts at providing consumer smartphone functionality in a tougher package, but none as small, sleek and elegant as the Nautiz X1. It measures 4.9 x 2.6 inches, which is exactly the size of the Samsung Galaxy S2 (the one before Samsung decided to make the displays almost as big as a tablet). At 0.6 inches it's thicker, and it weighs 6.3 ounces, but for that you get IP67 sealing (yes, totally waterproof), a ridiculously wide -4 to 140 degree operating temperature range, and all the MIL-STD-810G ruggedness specs you'd usually only get from something big and bulky. Which the Nautiz X1 definitely is not.

In fact, with its gorgeous 4-inch 800 x 480 pixel procap screen, Android 4.x, and fully contemporary smartphone guts, this is the tough smartphone Lowe's should have looked at before they bought all those tens of thousands of iPhones (see here). Don't get me wrong—I adore the iPhone, but it's devices like the Handheld Nautiz X1 that belong in the hands of folks who use smartphones on the job all day long, and on jobs where they get dropped and rained on and so on.

I don't know if Handheld is large enough to take full advantage of the remarkable products they have. They've done it before with that big contract in the Netherlands. But whatever may happen, it's hard not to be impressed with their fresh and competent products that go along with their great people, and their fresh and competently executed business plan.

Posted by conradb212 at 03:59 PM | Comments (0)

April 24, 2013

Itronix RIP

Last week, as I came to a stop at a red light, a police car stopped in the lane next to me. What immediately caught my eye was an expertly mounted rugged notebook computer, angled towards the driver. It was a GD-Itronix rugged notebook, probably a GD6000 or GD8200, with an elegant matte-silver powder-coated insert on top of the magnesium alloy computer case that prominently featured the "General Dynamics" brand name. The officer perused the screen, then looked up, and briefly our eyes met. He had no idea how well I knew that computer in his car, and the one that came before it, and the one before that.

I began following Itronix in the mid-1990s when their rugged notebooks still carried the X-C designation that stood for "Cross Country." Around that time, Itronix purchased British Husky and with that came the FEX21, and since Windows CE was starting to come on strong in smaller rugged devices, Itronix also introduced the tough little T5200 clamshell. I remember a call with Itronix in 1996 or so when I was watching my infant son in the office for an hour or two while his mom was shopping. The little guy was not happy and screamed his head off the entire time I was on the phone with Matt Gerber who told me not to worry as he had a couple of young kids himself. I remember hoping he didn't think we were running a monkey operation.

Around the turn of the millennium, Itronix in a clear challenge to Panasonic's rugged, yet stylish Toughbooks, launched the GoBook. It was a clean, elegant, impressive machine with such cool features as a waterproof "NiteVue" keyboard with phosphorescent keys, and seamless, interference-shielded integration of a variety of radio options. I was impressed.

That first GoBook would quickly evolve into larger, more powerful versions and then spawn a whole line of GoBook branded rugged notebooks, tablets and interesting new devices such as the GoBook MR-1 that measured just 6 x 4.5 inches, yet brought full Windows in a super-rugged 2.5-pound package to anyone who needed the whole Windows experience in such a small device. On the big boy side came the impressive GoBook II, then III, and then "Project Titan," the incomparable GoBook XR-1. At the time we said that it had "raised the bar for high performance rugged notebooks by a considerable margin. It has done so by offering a near perfect balance of performance, versatility, ruggedness and good industrial design." High praise indeed, and totally deserved.

Itronix also branched out into the vehicle market with the semi-rugged GoBook VR-1 and into tablets with first the GoBook Tablet PC and then the GoBook Duo-Touch that combined both a touchscreen and an active digitizer into one small but rugged package. But even that wasn't all. With the introduction of the GoBook VR-2 came DynaVue, a truly superior new display technology that just blew my mind. Tim Hill and Marie Hartis had flown down from Spokane to demonstrate DynaVue on the new VR-2, and both could hardly contain their excitement. DynaVue ended up revolutionizing rugged systems display technology with a very clever combination of layering of filters and polarizers, and its approach became the basis of outdoor-viewable display technology still in use today.

I'll never forget a factory tour of the Itronix main facility in Spokane, meeting and speaking with some of the most dedicated engineers, designers, product planners and marketing people in the industry. I visited their ruggedness testing (I always called it "torture testing") lab which rivaled what I had seen at Intermec and at Panasonic in Japan. I spoke with their service people, the folks on the shop floor and with management. What a talented and enthusiastic group of people. The sky seemed the limit. (See report of the 2006 Spokane factory tour)

But change was brewing. Itronix's stellar performance had attracted suitors, and giant defense contractor General Dynamics, then a US$20 billion company with some 70,000 staff, felt Itronix would nicely complement and enhance its already massive roster of logistics, computing and military hardware. The sale had come as no surprise. Everyone knew it was eventually going to happen. Equity investment firm Golden Gate Capital had purchased Itronix in 2003 from former parent Acterna with the intent of prepping Itronix for a sale. Within just two years, Itronix prospered enough to make it a lucrative proposition for General Dynamics. Within Itronix, the hope was that the sheer mention of the name "General Dynamics" would open doors.

In our GoBook VR-1 review we cautiously offered that "the change in ownership will be both a challenge and a tremendous opportunity for Itronix."

Turns out we were right about the challenge part. The "GoBook" was quickly dropped in favor of a GDxxxx nomenclature, and with it the laboriously earned GoBook brand equity. There were attempts to somehow merge another GD acquisition, Tadpole, into Itronix, and that turned out to be difficult. No one seemed to know what to expect. And then the hammer fell.

In early 2009, General Dynamics announced it would phase out the Itronix computer manufacturing and service facility in Spokane, Washington and operate the business out of Sunrise, Florida where the company's C4 Systems division had an engineering facility instead. It was a terrible blow for Spokane, where losing Itronix meant the loss of almost 400 jobs. And the cross-country move meant Itronix lost most of what had made Itronix the vibrant company it had been.

It was never the same after that. On the surface things continued to look good for a while. There seemed a cohesive product line with GD2000, GD3000, GD4000, GD6000 and GD8000 rugged computing families. But from an editorial perspective, we were now dealing with people who didn't seem to know very much about the rugged computing business at all. And there no longer seemed a direction. Some of the final products were simply rebadged products from other companies. Eventually, there was mostly silence.

In January 2013, I was told that "after in-depth market research and analysis, we have determined that it is in the best interests of our company, customers and partners to phase out a number of our General Dynamics Itronix rugged computing products." In April 2013 came the end: "Itronix has phased out all products."

That's very sad. A once great company gone. Could it have been different? Perhaps. But Itronix was often fighting against the odds. Even in its heydays, Itronix primarily worked with Taiwanese OEMs whereas its major competitors at Panasonic and Getac controlled their entire production process. In addition, while its location in Spokane was a calming constant, Itronix ownership was forever in flux. Itronix was started in 1989 as a unit of meter-reading company Itron to make rugged handheld computers. It was spun off from Itron in 1992, then sold to rugged computer maker Telxon in 1993. In 1997, telecom testing gear company Dynatech Corp. bought Itronix from Telxon for about $65 million. Dynatech changed its name to Acterna in 2000, but fell on hard times and sold Itronix to private equity firm Golden Gate Capital in 2003 for just US$40 million in cash. Golden Gate held on to it for a couple of years before General Dynamics came along. -- The band Jefferson Starship comes to mind here, with Grace Slick charging "Someone always playing corporation games; Who cares they're always changing corporation names..."

Perhaps there could have been a management buyout. Perhaps the City of Spokane could have helped. But that didn't happen, and though in hindsight it seems like a natural, there are always reasons why things happen the way they happen.

As is, there once was a superbly innovative company called Itronix, and they did good. I will miss them, and so probably will everyone interested in rugged computing equipment. I bet the police officer I saw with his Itronix laptop will, too.

Posted by conradb212 at 06:52 PM | Comments (0)

March 27, 2013

Xplore adds Common Access Card reader-equipped rugged tablet for military and government

This morning, March 27, 2013, Xplore Technologies introduced a new version of their ultra-rugged tablet computer, the iX104C5-M2. In essence, this is a specialized model for military and government personnel that require additional hardware security on top of the various security hardware, software and firmware measures already inherent in modern computing technology. What the new M2 model adds is an integrated common access card (CAC) reader. With the reader, in order to get access to critical data, a U.S. government issued ISO 7816 smart card must be inserted.

Why is the ability to read such cards and to provide data access only with such a card important? Because it's mandated in directives and policies such as the Homeland Security Presidential Directive 12 (HSPD-12) that requires that all federal executive departments and agencies use secure and reliable forms of identification for employees and contractors. A chip in the CAC contains personal data, such as fingerprint images, special IDs and digital certificates that allow access to certain federally controlled systems and locations. As a result, both Federal agencies and private enterprise are now implementing FIPS 201-compliant ID programs.

But what exactly do all those card terms mean? What's, for example, the difference between a CAC and a PIV card, and how do they relate to Smart Cards in the first place? Well, the term "smart card" is generic. It's simply a card with a chip in it. The chip can then be used for data storage, access, or even application processing. A CAC is a specific type of smart card used by the US Department of Defense. A PIV (Personal Identification Verification) card is also a FIPS 201-compliant smart card used by the Federal government, but it's for civilian users. Then there's also a PIV-I smart card where the "I" stands for "Interoperable," and that one is for non-Federal users to access government systems.

The way a CAC works, specifically, is that once it's been inserted into the CAC reader, a PIN must be entered and the reader then checks via network connection with a government certificate authority server, and then either grants or denies access. The CAC stays in the reader for the entire session. Once it's removed, the session (and access) ends.

What this means is that only computers that have a CAC reader can be used for certain military and other governmental work. And the new Xplore iX104C5-M2 provides that reader. It's built directly into the chassis where it is secured and protected.

I had a chance to talk with Xplore Technologies representatives prior to the release of the new model. They said they created this new version specifically to meet the requirements of the Department of Defense, the Air Force, and Homeland Security. According to them, the potential market for CAC-equipped ruggedized tablet is 50,000-100,000 units. Considering that a rugged Xplore tablet lists for over US$5k, that would value that market at between half a billion and a billion dollars. Xplore's competition, of course, will step up to bat as well, and not all CAC-equipped computers will require the superior ruggedness and portability of an Xplore tablet,. But it's easy to see why Xplore, a company with roughly US$30 million in annual sales, would throw its hat in the ring. Even winning a small percentage of the estimated value of this market could have a sizable impact on Xplore.

While I'm at it, let me recap what Xplore Technologies is all about and what they have to offer. Unlike the flood of Johnny-come-latelies attempting to grab a slice of the booming tablet market, Xplore has been making tablets for the better part of two decades. Starting in the mid-1990s, the company began offering some of the finest and most innovative rugged tablet computers available. They were at the forefront with integrating multiple wireless options into a rugged tablet, offering truly outdoor-readable displays, and providing dual mode digitizers that automatically switched between active pen and touch. We reviewed their current iX104C5 tablet platform in detail a couple of years ago (see here) and declared it "one of the best rugged tablet designs available today." In the meantime, Xplore has broadened the appeal of the platform with a number of versions targeted at specific industries and clienteles, and this latest M2 version continues that tradition with a very timely product.

See the Xplore iX104C5-M2 product page.

Posted by conradb212 at 06:58 PM | Comments (0)

March 06, 2013

When the fire chief wants iPads instead of rugged gear

The other day I was engaged in a conversation at a party. Turns out my conservation partner was the fire chief of an affluent community of about 120,000. We talked about our respective jobs and soon found we had something incommon: rugged computing equipment. They use Panasonic Toughbooks, but the fire chief said something that has been on my mind for a while now. He said they liked the Toughbooks just fine, but he considered them much too expensive and they'd just buy iPads instead. He said he doesn't care if the iPads break, they'll just replace them with new ones because consumer tablets cost so little.

I can see that rationale. It's one thing if a professional tool costs 50% more than a consumer grade tool. But another if the professional tool costs five to ten times as much. Over the past few years I've seen large chains buy massive numbers of consumer smartphones and tablets instead of the rugged industrial-grade handhelds and tablets they used to buy. Sometimes it seems like the rugged computing industry is missing out on a great opportunity to benefit from the boom in smartphones and tablets by staying with older technologies and very high-end pricing instead of offering ruggedized versions of what today's consumers want.

Posted by conradb212 at 03:57 PM | Comments (0)

February 07, 2013

Not your father's Celeron

In my last blog article I wrote about the needless demise of netbooks, and how that demise was due more to the fact that people loved the rock-bottom price of netbooks but then found them too small and lacking in performance, so they asked for more size and performance. The industry complied with larger, more powerful netbooks, but that meant they cost more and netbooks weren't netbooks anymore. So people stopped buying them. I also wrote how, in my opinion, Intel's inexpensive Atom processors both made the netbook by making the low price possible, but then contributed to the demise of the netbook with their often unacceptable performance. Unfortunately, the unacceptable performance level of Atom processors also affected a lot of other industrial and vertical market devices based on Atoms.

So we have this unfortunate situation: Atom processors (of which there are by now about 50 different models) don't cost a lot, usually well under US$100, with some in the US$20 range. But they are also very marginal performers, so much so that not only netbook vendors abandoned them, but also a good number of vertical market manufacturers which quietly switched to "real" Intel Core processors. Unfortunately, even the low-end mobile Core i3 chips cost in the low US$200 range, and mobile Core-i7 chips usually closer to US$400. Those are huge price differences with major impacts on low-cost consumer electronics (though one would think far less impact on much higher priced vertical market computers where the processor makes for a much lower percentage of the overall purchase price).

What that all means is that there's an unfortunate gap between the inexpensive but rather underpowered Atom chips, and the beefy but much more expensive Core processors. (Oh, and while I'm at it, here's basically the difference between the by now three generations of Intel Core chips: First gen: good performance, but power hogs with insufficient graphics. Second gen: good performers with much better gas mileage but still sluggish graphics. Third gen: good performance, economical, and much better graphics.) But now to get back to the big gap between Atoms and Core processors: there's actually something in-between: Celerons and Pentiums.

Celerons and Pentiums? But weren't Pentiums old chips going back to the early 1990s and then being replaced by Core processors? And weren't Celerons bargain-basement versions of those old Pentiums? Yes that's so, but there are still Celerons and Pentiums in Intel's lineup, and they are NOT your father's Celerons and Pentiums, they are now slightly detuned versions of Core processors. They should really call them Core-i1 or some such.

But let me explain, because those new-gen Celerons and Pentiums may well be one of the best-kept secrets in the processor world. If you go to the Intel website and look up their mobile processor lineups, you'll find them neatly organized by generation and then by Core Duo, Core 2 Duo, i3, i5, and i7. Celerons are still listed as either Celeron M Processors or Celeron Mobile Processors. The Celeron Ms are old hat and many go back a decade or more. The Celeron Mobile processors, however, include many models that are much newer, with the Celeron 10xx low and ultra-low voltage models launched in Q1 of 2013, i.e. as of this writing. I would have never noticed this, and probably would have continued thinking of Celerons as obsolete bargain processors, had it not been for an Acer mini notebook I just bought as a replacement for my vintage (2008) Acer Aspire One netbook.

You see, my new Aspire One has an 11.6-inch 1366 x 768 pixel screen and is really still a netbook, with netbook looks and a netbook price (I bought it as a refurb for US$250), but it has a Celeron instead of an Atom processor. The 1.4GHz Celeron 877, to be exact, introduced Q2 of 2012, and an ultra-low voltage design with a thermal design power of 17 watts. It uses the Sandy Bridge architecture of the second gen Core processors, and reportedly costs about US$70, no more than a higher end Atom chip, and only about US$25 more than the Atom N2600. Now how would that work, a real Sandy Bridge, non-Atom chip in a netbook?

Turns out very well.

The Celeron-powered little Acer ran a 1,261 PassMark CPU score compared to Atom-powered devices in our rather comprehensive benchmark database reaching from a low of 163 (Atom N270) to a high of 512 (D510). The Celeron ran CrystalMark memory benchmarks between two and five times faster than the Atoms, and CrystalMark GDI benchmarks between three and five times faster. The Celeron 877 netbook also powered through most other benchmarks much faster than any Atom-based device. As a result, by netbook standard this new son-of-netbook absolutely flies. And it handles HD video, a big sore sport with early netbooks, without a problem.

But what about battery life and heat? After all, most of those mobile Atom chips have minuscule thermal design power of between two and five watts (with the exception of the D510, which is at 13 watts) whereas, though designated a "ultra-low power" chip, the Celeron's TDP is 17 watts. Reviews on the web complain about insufficient battery life of this particular Acer Aspire One (the AO756-2888). Well, that's because to keep the price low, Acer gave this netbook only a wimpy 4-cell 37 watt-hour battery. Most earlier netbooks had beefier 6-cell batteries.

In real life, our benchmark testing always suggested that Atom power management was relatively poor whereas ever since Sandy Bridge (second gen) Core processor power management has been excellent. So the difference between Atom and Core-based power consumption can be a lot less than one would assume based on TDP alone. And that was exactly what I found with the Celeron in this new Acer netbook. BatteryMon power draw (with WiFi on), was as little as 6 watts. That's actually LESS than what we have observed in a good number of Atom-powered devices (and also less than my old 2008 Atom N270-powered Acer netbook). Sure, the top end of the Celeron-based machine is so much higher that it can draw down the battery quicker than an Atom device, but under normal use, the Sandy Bridge guts of the Celeron handle power management very, very well. As for heat, my new little Acer has a quiet fan, but it actually stays cooler and the fan comes on less often than that in my 2008 Atom-based netbook.

I am not an electric engineer and so my conclusions about relative processor performance come from benchmarking, real life experience, and perusing Intel's tech specs. Based on that, I'd have to say the Pentium and Celeron versions of Intel's Core processors deserve a whole lot more attention in the mobile space. I don't know what it actually looks like at the chip level, but it feels like Intel starts with essentially one design, then adds features here and there (like all the extra Intel technologies in the Core i7 chips) and omits or perhaps disables those them in lower level chips. As a result, the inherent goodness and competence of an Intel Core chip generation may be available in those little-known Celeron and Pentium versions, if not all of the features of the more expensive SKUs.

What does that all mean? Obviously, for those who need to run the latest and most 3D-intensive video game at insane frame rates, only the very best and most expensive will do. And the same goes for those who absolutely need one or more of those extra features and technologies baked in or enabled in i5 and i7 chips. If that is not an issue, the Celeron versions may just be a very well kept secret and a terrific bargain. The Celeron 877 sitting in my lowly new netbook absolutely runs rings around any Atom-based device, and it does so without even trying hard and while treating battery power as the precious commodity it is.

So.... if I were a designer and manufacturer of vertical market industrial and rugged devices, I'd think long and hard before committing to yet another underpowered Atom chip that'll leave customers underwhelmed before long, and instead check what else Intel may have in its parts bin. There are some real bargains there, good ones.

Posted by conradb212 at 04:02 PM | Comments (0)

February 04, 2013

The needless demise of the netbook

Three or so years ago, netbooks sold by the millions. Today, they're gone, replaced by tablets and larger, more powerful notebooks. What happened? I mean, it's not as if tens of millions of people wanted a netbook a few years ago, and today no one wants one.

What's not to like about a small and handy notebook computer that runs full Windows and costs a whole lot less than even inexpensive larger notebooks? So much less that the purchase price of a netbook was close to making it an impulse buy.

The problem was, of course, that while the price was right, netbooks themselves weren't. Slowly running Windows on a very small display with marginal resolution quickly turned the netbook experience sour. The very term "netbook" implied quick and easy access to the web, an inexpensive way to be online anytime and anywhere. Well, netbooks were so underpowered as to make that browsing and online experience painful. It didn't have to be that way, but market realities painted the netbook into a corner where it withered and died.

It's not that the technology wasn't there to make netbooks fast and satisfying enough to become a permanent addition to what consumers would want to buy. And it wasn't even that the technology required to make netbooks as powerful as they needed to be without disappointing customers would have been too expensive. It's just that making such products available would have cannibalized more profitable larger notebooks. And consumers who demanded larger, more powerful netbooks at the same low price also weren't thinking it through.

There's a reason why compact technology demands a premium price. An unsubsidized 3-ounce smartphone costs as much as a 50-inch HD TV. A loaded Mini Cooper costs as much as a much larger SUV or truck. And ultra-mobile notebooks have always cost more than run-of-the-mill standard ones. It's the MacBook Air syndrome that dictates that sleek elegance and light weight costs extra. Netbooks broke that rule by promising the full Windows experience in an ultra-compact device at an ultra-low price.

You can't do that in the Wintel world. Something had to give. And that was acceptable performance. I would not go as far as declaring Intel's entire Atom project as a frustrating, needless failure as there are many Atom-based products that work just fine. But the whole approach of making processors not as good and fast as they could be but throttled and limited enough so as not to interfere with sales of much more expensive processors is fundamentally flawed. It's like promising people an inexpensive car, but then they find out it can't drive uphill.

So netbooks were flawed from the start in infuriating ways. The 1024 x 600 display format endlessly cut off the bottom of just about everything because just about everything is designed for at least a 1024 x 768 display. And that was the least of netbooks' annoying traits. Performance was the biggest problem. The Atom N270 processor in almost all early netbooks had painfully insufficient graphics performance, and was completely unable to play the HD video that people could generate on every cheap camera and phone. The endless wait for a netbook to complete any task beyond the very basics quickly turned people off. Yes, the small size and weight, the low cost, and the good battery life sold tens of millions of netbooks, but their inadequacy soon relegated them to the dustbin. In my case, I quickly realized that a netbook did not replace a larger notebook with standard performance; it just meant I had to take with me both the netbook AND the real computer.

So people demanded more. The original netbooks had 7-inch screens, but that quickly grew to 8.9 inches for all those Acer Aspire Ones and Asus Eee PCs. And then that wasn't large enough and so the netbook vendors switched to 10.1 inch screens. And then to whatever new Atom processors Intel introduced. Then tablets came and it was just so much easier, quicker and more pleasant to use a tablet to browse the web that the netbooks' shortcomings became even more evident.

With netbooks' fortunes waning but the iPad's tablet success turning out to be frustratingly difficult to copy, netbook vendors gave it one last shot. 11.6 inch screens with full 1366 x 768 720p resolution. AMD processors instead of Atom (short-lived and unsatisfactory). And finally ditching the Atom in favor of Intel Celeron and Pentium chips, which had little to do with the Celeron and Pentium M chips of yore but simply were wing-clipped version of Intel's Core processors. By doing that, netbooks ceased to be netbooks. They had become smallish notebooks with decent performance, but without the endearing compactness, small weight and rock bottom prices that once had given netbooks such allure.

And battery life suffered as well.Try as anyone might, it's just not possible to run a 11.6 inch screen and a 17-watt Celeron or Pentium for nearly as long on a battery charge as an 8.9-inch screen with a 2-watt Atom. So that quality of netbooks was gone, too.

Where does that leave all those folks who wanted a cheap and simple little notebook for when space, cost and weight mattered? Nowhere, really. Tablets are wonderful and I wouldn't want to be without mine, but they are not full-function computers. Not as long as real productivity software isn't available for them, and not as long as tablet makers act as if something as simple and necessary as being able to do or look at two things at once were the second coming. Fewer dropped calls, anyone?

So for now, if you peruse Best Buy or CostCo or Fry's ads, you either get a tablet or a notebook with a 14-inch screen or larger, or you spring for an expensive Macbook Air or an Ultrabook.

That leaves a big void, and a bad taste in the mouth. For the fact is that there could be totally competent netbooks in the impulse buy price range if it weren't for the reality that Intel makes all those pricey Core processors that all by themselves can cost several times as much as a basic netbook. Which means the myth that you need a real Intel Core processor to run Windows and not just some wimpy ARM chip must be upheld. Personally, I do not believe that for a second, but the financial fortunes of two major technology companies (Microsoft and Intel) are built upon this mantra, and that won't change unless something gives.

So what did I do when my little old 8.9-inch Acer Aspire One finally gave out? First despair because I couldn't find a contemporary replacement, then grudgingly accept the reality of the netbook's demise and buy a new Aspire One, one with an 11.6-inch 720p screen and a Celeron processor. I got a refurbished one from Acer because it was cheaper and comes with Windows 7 instead of Windows 8. So there.

But what if a low, low price is not the issue and you want something really rugged in the (former) netbook size and weight category? Then you get an Algiz XRW from the Handheld Group. It's small and light enough, runs forever on a charge thanks to using a little engine that for the most part can (the Atom N2600), and has a 720p screen good enough for real, contemporary work. And it's for all practical purposes indestructible.

Posted by conradb212 at 07:30 PM | Comments (0)

January 14, 2013

On the Microsoft front ...

Well, on the Microsoft side of things, a couple of areas are becoming a bit clearer. Not much, but a bit.

At the National Retail Federation (NRF) Annual Convention & Expo in New York, Microsoft issued a press release entitled "Microsoft Delivers Windows Embedded 8 Handheld for Enterprise Handheld Devices." That title is a bit misleading as those handhelds, prototypes of which were shown by Motorola Solutions, are not available yet, and Microsoft won't even release the Windows Embedded 8 Handheld SDK until later this year. However, after having stranded the vertical and industrial market with the by now very obsolete Windows Embedded Handheld (nee Windows Mobile 6.5) for a good couple of years, at least now it looks like Microsoft will offer a vertical market version of Windows Phone 8 for all those who want a handheld with a Microsoft OS on it instead of Android.

There will, of course, not be an upgrade path from Windows Mobile/Embedded Handheld to Windows Embedded 8 Handheld, just as there wasn't one from Windows Mobile to Windows Phone 7, or from Windows Phone 7/7.5 to Windows Phone 8. Still, at least having the prospect of soon getting an up-to-date mini Windows OS that's reasonably compatible with Windows 8 itself should be a huge relief to all those rugged handheld manufacturers who've been under increasing pressure of offering Android-based devices. Then again, Microsoft once again pre-announcing a product that doesn't even ship its SDK yet will also further perpetuate the uncertain vertical market handheld OS status quo, and likely lead to more customers deciding to simply get readily available consumer smartphones instead of waiting for the vertical market smoke to clear.

On the tablet side, we have the, by most accounts, less than stellar reception of Windows 8. Microsoft will likely correct the situation with Windows 8 over time, but as far as tablets go, it's pretty easy to draw some preliminary conclusions: Like, no matter how good the Windows Surface RT tablet hardware was/is, without being able to run what most people will consider "Windows" for many years to come, Windows RT is simply not going to fly. If the Metro interface were a runaway hit and there were tons of Metro apps, perhaps. But as is, anyone who needs to use any "legacy" Windows software is out of luck with Windows RT. So it's a Windows CE situation all over again: Windows RT must not be too powerful or else it'll eat into Windows 8 marketshare. And there can't be a perception that ARM-based tablets are capable of running "real" Windows, or else there'd be no reason to spend a lot more for Intel-based tablet.

Posted by conradb212 at 06:11 PM | Comments (0)

January 04, 2013

Big changes at General Dynamics Itronix

Eagle-eyed RuggedPCReview readers may have noticed something missing from the front page of our site: the General Dynamics Itronix logo in the site sponsor column. Yes, for the first time since the launch of RuggedPCReview, Itronix is not among our sponsors anymore. That's sad as Itronix was our first sponsor, and prior to that we had covered all those rugged Itronix GoBooks and other rugged mobile devices in Pen Computing Magazine since the mid-1990s.

What happened? We're not sure, but an email exchange with Doug Petteway, General Dynamics C4 Systems director of product management and marketing yielded that the company is "restructuring its portfolio of rugged products to focus more on high value targeted solutions rather than the mass commodity market" and that while they'll continue selling the GD6000, GD8000 and GD8200 rugged notebooks through early 2013, the entire rest of the lineup of Itronix rugged mobile computing products is discontinued.

Petteway made the following statement:

"At General Dynamics C4 Systems, we have a set of core capabilities that we are leveraging aggressively to expand and grow in key markets. To maximize our potential for success, we must continually assess and refine our portfolio, investing in critical gap-filling capabilities that enable us to deliver highly relevant “must-have” solutions while also phasing out offerings that are no longer in high demand, freeing up valuable investment resources.

After in-depth market research and analysis, we have determined that it is in the best interests of our company, customers and partners to phase out a number of our General Dynamics Itronix rugged computing products. This decision may affect the solutions customers buy from us today. Please know that General Dynamics C4 Systems’ management team wants to assure you that our customer needs remain our first priority.

As always, customer satisfaction is paramount and we will continue to ensure our customers receive the service and support in full accordance with our warranty commitments.

We remain focused on being an industry leader with proven, high value communications, computing, security and mobility solutions for our customers.

Additional announcements will be made in the near future."

That doesn't sound very good, and not having all those rugged Itronix notebooks and tablets available anymore is a big loss. We wish Itronix all the best, whatever course General Dynamics has in mind for them.

Posted by conradb212 at 12:06 AM | Comments (0)

November 30, 2012

Surface with Windows 8 Pro pricing contemplations -- an opportunity for traditional vendors of rugged tablets?

On November 29, 2012, Microsoft revealed, on its Official Microsoft Blog (see here), pricing for its Surface with Windows 8 Pro tablets. The 64GB version will cost US$899 and the 128GB version runs US$999. That includes a pen but neither the touch or the type cover. They cost extra.

So what do we make of that?

Based on my experience with the Surface with Windows RT tablet, I have no doubt that the hardware will be excellent. With a weight of two pounds and a thickness of just over half an inch, the Pro tablet is a bit heavier and thicker than the RT tablet, but still light and slim by Windows tablet standards. The display measures the same 10.6 inches diagonally, but has full 1920 x 1080 pixel resolution compared to the 1366 x 768 pixel of the RT tablet. That's the difference between 1080p and 720p in HDTV speak. There's a USB 3.0 port and a mini DisplayPort jack. Under the hood sits a 3rd Gen Intel Core i5 processor as opposed to the nVidia Tegra 3 ARM chip in the RT model. And both RAM and storage are twice of what the RT tablet has. All that certainly makes for an attractive tablet.

What customers of a Surface with Windows 8 Pro get is a modern and rather high performance tablet that can be used with a pen or a mouse in desktop/legacy mode, and with touch in the new Metro mode with all the live tiles and all. You can use the pen in Metro mode, of course, but Metro wasn't designed for that. And you can use touch in legacy mode, but as 20 years of experience with Windows tablets has shown, legacy Windows does not work well with finger touch. Still, this will most likely be good hardware that makes full Windows available in a tablet, and also allows evaluating Metro in its native mode.

But let's move on to the ever important price. And here Microsoft faced an unenviable task. Microsoft tablets had to be price-competitive with the iPad, and the Surface RT tablets are. Except that so far they have not been accepted as "real" Microsoft tablets because they cannot run legacy Windows software. The Windows 8 Pro tablets are real Windows tablets, but they now cost more than iPads. Sure, they have more memory and ports and a memory card slot and an Intel Core processor, but the perception will still be that they cost more than iPads and are thus expensive. That's somewhat unfair because the i5 processor in the Microsoft tablet alone costs costs more than most consumer Android tablets. But this is an era where you can get an impressive, powerful and full-featured notebook for 500 bucks or so, and a sleek Ultrabook for well under a grand. That makes the tablet look expensive.

Price, in fact, has always been a weak spot with Windows-based tablets. Witness a bit of tablet history: the first pen tablets in the early 1990s cost almost $4,000. Even in an era where notebooks cost much more than what they cost today, that was too much, and it was one of the several reasons why early pen tablets failed in the consumer market. Tablets did remain available in vertical markets throughout the 90s, albeit usually at around $4,000.

In 2001/2002 Microsoft tried again with their Tablet PC initiative. The goal there was to bring the tablet form factor, beloved by Bill Gates himself, to the business and consumer markets. The price was to be lower and to make that possible Microsoft initially mandated the use of inexpensive Transmeta processors. When they turned out to be too slow to drive the WIndows XP Tablet PC Edition at an acceptable clip, everyone turned to Intel and the average 2002-style Tablet PC ran around US$2,000. Which was still too expensive for the consumer market where customers could pick up a regular notebook for less.

Unfortunately, while two grand was too steep for consumers, the side effect was that companies like Fujitsu, Toshiba, and everyone else who had been selling tablets in the 90s now had to offer theirs for half as much as well, losing whatever little profit came from tablet sales in the process. What's happening now is that the Surface for Windows 8 Pro again halves the price people expect to pay for a tablet. And again there may be a situation where the public considers Microsoft's own Windows 8 tablets as too expensive while the verticals have to lower their prices to stay competitive with Microsoft itself.

And that won't be easy. Vertical market vendors have done a remarkable job in making business-class Windows 7 tablets available for starting at around US$1,000 over the past year or so. But those tablets were almost all based on Intel Atom processors which are far less powerful than what Microsoft now offers in their own Windows 8 Pro tablets. So we have a situation where Intel pushed inexpensive Atom processors to make inexpensive tablets possible, but Microsoft itself has now upped the ante for its licensees by offering much more hardware for less.

Ouch.

It's hard to see how this could possibly leave much room for the traditional makers of business-class Windows tablets. Unless, that is, they find a way to compellingly answer the one question we've been hearing ever more loudly over the past couple of years: "we need a tablet like the iPad, but it must run Windows and be a lot more rugged than an iPad." Well, there's the niche. Tablets that match the iPad's style and Microsoft's newly established hardware standard, but a whole lot tougher than either and equipped with whatever special needs business and industrial customers have.

That ought to be possible. The traditional vertical market tablet makers and sellers already know their markets. And unlike the designers of consumer market tablets, they know how to seal and protect their hardware and make it survive in the field and on the job. What that means is that Microsoft's pricing for their Surface tablets may well be a glass half full for the rugged computing industry, and not one half empty.

Anyone for a sleek yet armored ULV Core i5 or i7-powered, IP67-sealed tablet with a 1080p dual-mode and sunlight viewable procap/active pen input display, a 6-foot drop spec, dual cameras with a 4k documentation mode, 4G LTE, and integrated or modular scanner/RFID/MSR options?


Posted by conradb212 at 08:31 PM | Comments (0)

November 21, 2012

Windows RT: how suitable is it for vertical markets? (Part II)

I had planned a quick follow-up on my first impressions of the Microsoft Surface RT tablet and Windows RT in general. But now it's almost a month later, so why the hesitation?

It's not because of Microsoft's hardware. I am as impressed with the Surface RT tablet as I was when I first took it out of its box. It's a truly terrific device. If after a month of use about the only gripe is that you still can't easily find the on-off button, you know the hardware itself is good. So no issues there. It never gets hot or even warms up. Battery life is practically a non-issue, like on the iPad. It's plenty fast enough. Honestly, the argument that for real performance and real work you need an Intel processor is pretty thin. What it really feels like is that Microsoft is in the difficult spot of having to artificially hold ARM hardware back via Windows RT so that it won't compete too much with Intel hardware, but at the same time Microsoft doesn't want to come across as being uncompetitive on ARM platforms. Tough position to be in.

And then there's the whole concept of Windows 8. I really did not want to get into a discussion of operating systems, but Microsoft makes it hard not to. Especially if you've been covering Microsoft's various mobile and pen/touch efforts over the years.

One giant problem is that Microsoft still does not want to let go of the "Windows on every device" maxim. So Windows 8 is on the desktop, on notebooks, on tablets and on phones. With Microsoft claiming it's all the same Windows, though it's really quite unclear to most whether it's really the same Windows or not. So from a practical perspective, what exactly is the advantage of the tile-based "Metro" look on all those very different computing platforms when you really can't run the same software anyway? Yes, the fairly consistent look is probably good for brand identity (as if Microsoft needs more of that), but it's inconvenient for users who have to deal with this one-size-fits-all approach at best.

And there are some other issues.

For example, what's the deal with the "flatness" of everything in Windows 8 and RT? Not long ago everything had to be 3D and layered, and now everything has to be completely flat? There is simply no good argument for that. 3D components on a screen always help making things more manageable and more obvious (let alone better looking), so complete flatness for complete flatness' sake seems weak.

Then there's the peculiarly low density of almost everything I've seen so far in Metro. Maybe that's just because Metro is only getting started, but between the Kansas-like flatness and very little on the screen, it feels strange and empty, and it means a lot of panning left and right.

And by far the biggest beef: why try to shoehorn everything into one operating system? It is very abundantly clear that traditional Windows apps, the kind that hundreds of millions use every day, are simply not for touch operation and may never be. Just because it's simple to touch here and there and use touch to consume information on small media tablets doesn't mean touch is the way to go with the much more complex interactive software most people use for work. Pretty much all of the creative work I do, for example, requires the pinpoint accuracy of a mouse: editing, image processing in Photoshop, layout in Quark Xpress, etc., etc. I cannot see how that can be replaced by just tapping on a screen.

So from that perspective, it does seem like Microsoft has simply done what the company has done every time in the past 20 years when new and disruptive technology came along -- it paid lip service by putting a fashionable layer on top of Windows. That's what happened with Windows for Pen Computing (1992), the Pen Services for Windows 95 and then 98, and the Windows XP Tablet PC Edition (2002). Only this time the disruptive technology (tablets) has found widespread enough acceptance to really get Microsoft's attention.

And a couple of personal peeves in Windows RT:

First, I find the live tiles annoying. I find all the constant moving on the screen distracting, and in corporate environments it's certainly a constant distraction, with people getting sidetracked into consuming information. Let me make the decision what I want to do next, rather than have a screen full of tiles vying for my attention like a wall of alive pictures in a Harry Potter movie.

Second, if Metro is indeed Microsoft's interface and operating environment of the future, does that mean we'll have come full circle from having just one app per screen to task switching to, finally, software that allowed as many windows as we wanted, just to get back to task-switching one-thing-at-a-time? That, given the right apps, may be good on small tablets, but it's definitely not the way I'd want to work on the desktop or even on a laptop.

Oh, and a third... if Microsoft is concerned about being so far behind with available apps in its store, it really doesn't show. If they were concerned, why would the store be as ultra-low density as it is, with no way of quickly finding what you really want? The store interface seems minimal beyond a fault.

But on to Windows RT and its suitability for vertical markets. That actually might work, although there are several big ifs.

Windows RT for vertical markets: PRO

Economical hardware -- Judging by the initial Surface RT tablet, ARM-based Windows RT-powered tablets could be a perfect solution for numerous vertical market deployments. They are light, simple, quick, don't heat up, get superior battery life, and they cost less.

No virus/malware -- User don't have to worry about viruses and malware because a) the main focus of the bad guys will remain Windows 8 proper, and all software must come from the Microsoft app store. That could be a big argument for Windows RT.

Device encryption -- There's device level encryption In Windows RT. That can be done in Windows 8 also (via BitLocker and other utilities), but in Windows RT it's in the OS itself.

Custom stores --From what I hear, vertical market vendors will be able to have their own showrooms in the Microsoft store that only users of that vendor's hardware can see. That would/will be a great benefit for both users and vendors.

Microsoft Office -- Microsoft Office comes with Windows RT. I haven't done a feature by feature comparison with "real" Office and there are those who says Office RT is a dumbed-down version of Office. All I can say is that Office RT will meet the needs of a whole lot of users. If it's dumbed down, it's infinitely less dumbed-down than Office on Windows CE and Windows Mobile was. There are, however, some licensing issues as, at least for now, Microsoft considers Office RT not for commercial use.

Legacy and leverage -- Microsoft has always used the leverage argument ("your users and programmers already know Windows, and this will fit right in") , and Windows RT will probably benefit from that as well. It's curious how much of the age-old Windows utilities and apps actually run on Windows RT, and Windows RT will probably fit much more easily into a corporate Windows infrastructure than Android or iOS.


Windows RT for vertical markets: CON

Confusion -- You'll forever have to explain (and wonder) what exactly works and what doesn't work on Windows RT compared to Windows 8. Some may decide it's easier to just use Windows 8 instead.

Still not pure tablet software -- Unlike with Android and the iPad, Windows RT users still have to fall back into desktop mode for Office and perhaps other functionality (settings, configurations, etc.) where touch just doesn't work well and you really need a mouse. You can use any USB mouse with Windows RT, but it's frustrating to never know if you need a mouse on your new tablet or not.

Artificial limitations -- Since Windows RT is not to compete too much with the Wintel side of Windows 8, there are hardware and software limitations to deal with in Windows RT, whether they make sense or not. Users are the victims here.

Vendor predicament -- How is a hardware vendor to make the call on Windows 8 versus Windows RT? Offer both? Make cheaper RT versions? That's exactly the kind of predicament vendors used to have with Windows versus Windows CE (CE lost).

So for now, as far as the suitability of Windows RT for vertical markets goes, I'll have to give an "A" for current Windows RT tablet hardware. It's really excellent, and ARM-based hardware could really be a boon for integrators and vertical market vendors; a "B-" for Windows RT itself, because for now Metro is too limited to be of much use; and a "D" for clarity of concept as it's totally unclear where Microsoft is headed with RT.

Posted by conradb212 at 05:30 PM | Comments (0)

October 27, 2012

Windows RT: how suitable is it for vertical markets? (Part I)

Though as of this writing, October 27, 2012, Windows 8 and RT were just officially unveiled a couple of days ago, reams have already been written on Windows 8 and to a much lesser extent, Windows RT. We got our Surface RT tablet on October 26 with the intent on reporting on the Surface hardware and RT software in some detail. However, our emphasis will be on their suitability for vertical and industrial markets.

So what about Windows RT? The general word on it has been that it's a special version of Windows 8 for devices with ARM processors. A special version that will not be able to run any legacy Windows software, one that does not offer users the legacy desktop to go about their Windows business, and one where you cannot install software other than download it from the official Windows store. Engadget clearly stated in its review of Windows Surface: "Windows RT can't run legacy programs written for traditional, x86-based Windows systems."

Is this all so?

Yes, and perhaps no.

So here's what we found so far on our Surface tablet.

It comes with Microsoft Office 2013, and you run those versions of Word, Excel, PowerPoint and OneNote on the Windows RT desktop. We took screen shots of Word, Excel and PowerPoint, and here's what the apps look like (click on the pics for full-size versions):

Note that Office RT isn't final yet. It'll be a free download when it is. From what I can tell (and I am not an Office expert), even what comes with Windows RT now is a full version of Office, and not some micro version like Windows CE/Mobile used to have. This is the real thing.

Anyone who expected Office to be totally touch-optimized for Windows RT will be disappointed. It's not. You can use it with touch, but it can be a frustrating experience. And the touch keyboard doesn't help. Fortunately, you can simply plug in any old mouse or keyboard or mouse/keyboard combo and it works with Windows RT right off the bat.

Below is a screen capture of an Excel presentation. (and yes, I picked the slide that shows Alan Kay predicting it all back in 1968, and the original IBM Thinkpad tablet from 1993 on purpose).

If you take a closer look at our Word and Excel screen captures, you'll notice that not only are they in their own windows, we also have legacy Windows apps like Paint, Notepad, Calculator, the Math Input Panel, a system shell and the old performance monitor running. Interestingly, they do run (and many others, too, like Remote Desktop, Windows PowerShell, the whole Control Panel, etc.), and you can even pin them on the old Windows task bar. In fact, there's a lot of old legacy Windows stuff down in the basement of Windows RT. And much of it seems as functional as ever.

I am not sure what to make of that. After all, Windows is not supposed to run on ARM, yet a good number of the old programs do run. There's probably a good explanation for that.

Unfortunately, that doesn't mean you can simply install and run other old software. If you do, there's a message that says you can only install software from the Windows store.

So what's our preliminary impression of Windows RT on a Surface tablet? Quite positive. The 1.3GHz quad-core Nvidia Tegra 3 CPU has plenty enough power to make RT tablets perform well. The Nvidia setup doesn't need a fan and the tablet never even warms up, at all. And it seems to run almost ten hours on a charge.

Check back for more commentary on the suitability of Windows RT hardware and software for vertical markets.

Posted by conradb212 at 06:19 PM | Comments (0)

October 16, 2012

Windows Surface tablets will be here shortly

Now this should be interesting. On October 16, 2012, Microsoft announced more detail on its upcoming Windows Surface tablets. And though labeled as a "pre-order" with limited amounts, customers could actually order the Windows Surface RT tablet of their choice from the Surface page on Microsoft's online store. For delivery on or before October 26th, i.e. within ten days.

So the pricing of the Microsoft Windows RT tablets is no longer a secret. The basic 32GB tablet without a keyboard touch cover is US$499, the touch cover adds a hundred bucks, and the 64GB version with touch cover is US$699. That gets you a Microsoft-branded tablet that's as slender as the iPad, though it weighs a tiny bit more (1.5 vs 1.44 pounds). The Microsoft tablet looks wider because its 10.6-inch screen has a wide-format 16:9 aspect ratio compared to the iPad's 4:3.

There's a standard USB port (which the iPad doesn't have) and a standard microSD card slot (which the iPad also doesn't have). There's a capacitive touch screen of course, and two 720p cameras, meaning the Surface tablet is for video and not so much for taking pictures (for that you'd want higher res). The 1366 x 768 pixel resolution is more than the original iPad and the iPad 2's 1024 x 768, and it's also what's called 720p in HDTV and video speak, so it should be good for video playback.

All the expected sensors are there: ambient light, accelerometer, gyroscope and compass, meaning the Surface will be able to do the same tricks customers have come to expect from modern apps. And speaking of apps, the Surface RT tablet comes with Microsoft Office Home and Student 2013 RT (see here). It's not the final, final version, but it'll be a free update when that becomes available.

There's WiFi and Bluetooth, but no mobile broadband, so these initial versions of Microsoft's RT Surface tablets will need to be within the reach of a WiFi access point to be online. The processor is of the Nvidia Tegra variety, i.e. the type that has been powering the majority of Android tablets out there.

What's new and different is Windows RT, a version of Windows that runs on ARM processors and doesn't need the presumably more complex x86-based hardware required to run full Windows. What exactly that means remains to be seen. It's said that the Surface RT tablets are aimed at the consumer market, but the iPad was, too, and now it's used almost everywhere. How exactly will Windows RT work? How will it resonate with customers who have come to expect elegant, effortless simplicity from tablets? No one knows just yet.

And how will it all relate to Surface tablets with full Windows 8, tablets that will, at least the Microsoft Surface versions, look very much like the Surface RT tablets, but have beefier hardware (anything from the latest Atom to third gen Core processors), higher resolution (1920 x 1080), and more storage? Will the two co-exist, with users selecting one or the other depending on their needs? The Windows 8 Pro versions will inevitably cost a good bit more, but how much more can a market bear where consumers have been spoiled with very inexpensive, very powerful notebook computers for years? Much will probably depend on how Windows 8 pans out.

Finally, what will it all mean to vertical and industrial market tablets? Will there be rugged tablets running Windows RT? Or will the ever-important leverage factor dictate that most enterprise and industrial tablets remain x86-based and compatible with legacy Windows? No one knows.

So for now I ordered a Surface RT tablet, just to see how it works and what it's all about.

Posted by conradb212 at 11:28 PM | Comments (0)

October 02, 2012

Motorola Solutions' acquisition of Psion: Good, bad, or ugly?

Well, it's done. Psion is now part of Motorola Solutions. On October 12th, 2012, Ron Caines and Frederic Bismuth of Psion and Mark Moon of Motorola Solutions sent out the following note to their customers:

Dear Psion Customer:

We are writing to let you know that today Motorola Solutions completed the acquisition of Psion PLC.

Motorola Solutions is a leading provider of mission-critical communication systems and a pioneer in enterprise mobility solutions. The company has always been focused first and foremost on how to best serve its customers and chose to acquire Psion because of its complementary enterprise mobile computing products and its talented people who understand this highly specialized business. We are excited about what this opportunity brings you as a valued Psion customer. Bringing the Psion family of products onboard allows Motorola Solutions to extend its portfolio and better serve customers by delivering solutions in expanded use cases, especially in warehousing, cold chain, ports, yards and specialized modular applications.

Integration of the two companies has only just begun today. There will be no immediate changes to your account management, the partners that serve you or the products and services you receive from Psion. Customers who previously purchased or will purchase Psion products can be assured their products will be fully serviced and supported for the full duration of the contracts. All customer support numbers also remain the same.

Furthermore, Motorola Solutions is committed to investing jointly around its and Psion's technical strengths and capabilities to deliver compelling solutions for the various applications and markets that both Motorola Solutions and Psion have served.

Once we have worked through the details of the integration, we will share those plans with you. You can be assured that throughout this process we will remain focused on building on Psion's relationship with you and serving all of our customers.

If you have any questions, please contact us or your Psion representative. Thank you for your continued loyalty and support.

With many of the smaller, independent manufacturers of rugged computing equipment being swallowed up by larger companies, this was perhaps inevitable. To many rugged computing enthusiasts and insiders, also inevitable is the question "why?" as there is rather substantial product line overlap between the two companies. In an informal conversation, a Motorola source said that the acquisition of Psion was adding complementary handheld products and vehicle-mount terminals that complement Motorola's offerings. The acquisition, the source said, also supports their international growth strategy by providing an attractive, global-installed base.

That's certainly true, by if the history of such acquisitions has shown anything, it's the latter reason rather than the former. As is, purchased product lines almost inevitably get absorbed. They may live on for a while, but in the longer run it makes no sense to carry duplicate lines. That's too bad as Psion was really on to something with their modular approach to rugged handheld computing platforms. What will become of the innovative ikôn, Neo, and Omnii? The tough WorkAbouts? The panels that still have the old Teklogix' DNA?

So for now, we reflect on what was. Through Pen Computing and RuggedPCReview.com we covered Psion for a very long time. First those really terrific little clamshell handhelds that were better than anything based on Windows CE at the time, then the acquisition of Teklogix in 2000 (I was at the press conference in Chicago when it was announced), the Psion netbooks way before the world bought tens of millions of "netbooks," and always the rugged handhelds. We had a close relationship with Psion most of the time; at some point we even had a "Psion PSection" in Pen Computing Magazine (with some of the columns still online at pencomputing.com/Psion/psection.html).

So here's hoping that Moto Solutions will aim for, and succeed in, creating the synergy that is always given as the reason for an acquisition. After all, Moto's own for Symbol Technologies is well aware of the good (its own flourishing after being acquired by Moto), the bad (Intermec > Norand), and the ugly (Symbol > Telxon).

Posted by conradb212 at 03:11 PM | Comments (0)

August 31, 2012

"The Windows Marketplace for Mobile for windows mobile 6.x devices is closing"

"The Windows Marketplace for Mobile for windows mobile 6.x devices is closing" -- that was the title of a March 8, 2012 entry at answers.microsoft.com. In it, it said among other things, "Beginning May 9, 2012, the Windows Mobile 6.x Marketplace service will no longer be available. Starting on this date, you will no longer be able to browse, buy or download applications directly on your Windows Mobile 6.x phone using the Windows Mobile 6.x Marketplace application and service." Signed The Windows Phone Team (with "Ready for a new phone? Explore the latest Windows Phones -- now with over 60,000 applications and games available!" in their signature). I mean, the fact that the announcement was made by the Windows Phone team, whose job it is to replace Windows Mobile, and not whoever is responsible within the Windows Embedded contingent tasked with presiding over Windows Embedded Compact speaks volumes.

Good Grief.

What was Microsoft thinking? The one saving grace of what's left of Windows Mobile or Windows Embedded Compact, or whatever it's called these days, was the Windows Marketplace from which you could download apps directly into the device. Whenever I got a new Windows Mobile device for testing, the first thing I always did was download a few essentials, such as Google Maps, Bing, Facebook, Handmark's ExpressNews, a couple of utilities and converters, etc. Now you can't even do that anymore.

It's as if Microsoft (or whatever feuding faction within Microsoft presides over the demise of Windows Mobile these days) had dropped even the last ounce of pretense that they intend to maintain Windows Mobile as a viable contender to iOS and Android. Windows Mobile never was that, of course, but the nicely done Marketplace at least let long-suffering users personalize their devices to some extent. No more.

That is truly regrettable. I don't think anyone ever loved Windows Mobile, but fact is that even today, in 2012, the vast majority of industrial and vertical market mobile hardware still runs one version of Windows Mobile or another. By ditching the Marketplace, Microsoft now has made sure that Windows Mobile devices are truly usable only via 100% custom-designed software that mostly avoids the OS interface altogether.

That is not a happy situation for all the rugged hardware vendors who have faithfully designed, manufactured and marketed innovative, reliable, high quality devices for all those years, and now are saddled with an ancient software platform that is neither supported properly by Microsoft, nor competitive against newer platforms, even those incompatible ones from Microsoft.

Posted by conradb212 at 04:56 PM | Comments (0)

August 11, 2012

Performing under pressure

As I am writing this, the London Olympic games are coming to an end. What two weeks of intense competition proved again is that winning means meticulous preparation, at times a bit of luck, and always the ability to perform under pressure. The latter made me think because rugged computers are all about the ability of a piece of equipment to perform under pressure. Pressure as in heat, cold, dust, rain, sun, and whatever else may keep a system from running at peak efficiency.

Ruggedness testing is designed to determine if systems hold up under pressure, but are the tests really meaningful? Many probably are. If, for example, a system is dropped a number of times from a certain height and still works afterwards, chances are it'll survive similar drops out there in the field. But are all tests as meaningful?

A while ago a manufacturer of rugged computers challenged us to test computing performance not just in an office environment, but also over the entire listed operating temperature range. We did, and not surprisingly, the machinery supplied by that company passed with flying colors, i.e. it ran through the benchmarks as fast at freezing and near boiling temperatures as it did at the 72F we usually have in the test lab.

But, as we subsequently found out, that seems to be the exception. We've been doing benchmark testing on some other rugged devices under thermal stress, and the results are reason for concern. If a rugged handheld, laptop or tablet is supposed to be used out in the field, it's reasonable to assume it'll be asked to perform at peak efficiency at temperatures one might likely encounter outdoors or on the job. Depending on where you are, that might easily include temperatures well over 100 degrees. Such work may well include prolonged exposure to the sun where it may heat up beyond ambient temperature. If it is 105 degrees outdoors, temperatures may easily reach 115 or 120 degrees or even higher if you sit the device down somewhere, or even if it's left in a car. So what happens to performance then? Can the device perform under pressure?

Turns out, not all can.

Running our standard benchmarks after leaving rugged systems out in the California summer sun showed performance drops of 50 to 80%. That's pretty serious. Is it acceptable that a piece of equipment that's supposed to be used outdoors then runs only at a fraction of the speed or even at half speed? I'd say not. Think of the potential consequences. Tasks may take between twice to several times as long, potentially affecting critical decisions.

Is it reasonable to expect full performance under extreme conditions? Not necessarily. Extreme conditions can have an impact on electronics, and there may be justifiable, reasonable precautions to limit performance so as to safeguard the equipment and its life. But is it acceptable to see performance drop to a fraction at the limits of a listed operating temperature range? It's not. Customers should know what level of performance they can expect when the going gets tough.

Like at the Olympics, performance under pressure separates the rugged system winners from the also-rans. This really needs to be addressed.

And it's not a simple issue. Complex electronics such as processors have sophisticated internal power management. Boards have sensors that report temperatures to control mechanisms that then may throttle system performance. Firmware and the OS may also monitor environmental situations and then engage fans or throttle performance. The hardware itself may have inherent design limitations. Variables such as Glass Transition Temperature, or Tg, come into play. Tg is the temperature at which polymer materials go from a glassy state to a rubbery state. The types of capacitors used matters. Conformal coating can protect boards. HALT testing can predict real life reliability better than the simple mean time between component failures. And so on.

All of this is standard practice in embedded systems design. It should be fully and universally adopted in rugged mobile system design as well.

Posted by conradb212 at 04:46 PM | Comments (0)

June 26, 2012

Microsoft's entry into tablet hardware a result of partner failure?

Ever since Microsoft provided a glimpse at a couple of "Surface" tablet hardware prototypes, some in the media are describing Microsoft's apparent entry into the hardware market as a result of Microsoft hardware partner failure. As if, somehow, the combined might of the world's computer manufacturers failed to come up with tablet hardware good enough to do Windows justice.

Nothing could be farther from the truth.

The reason why Windows-based tablets never were a major commercial success lies squarely in Microsoft's corner, and not in that of the hardware partners. For stating the very obvious: Windows has never been a tablet operating system. It was designed for use with a keyboard and a mouse. It does not work well with touch, and it did not work well with pens.

If anything, hardware partners went out of their way with innovative ideas and products to make Windows work in Microsoft-mandated tablets. And let's not forget that it was Microsoft itself that, well into the lead-up to the 2002 Tablet PC introduction, began pushing convertible notebooks rather than tablets. Apparently, the company had so little faith in its own Tablet PC project that it seemed safer to introduce the Tablet PC Edition of Windows XP on a notebook with a digitizer screen rather than a true tablet. That of course, made tablet PCs bigger and bulkier and more expensive.

Let's also not forget that Microsoft mandated an active digitizer for the 2002 Tablet PC because active pens better emulated the way a mouse (and with it, Windows) worked. Touch was definitely not part of the Tablet PC.

Microsoft's hardware partners did the absolute best they could within the great constraints of the Windows OS. In the 1990s, companies like GRiD, Fujitsu, Toshiba, NEC, IBM, Samsung, Compaq and many others came up with numerous tablet computer solutions trying to somehow make Windows work in smaller, lighter, handier platforms without physical keyboards. In the 2000s, a whole roster of hardware partners came up with tablet and tablet convertible hardware when Bill Gates proclaimed that by 2006, tablets would be the most popular form of PCs in America. They (Motion Computing, Fujitsu, Acer, Toshiba, Panasonic, etc.) invested the money and they carried the risk, not Microsoft.

Add to that the unsung heroes of the tablet computer form factors, the companies that made all those vertical market tablets for applications where it simply wasn't feasible to carry around a big laptop. They made do with what they had on the operating system side. And they did a remarkable job.

To now complain about "partner failures" is simply asinine. And given that even now, hardware partners will have to decide whether to bet on x86 Windows 8 or ARM Windows RT, will they again be blamed if one or both flavors of Windows 8 fail to make inroads against the iPad and Android tablets?

Posted by conradb212 at 10:47 PM | Comments (0)

June 21, 2012

Windows Phone 8...

Sometimes I wish I could be a fly on the wall to listen in when Microsoft's mobile folks make their decisions.

I mean, a few years ago they found themselves in a position where, against all odds, their erstwhile omnipotent foe Palm collapsed and left Windows Mobile as the heir apparent. So did Microsoft take advantage of that? Nope. Instead, they failed to improve their mobile OS in any meaningful way, all the while confusing customers by endlessly renaming the thing. And handing leadership over to the phone companies.

Then Apple comes along and shows the world how smartphones are supposed to be. Well, apart from grafting a Zune-like home screen, Microsoft did virtually nothing to advance Windows CE from its mid-1990s roots. Then they come up with Windows Phone 7, which is a whole lot better, but completely incompatible with any earlier Windows CE/Windows Mobile devices and software.

While Phone 7 and the Phone 7.5 update were billed as the future, apparently they weren't as now there will be Windows Phone 8, which is.... completely incompatible with Phone 7/7.5. And why? Because Phone 8 will supposedly share the same Windows kernel that "real" Windows has (though presumably not the ARM versions). So if Windows 7/7.5 still had Windows CE underpinnings, why were those versions not compatible at all with earlier Windows CE/Windows Mobile versions? It's just all so confusing.

And about the shared Windows kernel: Wasn't the very idea of Windows everywhere why Windows failed in so many areas that were not desktop or laptop?

In this industry, one absolutely never knows what's going to happen. Palm was considered invincible, Transmeta was supposed to succeed, Linux was to be the next big thing, the iPhone and then iPad were widely derided as lacking and a fad when they were first introduced, and Android was certain to quickly challenge iOS in tablets. So perhaps Windows Phone 8 will somehow become a success, but then why baffle the public with Windows 8 for the desktop, Windows RT, which isn't quite Windows, for ARM tablets, two versions of "Surface" tablets, and then Windows Phone 8 devices that share the Windows kernel but are somehow separate anyway?

Go figure.

Posted by conradb212 at 08:19 PM | Comments (0)

May 30, 2012

Android finally getting traction in vertical and industrial markets?

Just when Windows 8 is looming ever larger as perhaps a credible competitor to iOS and the iPad, we're finally starting to see some Android action in vertical market tablets and handhelds. It's timid, exploratory action still, but nonetheless a sign that the industry may finally break out of the stunned disbelief as Apple was first selling millions and then tens of millions of iPads.

What has changed? Perhaps it's the fact that it's becoming increasingly harder to argue against Android as a serious platform now that Google's OS dominates the smartphone market. Though it seems more fragmented than ever, Android is now on hundreds of millions of smartphones, and all of them are little mobile computers much more than phones. The fragmentation is certainly an issue as is the large variety of mobile hardware Android runs on, but it's also a trend and sign of the time. Cisco recently published the results of a study which showed that 95% of the surveyed organizations allowed employee-owned devices, and more than a third provided full support for them. It's called the "Bring Your Own Device" syndrome, and for Cicso it was enough to ditch its own Cius tablet hardware. What it all means is that people will want to use what they own, know and like, and in tablets and handhelds that's iOS and Android.

There's also been movement on the legal front. Oracle had been suing Google for patent infringement over some aspects of Android, and since Oracle is a tenacious, formidable opponent in whatever they tackle, this cast a large shadow over Android. Well, Google won, for now at least, when a jury decided Google had not infringed on Oracle's patents.

So what are we seeing on the Android front?

Well, there's DRS Tactical Systems that just announced two new rugged tablets with 7-inch capacitive touch displays. They look almost identical, but they are, in fact, two very different devices. One runs Android, one Windows, and DRS made sure the hardware was fully optimized for each OS, with different processors, different storage and different controls. That's costly, and it shows that DRS sees Android as having just as much of a chance to be the platform of choice in mobile enterprise applications as does Windows.

There's Juniper Systems which revealed that its unique 5.7-inch Mesa Rugged Notepad will soon be available in an Android version called the RAMPAGE 6, courtesy of a partnership with Pennsylvania-based SDG Systems. The Juniper Mesa is powered by the ubiquitous Marvell PXA320 processor. If the Android version uses this same chip, we'd finally have an answer to the question whether the PXA processors that have been driving Pocket PCs and numerous industrial handhelds for a decade can run Android (we asked Marvell several times, to no avail).

The folks at ADLINK in Taiwan have been offering their TIOT handheld computer in two versions since late 2011; the TIOT 2000 runs Android, the identical-looking TIOT 9000 Windows CE. Here, though, the Android model runs on a Qualcomm processor whereas the Windows CE model has a Marvell PXA310.

General Dynamics Itronix has been playing with Android for a couple of years now, demonstrating their Android-based GD300 wearable computer to military and other customers. Panasonic introduced their Toughpad to great fanfare at Dallas Cowboy Stadium in November of 2011, but though the rather impressive tablet seemed ready back then, it actually won't start shipping until summer of 2012. Motorola Solutions also announced an Android tablet late in 2011, but I am not sure if the ET1 Enterprise Tablet is in customer hands yet.

Mobile computing industry veterans may recall that there was a similarly confusing era several technology lifetimes ago: back in the early 1990s the upstart PenPoint OS platform came on so strong that several major hardware companies, including IBM, shipped their tablets with PenPoint instead of Microsoft's unconvincing pen computing overlay for Windows. Microsoft, of course, eventually won that battle, but Microsoft's "win" also demoted tablets back into near irrelevance for another decade and a half. Will it be different this time around? No one knows. Microsoft dominates the desktop, as was the case back then. But unlike PenPoint which despite its hype was known only to a few, hundreds of millions are already familiar with Android.

The next six months will be interesting.

Posted by conradb212 at 10:10 PM | Comments (0)

May 02, 2012

The widening gulf between consumer and vertical market handhelds

Almost everyone has a smartphone these days. Smartphones are selling by the tens of millions every quarter. In Q1 of 2012, Apple and Samsung sold over 30 million smartphones each. Smartphones have become part of modern life. Everyone is tapping, pinching and zooming. Everyone except those who need a rugged smartphone. Because there isn't one.

Now to be fair, there are rugged smartphones and any number of ruggedized handhelds that add phone functionality to a handheld computer that can also scan and do all the things people who work in the field need to do on the job. Except, they really aren't smartphones. Not in the way consumers have come to expect smartphones to be. Why is that?

Because ever since 2007 when Apple introduced the iPhone, there's been a widening gulf between consumer phones and the devices people use at work. Before the iPhone, cellphones had a bit of rudimentary web functionality and a number of basic apps. Nothing was standardized and everyone rolled their own. Professional handhelds almost all ran Windows Mobile, which had had very good phone functionality as early as 2002. But Windows Mobile never really took off in the consumer market.

Why did the iPhone change everything? Because it introduced a fluid, elegant way of using and interacting with the phone that resonated with people and made total sense. Almost no one wants to first pull out a plastic stylus to then operate a clumsy mini version of a desktop OS. But just lightly tapping at a screen, drag things around, and effortlessly zoom in on what was too small on a tiny phone display, that's an entirely different story. One that Google quickly copied with Android, and one that Microsoft did not, or not until it was too late.

As a result, smartphones took off on a massive scale, one much grander than anyone had anticipated. And it was the sheer, simple elegance and functionality of just having to lightly tap, swipe, pinch and zoom that did it. Which, in turn, came from Apple's primary stroke of genius, that of using capacitive multi touch.

The rest is history. Since 2007, Apple's sold hundreds of millions of iPhones. And there are hundreds of millions of Android smartphones, with vendors selling Android-based smartphones combined having a larger market share than Apple.

With all of this happening and perhaps half a billion handhelds being sold in just five short years, how did the vertical market respond? How did it benefit from the riches, the opportunities, the breakthrough in acceptance of handheld technology that the vertical market had been waiting for?

It didn't.

Ruggedized handhelds still run Windows Mobile in a form virtually unchanged from the days before Android and the iPhone. There is no multi-touch. There is no effortless tapping and panning and pinching and zooming. There is no apps store (there was one, but Microsoft closed it).

And worse, there is no upgrade path. Windows Mobile, which Microsoft merged into its embedded systems group a while ago, seems frozen in time. But isn't there Windows Phone 7, that's now Phone 7.5 and is currently heavily promoted with the launch of the Nokia Lumina 900 smartphone? There is, but Windows Phone is totally different from Windows Mobile. There is no upgrade path. And even if there were, it's a market where there are already half a billion iPhones and Android smartphones, and people who know how to use them and who expect nothing less. Not in their personal lives, and not on the job.

That is a definite problem for those in the market of making and selling ruggedized handhelds. And the problem is not demand. With the world now pretty much convinced that handheld computing and communication devices are tremendously useful and will only become more so, no one needs to be sold on the merits of handheld technology on the job. Everyone knows that already.

The problem is that the business market now wants smartphones that are a little (or even a lot) tougher than a consumer phone, and perhaps can do a few things consumer phones don't do so well, like scanning. But the market wants that extra toughness and those extra capabilities without giving up the elegant, effortless user interface, the bright high-res displays, and the ability to take pictures and HD movies so good that consumer smartphones are now replacing dedicated digital cameras.

And that's why it is becoming increasingly difficult to sell handhelds that offer technology and functionality that is by now very dated by consumer smartphone standards. Sure, the technology and functionality of most ruggedized handhelds are as good and better as they were six years ago, but the world has changed. Sure, the vaunted Microsoft leverage argument ("You use Microsoft in your business, so Windows Mobile fits right in and you can leverage your existing investment") still applies. But that is no longer enough. Businesses who need to equip their workers with rugged handhelds now want more.

But isn't the mere popularization of handheld technology enough to make rugged technology vendors make a good living? Perhaps. It all depends on the type of business and its inherent profitability. But is basically standing still a good business strategy in a technology boom measuring in the hundreds of millions of consumer handhelds? And are the largely flat financials of rugged handheld makers not a warning sign?

There are many possible scenarios. For example, perhaps we're seeing a total separation of consumer and vertical markets, one where consumer handhelds get ever more powerful while much more rugged vertical market computers pursue a small niche where they simply won't ever be challenged by consumer technology. And perhaps Microsoft will manage to somehow leverage a successful unified Windows 8 Metro-style user interface into handhelds that can become the true successor of Windows Mobile, with whatever benefits customers see in remaining within the Microsoft fold. And perhaps there really is an insurmountable challenge in making capacitive multi-touch suitable for rugged applications (this is often voiced as a reason, though I can't quite see it).

But there are also darker scenarios that bode less well for the verticals. If consumer phones aren't tough enough or don't have certain peripherals, third parties may simply make rugged cases and enclosures to make them tough, and sleeves and caddies to add whatever functionality business customers want. Without losing the performance and capabilities of a consumer smartphone. In that case, what could and should have been a golden opportunity for vertical and industrial handheld makers might simply vanish as consumer technology eats their lunch.

As is, it's become somewhat painful to see vertical market companies struggle, companies that know so well how to make products that hold up under trying circumstances, products that don't leak, products with displays that can be read in bright sunlight, products that will last years rather than months, and products that are tailor-made so well for very specific needs. Those companies have a lot of valuable expertise and so much going for them.

But will all that be enough to mask and make up for an increasingly wider gulf between vertical market and consumer market technology? Only time can tell, and it may be running out.

Posted by conradb212 at 04:59 PM | Comments (0)

April 24, 2012

e-con Systems executive explains the reality of cameras in rugged computers

A little while ago I had an email conversation with the folks at e-con Systems. They are an embedded product development partner with significant expertise in camera solutions in the Windows CE and Windows Embedded space. The company offers a variety of lens and camera modules that can be interfaced with most of the common handheld processors from TI, Marvell, FreeStyle and others. My interest was, as I discussed in earlier RuggedPCReview.com blog entries, why at a time when every new smartphone includes a superb camera capable of full HD 720p or 1080p video, the cameras built into rugged devices lag so far behind.

Here is what Mr. Hari Shankkar, co-founder and VP of Business Development of e-con Systems had to say:

"We have worked with several rugged handheld manufacturers and they use our device driver development services or our camera modules. Based on this experience and our interactions with them, here are our comments:

  • There is a big difference in the way rugged computers are constructed and devices such as digital cameras or smartphone are built.
  • The bulk of the NRE effort goes to making the device rugged and only a very small percentage is left when it comes to the camera. In the case of a digital camera or a cell phone this is not the case as the cameras are given higher importance.
  • These devices are sold through tenders and it is mostly B2B (business-to-business) and not B2C (business-to-consumer) like the cell phone cameras and the digital cameras. The request for quantities is low, like a few hundred per month or per quarter. We have personally not seen these tender documents but from what we have been told, the emphasis is given more to the ruggedness than to the camera side. The camera is needed but customers are more concerned about the resolution of the pictures and whether they can capture 1D/2D barcodes with it.
  • Some of the cameras with ISPs (image signal processors, for backend digital processing) don’t work at very low temperatures; only raw sensors work at such low temperatures. This means you have to have an external ISP on the board. But some of the manufacturers prefer to have the ISP developed in software and not have any hardware ISP. The digital cameras and the cell phone cameras have ISP integrated externally for high resolutions. This is one of the reasons you don’t see a rugged computer with a 8MP or a 14MP very often. Currently, the 8MP and the 14MP are raw sensors and no one has a ISP built in.
  • The image captured by the camera from a sensor can vary between the lens choices. A glass lens will give better quality than the plastic lens. However, we see most of the vendors going with camera modules having plastic lenses which of course affects the quality of the images you are capturing.
  • As long as the end customer demand is not that great for cameras, this will be like this. We see that integration of global shutter cameras (required for capturing stills when you are capturing a fast moving object) or integration of a glass lens not in the immediate future."

So what Mr. Shankkar is saying is that a) rugged manufacturers concentrate on the basic design to the extent where the camera is usually an afterthought (and our internal examination of most rugged designs confirms that), that b) there are some image signal processing issues that complicate matters for rugged applications, and that c) in the absence of higher customer demand, the quality of imaging subsystems in rugged designs is going to remain as is.

Those are certainly logical reasons, and as a provider of imaging solutions for handhelds and other devices, Mr. Shankkar is familiar with the thought process and priorities of rugged equipment vendors. And e-con Systems certainly has a roster of very competent camera modules (see e-con Systems camera modules).

Nonetheless, I cannot help but see a widening disconnect between rugged computer manufacturers and the digital imaging industries here. Integrating the imaging quality and functionality of, say, a US$200 GoPro Hero 1080p hybrid video camera into a high-end rugged data capture device simply ought to be doable. And if I can take superb high definition pictures and 1080p HD video with a 5-ounce iPhone 4s, the same ought to be doable in a rugged handheld or tablet. Yes, it would add cost, but these are not inexpensive devices, and the precision data capture requirements of many vertical market applications deserve no less than what any smartphone camera can do.

Posted by conradb212 at 10:50 PM | Comments (0)

April 18, 2012

The nature and potential of Windows 8 for ARM devices

Well, Microsoft announced in its Windows Blog (see here) that there will be three versions of the upcoming Windows 8. For PCs and tablets based on x86 processors, there will be plain Windows 8 and and the more business-oriented Windows 8 Pro that adds features for encryption, virtualization, PC management and domain connectivity. Windows Media Center will be available as a "media pack" add-on to Windows 8 Pro. A third version, Windows RT, will be available pre-installed on ARM-based PCs and tablets. Windows RT will include touch-optimized desktop versions of Word, Excel, PowerPoint, and OneNote.

That, mercifully, cuts down the available number of Windows 8 versions from five in Windows 7 (Starter, Home Basic, Home Premium, Professional, and Ultimate) to just three, if you don't count additional embedded and compact varieties.

While Microsoft's April 16 announcement on the versions was interesting, what's even more interesting is a long entry in Microsoft's MSDN blog back on February 9. It was called "Building Windows for the ARM processor architecture" (see here) and provided an almost 9,000 word fairly technical discussion of the ARM version of Windows 8. That one shed some light on how Microsoft intends to implement and position the next version of Windows, and make sure Windows won't be irrelevant in what many now term the "post PC" era.

As you may recall, Microsoft's initial Windows 8 announcements were a bit odd. Microsoft called Windows 8 "touch first" and made it sound as if Windows 8 were a totally multi-touch centric OS. While that certainly sounded good in a world awash in iPads, it seemed exceedingly unlikely that all those hundreds of millions of office workers would suddenly switch to touch devices. One could really only come to one conclusion: Windows 8 would most likely work pretty much like Windows 7 and Windows XP before it, but hopefully also somehow incorporate touch into the vast Microsoft software empire.

The MSDN blog goes a long way in explaining much of what we can expect. It's difficult to condense the very long post into some of the important basics, but it goes something like this:

Windows on ARM, which was originally called Windows WOA but was then renamed to Windows RT in the April announcement, should feel as much as standard Windows 8 as possible. To that extent, while the ARM version cannot run legacy Windows software, there will be a Windows desktop with the familiar look and feel, and also a lot of the familiar Windows desktop functionality.

Microsoft also emphasized that Windows RT will have a "very high degree of commonality and very significant shared code with Windows 8." So why can't it run legacy Windows software? Because, Microsoft says, "if we enabled the broad porting of existing code we would fail to deliver on our commitment to longer battery life, predictable performance, and especially a reliable experience over time."

That, however, doesn't mean there won't be Microsoft Office on the ARM version of Windows. In fact, every Windows ARM device will come with desktop versions of the new "Office 15," including Word, Excel, PowerPoint and OneNote. Will the ARM version of Office be different? Microsoft says that they "have been significantly architected for both touch and minimized power/resource consumption, while also being fully-featured for consumers and providing complete document compatibility." What that means remains to be seen. After all, the Windows CE/Mobile "Pocket" versions of the Office apps were also called Word, Excel, PowerPoint and OneNote, but with just offering a small fraction of the desktop versions' functionality.

From a cost point of view, x86 Microsoft Office runs between US$119 (Home and Student) to US$349 (Office Professional). Considering that Windows RT devices will likely have to be very price-competitive with iPads and Android tablets, including Office will put an additional cost burden on Windows ARM devices.

Now let's take a broader look at Windows RT and how it'll differ from standard x86 Windows 8. First of all, you won't be able to just buy the Windows RT OS. It only comes already installed on hardware. That's really no different from Android, and the reason is that the operating system on ARM-based devices is much more intertwined and optimized for particular hardware than x86 Windows that pretty much ran on any x86 device.

Microsoft also stated that it has been working with just three ARM hardware platform vendors, those being NVIDIA, Qualcomm and Texas Instruments. There are, of course, many more companies that make ARM-based chips and it remains to be seen whether other ARM vendors will remain excluded or if they, too, will have access to Windows RT. As is, while Windows has always predominately been x86, Microsoft occasionally also supported other processor platforms. For example, early Windows CE was considered a multi-processor architecture. Back in 1997, Windows CE supported Hitachi's SuperH architecture, two MIPS variants, x86, the PowerPC, and also ARM.

Another difference between the x86 and the ARM version of Windows 8 is that "WOA PCs will be serviced only through Windows or Microsoft Update, and consumer apps will only come from the Windows Store." So while x86 versions of Windows 8 application software will likely be available both through a Windows Store or directly from developers, Windows 8 ARM devices will follow the Apple app store model. That, of course, has significant control and security implications.

A further difference between Windows 8 x86 and ARM devices will be that while conventional x86 hardware likely continues to have the traditional standby and hibernation modes, ARM-based Windows devices will work more like smartphones and tablets that are essentially always on.

Now for the big question: How does Microsoft intend to bring Windows to such wildly different devices as a desktop PC and a tablet without falling into the same traps it fell into with earlier tablet efforts that were never more than compromises? In Microsoft's vision, by adding the WinRT, a Windows API that handles Metro style apps. From what I can tell, if a Metro application (i.e. one that only exists in the tile-based Metro interface) completely adheres to the WinRT API, then it can run both on ARM devices and also on x86 devices under their Metro interface.

What does that mean for existing software that developers also want to make available on ARM devices? There are two options. First, developers could build a new metro style front end that then communicates with external data sources and communicates through a web services API. Second, they could reuse whatever runtime code they can within a Metro environment. Either way, the old Windows leverage argument ("staff and developers already know Windows, so we should stay with Windows") won't be as strong as the WinRT API and Metro interface are new. How that will affect business customers who simply wish to stay with Windows instead of using iPads or Android tablets is anyone's guess.

I must admit that having gone though Windows for Pen Computing (1992), the Windows Pen Services (1996), and then the Windows XP Tablet PC Edition (2001), I am a bit skeptical of Microsoft's approach to Windows RT. It still feels a lot like hedging bets, cobbling yet another veneer on top of standard Windows, and claiming integration where none exists.

In fairness, the iPad has the same issues with Mac OS. The iPad is fundamentally different from a desktop iMac or even MacBook, and I am witnessing Apple's attempts at bringing the Mac OS closer to iOS with a degree of trepidation. But the situation is different, too. Microsoft's home base is the desktop and it now wants (and needs) to find ways to extend its leadership into tablets and devices, whereas Apple found a new and wildly successful paradigm that flies on its own and only loosely interfaces with the desktop (where most iPad users have Windows machines).

Bottom line? For now, while Windows 8 will undoubtedly do very well for Microsoft on the desktop and on laptops, it remains far from a certain slam dunk on the tablet and devices side. As I am writing this, Microsoft, AT&T and Nokia are on an all-out campaign to boost Windows Phone with the Nokia Lumina 900, but considering the massive head start the iPhone and Android have, nice though it is, Windows Phone remains a long shot. Windows RT will likely encounter a similar situation.

One possible outcome may be that Windows RT will lead to a resurgence of the netbook syndrome. Netbooks sold on price alone, though they were never very good. Low-cost Metro devices might pick up where earlier gen netbooks left off, with multi-touch and lots of post PC features, but still nominally being Microsoft and having Office.

Posted by conradb212 at 05:06 PM | Comments (0)

April 16, 2012

Will GPS drown in commercialism?

There are few technologies that have changed our lives and work as fundamentally as GPS. Not so very long ago, if you needed to know where to go, you used a paper map. Today we simply punch in where we want to go, then listen to directions and monitor our position on the GPS display. And industry, of course, has taken wondrous advantage of GPS, using it to optimize and manage transportation and location-based services to a degree never thought possible. GPS, by any account, is totally crucial to our modern world and society.

That's why a couple of recent observations worry me.

The first was when I left for San Francisco International Airport for a recent trip to Europe and my Garmin GPS did not find San Francisco Airport. Flat out did not find it. Not even in the transportation category. What it did find, though, was a hotel close to the airport. And so, since I was already underway and needed to concentrate on traffic, that's what I had to choose as my destination. Which promptly meant that I missed an exit. I have to believe that a Garmin GPS ought to find San Francisco International Airport, but mine didn't. All it coughed up was a hotel nearby.

After I returned from Europe, I needed to take my son to a local high school for a college orientation. I looked up the location of the college on Google Maps on my iMac and committed it to memory. In the car, I used the Maps app on my iPad, which is by Google, and the iPad drew the route from my home to the school. Except that it wasn't to the school. It was to a "sponsored location" nearby. Yes, the official Maps app on the iPad guided me to a "sponsored location" and not to where I wanted to go. Without telling me. It did place a small pin where I actually wanted to go, but the route it drew was to the sponsor location.

That is a very dangerous trend. Project it into the future, and you might see a situation where GPS might be as utterly unreliable and frustrating as email is today. Just as we drown in commercial spam, what if GPS apps likewise will drown us in "sponsored locations," making users sift through commercial GPS spam in order to find what we really need? That would make GPS not only useless, but potentially dangerous.

That, Google, would be evil indeed, and it's already evil that I am guided to a "sponsored location" instead of the clearly defined location I wanted to go to.

How does that relate to rugged computing? It's pretty obvious. What if commercial hooks begin hijacking routes? What if even official addresses are drowned in sponsored spam locations? Think about it.

And below you can see the routing to the sponsor location instead of the requested location marked by a pin (click on the image for a larger version).


Posted by conradb212 at 03:56 PM | Comments (0)

March 08, 2012

The new iPad -- both challenge and opportunity for rugged market manufacturers

If you want to sell tablets it's tough not to be Apple. And on March 7, 2012, it got that much tougher. For that's when Apple introduced the next version of the iPad, setting the bar even higher for anyone else.

Why do I even mention that here at RuggedPCReview.com where we concentrate on computing equipment that's tough and rugged and can get the job done where a consumer product like the iPad can't? Because, like it or not, the iPad, like the iPhone, sets consumer expectations on how computing ought to be done. It does that both by the elegance and brilliance of its execution, and by the sheer numbers of iPads and iPhones out there (Apple has sold 315 million iOS devices through 2011). That pretty much means anything that doesn't at least come close to offering the ease-of-use and functionality of the Apple devices will be considered lacking, making for a more difficult sell.

Unfortunately for anyone else out there trying to sell tablets, it's been tough. Somehow, while the iPad is simply a tablet, a way of presenting, consuming and manipulating information, it's been remarkably difficult for anyone else to convince customers to select them, and not Apple. Remarkable because Apple, despite its mystique, never managed to even make a dent into Microsoft's PC hegemony, and remarkable because of the number of vocal Apple opponents who shred whatever Apple creates seemingly on principle.

But let's take a quick look at Apple's latest version of the iPad, called not, as expected, iPad 3, but once again simply iPad.

No one ever complained about the resolution of the iPad display (1024 x 768), and everyone else stayed around that resolution as well, with lower end products perhaps offering 800 x 480, many using the old 1024 x 600 "netbook" resolution, and higher end products going as far as 1280 x 800 or the wider 1366 x 768. Well, with the new iPad Apple quadrupled resolution to 2048 x 1536, making for a superior viewing experience. Such high resolution is not necessarily needed, but if it's available for as comparatively little as Apple charges for iPads, everything else now looks lacking. And I can definitely see how the super-high resolution could come in very handy for many vertical market applications.

The new iPad also has two cameras. The new iPads we ordered will not arrive for another week and so I don't know yet just how good they are, but if the iPhone 4s is any indication, they will be very significantly better than what anyone else in the rugged arena has to offer at this point. I've long wondered why expensive, high quality rugged handhelds, tablets and notebooks come with marginally acceptable cameras, and the new iPads will only widen the chasm. The iPad cameras aren't only capable of offering fully functional video conferencing on their large screens, they can also snap rather high quality stills, and they can record 1080p full motion HD video, with image stabilization. And the iPad has the software to go with it. Few could claim this wouldn't come in handy for professionals in the field.

Advances on the technology side include a faster dual core Apple-branded ARM processor with quad core graphics and 4G LTE wireless broadband. Unless some rugged hardware we've seen over the years, iPads were never underpowered, and with the new chip they'll be snappier yet. And while 4G wireless isn't ubiquitous yet by any means, having it built-in certainly doesn't hurt. And then there's battery life, where the iPad, even the new improved one, wrings about ten hours out of just 25 watt-hours. And the whole thing still only weighs 1.4 pounds.

Now, of course, the iPad isn't rugged. It's durable and well built, and if you use it in one of its many available cases, it won't get scratched or dented, but it's not rugged. Its projected capacitive multi-touch screen famously cannot be used with gloves, you can't use a pen for when pin-point accuracy is required, and it's not waterproof.

None of which stopped the iPad from scoring some remarkable design wins in areas and industries that once did not look beyond rugged equipment. The FAA granted American Airlines permission to use iPads to replace inflight manuals and such, and American is deploying 11,000 iPads. Others will follow.

What does that all mean for companies that make rugged tablets? That the market is there. In fact, I believe the surface has barely been scratched. But it has to be the right product. Apple showed the way with the iPad but, with all due respect to those who've tried so far, few followed with more than a timid effort. It's been mostly wait-and-see, and now Apple has set the bar higher yet. That doesn't mean it's over for anyone else, but it's gotten tougher yet. The new iPad will boost acceptance of the tablet form factor and functionality to higher levels yet, and that still means opportunity for everyone else.

I am convinced that there's a large and growing demand for a more rugged tablet, and that whoever comes out with a product that doesn't just approximate but match and exceed expectations will win big.


Posted by conradb212 at 04:34 PM | Comments (0)

January 26, 2012

A conversation on imaging in rugged handhelds

Recently I received an email from someone in the industry that concluded with the question: "Wouldn't a conversation on imaging in rugged handhelds be interesting to your readers?"

The answer, of course, is "definitely," and so I responded as follows:

"I recently wrote two articles on the general state of imaging in handheld/mobile systems, so you basically know where I stand. In essence, given the very rapid advance in HD still/video imaging thanks to a convergence of CMOS, tiny storage formats, and H.264 compression technology (Ambarella!), it's now possible to generate excellent high resolution stills as well as near perfect 1080p/30 and better video in very small packages, packages that are small enough to fit into handheld and mobile computers.

"Yet, while we see tiny $200 GoPros and such, and advanced still/video capability in virtually every smartphone, the imaging technology we find in almost all rugged computers, even high-end ones, is lacking. Though we review and examine numerous mobile computers every year, we have yet to find a single one that has hybrid imaging capabilities that come close to what is possible today, and most are, in fact, barely usable. It is inexplicable to me how a $4,000 ruggedized notebook computer or tablet does NOT include competent imaging subsystems. There is room, there is a need, and the costs are not prohibitive.

"What enables me to make those statements? First, I have been reviewing rugged mobile computing technology for almost 20 years. For the past ten or 15 years, imaging in mobile computers has barely advanced. Second, I co-founded Digital Camera Magazine in 1997 (as the first magazine anywhere to concentrate solely on digital cameras). I continue to follow digital imaging closely and we also do digital imaging reviews as time allows. Third, as an enthusiastic scuba diver (see my scubadiverinfo.com), I have done many underwater imaging product reviews, including a couple on the GoPros (see here). Fourth, in working with several embedded systems vendors, I know what's possible in terms of integration. What I do see is an almost total lack of communication between computer and imaging people.

"I was not familiar with your company, but I see that you are in part concentrating on camera modules. Which means that you are probably painfully aware of the situation. What must happen is much better integration of much better imaging capabilities into mobile computers. At a time where I can produce near Avatar-quality underwater 1080p 3D video with two GoPros, and where world events are routinely reported on smartphones, mobile computers are woefully out of touch with imaging. A professional who pays $4,000 for a rugged computer (or even just $1,200 for a rugged handheld) should expect no less in terms of imaging quality and ease-of-use than you can get in a cheap digital camera (i.e. sharp pictures, a decent interface, HD video, and speed). Instead, what we currently have in most mobile computers is simply not nearly good enough. You could never rely on it even for quick, reliable snapshots in the field, let alone quality imaging.

"Think about it: businesses spend a lot of money to equip their personnel with expensive mobile computing equipment. Much of that equipment is used for data capture, sight survey, recording, reporting, etc. It makes zero sense to me to have vast computing power, a great outdoor viewable display, great communication and data capture technology, .... and weak rudimentary imaging that is in no way suitable or sufficient.

Posted by conradb212 at 09:02 PM | Comments (0)

November 21, 2011

Ruggedized Android devices -- status and outlook

As far as operating system platforms go, the rugged mobile computing industry is in a bit of a holding pattern these days. Thanks to the massive success of the iPhone and iPad there is a big opportunity for more durable handhelds and tablets that can handle a drop and a bit of rain, yet are as handy and easy to use as an iPhone or iPad-style media tablet.. On the tablet side, a lot of enterprises like the iPad form factor and ease of use, but they need something a bit tougher and more sturdy than an iPad or a similar consumer product. On the smartphone side, hundreds of millions use them now and expect the same elegance and functionality in the handhelds they use on the job. But again, those professional handhelds need to hold up to abuse and accidents better than your standard consumer smartphone.

So with dozens and perhaps hundreds of millions of Android smartphones sold, and tens of millions of iPads, why are the likes of Lowe's home improvement center equipping their employees with tens of thousands of iPhones instead of presumably more suitable ruggedized handhelds (see Bloomberg article)? And why do we see iPads being sold into enterprise deployments that used to be the exclusive province of rugged tablets? There isn't one easy answer.

On the tablet side, it almost looks like the enterprise seems to want iPads and nothing else. Which is a problem for anyone who isn't Apple as the iOS is proprietary and Android-based tablets simply haven't caught on yet. That may be due to the perception that Android is really a phone operating system, or potential customers are befuddled over the various versions of the Android OS.

On the handheld side where Android has successfully established itself as the primary alternative to the iPhone, it would seem to be easy to offer Android-based ruggedized smartphones and handhelds. But there, too, the majority of recent product introductions still used the by now ancient Windows Mobile, an OS that looked and felt old nearly a decade ago.

So what gives? A few things.

With tablets, the almost shocking lack of success of Android and other alternate OS tablets has had a cold shower effect. If neither Motorola Mobility (Xoom) nor RIM (Playbook) nor Hewlett Packard (TouchPad, Slate 500) can do it, who can? And then there's Microsoft's promise to finally getting it right on tablets with the upcoming Windows 8. That's far from certain, but in a generally conservative industry where almost everything is Microsoft, the usual Microsoft leverage/investment/integration arguments carry weight.

With handhelds and smartphones, it's harder to understand because non-Microsoft platforms have traditionally been far more successful, and in the era of apps, software leverage hardly matters anymore. Perhaps it's Microsoft's heavy-handed forcing Android vendors into paying them, and not Google, royalties. Perhaps it's some sort of fear not to stray too far into uncharted waters. It's hard to say. Almost everyone I talk in the industry admits, off the record, to keeping a very close eye on Android developments.

So that all said, where do we stand with respects to Android-based products in the vertical/industrial markets where durability, ruggedness and return-on-investment and total-cost-of-ownership matter?

In tablets, there have been two recent introductions. One is the Motorola Solutions ET1, a small 7-inch display ruggedized enterprise tablet. It's based on a TI OMAP4 processor and runs Android 2.3.4, i.e. one of the "non-tablet" versions. The ET1 was said to be available in Q4 of 2011. RuggedPCReview reported on the device here. The other notable introduction is the Panasonic Toughpad, introduced in November of 2011, but not available until the spring of 2012. The Panasonic Toughpad is a Marvell-powered device with a 10.1-inch screen and runs Android 3.2. Both devices seem to be what a lot of enterprise customers have been waiting for: more durable versions of consumer media tablets, fortified for enterprise use with beefed-up security, service and durability without sacrificing slenderness, low weight and ease-of-use.

On the handheld side, we've also come across some potentially interesting products. The first is the ADLINK TIOT2000 (see our report), a conventional resistive touch handheld with a QVGA display. What's interesting here is that ADLINK offers a visually identical version, the TIOT9000 (see here) that runs Windows CE, with the Android version using a Qualcomm 7227T processor and the Windows CE version a Marvell PXA310. Winmate just introduced its E430T, an industrial PDA with a large 4.3-inch display that uses capacitive touch. This machine uses a Texas Instruments DM3730 processor and is said to be able to run Android 2.3 or Windows Mobile 6.5. I've also seen Android listed as an alternate OS on some of Advantech's embedded modules, including the TI OMAP 3530-based PCM-C3500 Series (see here).

On the surface, it would seem to be almost a no-brainer to cash in on the great public interest in tablets/smartphones and the opportunity a new-era OS such as Android provides. But nothing is ever as easy as it seems.

For example, there's a big difference between traditional rugged tablets that usually either have very precise digitizer pens or a resistive touch screen (or often both), and iPad class devices that use capacitive touch that lets you do all that tapping and panning and pinching, but generally doesn't work in the rain or under adverse conditions. The same issue exists on the handheld side where the traditional Windows Mobile is clearly designed for use with a passive stylus and cannot easily take advantage of capacitive multi-touch. That has, however, not stopped Casio from introducing the IT-300 that has a capacitive multi-touch display, yet runs Windows Embedded Handheld 6.5 (see our report).

So it's all a bit of a mystery. The transition to new operating platforms is never easy and often traumatic, and there are good arguments for being cautious. For example, in addition to leverage, one of the big arguments for Windows CE/Windows Mobile has always been the wealth of existing software. True, but in a world of tens of thousands of often very slick and sophisticated iOS and Android apps, it's hard to believe developers wouldn't quickly come up with the appropriate versions and apps.

With tablets, the situation must be quite frustrating for manufacturers of rugged mobile devices. They undoubtedly see a great opportunity to cash in on the tablet boom, but they are to a degree caught between needing to support the existing Windows XP/Windows 7 infrastructure and deciding what to move to next. Microsoft is cleverly dangling a (for them) no-lose carrot in the form of Windows 8's Metro interface where ARM-based devices would only run Metro and have no access to "classic" Windows whereas for X86-compatible devices, Metro would just be the front end. So there are three potential success strategies: Android, Metro-based ARM devices, and X86 tablets that run Metro and classic windows. No one can support all three.

So for now, as far as rugged tablets and handhelds go, it's the best of times and it's the worst of times.

Posted by conradb212 at 04:42 PM | Comments (0)

November 02, 2011

Windows 8: a bit of fear, uncertainty and doubt

In mid-September 2011, Microsoft showcased a preview of the next release of Windows at the BUILD developer conference. After reading up on it, I wrote the below in the days following the preview, but held off putting it in the RuggedPCReview blog until I had a bit more time to let it sink in and contemplate the likely impact on rugged mobile computing manufacturers and users. My thinking hasn't changed, so below is pretty much what were my first impressions.

Essentially, Microsoft is offering a touch-optimized front end on the next version of Windows. For ARM devices, the new front end is mandatory, for X86 devices it is not. That's probably not to expose itself to charges that even on ARM devices, classic Windows just doesn't work very well.

What's a bit puzzling is that Microsoft called Windows 8 "touch-first." I have to assume that refers to the Metro interface only because having all of Windows touch-first would make most existing hardware essentially obsolete, as touch is neither available nor feasible on most desktops and notebooks. If all of Windows 8 would be touch-first, how would people take to a user interface designed for touch when they are sitting in front of a desktop?

So Microsoft is basically hedging its bets in the tablet space, just as it has before when rival platforms began getting to much attention. Witness...

In 1991, Microsoft grafted pen extensions on top of Windows 3.1 and called it Windows for Pen Computing. It was a miserable flop, but created enough FUD to stall and kill rivaling efforts (remember that even the original ThinkPad ran PenPoint and every major computer company had a pen tablet).

In 1995, Microsoft grafted the Pen Extensions onto Windows 95, but essentially left it up to hardware manufacturers to make them work and support them.

In 2001, Microsoft grafted pen functionality onto Windows XP and called it the XP Tablet PC Edition, forcing most hardware manufacturers to create products for it.

In 2009, Microsoft added a bit of touch functionality and made it available in Windows 7, proclaiming the OS -- successfully marketed as a rock solid new platform when it to most users it really looked like Vista done right -- as touch enabled.

In each case, Microsoft's effort created enough FUD to either derail efforts or at least drive OEMs to support them to some extent.

Now there'll be Windows 8 and once again Microsoft is attempting to ward off a challenge and remain relevant by integrating rival technology with just enough independent thinking to declare it its own.

So what is Microsoft doing? Think about it. Would Microsoft gamble its still commanding market position on suddenly converting everything to touch? When touch really only works on tablets? When almost all work is still done on desks sitting down? When billions use keyboards and mice? When even Apple is not suggesting touch is the be-all and end-all, and all of OSX and all Macs now work with touch only? When Microsoft just managed to convince the public that Windows 7 is new and solid? When unpleasant memories of Vista still linger? When almost everyone still remembers New Coke? When the idea of having tiles that summarize info from other apps has been tried (in WinMo) years ago? When the last thing IT wants is everyone having Facebook and Twitter built right in?

Let's be realistic here. What Microsoft is doing is nothing more than trying its Windows Everywhere approach one more time. By promising a new Windows that is so marvelous that nothing else is needed, not on tablets, not on the desktop. That hasn't worked in the past, and it will not work now. What Microsoft so far has shown is an updated version of Windows 7 with a new optional interface. The only new thing is that the interface will be mandatory on ARM-based devices. So that Microsoft won't get criticized again if the touch layer doesn't work well on tablets or just isn't enough to run Windows. This way Microsoft can always refer those who need "real" Windows to an X86 tablet and relegate or even abandon ARM devices should that not work out. If it does work out, great. If not, no big deal.

Now let's look at tablets specifically. Microsoft's primary argument for Windows on tablets is the leverage, legacy and compatibility proposition that says that corporate IT runs on Microsoft, all the software and software tools are Microsoft, developers know Microsoft, and there are trillions of Microsoft apps. Therefore, Windows based tablets will fit right in. Even if they are a little hard to operate.

Using the leverage argument, if Metro is indeed a mandatory new interface on ARM-based tablets, then out goes the legacy application argument for tablets. It'll have to be all new apps. And that transition will be as hard or harder than what Windows Mobile users encountered when it was end of the road with WinMo 6.5, and the was only the vague promise of an eventual move to a Phone 7 style system that was not backward compatible.

So then why not just stay with X86 and the option to run Windows Classic where all the software is and will be? That is going to be the big question. Also, it's been suggested that since developing for both ARM and X86 requires using the Metro UI, that means Metro will be the preferred environment. Will that mean Windows 8 users have to go back and forth between environments? Will we see "compatibility boxes" again?

There is, of course, always the chance that Microsoft will indeed be able to put forth a credible effort, just as it did with the Windows 7 follow-up to Vista. The Metro interface may just be so compelling that it can stem and turn the tide of what by its introduction may be several hundred million iPads and perhaps Android tablets. A tall order indeed.

So for now it's Microsoft generating a degree of fear, uncertainty and doubt among hardware manufacturers and corporate customers. It's wise move that was to be expected. And in time-honored Microsoft fashion, it's also a riskless bet where Plan B (Windows classic) is the safe perpetuation of the status quo.

What does it all mean to makers of mobile and rugged devices? It depends on how serious Microsoft is with the Metro UI and ARM hardware. At this point, mobile hardware either uses Windows Mobile, or Embedded Handheld whatever, or it's using Windows XP or Windows 7 on Core or Atom powered devices. It's hard to see much of a future of Atom powered hardware if ARM-based tablets and handhelds can run Metro faster with fewer resources. In fact, the only reason would be to be legacy compatible, and that is a rather major reason.

The next issue is touch. It's hard to imagine a next gen Windows not supporting a multi touch interface that uses projected capacitive technology. And that is precisely what the vertical market mobile computing industry currently says it doesn't want because capacitive touch can't handle rain, gloves, or other adverse conditions. And then there's the pen functionality for signature capture and such, or even handwriting recognition. How will pens work in a touch interface (remember, touch has never worked well in a pen interface)?

For a bit of testing, we installed Windows 8 on an older HP 2710p convertible Tablet PC. The install was easy and pretty much everything worked. From a cold start to Metro takes just under a minute. The HP tablet doesn’t have touch, but the installer recognized the pen just fine. All the swiping has to be done by pen. Clicking on the Start menu brought up Metro with its flat tiles. It all can be made to work somehow, but at this point I think the real question is whether Android can establish itself on tablets or not before Microsoft is ready with Windows 8.

Posted by conradb212 at 04:36 PM | Comments (0)

August 08, 2011

Do you have "Grandpa Boxes" in your lineup?

Unlike Gary Trudeau whose "Doonesbury" strips can be personal and mean-spirited (remember his relentless unfair mocking of the Apple Newton?), Scott Adams' "Dilbert" presents a lighthearted, humorous, yet keenly insightful commentary on the corporate and technical issues of the day.

In a recent strip (August 3, 2011), Dilbert's working on his computer when a young colleague approaches and asks, "Are you getting a lot done on the Grandpa Box?" "The what?" Dilbert asks. "The people in my generation do our work on our phones and tablets," is the response. "I also have a laptop," Dilbert objects. "I'll text the nineties and let them know," the young gun says (see the strip here)

This made me think. Is this really happening? Are we really seeing a shift from the computing tools as we know them to a new generation of devices that we didn't really think could do the serious jobs? While it seems almost unthinkable that a smartphone could replace a "real" computer, 30 years ago almost no one thought PCs could ever challenge mainframes or minicomputers, and yet PCs went on to revolutionize the world and doing things no one ever thought they could.

It also made me think of my own changing pattern of using computers. I use my own smartphone and tablet more and more, and my laptop less and less. I described the syndrome in a serious of lengthy blog posts entitled "iPad on the Road". On my own latest intercontinental business trip, I didn't take along a laptop at all, just my smartphone and tablet.

I also thought of a period in my life about three years ago where texting was my preferred means of communication, and how immersed in it I became. I got to a point where the shortcuts on the tiny keypad of my phone and its T9 predictive text entry became second nature and I could bang out messages with hardly looking at the keypad at all. I remember thinking that hundreds of millions of people, and perhaps billions, text every day. To them, T9 and similar text entry is second nature. And yet, makers of rugged tablet computers hardly ever include any of those text entry methods. I even suggested it to some, but there never was follow-up.

Can phones and tablets really do the job of computers as we know them? And is the young generation really doing its work on phones and tablets? I can see it to some extent as I am using Apple's Pages wordprocessor on my iPad, and also FTP, SSL, blog and remote login programs. And that's on top of what media tablets do best, like browsing, email, entertainment, research, etc. And on my most recent trip, Skype on my tablet actually replaced even my phone.

Does all of that make conventional computers "Grandpa Boxes"? The way I see it now, yes and no. Just like PCs replaced some of the conventional computing of the day and added a huge amount of new and previously unimaginable ways of using computers in everyday life, smartphones and tablets will replace some of the things we're now doing on desktops and notebooks, and add a huge amount of new functionality that we never really thought of.

This means we may be at the threshold of a new era with both challenges and opportunities. The challenge will be to figure out what all will inevitably be replaced by these emerging computing platforms. The opportunity will be to take advantage of the new platforms.

For the mobile rugged computing industry this means thinking long and hard which of their products are "Grandpa Boxes" and which continue to fill a real, rational need. And also what part of the smartphone and media tablet revolution to embrace and employ for their own purposes.

So far, the industry has been timid. The are a few ruggedized smartphones and a couple of "new style" tablets, but no one's really much ventured past the cozy confines of the Wintel world. And the new realm of apps has not yet been discovered by the verticals. What this means is that a giant opportunity remains unexplored, and there's also a danger of simply missing the boat by waiting too long, with new players coming in and taking over.

That won't necessarily happen as there's much expertise in this industry, but what if suddenly there are apps that can handle business processes on inexpensive yet durable smartphones and tablets the way hundreds of millions already use their smartphones and tablets?

Do you have Grandpa Boxes in your lineup? If so, does that make sense, or is it time to move on?

Posted by conradb212 at 04:43 PM | Comments (0)

June 29, 2011

"The Cloud"

It's fashionable these days to say that something's "in the cloud." The cloud is in. Everyone's moving stuff to the cloud.

Which is really annoying.

"The Cloud," of course, isn't a cloud at all. In fact, it couldn't be farther from a cloud. It's the same old server farms somewhere in a warehouse. That's all. So why the sudden fixation with "the cloud"? Probably because centralized storage and applications can be huge business and because it presents an opportunity to regain control over users and their data, control that has largely been lost ever since the PC revolution took it away from centralized mainframes in the 1980s.

But isn't it really great not to have to worry about where stuff is stored? And that it'll all be there for you when you need it, wherever that may be? In theory, yes. In practice, not so much. Because it may, or may not be there.

I learned that lesson yet again when my Amazon account somehow got compromised a little while ago. For all practical purposes, Amazon is in "the cloud" as far as their customers are concerned. Customer data is there, wish lists, old transactions, and all the archived Kindle books. So when Amazon suddenly didn't accept my password anymore I tried to reset it three times, exhausting in the process the passwords I can easily remember.

A call to Amazon yielded that the account had indeed been compromised, and I was guided through setting it up again. I wasn't told how and why the hacking might have happened, and moving my data was a manual process that had to be done by Amazon. But even Amazon, stunningly, was unable to move my Kindle book library. Instead, they said they'd send me a gift card so that I could purchase the books again. The card eventually arrived.

Then I found that my Amazon affiliates account was also linked to my main Amazon account, and also no longer worked. Amazon once again changed my password and gave me instructions on how to regain access.

Bottom line: if even Amazon (or Sony or the government, for that matter) cannot guarantee that your data is safe, or explain what happened when it's compromised, why should I trust "the cloud"? Companies come and go, and some who are now presenting "cloud" services will undoubtedly soon be gone. Others will, in the software industry's inimitable fashion, act as if their service was the only one that matters and make users jump through hoops. And it'll all add to the rapidly growing number of logins and setups and passwords that we are pretty much forced to entrust our lives and financials with.

While experiences like what happened to my Amazon account are simply annoying and worrysome, what happens if and when it'll all come crashing down? Or if you wake up one day with amnesia, or the cheat sheet with all your access data is lost. The cloud -- poof! -- will be gone, and with it all of our data. That alone is a darn good argument for local storage and backups. Having one's head in the cloud will almost inevitably turn out to be a bad thing.

Posted by conradb212 at 06:50 PM | Comments (0)

May 24, 2011

Another conversation with Paul Moore, Fujitsu's Senior Director of Product Development

I don't often do phone interviews with product managers or PR people when a new product is announced. That's because, for the most part, whatever they can tell me I already know from the press materials. And what I really want to know they usually can't tell me because PR folks, by and large, need to stick to a script and company line. Which means I might as well save the time of a PR call to examine things myself, Google this and that, and then form my own opinion.

That said, there are industry people I enjoy talking to on the phone. Paul Moore, Senior Director of Product Development at Fujitsu is one of them. Conversations with Paul are always value-added because he not only knows his stuff, but he also has opinions, answers questions, and does not shy away from a good debate over an issue. Like all professionals in his position, Paul must present and defend the party line, but with him you always get a clear and definite position and explanation, and I respect and appreciate that. I may not always agree, and at times it must be hard for someone in his position to argue a point that seems, from my perspective, rather clear. But that's what a good PR person does, and Paul is among the best.

The occasion of our conversation was the availability of Fujitsu's new Stylistic Q550 tablet, a "business class tablet" first introduced back in February (see my preview). The Q550 represents Fujitsu's initial effort to grab a slice of the tablet market popularized by the iPad, and expected to grow almost exponentially. So far that's turned out to be much more difficult than anyone expected, as Apple's product and pricing are very good, main contender Android just doesn't seem quite ready yet, and Microsoft doesn't have anything specifically for tablets.

The overall situation is odd. Many millions love the iPad and its effortless elegance, but for certain markets the iPad is lacking. It's not particularly rugged. It's an Apple product in a still largely Windows world. And there's no pen for situations where a pen is needed (signatures, etc.).

So Fujitsu comes out with the Stylistic Q550 with a nice 10.1-inch screen, and running regular Windows 7 on a 1.5GHz Atom Z670 processor, one of the newest ones. It has multi-touch like the iPad, but also a pen, thanks to N-trig's DuoSense technology. It also has an SD card slot, a Smart Card slot, a fingerprint reader, higher resolution than the iPad (1280 x 800), a brighter backlight, outdoor viewability, and optional Gobi 3000. And it starts at just US$729, which isn't much for a business class machine.

Paul starts the conversation with reminding that Fujitsu has some 20 years' worth of experience in the tablet market (true, they are the pioneers). That taught them a thing or two. Like that removable batteries are a must; business can't send in product just to replace a bad battery. Then there's all the security stuff corporations need, like biometrics, the TPM module, bitlocker encryption, and compatibility with all the other gear companies already have. And there's also an HDMI port for presentations, a handstrap, dual cams, the Gobi 3000 module so you can use AT&T, Verizon or Sprint, or whatever you want. Business needs all that.

And that is why when Fujitsu created a next-gen tablet for commercial markets, they based it on Windows 7. That was just a given. "For us, this is a market expander," Paul said, "not just another product."

That makes sense, even though the market researchers at IHS iSuppli just predicted that iPad-style media tablets will outsell PC tablets by a factor of 10 to 1 through the next four years or so (see here). Paul doesn't debate that point. "Let's face it, Apple owns consumer," he says, "We've always been vertical. We concentrate on usability, screens, ports, security, compatibility, ..." and he adds a half dozen more items and features that separate glitzy consumer electronics from the tool-for-the-job professional stuff.

Why not Android then? There's allure, and Fujitsu is rumored to introduce a smaller Android-based tablet. Paul quickly cuts to the core of that issue: "No one likes to pay for an OS," he says, and that's certainly an Android attraction. "But Android is basically a phone OS. There are security challenges, different marketplaces, and if all my software is Windows-based, do I really want an Android device?" Good points there, and especially when a business uses custom software. And as for the iPad, it's a "want" device, Paul says. Theirs is a "need" device. All net on that one.

Then I am pressing on an issue that I consider very relevant. While I have serious doubts that Windows, as is, is well suited for tablets, the compatibility argument is valid. I think Microsoft's leverage-across-all-platforms mantra is not as strong as it once was, but for now it still stands. However, if you make a business class machine, it really should be considerably tougher than a media tablet. Yet, the Q550 is listed with a rather narrow 41 to 95 degree operating temperature range and nothing more. No drop spec, no sealing spec against dust and water, no altitude or humidity specs, nada. Why? Especially when Motion introduced the CL900 which does offer a decent degree of ruggedness.

Paul says their tablet does not compete in the same class as Motion's. The Motion tablet is heavier and more expensive and really more in the class of an Xplore tablet or such. I cannot agree here. While the Q550 is indeed a bit lighter and less expensive than the Motion tablet, both are essentially Windows-based business class media tablets starting at under US$1,000 whereas fully rugged hardware like the Xplore tablets weigh and cost a whole lot more. I definitely believe commercial markets would like to see a degree of ruggedness, but Paul won't concede the point. Besides, they do have protective cases and such. And Paul's argument that Fujitsu has a long record of building tablets that hold up well is most definitely valid. Paul also pointed out that the Q55 is indeed MIL-STD-810G tested, meeting nine military standard tests for various demanding environmental conditions including transit drop, dust, functional shock and high temperature. I hope they soon add this to the specs.

Now the conversation moves beyond the new tablet. I ask Paul why Fujitsu, the pioneer in tablets, appears to have discontinued their larger Stylistic slates, a storied line of tablets that went back, uninterrupted, a good 15 years or so. Well, they did stop the last of that line, the Stylistic ST6012, over a year ago because everyone seemed to be transitioning to convertibles, and Fujitsu has many years' worth of experience in that product category, too.

Why the switch? "Convertibles are less expensive," Moore explained. It's simple physics: having the LCD in one case and the rest of the electronics in another means less complexity, fewer thermal issues, and thus less expensive components. So convertibles turned out to be less expensive, but more powerful and more reliable. Years ago, Fujitsu sold more tablets than convertibles, then the ratio switched. Good information and reasoning. I still think that Microsoft is as much at fault as physics, but in this instance the marketplace spoke, and Fujitsu followed.

Then I get on a high horse on cameras. The Q550 tablet does have two of them, a front-facing VGA webcam, and a rear-facing 1.4mp documentation camera. I haven't tried out the Q550's cameras yet, and I have no problem with a VGA webcam. But a 1.3 megapixel documentation camera is meager in an era where digital cameras with 14-megapixel sensors and 1080p HD video can be had at Walmart for less than a hundred bucks. Paul says he's had that discussion with his engineers, so no real argument there, other than that true digital camera guts can't easily be built into a slender tablet. I think they can.

I've been on the phone with Paul Moore for almost an hour and it's time to let him go so he can get ready for his next call. I had a lot of fun. I learned things, I got some good information. And I hung up with the feeling that I had talked to someone who really likes his work and the products he represents. That makes all the difference.

Thanks, Paul. And thanks, Wendy Grubow, for always keeping us informed about Fujitsu's latest.

Posted by conradb212 at 12:11 AM | Comments (0)

May 09, 2011

The problem with benchmarks

When we recently used our standard benchmark suite to test the performance of a new rugged computer, we thought it'd be just another entry into the RuggedPCReview.com benchmark performance database that we've been compiling over the past several years. We always run benchmarks on all Windows-based machines that come to our lab, and here's why:

1. Benchmarks are a good way to see where a machine fits into the overall performance spectrum. The benchmark bottomline is usually a pretty good indicator of overall performance.

2. Benchmarks show the performance of individual subsystems; that's a good indicator for the strengths and compromises in a design.

3) Benchmarks show how well a company took advantage of a particular processor, and how well they optimized the performance of all the subsystems.

That said, benchmarks are not the be-all, end-all of performance testing. Over the years we've been running benchmarks, we often found puzzling inconsistencies that seemed hard to explain. We began using multiple benchmark suites for sort of a "checks and balances" system. That often helped in pin-pointing test areas where a particular benchmark simply didn't work well.

There is a phrase that says there are three kinds of lies, those being "lies, damn lies, and statistics." It supposedly goes back to a 19th century politician. At times one might be tempted to craft a similar phrase about benchmarks, but that would be unfair to the significant challenge of creating and properly using benchmarks.

It is, in fact, almost impossible to create benchmarks that fairly and accurately measure performance across processor architectures, operating systems, different memory and storage technologies, and even different software algorithms. For that reason, when we list benchmark results in our full product reviews, I always add an explanation outlining the various benchmark caveats.

Does that mean benchmarks are useless? It doesn't. Benchmarks are a good tool to determine relative performance. Even if subsystem benchmarks look a bit suspect, the bottomline benchmark number of most comprehensive suites generally provides a good indicator of overall performance. And that's why we run benchmarks whenever we can, and why we publish them as well.

Now in the instance that causes me to write this blog entry, we ran benchmarks and then, as a courtesy, ran them by the manufacturer. Most of the time, the industry's benchmarks and ours are very close, but this time they were not. Theirs were much higher, both for CPU and storage. We ran ours again, and the results were pretty much the same as the first time we ran them.

The manufacturer then sent us their numbers, and they were indeed different, and I quickly saw why. Our test machine used its two solid state disks as two separate disks whereas I was pretty sure the manufacturer had theirs configured to run RAID 0, i.e. striping, which resulted in twice the disk subsystem performance (the CPU figures were the same). A second set of numbers was from a machine that had 64-bit Windows 7 installed, whereas our test machine had 32-bit Windows 7, which for compatibility reasons is still being used by most machines that come through the lab.

The manufacturer then emailed back and said they'd overnight the two machines they had used for testing, including the benchmark software they had used (same as ours, Passmark 6.1). They arrived via Fedex and we ran the benchmarks, and they confirmed the manufacturer's results, with much higher numbers than ours. And yes, they had the two SSDs in a RAID 0 configuration. Just to double-check, we installed the benchmark software from our own disk, and on the 32-bit machine confirmed their result. Then we ran our benchmark software on the 64-bit Windows machine, and... our numbers were pretty much the same as those of the machine running 32-bit Windows.

Well, turns out there is a version of Passmark 6.1 for 32-bit Windows and one for 64-bit Windows. The 64-bit version shows much higher CPU performance numbers, and thus higher overall performance.

Next, we installed our second benchmark suite, CrystalMark. CrystalMark pretty much ignored the RAID configuration and showed disk results no higher than the ones we had found on our initial non-RAID machine. CrystalMark also showed pretty much the same CPU numbers for both the 32-bit and the 64-bit versions of Windows.

Go figure.

This put us in a bit of a spot because we had planned on showing how the tested machine compared to its competition. We really couldn't do that now as it would have meant comparing apples and oranges, or in this case results obtained with two different versions of our benchmark software.

There was an additional twist in that the tested machine had a newer processor than some of the comparison machines that scored almost as high or higher in some CPU benchmarks. The manufacturer felt this went against common sense, and backed up the conjecture with several additional benchmarks supplied by the maker of the chips. I have seen older systems outperform newer ones in certain benchmarks before, so I think it's quite possible that older technology can be as quick or quicker in some benchmarks, though the sum-total bottom line almost always favors newer systems (as it did here).

The implications of all this are that our benchmark suites seem to properly measure performance across Windows XP, Vista and 7, but apparently things break down when it comes to 64-bit Windows. And the vast discrepancy between the two benchmark suites in dealing with RAID is also alarming.

It was good being able to use the same exact benchmark software to objectively measure hundreds of machines, but I am now rethinking our benchmarking approach. I greatly value consistency and comparability of results, and the goal remains arriving at results that give a good idea of overall perceived performance, but we can't have discrepancies like what I witnessed.

Posted by conradb212 at 08:47 PM | Comments (0)

May 06, 2011

Conversation with Ambarella's Chris Day about the state of still/video imaging in mobile computing devices

In a recent blog entry I wrote about the generally low quality of cameras built into rugged mobile computers compared to the very rapidly advancing state-of-the-art in miniaturized imaging technology. It doesn't seem to make sense that high quality, costly tools for important jobs should be saddled with imaging hardware that ranges from only marginally acceptable to quite useless. Still and video cameras are now in tens of millions of smartphones and many of them now can take very passable high res still pictures as well as excellent video. I would expect no less from vertical market mobile computing hardware.

Why is that important?

Because the ability to visually document work, problems, situations or details is increasingly becoming part of the job, a part that can dramatically enhance productivity, timeliness and precision, as well as enable quick problem solving by consulting with home offices, etc., and it also helps create documentation trails. Add technologies such as geo-tagging and mapping, and the presence of high quality hybrid imaging functionality has an obvious and direct impact on return on investment as well as total cost of ownership. However, that only applies if the computer's still and video capturing capabilities are at the same high quality and performance level as the computer itself.

Over the past several months I have asked several of my contacts in the mobile computing world why the cameras aren't any better, especially since many of them highlight those cameras as productivity-enhancing new features. Which they can be, but generally are not, or not yet. The cameras are slow, produce unacceptable pictures (low resolution, artifacts, color, compression, sharpness, large speed delays, interface), and video is generally almost useless (very low resolution, very low frame rate, etc.). I did not receive any compelling answers, just tacit agreement and somewhat vague references to space and cost considerations.

So I decided to seek opinions from people at the forefront of today's miniaturized image processing solutions and get their side of the story. Molly McCarthy of Valley PR was kind enough to arrange a call with Chris Day, who is Vice President, Marketing and Business Development at Ambarella and has one of those very cool British accents.

Why did I seek out Ambarella? Because when we took apart a video scuba diving mask I had been testing, I found Ambarella chips inside. The product was the Liquid Image Scuba Series HD mask that has a high definition still/video camera built right into the mask. It can shoot 5-megapixel still pictures and also 720p HD video (see our review). The mask including the camera costs less than US$250 and it records on a microSD card. We also reviewed another tiny sports camera that includes Ambarella technology (the ContourHD), and that one can do full 1080p HD video.

What is Ambarella? It is a Silicon Valley company that was formed in 2004 to be a technology leader in low power, high definition video compression and image processing semiconductors. Chris explained that their main thrust is H.264 video compression, a technology that generates very good video at file sizes much smaller than conventional formats. Their largest market is what's called consumer hybrid cameras, the rapidly expanding segment of small cameras that can do both high quality, high resolution still images as well as superb high definition video. Ambarella is probably the leader in that area, and also the first to truly merge high-res video and still imaging.

Ambarella's hottest market right now is sports cameras, the kind that generate incredible HD video of skiing, skydiving, car racing, and all sorts of extreme sports (including, of course, scuba diving). They also do cameras for security and surveillance where the days of the grainy b&w low-res video often shown in "the world's dumbest criminals" type of TV shows is rapidly coming to an end. Ambarella also supplies other markets that rely on high compression but also high quality in their sophisticated imaging and forecasting systems.

About 400 people work for Ambarella these days, 100 of them at the Silicon Valley headquarters. For the most part, Ambarella makes chips, but they are also getting closer to providing full products, and already offer hardware/software development platforms.

I told Chris of my puzzlement over the primitive state of cameras built into most current mobile computers, especially considering that the professionals using those expensive high-quality computers could definitely use reliable, high-res cameras built into their equipment. Chris said that Ambarella did have discussions with several notebook manufacturers three to four years ago, but nothing ever came of it, primarily for cost reasons.

Now it must be understood that a good part of Ambarella's value-added consists of the chips that do very fast, very good video compression, and general purposes processors can do some of that, so perhaps consumer notebook makers simply didn't see the need for the extra speed and quality when most notebook users don't ask for more than basic webcam functionality.

Notebooks are one thing, of course, and tablets and smartphones another. Also to be considered is the fact that there are really two types of cameras used: vidcams for video conferences (increasingly referred to as "front-facing" cameras), and the much higher resolution documentation cameras (generally called "rear-facing") used like regular digital cameras. Most better smartphones and tablets now have two cameras, one for each purpose.

To that extent, Ambarella created their iOne smart camera solution that brings full HD camera and multimedia capabilities to Android-based devices. The iOne's SoC (System on Chip) supports live video streaming, WiFi upload of video clips, and full HD telepresence applications. It also has multi-format video decoding for playback of Internet-based video content up to 1080p60 resolution (i.e. better than HD TV). Chris felt that sooner or later one of the media tablet makers would truly differentiate itself with a superior built-in camera.

Ambarella also offers full development platforms for digital video/still imaging that contain the necessary tools, software, hardware and documentation to develop a hybrid DV/DSC camera functionality (see Ambarella consumer hybrid camera solutions here).

The bottom line, Chris Day said, is that "it is possible to have a mobile computing device that is also a world-class camera." We're just not seeing them yet. I am convinced that the first professional mobile computing product that offers the still/video recording capability of an inexpensive consumer camera will have a definite strategic and marketing advantage.

But what about the size and cost? As is, there are any number of imaging modules for those handy smartphones that are getting better all the time. They are tiny and inexpensive and light years ahead of what we now see in actual vertical market mobile handhelds and tablets.

A step up are the imaging modules that go into standard digital cameras. Those are larger and more complex, but judging by the tiny size of today's consumer point & shoot cameras that often offer 14 megapixel and 1080p video, those electronics should also easily fit into many mobile computing devices. They cost more, of course, but given the fact that many consumer cameras are now under US$100, it should be possible. Consider one product that uses Ambarella technology, the Sony Bloggie Touch. It can do 12.8mp stills, 1080p video, has 8GB of memory and a 3-inch touch LCD, yet it's hardly thicker than half an inch and costs under US$150. The guts of this in a rugged tablet or handheld would make an extremely attractive combination.

So the experts have spoken. It's doable. And it wouldn't even cost that much.

Video/imaging integrated into cellphones has changed the world. A lot of reporting now originates from smartphones before CNN ever gets there. And there's already talk that smartphones may essentially replace the conventional low-end camera market. The technology is there.

State-of-the-art DV/DSC Video/imaging could bring great value-added to rugged mobile computing hardware. Being able to document work, situations, conditions can be invaluable and truly open many new possibilities to get jobs done better and faster. But the pictures must be good, and users must be able to rely on the camera. Current camera modules cannot do that. HD video, likewise, could change everything. And it is truly lightyears ahead of the slow, grainy QVGA and VGA videos that most current computer cameras are limited to.

Posted by conradb212 at 02:16 AM | Comments (0)

April 26, 2011

Is the race for tablet supremacy already over? Many developers think so.

Who could forget Microsoft CEO Steve Ballmer stomping around the stage and yelling "developers, developers, developers!" at conferences in the mid-2000s (see Balmer's developers on YouTube)? Well, according to the Appcelerator/IDC Mobile Developer Report, April 2011, the developers have spoken and the news isn't at all good for Microsoft, and not even that good for Android.

What Appcelerator and IDC did was survey a total of 2,760 Appcelerator developers on their perceptions regarding mobile OS platforms, feature priorities and development plans. The survey essentially showed that while Android smartphones have passed the iPhone in terms of sales and market share, developer interest in both Android smartphone and tablet apps has stalled and reversed, with both being behind interest in iPhone and iPad development. According to the report, this is due to "an increase in developer frustration with Android. Nearly two-thirds (63%) said device fragmentation in Android poses the biggest risk to Android, followed by weak initial traction in tablets (30%) and multiple Android app stores (28%)."

I think that's worth a lot of thought. Despite frustration with Microsoft, Apple's market share in computers was in the low single digits for many years and not even the Vista debacle and Apple's great momentum in iPhones and such managed to lift the Mac OS to more than 10% or so (in Switzerland it's highest, with 17.6% according to StatCounter Global Stats, Feb. 2011). Yet, the situation is completely different with media tablets where the Apple iPad continues to be virtually unchallenged a year after its initial release. Apple still has a commanding market share. In 2010, it was 83.9% according to Gartner, which predicts that Apple will still hold an almost 50% share in 2015, still beating Android by several percentage points.

Such market dominance of a single company is almost unheard of, and certainly not in a market that has a good percentage of customers who are against the company on principle, as is the case with Apple. Then again, there's precedent: No one else managed to come close to Apple in MP3 players either. Even though MP3 players can be considered commodities and hardly cutting edge hardware, the iPod continues to reign supreme with a ridiculously commanding market share whereas Microsoft got absolutely nowhere with its Zune.

But can this happen again with tablets? On the surface it seems impossible. Hardware is a commodity, and there are certainly more than enough critics of Apple's very controlled approach to the whole development and sales process. But here we are, a year later and there just isn't anything else out there to challenge Apple. Why is that?

There are several reasons. First, Apple not only created a great product with superior battery life (a huge factor), but it also really aced the pricing. After having been known as a premium-price player for almost all of its history, the iPad is pretty much the low price leader. Sure, you can pick up an Android tablet on eBay for a hundred bucks, but those tablets are so poorly made and of so little use that they have actually hurt the Android cause rather than helped it. And like it or not, but the Apple app store simply guarantees a good user experience. Knowing that there won't be inappropriate content, viruses, frauds and cons is invaluable. And having so many more good apps than anyone else is invaluable as well. As is having one source, and not having to figure out what store to have to go to.

But back to pricing: Motorola and others learned quickly that pricing any media tablet higher than the iPad was simply out of the question. But pricing it lower is also pretty much out of the question if there is to be any profit potential at all. Remember that unlike Apple, any other hardware vendor does not have the built-in income from their own dominant app store.

So what can the rest of the industry do? Make better tablets, for one thing. As is, the surveys says that "while 71% of developers are very interested in Android as a tablet OS, only 52% are very interested in one of the leading Android tablet devices today." No surprise here; everyone else has essentially been copying Apple's look and features: Capacitive multi-touch? Check. Slender, glossy, black slate? Check. Nice icons, zooming, pinching, panning, etc.? Check. 3G? Check. Long battery life? Check (mostly). Simply beating Apple in a spec here or there won't make a difference; that's like Hyundai claiming they beat Mercedes Benz or BMW in this stat or that.

I am fairly sure Android will be doing well on tablets anyway, but as of now, the issues the platform faces are very real. According to the Appcelerator/IDC survey, by far the biggest concern is Android's fragmentation. Only 22% of the polled developers feel that the problem is that iOS is simply better, but almost two third cite fragmentation. Too many tablets, too many versions of Android, too much needed customization. In that sense, it's a bit like the difference between developing for a game console where the hardware and software are constant (iPad), and developing for the PC where there are a million processor/OS/BIOS/storage/display permutations (Android tablets).

But what of the Microsoft factor? Microsoft simply has got to know that a leading presence in mobile is mandatory if the company is to remain relevant as it's historically been relevant (i.e. being #1 in every market it enters). But well into year 2 of the tablet era, Microsoft remains in the same deer-caught-in- aheadlight gridlock over what to do. The issue is always the same: how to tie a non-PC platform into the PC-based Windows platform. Windows CE/Windows Mobile never really succeeded the way it could have because Microsoft intentionally dumbed down those platforms, fearing they might compete with the almighty Windows proper. In tablets, that attitude just won't do. Anything that looks like it's really just an adapted version of mouse Windows is not going to work. Not now, not ever. If Microsoft does not get over that mental block, Microsoft will not be a factor in tablets.

As is, the polled developers already feel, by roughly a 2/3 majority, that no one can catch up to Apple or Android. The developers-developers-developers have spoken here, and so Microsoft finds itself in the odd position of having to hope that a hardware partner will pull a rabbit out of the hat. That has never worked too well for them in the past, with the sole exception of the original IBM PC deal. And even that meager assessment by the developers is probably based on the respectable early showing of Windows Phone 7. Microsoft's still amorphous tablet effort may be an even longer shot.

Then there's the next issue. Are we perhaps entering an era where people abandon the web as we know it and simply turn to apps? It seems unthinkable and the web certainly won't go away any time soon, but let's face it, the web has become a big pain in many respects. Websites are jam-packed full of ads and commercial messages. More websites than not are simply nearly content-free decoys to lure AdSense and other ad click traffic. There's danger waiting everywhere. Often, the web today feels like running the gauntlet in a seedy neighborhood full of panhandlers and worse.

Now compare that with the structure and safety of apps. They do exactly what you want them to do. They've been tested and certified. They are clean. And you are in command. That's why a growing number of companies now offer their own apps in addition to their websites. Yes, it's a bit ironic that we may all return to the gated communities that we had in the past (remember AOL and CompuServe?), but that's the way things seem to go. Already, developers interested in building apps outnumber those interested in the mobile web by 5:1.

Does that mean the race is run? Not at all. Momentum can change very quickly. While it's unlikely that Apple or Android-based tablets will crash and burn, you never know if Microsoft or perhaps even HP with the WebOS will come out with something so awesome that the momentum shifts. Decades ago IBM found that they could not profitably compete in the very PC market they had created. Netscape was defeated by Internet Explorer, which initially had looked like a woefully inadequate competitor. Unbeatable Palm lost its mojo and vanished. It can all happen.

As is, from the vantage point of a product reviewer and publisher, I am surprised by a number of things.

First, I wonder why everyone simply copies Apple instead of taking advantage of Apple's weak spots. Yes, the almighty iPad has some weak sides, and none worse than its ridiculously glossy, ridiculously smudge-prone display. Any major tablet vendor who comes out with a product that does not turn into a mirror outdoors has an instant, massive advantage and selling point.

Then there's the vaunted leverage. "Leverage" has been Microsoft's main argument for decades, and it goes something like this: since 90% of all computers use WIndows and everyone knows how to use Windows and there are so many programmers who know Windows development tools and languages it only makes sense to "leverage" that investment into other platforms. That was always the argument for Windows CE/Windows Mobile, and in the vertical markets, which is still almost 100% Windows Mobile, it worked. Now Google doesn't have any leverage, and Apple doesn't have much. In fact, it's amazing that Apple managed to build so much around arguably its weakest point, iTunes.

Point is, if Microsoft came out with some way to truly leverage its Windows position into tablets and smartphones, things could change in a hurry. It's not clear how that could happen, but there simply has got to be a compelling way to truly extend the commanding position Microsoft has on the desktop (and on the lap) to tablets and smartphones. And no matter how positively surprised I was with Windows Phone 7, we're not talking just a Zune interface and automatic updates from Facebook and Twitter.

And where does that leave HP and RIM? HP recently made noises about offering WebOS on a whole range of devices. The HP TouchPad will run WebOS, WebOS has had mostly good reviews when Palm introduced it for its smartphones a couple of years ago, and HP certainly has deep enough pockets to make an impact. Palm/HP never sent us a Palm Pre or Pixi for review and I wasn't about to sign up for a 2-year telco contract just to review a Pre in detail, but from what I can tell, WebOS excels at something that is just a pain on the iPad--multitasking. The lack of useful multitasking is the one thing that keeps me from using my iPad for more than I already use it for, and the sole reason why I still take my MacBook Pro on business trips.

RIM, they have more of a problem. For RIM, the question really is whether lightening can strike twice. RIM rose to prominence based on one single concept, that of providing secure, totally spam-free email in a pager-sized device. That worked for many years, but RIM struggled with adding some pizzazz to their BlackBerry devices, and going it alone on tablets seems undoable. In the Appcelerator/IDC survey, developers were somewhat excited over RIM's announcement that their PlayBook tablet would support Android apps. That's really no more exciting than Apple's claim that you can run Windows software on a Mac back when no one wanted a Mac. That said, it's almost impossible to figure out what does and does not make business sense, and so we may see some seemingly weird niche products.

As far as the situation at hand goes, the developers-developers-developers have spoken. For now. Developers go where the money is, and even massive incentives go only so far against the lure of a successful app store and tens of millions of tablets sold. An awful lot is at stake here and it's a war, one that seems pretty clear right now (Apple strong, Android gaining), but also one where things can still change in a hurry.


Posted by conradb212 at 09:03 PM | Comments (0)

April 22, 2011

Microsoft....

So I'm getting to the next machine in the review queue, charge it, then start it up, just to get nagged by Windows to activate the OS. Would I like to do that online, right now? Huh? Huh? I didn't think that was going to be possible since the machine didn't know the password to my WiFi network yet. But Windows wanted to try anyway and so I let it. Of course, it didn't get anywhere.

So then I am in Windows 7, but there's this nasty message at the bottom right that says, "This copy of Windows is not genuine." Well, that's bad news as the machine is a prototype from a well-respected rugged computing manufacturer.

I let Windows get access to my WiFi and tried the activation again. No go. I get a ominous message that says "You may be a victim of software counterfeiting." Oh, oh. So I accepted the option to "Go online and resolve now."

Well, Windows then said that "Windows validation detected and repaired an activation exploit (used to prevent Windows Vista built-in licensing from operating properly)" and that I had to activate Windows in order to "complete the repair process and be able to use the full functionality of Windows Vista."

Dang, and there I thought I was on Windows 7 on this brand-spanking new machine.

Windows offered: "Not to worry, we can help you with that."

The help consisted of offering me to buy genuine Windows, the professional version for just US$149.

Now, first, I wasn't on Windows Vista. Second, I didn't have a non-genuine version of Windows. And third, I most certainly wasn't going to pay $149 to upgrade my brand-new Windows 7 to Windows 7.

So I rebooted, and then rebooted again. Now Windows decided that my software was genuine and just wanted to activate it. And this time it worked.

Go figure. And go figure how Microsoft can be in business.

Posted by conradb212 at 03:13 AM | Comments (0)

March 29, 2011

Why are cameras in mobile computers not any better?

When I founded the original Digital Camera Magazine in 1997, almost no one thought that digital photography would ever seriously challenge film. At best, digital cameras were thought to become novelties or peripherals for computers. Yet, just a decade later, digital imaging had surpassed film and, in one of the quickest major technology upheavals, quickly made film irrelevant. As a result, digital cameras, which initially had carried a steep price premium, became more and more affordable. Today you can get a very good and incredibly compact 14-megapixel camera for less than US$100. In essence, digital imaging technology has become commoditized.

Which makes one wonder wonder why cameras integrated into mobile computing equipment aren't any better.

It's sad but true: cameras built into mobile computers are simply not very good. Some are getting better, but virtually none are within a lightyear of even the most basic dedicated digital camera. And, worse for those why rely on top quality tools for the job, cameras in consumer products such as smartphones and media tablets are generally much better than what is used in vertical market equipment. That is hard to explain.

Why is it important to have a good camera in a mobile computer? Because mobile computers are expensive tools for important jobs. Image capture is quickly becoming a must-have feature in the field. Field workers must document all sorts of things out there, like accidents, conditions, extraordinary events, repair status, etc., etc. And those images must be good enough to be of value.

As is, most cameras integrated into mobile computers cannot do that. The cameras are low res (hardly ever more than 3-megapixel), slow (often unacceptably slow), basic (few come close to the features even the cheapest dedicated camera has), and thus simply cannot do the job they're supposed to do. There are probably all sorts of explanations as to why that is, but I just can't buy them. If a cheap, tiny consumer camera can take award-winning pictures, the guts of such a camera can and should easily fit into a much larger mobile computer. Why this isn't happening is beyond me, but it just isn't.

This stunning lack of cross-fertilization between two major technologies actually goes both ways. Cameras have progressed immeasurably over the past decade, yet to this day, digital cameras come with the same tiny 30MB or so of internal memory they always have. You can buy a generic MP3 player with 8, 16, 32 or even 64GB of storage for a few bucks, but even the most advanced consumer digicams have essentially no internal storage. Which is always a REAL pain when your card gets full or you forget to put one in. And let's not even talk about compatibility. In the camera world, every company has their own standard and almost nothing is ever compatible.

That really needs to change. Customers who pay $2,000 or more for a rugged mobile computer should be able to take superb pictures with it, and shoot HD video, just as you can with a little $100 camera. There is simply no excuse, none, to put sluggish, insufficient imaging technology into expensive computer equipment. It cannot be a cost issue either; missing ONE important shot because a field computer's camera is so unwieldy and incapable can cost more than the entire device.

So let's get with it, mobile computing and camera industries! Camera guys: You need some real storage in your product, and no, going from 30 to 100MB won't do. And give some thought about compatibility. Computer guys: Do demand and insist that the camera guts inside your wonderfully competent mobile computing gear is not an embarrassment. It should work at least as well as that brand name $79 camera you can pick up at Walmart. And that includes good video and a real flash!

So there.

Posted by conradb212 at 06:21 PM | Comments (0)

February 09, 2011

How we get news

A big part of the work here at RuggedPCReview.com is getting and spreading the news on what's going on in the rugged and mobile computing world. How do we do that? And how can manufacturers help get the news out?

In the past, it was pretty simple. We went to trade shows to see what all was new, loaded up on glossy brochures, attended press conferences, and left behind a bushel of business cards so we'd be in the rolodex of everyone who mattered in the rugged computing industry. That pretty much ensured a steady supply of news via mailed press kits and such, plenty enough to fill a print magazine every other month back in the day when we published Pen Computing Magazine. For a while after that era, it was a hybrid thing, with part of the news coming the old-fashioned way, and part gleaned from websites.

It's all changed now. We still go to the occasional trade show, and they are always fun and helpful. And you get to actually see the people there. But shows are also expensive and a time waster, what with all the traveling, cabs, airports, hotels, waiting in line, and then the rush at the show itself. So for the most part, trade shows are a (bitter)sweet memory now.

Today, news comes from numerous sources, through numerous channels, and I get it all sitting in front of the big display of my iMac27 with dozens of windows open. That, for me, is news central, and here's where it all comes from:

BusinessWire PressPass -- a daily email with headline news on the topics I subscribe to. The cool thing is that they show the company logo next to the headline. That makes it easy to very rapidly scroll down the (looong) email and stop when my eye catches a familiar logo. Seems like a little thing, but in this day and age of massive information overflow, we need all the filters and help we can get.

PR Newswire for Journalists -- these are individual emails that include a paragraph that describes the news, and also links directly to a full press release. These are quite useful.

Marketwire Newsletter -- another daily email with items of interest for me, but this one is all text, and the headline is accompanied by paragraph. That increases the chance that I can search for keywords like "rugged" and catch things of interest. But it's also tedious to sift through a hundred paragraphs of news.

Google alerts -- yes, Google does it again. I have Google alerts set up for pretty much all the companies I follow, and also some on beats I cover. They are typical Google minimalist, and, like Google searches, they tend to include stuff I really don't need, but it's a great way to keep track of all mentions of a topic or term. Very useful.

PR folks and agencies -- yes, they still fill a purpose. I get emails from dozens of agencies and individuals. Some are very useful and I couldn't do without them. Others seem to simply pad their mailing lists. Overall, a good PR agency contact is invaluable. And good PR people assigned to the same account for a long time? Gold.

Websites -- company websites are still the definite, authoritative source of information. Problem is, many are falling behind the news. Some sites only seem to get updated when they have a web designer re-do their site. Then it eventually falls into near disrepair. That's the exception, of course, but even large companies with good sites often issue press releases without having the info up on their own sites when the news breaks. That is frustrating.

Social media -- honestly, far less useful than what the in-crowd wants you to think. I just don't have the time to be a "fan" of every company I need to cover, be that on Facebook or Twitter or what all.

Communities, Web 2.0, etc. -- the first time I saw a company "community" site was cool. I think it was the Sanda agency that did it for Trimble. It was well done, fun, informative. And the overall recipe has been copied by many others. This can be a nice way to foster a community spirit between companies and users, sort of like an ongoing user conference. But it's far too time-consuming for us media guys. We just don't have the time to stop by for a chat and looking around. So for news, not good. Overall, nice concept and useful.

Pounding the street -- yes, we still do that. Not really the street, of course, but the web. That's because we inevitably miss news and things fall through the cracks. So periodically I go check websites to see if something happened that we missed. But we can't do that often enough for this approach to do anything but fill gaps.

Too much news? Not enough news? -- Overall, of course, the world's drowning in news. And sifting through all that news takes a major chunk of my time every day. That, and then converting worthwhile news into our own, very targeted news items, product pages, and, eventually, the detailed reviews RuggedPCReview is known for.

However, there seems very little consensus on how much news is right.

There are companies that announce something practically every day, and that's often too much of a good thing. I am also not fond of news that really isn't news at all, but just a way to get in the news.

On the other hand, there are companies that seem to avoid news like the plague. I look at their websites and find a news item from last summer, then one from the winter before that, and that's that. Not good enough. Every company that sells stuff has news, and that news needs to get out. It doesn't always have to be a new product announcement; news about updates, upgrades, partnerships, contract wins, successful deployments, tech primers, white papers; they are all news.

Because, after all, news is about being in the news, being on top of the page, getting attention. That sort of exposure makes buyers think, "Hmmm... I just read about that company the other day. Let me look them up."

And that's what it's all about.

Posted by conradb212 at 04:12 PM | Comments (0)

January 18, 2011

Bye-bye PXA processors? Probably not just yet.

There was a time, around the year 2000, when Microsoft essentially decreed that Pocket PCs were to run Intel XScale processors. That was a big change, and a rude awakening for some of the Windows CE hardware vendors who had been promised that Windows CE was going to be a multi processor architecture platform. But Intel XScale it was, and the Intel PXA became the de-facto standard processor for virtually all vertical market handhelds for a decade.

So product specs for all those handhelds of that era weren't very exciting. They either had an Intel PXA255 or a PXA270 processor, with slight variations in clock speed. Considering the demise of the once high-flying PDA industry in favor of telco-controlled smartphones, those vertical market handhelds were a rather successful niche, with the occasional massive sale to parcel carriers, field service organizations, postal services, and so on. However, despite the virtual monopoly of the PXA processors, those industrial handhelds were not a lucrative enough market for Intel to remain interested. So in 2006, Intel sold the PXA business to Marvell for a modest US$600 million.

Marvell, a silicone solutions company intent on cracking the emerging smartphone market, initially scrambled to find someone to make the chips for them. They then quickly launched the PXA3xx series of application processors, including the high end 806MHz PXA320. When we tested the first handheld with the new PXA320 chip (the TDS Nomad), we were blown away by its speed and responsiveness.

However, Marvell apparently did not have the reach and marketing power of Intel. Sure, Marvell PXA270 and even the older PXA255 chips continued to power numerous handhelds, but the powerful new PXA3xx chips had trouble gaining traction. There was a new design win here and there, but we also started seeing defections. And those who stayed with Marvell often chose the older PXA270 chip over the newer and more powerful PXA310 or 320.

Of recent releases, Motorola stayed with Marvell for their new MC75A0 (PXA320) and MC55A0 (PXA320), but used a Qualcomm MSM processor for their ES400 enterprise digital assistant. Psion Teklogix chose a Texas Instruments OMAP3 processor for their Omnii XT10, GD-Itronix an ARM Cortex-A8 for their GD300, Datalogic did stay with Marvel for their new Elf (PXA310) and Falcon X3 (PXA310) handhelds but combined them with ARM Cortex co-processors. DAP Technologies stayed with Marvell with their new M2000 (PXA270) and M4000 (PXA270)Series. Getac stayed with Marvell for their PS236 (PXA310) handheld, but not their PS535F that uses a Samsung S3C2450. And then came the most recent blow for Marvell when Intermec based its new line of likely rather high-volume 70 Series handhelds on the TI OMAP 3530.

The situation doesn't appear to be critical for Marvell yet, as the majority of handhelds out there continue to run on its processors, and there have been some good recent design wins for the PXA310 and PXA320. But the PXA3xx series is now also already over four years old, an eternity in processor terms. It's also not quite clear how Marvell's ARMADA family of application processors relates to the PXA chips. Marvell recently explained to me how ARMADA processors target various markets ranging from consumer display devices like eReaders and tablets to high-end HD TVs, but the name AMADA never appears in vertical market handhelds, and while the PXA 3xx processors are listed with the ARMADA chips, there also seems to be an ARMADA 300 Series with 300/310 chips. Confusing? I'd say so.

A little wordplay anecdote here: two or three years ago, Marvell introduced its own "Shiva" CPU technology and announced it'd be used in upcoming SoC (system-on-chip) products. The PXA processors were then considered part of the Shiva family. So where's the word play? Well, turns out a year before, the Marvel Comix had released a comic book with armored Shiva robots that could not be defeated the same way twice. Apparently Marvel Shiva and Marvell Shiva was too close for comfort, and so the Shiva name is gone from Marvell.

Anyway, no, I don't think the Marvell PXA chips are going away anytime soon, but unless Marvell has some plans up its sleeves that were not aware of, they also don't seem to be going anywhere. Which, come to think of it, is pretty much where vertical market handhelds are in general, sort of in a holding patterns until it becomes clear whether Microsoft can be counted on to provide a true next generation mobile operating platform, or not. And whether the fundamental changes in user interface expectation brought upon by the iPhone/iPad and Android smartphones will lead to pressure for similar functionality and ease-of-use in vertical market devices, or not.

Posted by conradb212 at 10:25 PM | Comments (0)

January 07, 2011

Microsoft announces.... nothing. Google follows suit.

Well, the much anticipated Las Vegas CES is shedding no light on how the industry will react to Apple's monster tablet home run. Yes, there were some tablets here and there, but really nothing that we didn't know already, and certainly nothing earth-shattering.

Microsoft, stunningly, showed nothing. Nada. No product, no strategy, no plan. The whole situation was remarkably similar to a time several years ago when erstwhile handheld champion Palm was in the ropes and Microsoft had an opening a mile wide to finally get some traction with Windows CE. What did they do then? Nothing. Well, they came out with Windows Mobile 2003 for Pocket PC 2nd Edition. But even that was better than simply nothing at all. And back then there was nowhere near as much at stake.

If there is one single saving grace in this stunning inactivity, it's that Google, too, missed a giant opportunity to pull it all together and present to the world -- voila and ta-da -- the definite Android OS for tablets, the one that will do battle with Apple, the one that will make Microsoft irrelevant in tablets forever after. Didn't do it.

So those who stuck by Microsoft will now have tablets that really don't work very different from the old Tablet PCs. And those who meekly tried Android or something else missed a golden opportunity to put themselves on the map.

This is as close to forfeiting a game as it gets. By the time Microsoft may finally have something, Apple will have many tens of millions of iPads in the field. And after the virtual Android no-show at CES, the notion that Google seems unable to provide a cohesive tablet platform may get stronger.

So 2010 was the year of the tablet, for Apple, and 2011 will again be the year of the tablet, and no one's playing other than Apple. No one, I should say, of the big guys. There have been some nice new products. Motion Computing's new CL900 tablet is a thing of beauty and we really liked the little Samsung Galaxy Tab we had here for a few weeks.

But overall, Microsoft's apparent inability to figure out what to do in tablets and Google's ongoing spreading itself too thin is eerily reminiscent of CE devices from the likes of HP, Compaq, IBM, LG, NEC, Casio, Philips and others combined fail to gets as much as 25% handheld marketshare against the little Palms. Eventually, of course, Palm defeated itself, but that's not likely going to happen to Apple.

So the tablet crystal ball remains as milky as ever.

Posted by conradb212 at 11:25 PM | Comments (0)

January 02, 2011

Motorola, and the corporation names, corporation games thing

So on January 4, 2011, Motorola will complete its separation into two companies. The way it actually works is that what used to be Motorola will separate Motorola Mobility Holdings, or Motorola Mobility for short, from Motorola proper, and Motorola will then change its name to Motorola Solutions. So technically it looks more like Motorola jettisoned their phone business to concentrate on the much more stable and predictable vertical market offerings developed and sold via Motorola Solutions. From a stockholder's perspective, they'll get one share of Motorola Mobility for every eight shares of old Motorola stock. The old Motorola stock will then undergo a reverse 1-for-7 stock split so that seven shares of old Motorola stock becomes one share in Motorola with its new Motorola Solutions name (see how it works).

Sure reminds of the lines in that old Grace Slick We-Built-This-City song: "Someone always playing corporation games, who cares they're always changing corporation names."

While the spun off cellphone business will have about the same number of employees as the solutions business (both about 20,000), annual revenue of the cellphone side is expected to be US$11-12 billion and on the solutions side about 8-9 billion. However, the cellphone business is exceedingly unpredictable compared to the much more linear solutions side. For example, who'd have thought that cellphone world leader Nokia would completely miss the smartphone wave? Who could have predicted the iPhone? And wasn't Motorola itself on top of the world with its RAZR (over 120 million sold), and then practically fell off the map when the follow-ups didn't catch on? And who could have predicted that the Droid would catch on as it did. It's a wild ride there in cellphones, feast or famine.

Things are much different on the solutions side. Everyone needs "solutions," and solutions helped IBM and HP quietly not only remain relevant, but become bigger than ever despite diminished emphasis on hardware. IBM ditched its PC business (to Lenovo) and printer business (spinning off Lexmark), and while HP is huge in hardware, buying EDS (Electronic Data Systems) also made it one of the largest solutions providers, which now contribute a third of HP's revenue. Compared to those two giants, each with revenues over US$100 billion, Motorola Solutions will be small, but the business model sure looks promising.

What is Motorola Solutions? Company officials have always struggled with communicating that clearly. In essence, they leverage established lines of two-way radios, network equipment, scanners, and handheld computers into solutions for just about any vertical market. The scanners and handheld computers, of course, come from Symbol, which Motorola acquired in 2007. In a sense, with the former Motorola now being Motorola Solutions, and Symbol being a large part of Motorola Solutions, it almost looks like Motorola merged into Symbol, though I am not sure what part of the Motorola Solutions revenue comes from Symbol and what part from the two-way radio and related equipment side. I am also not sure who packages and manages those solutions that rely on Symbol's scanners and handhelds, and the radios, and whether Symbol will be just a hardware shop or be more involved in the solutions process.

Overall, one cannot help but wonder what Motorola had in mind with Symbol and its very considerable brand equity. After the acquisition there were a good two years where some Symbol products continued to have the Symbol name and logo before gradually getting the corporate Motorola logo. In the new Motorola Solutions, Symbol remains by far the most prominent brand, and I really do wonder what it all looks like from the inside.

One almost wonders why they didn't just go back to the Symbol name. There's precedence for such a move: Motorola Solution's biggest competitor in mobile hardware is Intermec, and Intermec once was just a subsidiary of Unova, and not a major one at that (though they swallowed Norand, just as Symbol swallowed Telxon). Yet, under the dynamic leadership of former CEO Larry Brady, Intermec established itself as a successful and driving force in the mobile/wireless industrial market, to the extent where in 2006, Unova changed its name to Intermec.

Oh, and then there is the weird thing with the operating systems: Motorola Mobility (strange choice of name, actually, considering that the actual mobile computers went with the other side) totally depends on Google's Android now, whereas all of Symbol's handhelds use Windows Mobile. Yet, given Windows Mobile's rather tenuous position and uncertain outlook, Symbol/Motorola Solutions simply has to have much more than a passing interest in Android itself. But the Android expertise is now in the other Motorola company. Go figure. And that's before the looming possibility that the Oracle/Google lawsuit over Android may put a monkey wrench in the works, or that Samsung or HTC take over the Android phone business.

Yes, they're always changing corporation names, and at times it's hard not to see it all as corporation games. In our fast-moving world where companies grow and buy each other, those games and struggles have become the norm, and sometimes one really wonders if all the overhead was worth it.

I was reminded of that while following the course of action of another mobile computing conglomerate over the past three years or so. What happened there was that Roper Industries, a very diversified almost US$2 billion company added three mobile computer companies to its roster, those being long-established Canadian DAP Technologies, start-up Black Diamond, and JLT Mobile Computers. The three companies were put together under the "Roper Mobile Technology" name, with DAP, Black Diamond, and Duros (the former JLT models) being its brands. Roper Mobile Technology was then renamed to RMT, Inc, with a nice and modern logo. That seemed to make much sense, but then the whole effort was shelved, with DAP Technologies absorbing the Duros lineup and renaming everything, retiring both their old "MicroFlex" brand name as well as the impressive-sounding "Duros" in the process (and also the somewhat contrived-sounding "Kinysis" name). Black Diamond is once again on its own as a subsidiary of Roper Industries. This probably all makes sense, but from the outside it looks confusing and like a few year's worth of lost opportunity to establish a force in the mobile/rugged market.

No one has a crystal ball, and every decision is (hopefully) the result of careful consideration, but sometimes it's hard to figure out why things are being done a certain way when there appear to have been much more logical courses of action.

Posted by conradb212 at 04:47 PM | Comments (0)

December 22, 2010

"10 tablets that never quite took off"

This morning, one of my longterm PR contacts brought to my attention a feature entitled "10 tablets that never quite took off." It was published by itWorldCanada, which is part of Computerworld. Now Computerworld is one of the world's leading resources of excellent IT reporting, and has been for decades (I used to contribute it in a former life as a corporate CIO), but the "slideshow" was disappointing and missed the point by listing some older tablets and mocking them.

Unfortunately, we're seeing a lot of this sort of stuff in the media now. Most younger editors seem to believe that Microsoft invented the Tablet PC in 2001 when, of course, tablets were around a good decade before that. Older editors who did not specialize on rugged vertical market hardware often have a distorted memory of what those pioneering efforts meant. While it's undoubtedly true that earlier efforts at commercializing tablets for the consumer market were met with little success, those tablets did succeed in many important vertical and industrial markets. Mocking older, pioneering technology for not being like the iPad is a bit like mocking the military Humvee for failing to succeed as a suburban SUV; different purpose, different time, different technology.

The slideshow presents some interesting benchmark products that were ahead of their time (such as tablets from Motion, Acer, ViewSonic, Xplore, etc.), but the commentary seems oddly uninformed/flippant for a respected entity like ITWorld. They mentioned "a firm named Wacam" and diss it for not being multi-touch. well, first, it's Wacom, not Wacam, and second, Wacom's electromagnetic digitizer technology has successfully been used for about 20 years (and remains part of Wacom's G6 input technology that combines capacitive multi-touch with an active digitizer).

The whole slide show seems ill-informed and condescending, sort of like a cheap potshot that disqualifies earlier, pioneering efforts as nothing but technological pratfalls. That is hardly true as what Apple eventually came up with, and what everyone else is trying to emulate now, stands on the shoulders of those pioneering efforts. If anyone is to blame for a lack of consumer market success it's Microsoft which, in its insistence on "Windows Everywhere!," never made more than a token effort to provide an OS suitable for tablets. In the light of this, the relative success of a ruggedized, special purposes tablet computers for industrial markets is even more impressive. A publication like IT World Canada ought to know and appreciate this.

Posted by conradb212 at 06:35 PM | Comments (0)

December 16, 2010

The tablet wars: background and outlook

This whole tablet thing is really interesting.

Despite getting soundly trashed by a good number of industry experts when the iPad was first announced by Steve Jobs on January 27, 2010, Apple ended up selling about ten million of them in 2010, and the same experts now predict that a lot more will be sold in the coming years. Everyone is scrambling to also have a tablet. Tablets are hot, tablets will demolish the netbook market, tablets will eat into notebook sales, Microsoft will gag and wither over having blown it with tablets, and so on and so on.

So let's take a look at what's really happening. First, tablets are not new. I often see references in the tech press on how Microsoft invented tablets back in 2001 when they introduced the Tablet PC. That's not true, of course. Tablets go back at least another decade, and more if you count such concepts as the Apple "Knowledge Navigator" that was introduced in 1987, or earlier yet, Alan Kay's DynaBook of 1968. What's also mostly forgotten is that almost 20 years ago, the computing world was all hyped up over tablet computers that you could write on, slates that were sort of like "smart paper." None other than Microsoft's own GM of their Pen Computing Group stated that "the impact of pens on computing will be far greater than the mouse," and that was in November of 1991.

See, back then, the buzz was building of pen computing as the next big thing, and pen computers were nothing other than tablets. Microsoft felt the heat because GO Corporation got a lot of press for its PenPoint OS that, unlike Windows, was totally designed for pens. GO released PenPoint in 1992, a company named Momenta released its own tablet interface, and a good number of tablet computers chose PenPoint, including the first IBM ThinkPad (yes, almost two decades before pundits made fun of the term "iPad," there was the ThinkPad, and later the WinPad). Microsoft battled back with Windows for Pen Computing, a version of Windows 3.1 that added a layer of pen functionality. An OS war took place in the early 1990s on such earlier tablets as the NCR NotePad, the Samsung PenMaster, the Fujitsu Point, the Toshiba DynaPad, as well as pen computers made by GRiD (courtesy of Jeff Hawkins who later founded Palm) and a gaggle of long-forgottens such as Dauphin, TelePAD, Tusk, and several others.

Microsoft won that war back in the early 90s, and they did it the way they always do it, by sheer, brute force. Windows for Pen Computing outmuscled PenPoint on the major platforms via some highly publicized sales, but it was a Pyrrhic victory as tablets went nowhere, in part because Windows just wasn't suitable for tablets and in part because the hype was about (underperforming) handwriting recognition as much as it was about tablets. One by one, the majors dropped out -- NCR, Compaq, Toshiba, IBM, NEC. Some hung in there long enough to see the complex and limited pen version of Windows 95, but tablets were done for the 90s. When Palm showed that pens could actually be quite useful, Microsoft launched Windows CE, but the small CE-based handhelds built by all the major Windows licensees were just too limited to excite anyone.

But those early tablet efforts were not entirely wasted. A small but resilient tablet computer industry survived and kept developing specialized tablet solutions for vertical market clients.

The next big tablet push came in 2001 when Microsoft, mostly on Bill Gates' belief in tablet computers, directed the world's computer makers to support its Tablet PC project. There was a widely publicized build-up with all sorts of tablet prototypes that culminated in the unveiling of the Tablet PC platform in the fall of 2002. At Pen Computing Magazine (which spawned the present RuggedPCReview.com) we reviewed all those early tablets, including the Acer TravelMate C100, the HP Compaq Tablet PC TC1000, the Fujitsu Stylistic ST4000, the Motion Computing M1200, the Toshiba Portege 3500, and the ViewSonic V1100, and we summarized it all in Pen Computing Magazine's detailed 2002 Tablet PC specification table. What's immediately noticeable is that most of those marquee tablets were actually what came to be called "convertible notebooks" or "notebook convertibles."

What had happened was that Microsoft had gotten cold feet about the mainstream appeal of tablets in mid-stream, and ordered Acer to come up with a convertible notebook design. By the time the Tablet PC was actually and officially unveiled, the emphasis was clearly on notebook convertibles. The media was only cautiously optimistic about the outlook for the by now not-so-tablet-anymore Tablet PC, and the market quickly decided it didn't make much sense to pay extra for pen functionality on convertible notebooks that made thick and clumsy tablets, if they were used as tablets at all. So that didn't go over too well.

There were plenty of parties to blame for the 2001/2002 Tablet PC concept's lack of success. Microsoft's midstream switch to convertibles pretty much killed the belief in the tablet-only versions. Tablet products cost more without offering tangible benefits. Microsoft's marketing support was lacking, to put it mildly. By far the most important problem, though, was that Microsoft once again tried to put an only slightly adapted version of Windows on tablets. That approach didn't work in 1991, it didn't work with Windows 95, and it didn't work with Windows XP in 2002.

Then nothing happened in the tablet market for a good many years. Nothing, that is, than a few vertical market vendors eeking out a living offering various vertical market tablets for special applications. After all, if you have the right software and you have to walk around on the job, it IS easier and faster to just tap on a tablet than setting up a notebook and crank up Windows.

So then the iPhone happens in 2007 and dazzles the world with a smooth, elegant, effortless user interface, one that lets you tap and pan and swipe with just the slightest touch, and where you can use two fingers to smoothly zoom in and out or rotate things. What made it all possible was Apple's use of a capacitive touch screen, a technology that neither needed a special pen like the preferred digitizer technology of the Microsoft Tablet PC, nor a stylus like most handhelds and PDAs. Capacitive touch, while hardly new, made using the iPhone fun and easy, but no one anticipated what would come next, and that was the iPad.

As stated in the opening paragraph, there was much criticism when Apple first announced the iPad. It wasn't computer enough, you couldn't run real software on it, it was just a big iPhone without the phone, and so on and so on. What those critics didn't realize was that the only reason the tablet form factor hadn't worked before was because the software hadn't worked before. Or more precisely, because Microsoft's insistence on "Windows Everywhere" was a big, colossal failure. One more time: Windows was designed to be used with a mouse. A mouse. Not a pen, and not fingers.

So what's the first thing Microsoft does when capacitive touch is starting to look like a real good thing? It adds touch to Windows 7. Which meant that the few Windows-based computers that also have a projected capacitive touch screens could be operated with touch. Sort of. Sort of, because Windows 7 is no more a touch-optimized OS than any other version of Windows before it.

The sheer predicament Microsoft was facing became evident during 2010. As millions of iPads were sold, Microsoft had nothing other than Windows 7 with the usual bit of pen support. This left the door wide open for Google, which had opportunistically positioned the Android OS they had purchased and developed as the platform of choice for iPhone rivals. Despite the flop of their own Google phone, the surprise success of the Droid helped Motorola get back on the (phone) map and quickly established the Android OS as the primary alternative for most non-Apple smartphones.

Not surprisingly, while Microsoft waited out 2010, it became apparent that Android, like the iOS, could easily scale up to larger tablet form factors. This realization apparently caught Google somewhat by surprise as their Android development efforts remained firmly concentrated on smartphones. This didn't stop a growing flood of bargain basement priced Chinese iPad copies to use (or maybe abuse) Android in cheap hardware with resistive digitizers that made them almost impossible to operate. This certainly didn't help Android look good, but the software platform's ascension into tablets is a done deal nonetheless.

Interestingly, despite lots of tablet announcements, nine months after the iPad went on sale, there's really only one halfway credible Android tablet out there, and that's the Samsung Galaxy Pad. I say "halfway" because the Samsung tablet only has a 7-inch display, thus placing it into a different category from the iPad.

So where does that leave the booming and seemingly unstoppable (experts predict many tens of millions sold in each of the next few years) tablet market? In an interesting situation for sure. Let's look at some of the forces at work:

First, almost no one wants to truly alienate Microsoft, and so Android may well find itself getting the "PenPoint treatment," referring to the situation almost two decades ago where a better-suited OS was muscled off tablet hardware by Microsoft. However, Google is an entirely different class of opponent than the underfunded PenPoint movement was back then. But Microsoft is different, too, and though Microsoft has lost a great deal of momentum, it still controls the desktop and most of the notebook market.

Second, even if Microsoft were to somehow prevail against Android, they still need to face themselves. For decades now, Microsoft has been its own biggest enemy with their dogged determination to use the big old Windows OS everywhere, whether it was suitable or not. Sure, they deviated a bit here and there, but whatever they tried elsewhere (Windows CE, Auto PC, special versions of Windows, etc.) always was sort of half-hearted and primarily designed not to encroach on Windows proper. So I just cannot see how a version of Windows 7 or 8 retrofitted to sort of fit onto tablets would meet with much more success than Windows for Pen Computing or the Windows Tablet PC Edition.

Third, there's a digitizer predicament. From the very dawn of pen computing, starting with the earliest tablets, virtually all serious tablet computers used an "active" digitizer, i.e. the kind that lets you write smoothly and accurately onto the surface of the display as if it were a sheet of paper. Active digitizers allow for very precise drawing, writing in "digital ink," and also for handwriting recognition (which really works much better than most give it credit for). Active pens do not need actual physical touch for the digitizer to know where the pen is, and that makes them great for popping up pulldowns or explanatory balloons and such before committing to a touch that might trigger an action. Problem is, capacitive touch cannot do that. Sure, you can write with your fingers, but not in any meaningful way. For that you need a pen.

And the digitizer predicament doesn't end there. A lot of the tablets (and convertibles) sold into vertical and industrial markets are going to be used outdoors where there are pesky things like bright sunshine, all sorts of reflections, rain, snow, dust, physical impact, and people wearing gloves. Capacitive touch displays can handle some of those, but not all. Possible answers are offering a variety of optional digitizers, or a combination of them. Both approaches increase costs, and they have their limits. And the underlying OS platform determines what kinds of digitizers make sense. For example, you can operate Windows quite well with a resistive digitizer, but Android really needs capacitive touch. Anyone who needs to write or draw on a tablet needs either an active or a resistive digitizer, and won't benefit from the wonders of touch-based zooming, panning and swiping, unless touch is combined with either active or resistive technology.

The final, and greatest, problem is that the iPad has irrevocably changed what users expect from a tablet. If you give someone a tablet these days, they simply expect to be able to quickly zoom in and out in a browser, and they use two fingers to do it. If that doesn't work, or only works poorly, well, why doesn't this work like an iPad? This, then, is the danger facing everyone who makes a tablet that looks just like an iPad: it must also work as well as an iPad. Or almost as well.

We'll probably have some answers soon. We'll soon know if Microsoft's answer to the iPad will simply be putting Windows 7 on tablets, or if they've learned from past mistakes. We'll soon know how successful Android will be in making major vendors truly commit to it. And we'll soon know whether HP will seriously try to add another option with the WebOS they got when they bought Palm.

It should be interesting.

Posted by conradb212 at 08:30 PM | Comments (0)

September 03, 2010

What are discrete graphics, and why would you need them?

If you follow the mobile computing beat, you've probably come across the term "discrete graphics." What that generally means is a computer's graphics display capabilities that are a separate sub-system and not part of the motherboard or, more recently, processor. Why should you care?

Because as with almost everything else in life, one-size-fits-all only applies to a certain extent. Most computers take the one-size-fits-all approach, offering a set of features and performance that is good enough for most intended applications. Most, but not necessarily all. In graphics, that means that your standard mobile computer can handle all the usual functions such as communications, browsing, office apps and most media. However, the one-size-fits-all graphics capabilities that come with a system may struggle with more demanding applications such as advanced 3D graphics, CAD, GIS or other graphics-intensive tasks.

Discrete graphics, while uncommon in mobile systems, are a standard part of almost all desktop and many notebook computers. Starting with the earliest IBM PC, computers had separate graphics cards that handled the moving of pixels on a display. When users wanted more resolution on their IBM PCs to run Lotus 123 than standard 640 x 200 CGA, they popped in a Hercules graphics card that boosted res up all the way to 720 x 348 pixel and made charting faster and more impressive.

Over time, graphics "cards" became increasingly powerful graphics subsystems that provided a very significant boost in capabilities and performance. Top of the line graphics "cards" can cost as much or more than the rest of the computer combined, and they often have their own cooling sinks and fans. Graphics subsystems in notebooks are usually less conspicuous and they can even be integrated into boards, but they are still often differentiators between low-end and high-end versions of the same computer.

Now who needs "discrete" graphics? Not everyone. In the past, separate graphics subsystems or cards often offered higher resolution than standard built-in graphics. That's because old-style CRTs were able to support multiple resolutions. LCDs are different in that they are designed as a matrix of so and so many pixels, and that is the "native" resolution that results in the crispest picture. Most integrated graphics are more than capable of running a LCD in its native resolution, and since the LCD doesn't support higher resolutions, there is no need for a graphics card that can drive more pixels.

However, resolution isn't everything. Over time, computer graphics have evolved into a science with numerous standards and technologies. That's especially true in the areas of shading, rendering and manipulating 3D objects. This goes way beyond simply making pixels appear on the screen in a certain color and brightness. Games, for example, can require huge amounts of graphics computing power to make objects and 3D action look as lifelike as possible. 3D modeling and visualization likewise can require vast amounts of graphics computing power.

How does all of this affect mobile computing? Well, mobile systems cannot possibly accommodate top-of-the-line graphics for the same reason that they cannot provide top-of-the-line desktop performance: power and heat. So just like mobile systems must have a carefully designed balance between performance, weight, size, cost and battery life in their choice of processors, the same goes for their graphic sub-systems. Up to now, for the most part, the processor handled processing and its complementing chipset handled graphics. And the graphics part of those chipsets came from third parties that specialize on graphics, such as nVidia or ATI.

Up to recently, the situation was that most mobile systems made do with the "integrated" graphics capabilities inherent in their chipsets. These designs share system RAM and have some limitations. Some higher end or specialized devices had more powerful graphics to speed up certain applications.

With the advent of Intel's 2010 Core processors, the game changed somewhat because Intel integrated the GPU (graphics processing unit) right into the CPU package. Intel claims this improves efficiency, speed and stability while graphics chipmakers probably view it as an Intel land grab designed to assert even greater control without, however, being able to provide the graphics performance some customers require. Both sides have their points, but one thing hasn't changed: a separate "discrete" graphics sub-system will still outperform one-size-fits-all integrated graphics, and may also provide graphics functionality not included in a standard integrated system.

But what does it all mean in the real world?

RuggedPCReview.com recently had a chance to benchmark test two mobile computers that offered discrete graphics on top of whatever came integrated into the chipsets. One of them was the General Dynamics Itronix Tadpole Topaz, a high-end "COTS" (Commercial Off The Shelf) notebook designed primarily for military applications. It came with an nVidia GeForce 8600M GT graphics sub-system. The other was a Panasonic Toughbook 31 equipped with discrete ATI Radeon HD5650 graphics. Both machines ran Intel chips at a clock speed of 2.56GHz. However, while the Topaz uses an Intel Core 2 Duo T9400 processor without integrated graphics, the Toughbook employs an Intel Core i5-540M with integrated graphics that can either be turned on or off via BIOS settings.

The two machines are not directly comparable as they address somewhat different markets. However, when comparing the GD-Itronix Topaz with a GD-Itronix GD6000 that runs the same processor but does not have discrete graphics, the Topaz substantially outperformed the 6000 both in 2D and 3D graphics benchmarks, and absolutely blew it away in an OpenGL benchmark, by about a factor of 12:1. Now, OpenGL (Open Graphics Library) refers to a cross-language, cross-platform API for 2D and 3D graphics and is widely used in CAD, simulations and visualizations. If a customer has applications that use OpenGL code, then having OpenGL optimized graphics is absolutely mandatory.

While the Topaz used its nVidia graphics full-time, the Panasonic's discrete ATI graphics can be switched on and off. Why would one want to switch off presumably superior graphics? For the same reason why in a vehicle you wouldn't want four-wheel-drive or a turbo engaged all the time when you don't really need it. Such performance boosters for special purposes can have a very negative impact on fuel mileage, and that, for now, is no different with discrete graphics. Panasonic quotes up to 11 hours of battery life with the discrete ATI graphics off, but only 5 hours with them on. That is a big difference.

So what do discrete graphics get you in a modern Core i5 machine like the Toughbook 31? Not surprisingly, in day-to-day use, you probably would hardly ever notice the difference. But as soon as you get into 3D graphics and such, the ATI boosted performance by about a third, a very noticeable difference. The real payoff, again, comes in with OpenGL, where things happen more than four times as fast. That's the difference between barely tolerable and actual, real work.

Bottom line? For now at least, if your application requires speedy 3D graphics or includes a lot of OpenGL code, discrete graphics is almost a must. It's a bit of a dilemma as Intel is clearly trying to eliminate third party separate graphics and probably doesn't pay much more than lip service to easy integration of external GPUs. This uneasy relationship may or may not contribute to the steep drop in battery life with discrete graphics engaged, but if battery life is an issue, it's certainly good to be able to engage discrete graphics only when needed, or when the machine is plugged in.

Posted by conradb212 at 05:10 PM | Comments (0)

August 19, 2010

New Intel Atoms, and how Oracle is helping Microsoft

So Intel has added two more processors to its ever growing family of Atom processor products with all its many branches and suggested applications.

The new chips are the single-core Atom D425 and the dual-core Atom D525 both of which run at 1.8GHz, representing a small step up from the existing 1.6GHz D410 and D510. Thermal Design Power remains at 10 and 13 watts, and the stated quantity prices of US$42 and US$63 is also the same as that of the earlier chips (which, however, enjoy "embedded" status). There is one notable difference: the two new chips support DDR3 SODIMM, and Intel is promoting them for home and small business network storage devices.

To put things in perspective, unlike the Atom N270 that made the netbook explosion possible and accounts for tens of millions sold, and unlike its N450 (and N455/N475) successor, the Atom "D" processors are the ones Intel targeted for "nettops," i.e. really inexpensive desktop PCs. From what I can tell, that strategy didn't pan out as there aren't too many desktop PCs that used the D410 or even the dual-core D510. Why? Perhaps desktops and notebooks are so inexpensive these days that consumers see no reason to get a machine with anything less than a "real" Intel chip (i.e. a Core 2 Duo or one of the new Core i3/i5/i7 chips). So one explanation for the new D425/D525 is that Intel is trying to salvage the Atom "D" by giving it DDR3 support, even if it's only for SODIMM, and targeting the chips at network storage systems, whatever exactly that is in real life.

A bit more commentary about the current Atom situation: there are now no fewer than ten versions of the Atom "Z" processor, ranging from the anemic 800MHz Z500 to the considerably more powerful 2.13GHz Z560. On the market side, the vast majority of Atom "Z" processor-based products we're seeing are using the 1.6GHz Z530, which, after all this time, still seems to be deployed pretty much interchangeably with the Atom N270 (there are products that offer both N270 and Z530 versions, and there are some which switched from one to the other and vice versa). In real life, I've never actually seen a product that uses the Z550 or Z560, which is odd as even the Z540 gives the one Z540 product we benchmarked (the Panasonic H1 medical market tablet) a small but noticeable performance edge over the competition).

But what about the new Atom "Moorestown" chips, a next iteration of the "Z" processors that should finally allow Intel to be competitive in the smartphone and such market? Well, apart from their announcement in May of 2010, we haven't heard another thing whereas ARM et al get all the publicity.

So it's hard to figure out what to make of Intel's Atom efforts to-date. On the one hand, there are the millions and millions of netbooks sold, but while everyone loved the low, low prices of netbooks, few were ever dazzled by netbook performance, especially in the graphics area. With the iPad showing what all can be done with a nominally much slower chip (the 1GHz A4), and netbooks getting ever closer to low-end notebooks, it's hard to see where that's headed.

Anyway, so what about Android? It's interesting to see what difference a week can make, and in this case the difference is the lawsuit Oracle threw at Google over Android. The suit is arcane and I am not even going to try to present details (it has to do with Oracle now owning SUN, which owns Java, and Android is supposedly using part of Java in a way Oracle does not approve of), but the mere fact that one 800-pound gorilla sues another 800-pound gorilla over a platform that up to that suit had almost unprecedented momentum is reason for concern. I mean, if you're a developer, you'd probably stop work and watch while Godzilla and Mothra duke it out. And while you're taking a breather, you may have time to cool off a bit over Android and realize that for now at least it's really little more than a smartphone OS, and that's it's already awfully fragmented. And also that businesses may find it difficult to trust things that come out of the current implementation of the Android App store.

This may all blow over and the two may come to an agreement, but let's realize that Oracle is in an entirely different class than SCO who a few years ago tried to claim exclusive ownership of all things UNIX and Linux. It's hard to see everyone all of a sudden stopping to make Android phones, but if this escalates, you'll see a lot of the companies that announced Android-based "iPad killers" delay their plans.

So who may be laughing all the way to the bank? Microsoft. Nothing could please Microsoft more than seeing Android derailed. Microsoft's own mobile plans are a mess at this point, at least as far as outsiders are concerned. There may at some point again be some sort of cohesive Microsoft mobile strategy and attractive product lineup, but there isn't one now. So the more time Microsoft has to communicate a clear plan and show some real products, the more likely it is to get back into the mobile game.

And HP has a dog in the race as well. HP is already the mightiest computer company in the world, and its immediate fate is not affected by what happens to Android. However, HP is also the company that fumbled away the iPAQ brand and today has essentially no presence in the mobile/smartphone market. But they bought Palm and all of Palm's cool IP, and so if there ever was a time to make a strong push for WebOS, it's now.

We'll see what happens, both with the Atoms and with Android. Who needs reality TV shows with all this stuff going on?!

Posted by conradb212 at 04:58 PM | Comments (0)

August 11, 2010

Android contemplations

Off the cuff, the way I see it is that Android has a better than even chance of becoming the OS of choice for tablets and other mobile devices. Android is really nothing more than another Linux distribution, but one backed and sort of run by Google. Microsoft, of course, will make the usual argument of leverage and security and integration into other Microsoft products, but the fact is that Linux itself can be at least as secure as anything Microsoft makes. Just look at the Mac OS which is also Unix-based, and Unix is the basis of Linux.

As is, Android is still very much a smartphone-oriented OS. But since it is just a shell on top of Linux (Google might object to that simplification), it can very quickly be adapted to almost any platform. For example, I simply downloaded a stable version of Android, created a bootable version on a USB key, and then booted some of the tablets and netbooks in our lab with it. The hardware never knew the difference and almost everything worked right off the bat, including WiFi. Adapting touch drivers and a few other things would be very simple.

The argument against Android is the same that people use against Linux: it's in the public domain. The program you need most may have been written by some guy from Leipzig or Buenos Aires, and that guy may have decided to ditch the code and move to Nepal. The reason why Microsoft has a stable platform is because they control it all, and the reason why iPhone/iPad apps are so very cool and polished is because you ONLY see what Apple examined and approved.

So Android's (and thus Google's) challenge will be to create the semblance of a strong, unified PRODUCT called Android, something people can rely on, and not something where a poorly written manual tells you that you first need to rebuild the kernel with the -fxuOie switches turned on for the app to run. That will be a challenge.

However, none other than General Dynamics Itronix has just released a handheld running Android. That would indicate that Android may be ready for prime time. And even if it isn't, and many questions remain, there's so much buzz and there's Google behind it. That alone will give anyone who offers Android or talks Android a strategic advantage.

Oh, and manufacturers offering both handhelds and tablets/notebooks would finally have the advantage of running the same OS both on hall their platforms, and not a mobile and a full version of an OS as has been the case with Windows and Windows CE/Mobile.

Posted by conradb212 at 06:15 PM | Comments (0)

June 17, 2010

Handheld Group Business Partner Conference 2010, Stockholm

Much to my surprise, the Handheld Group invited me to do a presentation at their annual Business Partner Conference in Stockholm, Sweden. The Handheld Group is an international supplier of rugged mobile computers, including handheld terminals and tablets, and they've carved themselves a nice niche with a lineup that includes specialty devices as well as tailored solutions for variety of uses. The annual conference is meant to provide a venue to socialize with business partners and inform them on products, outlook and opportunities.

I've always liked these sorts of conferences as they provide a great way to talk with executives and product managers, and see the latest lineups all in one place. So I accepted Handheld's invitation and prepared a presentation on "Trends and Concepts in Mobile Computing."

Getting flights these days is a real pain. The standard fares are outrageously high and apparently geared towards business travelers with unlimited expense accounts. I have low fare alerts for most of the destinations I potentially travel to, but those can be an exercise in frustration as those low, low fares are hardly ever actually available. I ended up paying several times the teaser fares, and that was with long layovers and a schedule convenient for the airlines, but not for me. In an era where a web page in London, Tokyo or Stockholm loads as quickly as one next door, we tend to forget how very far away those places actually are.

After landing at Stockholm's nice Arlanda airport I booked a ticket on the super-fast bullet train to downtown Stockholm, then, since it was a glorious morning, walked the 5K or so to the Elite Hotel Marina Tower where the conference was held. That was fun, though I was quite addled with jet lag, and the little wheels on my carry-on probably didn't like the cobble stone of old-town Stockholm much.

Much to its credit, the hotel let this weary traveler check in at 10:30AM, and so I took a long nap in my nice hotel room that looked like right out of an Ikea showcase. Then it was off to registration and meeting my hosts. They were all friendly as can be, and I noticed that most Swedes indeed are blond. I met Sofia, my main contact at Handheld HQ, then Jerker Hellström, CEO and Chairman of the Handheld Group, and Thomas Löfblad, not blond, and thanks to a course of study in the US possessive of less of an accent in his English than I am after 30+ years. The two welcomed the assembly to the conference, introduced the business partners, and kicked off the cocktail party mingling.

The Handheld folks did a great job making everyone feel at ease, and so I soon had interesting conversations with the mostly European attendees as well as some from as far as Australia. I found quite a few veterans of the old Husky Computers that was later bought by Itronix -- not surprising, as I learned, since the privately held Handheld Group had once gotten its start as the Scandinavian representatives of Husky. I got a chance to meet Daniel Magnusson and Nina Hedberg of RAM Nordic AB, which had the patented RAM Mount solutions on display; the Sacci folks with their numerous bags, harnesses and other clever ways of carrying around and using handheld computers; SIGMAX with their law enforcement and ticketing solutions; Brodit with their very clever mounting solutions; I got a demonstration of the impressive mobile device management solutions by The Institution, and spent time with all the other cool stuff there. I also had a chance to finally meet the HHCS Handheld USA team, including Mike Zelman, Dale Kyle, and my ever-helpful contact, Amy Urban.

Given the 9-hour time difference compared to California, I slept remarkably well and woke up refreshed and ready for a day of conferencing. CEO Hellström gave an overview of the company, its total and exclusive dedication to rugged computing, and its pride in being the fastest growing IT company in Sweden (Handheld actually grew during the difficult year of 2009). Hellström described Handheld's "virtual production model" where the company's engineers generate specifications and design, then have the products made by a production partner, and launched either alone or with a partner. He highlighted the company's products and special solutions, as well as the newly introduced Algiz 7 rugged tablet.

Following was an excellent presentation by David Krebs, who is the director of mobile and wireless research at VDC and a frequently quoted authority on all things mobile as well as a compatriot who grew up a few short kilometers from my original home in Zurich, Switzerland. David described the current mobile technology market as "in a state of rebound" after a serious setback in 2009. He pointed out that technology penetration in many mobile areas is still only 20, 30 or 40%, leaving plenty of potential opportunity, and predicted annual handheld revenue growth of 7.5% through 2014. David also highlighted the significant advantage of rugged versus non-rugged handhelds and tablets in terms of failure rates, resulting in substantially lower TCO (total cost of ownership), certainly a big selling point in the road to recovery.

I had decided to go out on a limb and run my presentation on my iPad via Apple's iPad dock-to-VGA adapter. This worked just fine, using Apple's US$9.95 iPad version of Keynote, which is Apple's equivalent of Powerpoint. In my presentation I discussed some of the concepts and trends in mobile computing, ranging from processors, to outdoor viewable displays, to digitizers, operating systems, and emerging new technologies. Murphy's Law struck when, stunningly, the frame of my reading glasses broke right in the middle of my presentation, forcing me to continue using one hand holding up my gasses and the other hand to operate the iPad. Fortunately, I had a clip-on mic or else I'd have needed a third hand.

After that Thomas Löfblad discussed the Handheld Product line that by now includes over a dozen state-of-the-art handhelds and tablets, as well as printers and accessories. All of the newer products are carrying Handheld's own Algiz (tablets) and Nautiz (handhelds) brand names.

After lunch, Mr. Hellström discussed the product roadmap for the year ahead, with the full rollout of the newly introduced Algiz 7 tablet, a second generation Algiz 8, and a glimpse at an upcoming new product that will extend Handheld's line into a new class of devices. The company took the opportunity of the partner conference to get feedback and commentary on the new form factor, the proposed features, functionality and price.

After face time with the new product, we heard about Handheld's plans on moving forward. Sofia Löfblad talked about how the company can support its partners with case studies, advertising support, loaners, product reviews, a special support website, and several other programs. Service and Support Manager Max Dahlbom then did a humorous, energetic presentation on service, warranty, care levels and support, all geared towards helping and supporting customers and stressing the importance of good service as a differentiator. Thomas Löfblad then addressed issues such as the impact of the weak Euro, the company's MaxFreight service, insurance issues and also some product updates.

The final presentation came from Dean Lindsay, a motivational speaker and best-selling author (everyone attending got a copy of Dean's highly recommended book "The Progress Challenge") who, engagingly and entertainingly, talked about common sense concepts of attracting and fostering business and sales.

What followed was an absolutely delightful four-hour cruise of the Stockholm waterways aboard the M/S Riddarholmen. We were greeted onboard with champaign, GPS-equipped Algiz 7 tablets were mounted in several locations and provided mapping and navigation data, food and drink were delicious, as was the varied scenery passing by.

There was again ample opportunity to mix and mingle, compare notes, and talk with people from the Handheld Group as well as partners and customers. The weather played along with bright sunshine all conference long (which apparently no one expected), and then some dramatic clouds at dusk. Unusual for those of us not living in northern latitudes, it didn't really get dark until way late into the night, and daylight remained even after we got back to the dock at 11PM.

I missed out on Stockholm sightseeing the next morning as I had to grab a cab to the airport for my trip back to California. Long though the flights back were, and the 8-hour layover at Chicago O'Hare, it gave me an opportunity to reflect on a side of business we often forget or take for granted, the people side. There are lots of great products out there, all able to do amazing things. But it takes people with vision and drive and competence to form companies that can pull it all together, picking a lineup of compatible products for a well-defined purpose, then marketing, selling, supporting and servicing those products. In the end, that's what it's all about, dealing with people you know and trust, folks who've been there and will be around, and who know their business. That's the impression I got from the Handheld Group. Good company, good people.

Posted by conradb212 at 07:17 PM | Comments (0)

May 28, 2010

4G

Pretty soon everyone will be talking about 4G. Who has 4G and whose 4G is better or faster. Somehow, marketing from all wireless camps has latched onto the cool-sounding terms 3G and 4G, though they're avoiding "3.5G" or "3.75G" you often find in tech specs. That's probably because three and a half sounds like not quite four.

Anyway, Sprint is now making noises about 4G and you can actually buy 4G smartphones using the Sprint network. Since Sprint is a bit in the ropes, being first may not mean all that much, but it's still good to know how things developed and where they are headed.

A couple of years ago, a product manager from one of the rugged computing manufacturers asked me what I thought of WiMAX. At the time WiMAX was a buzzword for really fast next generation wireless. I told him it was probably too early to worry about it and it wasn't clear what would happen. This is still true in 2010, but WiMAX is now available as "Sprint 4G" though technically it's the same network as the CLEAR brand name 4G network from a company named Clearwire.

Who's behind Clearwire? None other than Craig McCaw, the trailblazer who once compiled McCaw Cellular and then sold it to AT&T in the early 1990s. Where does Sprint fit in? Well, McCaw and Sprint both owned spectrum in the 2.5GHz (in fact, they own almost all of it) and decided to pool resources, with Sprint making additional investments until they were the majority stockholder of Clearwire.

Does this mean it'll be clear sailing for Clearwire/Sprint in the emerging 4G arena? Not really. Problem is, this time both AT&T and Verizon are backing an alternate 4G technology called LTE, which stands for Long Term Evolution (how do they come up with these terms?!). LTE uses the 700MHz band and physics dictate that waves in that spectrum can go farther at lower power levels and also have less trouble being received in buildings. This potentially means that LTE, in addition to being backed by the two largest wireless providers, also costs less to deploy.

For now, Spring and Clearwire claim that their 4G WiMAX network allows mobile download speeds of 3 to 6 mpbs with speed bursts over 10 mbps, and upload up to 1 mbps. Sprint and Clearwire sell both 3G/4G Mobile Hotspots (made by Sierra Wireless) and 3G/4G USB modems that look like standard USB memory keys. Clearwire offers unlimited usage for US$40/month plus a small lease fee for the modem (you can also buy it). That sounds like a good deal, but for now the map for that is quite limited.

More speed is exciting, but at least based on my various devices that use 3G, I'd be thrilled to have reliable 3G coverage wherever I go before I jump to 4G.

Posted by conradb212 at 03:57 PM | Comments (0)

May 19, 2010

Intel vPro technology—what is it all about?

If you follow chipmaker Intel, you know that the company not only loves code names, but also special technologies that are then used to market certain chips or chip families. At some point it was "with MMX" that made Intel Pentium chips special in hilarious commercials showing Intel engineers in astronaut suits. "Hyper-threading" was big for a while, and for the latest families of Core processors, Intel stresses "Turbo Boost." Another Intel technology that gets a little less attention is vPro, but vPro is now becoming part of the marketing message of some ruggedized mobile computing products that have been upgraded to include Intel's latest Core i5 and Core i7 processors.

So what is vPro all about?

vPro is an Intel technology platform that allows remote access to a PC regardless whether the computer is booted up or the power is even on. It is intended to allow remote management, monitoring and maintenance while maintaining strict security measures. While vPro componentry needs to be included in the processor, it's a platform rather than just a technology feature, one that requires a combination of chip, board, firmware and software. vPro also includes other Intel technologies such as Intel AMT (Active Management Technology), Intel Virtualization Technology, Intel's TXT (Trusted Execution Technology) and, of course, a network connection.

While remote access and management of PCs is commonly available through software such as VNC, VNC alone may not be capable and secure enough for all corporate purposes. With vPro, VNC can still be used, but it is now the Intel AMT part that facilitates secure communication with the PC, and in conjunction with the whole vPro platform, it is not only possible to control a remote PC, but aso to start it up and—even more amazingly—log in and perform certain function even if the OS is corrupted or missing. That's because the vPro engine/platform is available at a very low system level.

How can such vPro-based remote access be used? Well, there could be a system where dispatch sends job requests to a mobile computer in a filed office or a vehicle. The request will boot the computer if it is off, and then either perform a job or prompt an operator or driver to do the job and report back. It can then turn off the computer remotely, even shut down the OS. As long as the computer still has power, it remains remotely accessible (remotely waking up a PC is usually done via a hardwire LAN connection.

Panasonic highlights vPro in the recent introduction of its Toughbook 31 rugged notebook that use the vPro-enabled Intel Core i5 processor. Panasonic even features a video that shows the use of vPro technology between a dispatch with a vPro console and a Toughbook-equipped service truck. It demonstrates how the remote console can wake up the Toughbook, run a job, then shut it down again.

Motion Computing, too, stresses the advantages of vPro in their announcement of the upgraded Motion C5v and F5v tablet PCs, stating that vPro technology will enable their customers to experience enhanced remote management capabilities so IT can secure and/or repair a tablet from any location, even with power off.

So that's vPro, a set of technologies to remotely access and control computers securely. While remote access and control is not new, being able to do it securely, and with power down and no OS booted can definitely come in handy. Not everyone will need or use vPro, and setting things up for remote access and management is not entirely trivial, and so most users will simply enjoy the very significant performance increases of Intel's latest Core i5 and i7 processors.

Posted by conradb212 at 04:18 PM | Comments (0)

May 06, 2010

"Moorestown" — Intel's new Z6xx Atom platform and how it fits in

On May 4th, Intel introduced the next generation of its initial family of Z5xx Atom processor. Codenamed "Moorestown," the Z6xx family, together with a new I/O controller and signal processing chip are meant to make Intel competitive in the booming smartphone and internet access device market. On paper at least, the new processor family looks very good and may yet help Intel establish itself in the device market (which, interestingly, they abandoned when they sold the XSCALE application processor business to Marvell a couple of years ago). But before we go into details of Moorestown, let's backtrack and see how Intel's whole Atom venture began and developed.

"Silverthorn" and "Diamondville"

The Intel Atom processors have been around for over two years now. Initially, Intel launched two different product lines, the Z5xx "Silverthorne" processors geared towards mobile internet devices (MIDs), and the N2xx line of "Diamondville" processors for standard low cost PCs and netbook class devices.

The Z5xx versions of the Atom processor had a 13 x 14 mm package footprint and used the also new “Poulsbo” System Controller Hub. The processor had about 47 million transistors—more than the original Pentium 4. Bus frequency was 400MHz or 533MHz, and the Thermal Design Power (TDP) was between 0.85 watts for a low-end 800MHz chip, and 2.65 watts for a 1.86GHz Z540 version. The chipset used about 2.3 watts, which meant total CPU and chipset consumption wasn’t even 5 watts, far less than any of Intel's standard mobile processors. And the chipset had hardware support for H.264 and other HD decoding (but required the appropriate codecs to take advantage of it!). However, as the combo was targeted for internet devices, there was only PATA and no SATA support, though SATA could be added.

The Atom N2xx "Diamondville" family, released a bit later, was very similar to the Z5xx, so much so that to this date, I've yet to find someone who can convincingly describe why a manufacturer would pick one or the other, or what truly differentiates the two families. The Z2xx was a bit larger, measuring 22 x 22 mm, and the most popular model—the 1.6GHz N270—also had a, for Intel, very low Thermal Design Power of just 2 watts. The N2xx processors did not come with a newly designed chipset, but used lower power versions of the standard Intel 945 chipset and a separate ICH7M I/O chip. There was no HD decoding or hardware acceleration, but the chip did support the SATA interface.

The initial Atom processor families did not use two cores for cost and power conservation reasons. Instead they used Intel’s older HyperThreading technique that can process two threads, yet increases energy usage by only about 10%. Intel also developed a more power-efficient bus and a cache that could be disabled when it was not needed. The Atom Z5xx further used a new "Deep Power Down" C6 state, and similar advanced power management was available in the Atom N2xx.

What happened next was interesting. While Intel probably had high hopes for the Z5xx chips in the emerging "mobile internet device" market, it was the "Diamondville" processors, and more specifically the 1.6GHz N270, that almost singlehandedly created the new category of "netbooks" (well, the term had been used before, but never to describe a separate class of mobile computers). Despite the N270 chip's modest performance, consumers bought millions and millions of those little netbooks, most likely because of the low price that made netbooks an impulse buy as opposed to spending more for a "real" notebook computer.

The N270, however, was the sole bright spot in the Atom lineup on both sides of the Atom family, as neither the desktop-oriented N230 nor the entire mobile internet device Z5xx family did much of anything. The Z5xx chips were used in some industrial products like computers-on-modules, small form factor CPU boards, industrial tablets (such as the Handheld Algiz 8, the Mobile Demand T7000, the Logic Instrument Fieldbook, or the WinMate I980), MCAs (such as the Panasonic H1 or the Advantech MICA-101), or clamshell UMPCs (such as the Fujitsu UH900), but by and large there seemed no truly compelling reasons to go with Silverthorne.

"Diamondville" gets a little boost

On the Diamondville netbook side, the problem with the Atom N270 was that despite being used in all those netbooks, it was barely powerful enough to drive even those small, inexpensive computers. Anyone trying to do video or games on a netbook came away sorely disappointed. As a stop-gap solution, Intel released the very slightly more powerful N280 (1.66GHz clock speed instead of 1.6GHz) for netbooks, and the dual-core N330, which was really a dual-core version of the little-used desktop N230. With Atom video performance lagging, NVIDIA came up with the NVIDIA Ion Graphics chipset that was supposed to work better with Atom N-Series chips than Intel's own chipset, but it didn't come in time to make it into any of the first generation netbooks.

"Silverthorne" gets tougher

For embedded computing, in March of 2009 Intel quietly expanded the Z5XX platform with larger form factor versions that carried a "P" in their name, and then a special "large form factor with industrial temperature options" version marked with a "PT." This added the Atom 1.1GHz Z510P and 1.6GHz Z530P as well as the 1.1GHz Z510PT and 1.33GHz Z520PT. The P and PT versions used a larger 22 x 22 mm package (which is the same size as the N2xx chips) that used a different "ball pitch"—the spacing of the little balls of solder that replace pins on the underside of these tiny processor packages. That was probably done because the 0.6mm ball pitch of the original Z5xx series required high density interconnects (HDI) on the printed circuit boards, and those are more difficult to do and also more finicky, not what you'd want in the kind of rugged devices the chips were actually used. As far as temperature range goes, 32 to 158 degrees Fahrenheit is considered "commercial," whereas -40 to 185 degrees Fahrenheit is considered "industrial." Interestingly, only the "PT" series processors support the industrial temperature range; the "P" series versions are listed with the same commercial temperature range as the initial chips.

RuggedPCReview's assessment in 2009 was that "the moral of the Atom story is, at least for vertical market manufacturers: pick an Atom chip that Intel is likely to support for several years, and make certain the drivers are fully optimized and all the power saving features are fully implemented. Atom can deliver superior battery life and acceptable performance, but manufacturers must carefully target those products so customers won't be disappointed. We've seen Atom-based machines that use hardly less battery power than devices with much more powerful processors. That won't do. And we've seen some where non-optimized graphics drivers made the machines painful to use."

"Diamondville" begets "Pinetrail"

In December of 2009, Intel announced the next generation of Atom processors, or really the successor of "Diamondville." The new "Pinetrail" generation of Atom processors included the single core N450 (heir to the N270) and, adding yet another letter class, the single core D410 and the dual-core D510, both meant for low-end desktops. The big news here was that Intel reduced the chip count from three to two by integrating the graphics and memory controller into the CPU itself. The old ICH7M I/O controller chip was replaced with the Intel NM10 Express. That meant fewer chips to mount, somewhat lower power consumption, and—not mentioned by Intel—one less reason to seek third party chipsets such as NVIDIA's Ion. Reducing the chip count from three to two was nice, but the Z-series processors already had that. Graphics seemed somewhat improved, but not enough to make a huge difference, and there was still no HD playback hardware support. Our assessment was that we could not "help but feeling that Intel looked out for itself more than adding compelling value for consumers."

So for now, the N450 and the slightly faster 1.83GHz N470 are taking care of the netbook market, but what of the ever more important MID and smartphone market that Intel tried to address with Silverthorne? By now it was very obvious that Silerthorne had zero impact on that market and no one was going to base a smartphone or anything like it on an Atom Z5xx chip. Intel might have suspected as much, as even in the early days of Atom, their roadmap included codename "Moorestown," a system-on-a-chip platform.

Silverthorn replaced by "Moorestown"?

Well, Moorestown was officially introduced on May 4th, 2010. It includes the "Lincroft" Z6xx series of Atom chips, the "Langwell" Platform Controller Hub MP20, and the "Briartown" Mixed Signal IC (yes, Intel loves its code names). In its press release, Intel mentioned "significant power savings while increasing performance" in a design scalable across a range of devices including "high-end smartphones, tablets and other mobile handheld devices."

So what does Intel promise for the Z6xx platform? Nothing very specific as of yet. Power "breakthroughs" include much lower power consumption at idle and with audio active (i.e. music playing), and 2-3X reduction while browsing or playing video. That's good. Intel also promised a full 1080p video experience (really already possible with the Z5xx chips, albeit perhaps not "full"), with clock speeds up to 1.5GHz and low-power LPDDR1 memory for smartphones and 1.9GHz and faster DDR2 memory for tablets (current Z5xx series chips range from the 1.1GHz Z510 to the 2.0GHz Z550. Intel highlights that the new chips result in greater than 40% reduction in package area and a greater than 50% reduction in board area for the Z6xx and MP20, so their combined package real estate is less than 400 mm2, and the board area required less than 333 mm2. The new "Langwell" Intel Platform Controller Hub MP20 has a package size of 14 x 14 mm (same as Apple's A4) with a 0.5mm pitch and uses 65nm technology. That's down from the 22 x 22mm Poulsbo. The Z6xx chip itself is also on a 14 x 14mm package (see below).

From an architecure standpoint, the new 1Z6xx CPUs integrate a lot of the functionality that used to be part of the Poulsbo chipset, such as graphics, decoding/encoding, memory controller, etc., leaving the "South Complex" "Langwell" chip to concentrate on I/O. The graphics core integrated into the "Lincroft" CPU is the same as that on the older "Poulsbo" chip, but the core can now run at up to twice the frequency and has been optimized for power and performance. Video decoding remains the same, but there's now 720p H.264 and MPEG4 encoding and also H.263 videoconferencing encoding. Intel says that 3D graphics performance should double.

The "Briertown" "Mixed Signal IC" is meant to integrate components such as audio, touchscreen, voltage regulator, display and comms drivers and such. Intel stressed that it will be available from mutiple sources (such as Freescale, Maxim and Renesas).

While more performance was desirable, less power consumption was essential if the new chips are to have a chance in the device market. So Intel did some major work on power states. Instead of the older system-wide power management, much greater power savings are now possible by giving each subsystem its own power management capabilities. So whenever any part of a "Moorestown" system is not needed, it's turned down or off. Intel refers to those savings mechanisms as "power islands" on both the MP20 hub and on the Z6xx chip and it's all done with an involved combination of software, hardware and firmware features. The sum total of all this is that the three chips that make up the Moorestown platform combined use less power under load than the first gen Menlow platform did when running idle.

That's impressive, but also necessary. What Intel envisions for Moorestown-based devices is a range of form factors, from smartphones to sleek tablets with 10 days standby, two days music playback, over five hours of video, multi-tasking/multi-windowing/multi-point video conferencing, 1080p playback and 720p recording, and and "PC-like" internet.

With Moorestown Intel is clearly taking another run at a market where it is simply not represented. Apple has set the bar for smartphones and tablets very high, and really nothing less than the kind of performance and battery life found in Apple products will do. The performance of current Atom-based systems, those assisted by NVIDIA chips not included, ranges from perfectly adequate to rather anemic, especially with video. It's Moorestown's task to potentially change that.

NVIDIA likely won't be happy. Just when the first Atom N450/N470-based nebooks with its Ion graphics appear on the market, Intel throws another curve by including graphics into the very processor of the next generation of "internet device" Atom chips.

For embedded systems and rugged/vertical market systems developers, the ongoing fragmentation of the Atom platform into two families, and the rapid obsolescence of the two most frequently used chips (the N270 and the Z530) is also not very good news. While more performance is always better, if the Moorestown platform turns out to be that much quicker and more economical, then products based on the older chips will have suddenly become a lot less desirable.

Now what?

As far as the future of Moorestown for smartphones and mobile internet devices goes, Intel will not only have to overcome ongoing confusion about their two Atom families, but it will also face formidable competition from the ARM processor architecture camp. That includes Nvidia's Tegra, the Qualcomm Snapdragon, TI's OMAP and others.

And then there is Apple. The iPad's A4 chip, also ARM-based, is Apple-only and thus not direct competition, but with the iPad Apple has shown what is possible with a tiny processor running at just 1GHz. The iPad is uniformly seen as an excellent performer with very quick browsing and excellent video playback. The iPad does not only not need a fan; it simply never warms up at all, not even after runnning video for hours. And with the iPad's ability to run video for ten hours or more, Intel's set goal of "over 5 hours" of video looks modest at best.

So Moorestown has a great deal to prove, and Intel has a lot to lose. If the platform succeeds, the N-Series branch of the family will suddenly look quite obsolete, which will require another tweak. If it fails, Intel's reputation of being behind in mobile chips will be confirmed yet again. No one's ever counting Intel out, but Atom, netbooks notwithstanding, has been a struggle.

Posted by conradb212 at 10:42 PM | Comments (0)

May 04, 2010

Publishing and the iPad

This has nothing to do with rugged computing, but everything with publishing and how information is presented and distributed.

As a former print publisher, I spent some time comparing different approaches to magazine publishing on the iPad. Given the amount of hype about the iPad being the savior of publishing, I am surprised there is not an iMagazine app or some such. I mean, Apple could take the lead here yet again, creating the iTunes of the magazine world.
 
As is, everyone's doing their own thing, with Zinio, of course, having the lead with its hundreds of electronic titles. Problem is, they're not doing a thing different for the iPad. The PDF versions are faithful 1:1 equivalents of the print mags and it all works well, though a slight lag until each new page snaps into focus is annoying. And I am NOT willing to fill out long, cumbersome forms with address and credit card info to subscribe to a mag when it should all be 1-click.
 
Time Magazine rolls their own, for now at the absurdly high price of $4.95 per issue. Their approach is sort of a hybrid between PDF and web design and totally new stuff. It's very innovative, but takes some time getting used to. On the other hand, there really is no need to simply transform print to screen, even if it's print retrofitted with electronic stuff (links, video, forms, etc.). 
 
So Time is experimenting. Pictures that may be tiny in a magazine due to space constraints can be large, with text below it and you need to scroll down. When you zoom in to make text readable, pictures don't necessarily zoom with it; they don't need to. And how cool is it to have a full-page portrait of Lady Gaga or Bill Clinton and when you rotate to landscape, it becomes flawless high-definition video and they speak to you.
 
The iPad brings us another step closer to electronic publishing, a big one. But for now, no one is taking the definite lead. With the iBooks app and iBook store still a million miles behind Amazon, Apple probably has its hands full with filling in the many blanks, and an iMag app and store may not come to pass anytime soon, or ever. So Zinio and others have a window of opportunity, but it'll take more than selling individual mags for US$4.95 (Time) or making people put up with lag and an antediluvian 20th century style signup (Zinio).  

Posted by conradb212 at 05:54 PM | Comments (0)

April 09, 2010

Waterproofing rugged computing equipment

During the course of testing in the RuggedPCReview.com lab, we examine ruggedness specifications and claims. For the most part, while we report and comment on those specs, we do not put them to the test. That's because ruggedness testing is pretty involved business, and checking how much punishment a device can take before it fails makes about as much sense as a car magazine running a test vehicle into a concrete wall to see if it is indeed as safe as the manufacturer says.

There are, however, exceptions. If a manufacturer claims their product can be dropped from four feet without damage, we may try that. And if a product is advertised as being waterproof, we may check that claim out as well. And this is where it gets interesting.

Most rugged products have an ingress protection rating in their specs. If the IP code system is used, as defined by international standard IEC 60529, then the second number in the code indicates protection against water. An IP67, for example, means that the product is totally protected against dust (that's the "6"), and also protected against immersion into water down to one meter (3.3 feet) for up to an hour. IP68 means protection against continuous immersion, as specified by the manufacturer.

So are there mobile computers that are waterproof? The answer is yes. There is a small, but not insignificant number of systems, primarily handhelds, that carry IP67 ratings. And the marketing for those systems often includes pictures or videos of full immersion. At trade shows you sometimes see waterproof handhelds or tablets sitting in tanks, running video to show that they are, indeed, alive and unharmed.

Now it is abundantly clear that even machines that carry IP67 ratings are not dive computers and that few will ever even be immersed in water. However, given their intended use, they MAY fall INTO water, just as they may fall off a speeding pickup truck and get stepped on. Hence our occasional testing of the stated design limits and a bit beyond.

That said, as a certified scuba diver with a good degree of experience, I've come across some pretty fascinating underwater electronics that are sealed. Diving is really interesting in that pressure plays a huge role. Each 33 feet of sea water (or 34 feet of fresh water) adds one atmosphere, or 14.7 psi, of pressure. You'd think that divers get crushed down at 100 feet, but that's not so because the human body is mostly water anyway (60-80%, depending on the individual), so all we have to worry about are the air spaces inside of us (lungs, sinuses, ears, mask mostly). We equalize pressure by breathing in pressurized air that perfectly counterbalances the water pressure. The result is that even at substantial depth, your dive mask doesn't leak at all; the flimsiest of seals will keep water out as long as there is no pressure difference and as long as there is indeed a seal that keeps air and water apart.

This means that, theoretically, if there were a way to dynamically pressurize the inside of a rugged computing device, even very delicate seals (like the very thin silicone skirt of a dive mask) would be enough to keep water out even at great depth. Now obviously, no one is about to put automated compressed air pressure equalization systems into a handheld computer; that is not what such devices are for. It's interesting, though, to examine how underwater electronics ARE sealed:

- Most underwater cameras use special housings that still allow access to the camera's controls. They usually have one big O-ring seal for the housing clamshell, and then individually sealed pushbuttons.

- Recently, an exceedingly simple waterproofing method for cameras has come on the market. It simply consists of a sealed bag of clear plastic with a lens in it. It isn't protecting against pressure, but it sure keeps the water out.

- Dive computers (the ones that compute nitrogen loading, depth, dive time, remaining time, etc.) are sometimes oil-filled. Since oil cannot be compressed, there are no pressure issues.

- There are a number of waterproof cameras now that can handle up to 33 feet of water. Examples are the Olympus Tough series, the Canon D10, the Panasonic TS2 and more. We've tested most of those down to 50 feet, and had one down to 77 feet. Those are regular cameras with LCDs, battery and I/O compartments, and numerous controls. So it might be interesting for rugged computer engineers to take one of those cameras apart and see how they do it. (Btw, LCDs sometimes get compressed so that the image is temporarily impacted, and sometimes buttons are pushed in from the water pressure).

What does all this mean for the waterproofing of rugged mobile systems? Mostly that a good understanding of pressure and sealing is required to design reliable waterproofing. Apart from the fairly complex issues of pressure, there's also a good deal of common sense. Keeping things as simple as possible is key. In a setting where ANY failure can be fatal to the equipment, it only makes sense to keep the potential points of failure as few as possible, and as simple as possible. It is not surprising that NASA has always been big on the concept of "fail-safe," i.e. systems that if they failed they failed so as not to jeopardize the larger purpose, such as survival of astronauts. Likewise, scuba regulators are designed so that if they fail, they free-flow rather than shutting off air, thus giving the diver a chance at survival.

The conclusion is that the key to waterproofing of rugged computing systems is keeping things as simple as possible. This means keeping openings to the inside at a minimum, providing double protection whenever possible, and designing things to be as fail-safe as possible. Whatever seals there are must be totally reliable; resistant to twisting, ripping or falling out; durable; and easy to procure and replace. Seals should also be noticeable so users can see if something is amiss (we once failed to notice that a black O-ring in a black housing was missing, with nasty consequences). Rule #1 though is that the less there is to seal, the better.

Posted by conradb212 at 08:52 PM | Comments (0)

April 03, 2010

Finally: decent HD video on Atom boxes thanks to Broadcom card

The dirty little secret of millions of Atom N270-based netbooks (and pretty much all other Atom-based systems) is that they really cannot run HD video. If you try it, you get choppy video that creeps along at frame rates of no more than 10 frames per second max even with just 720p video, let alone 1080p. This makes HD video on Atom-based systems impossible to watch. It's a huge disappointment for anyone who thought a "netbook" would surely be able to handle today's high definition media formats, and certainly an annoyance for many customers of vertical market Atom boxes as well.

Well, third party to the rescue. And it's not nVidia (though that company's Ion technology will certainly improve the dire Atom platform graphics situation so that it becomes at least bearable). No, it's Broadcom which offers an inexpensive add-on card that can transform virtually non-existing Atom high-def decoding into something respectable and quite useful.

The Broadcomm "Crystal HD" High Definition hardware decoder BC970012 with the Broadcom AVC/MPEG-2/VC-1 video/audio BCM70010/BCM70012 decoder chipset is a PCIe Mini Card designed to allow full high definition real-time decoding for hardware that otherwise could not do so. The board can decode H.264 480i/480p, 720p, and 1080i/1080p at 40Mb/second.

I had read about the Broadcom solution last December, but never had a chance to check it out until an Atom N270-based Advantech ARK-DS303 digital signage player arrived at the RuggedPCReview lab. It had the Broadcom module installed as an option, since signage player customers may have a need for high definition video playback.

To test the HD playback capabilities of the Broadcom Crystal HD decoder, I installed both QuickTime and the freeware Media Player Classic HomeCinema 1.3, copied a 250MB 720P high definition Quicktime (.mov) movie recorded on a Bonica HD video camera onto the Advantech player and then ran the movie side by side on an Apple iMac27 and a 22-inch display hooked up to the ARK-DS303, set to 1680 x 1050 pixel resolution. Amazingly, the DS303 kept up with the vastly more powerful iMac throughout the movie, with playback quality being almost identical. There was a very slight choppiness at times, but it did not materially impact playback.

I then ran a full 1080p MPEG4 movie trailer on the DS303 and it never missed a beat. In fact, it almost ran better than 720p video. That is very impressive for a low-power Atom player with just a gig of RAM and no fancy hardware. By comparison, an Acer Aspire One netbook with basically the same hardware as the DS303 sputtered along at just a few frames per second. Our findings are confirmed by this test of the Broadcom card by SilentPCReview.

On a personal level, what this means is that if you have a netbook that has an empty PCIe slot, you can get a Broadcom BC970012 board on eBay or from a company like Logic Supply (see Logic Supply BCM970012), download the drivers from Broadcom (see Crystal HD download page) and finally have decent HD video playback on your little underachiever.

For manufacturers and resellers of rugged tablets and other mobile devices based on the Atom platform, and especially the N270, N280 and the new N450 (and probably the D510 as well), by all means make the Broadcom board available at least as an option! Due to its low cost, the Broadcom BCM970012 PCIe board is virtually a no-brainer for N270/N280 systems, and the newer N450 systems can definitely benefit from Broadcom's follow-up BCM970015.

The video below shows the same 720p (1280 x 720) clip playing on an iMac27 on the left and on the Atom 270-powered Advantech ARK-DS303 with the Broadcom module on the right.

I should mention that the situation is somewhat different for devices based on "Silverthorne" Atom chips, i.e. those with Z5xx Atom processors. Those actually havehard ware support for H.264 and other HD decoding. However, in order to take advantage of that capability, OEMs must include the necessary codecs, or users must run applications that come with those codecs (such as CyberLink or PowerVideo). For example, a Atom Z530-based Fujitsu LifeBook UH900 currently in the RuggedPCReview.com lab easily plays 1080p video at full frame rates.

Posted by conradb212 at 04:38 PM | Comments (0)

March 31, 2010

Will industrial tablets benefit from the iPad?

On April 3rd, the Apple iPad tablet will be available in Apple stores. According to various reports, almost 300,000 iPads have been ordered before the device even became available. The hype is enormous, with experts falling all over themselves proclaiming why the iPad will succeed or fail.

Fact is, at this point no one knows how the iPad will be received. Apple apparently felt comfortable enough with the tablet form factor to create the device and stake a good part of its reputation on it. Since the iPad is really a scaled-up iPhone rather than a pared-down MacBook, the question will be whether the iPhone experience indeed scales up to offer something the little iPhone just couldn't, or whether the larger form factor actually works against it as people may be more likely to compare it to a standard PC.

One thing is for sure: the iPad will put the tablet into the harsh light of public scrutiny again. This, of course, isn't new. The original IBM ThinkPad of the early 1990s was a tablet, and then, just as now, the tablet/pad concept was sold as something millions were already familiar with: Scribble on a notepad or relax in a comfy chair with a tablet computer that feels like a book or print magazine. The major difference between then and now, apart from almost two decades of technological advancement, is that back then handwriting recognition was seen as the key to unlocking the tablet's potential.

Unfortunately, handwriting recognition never quite worked out (though, with some training and given a chance, the software actually works very well), and current tablet efforts do not push recognition at all. Instead, the emphasis is on an attractive, elegant user interface with all the effortless swiping, pinching, bouncing and tapping that made the iPhone such a hit.

Will the iPad benefit industrial tablets by bringing more attention to the tablet form factor? It's quite possible, but there are some pitfalls. The obvious one is that the iPad will set a standard of user expectations (effortless multi-touch, elegant UI, etc.) that Windows-based tablets will probably have a hard time to meet. Another is that Apple's effortless, elegant user interface requires a flawless implementation of projected capacitive touch technology, something that may work much better on a small consumer device than a large vertical market device that users may want to operate with gloves on.

It's probably reasonable to assume that the iPad will raise expectations as to how tablets should operate. And raised expectations always mean that older technology will be viewed as lacking. So if users are driven to vertical market tablets just to find that they do not live up to expectations, the rejection and dissatisfaction will be more severe. Which means the overall impact of the iPad's publicity could be negative if vertical market tablets do not offer the same general improvements that the iPad brought to the consumer market.

What's the implication? For that we need to take a look at prevailing digitizer and touch technologies.

Almost since the very start of pen computing, Wacom's inductive technology has dominated the digitizer market with its precise, sleek pens that do not need a battery. The pen functionality of the Windows XP Tablet PC Edition, which was launched in 2002, was clearly designed for the Wacom pen, and the technology, for the most part, works very well. The primary problem with active pen solutions, though, is that you're dead in the water if you lose the pen, and despite tethering, spares, etc., it's just a matter of time until a pen gets lost.

Which is why resistive touch technology is being used as an alternative to inductive, and often in conjunction with it (so that the computer automatically switches from one to the either when according to certain rules). The problem with resistive touch is that it is not very precise, not well suited for inking and recognition, and very prone to misinterpretation. After all, the digitizer must figure out how much pressure represents a "touch" and also differentiate between an intended touch (like from a stylus) and an unintended one (like the pressure of the palm of your hand). Resistive touch doesn't work very well with Windows and its tiny check boxes and scrollers, though legions of Windows CE/Windows Mobile users have learned to live with it.

The respective shortcomings of inductive and resistive digitizer technologies led Apple to use projected capacitive touch, where a) it's either a touch or not a touch (no shades of gray depending on pressure), and b) multi-touch is possible. Combine that with Apple's interface magic and you have the elegant, effortless and seductive iPhone. Bingo, the perfect solution for tablets.

Or is it? Over the past year we've seen multi-touch functionality added to a lot of tablets and notebooks. I've tried several of them, and none worked very well. Those systems would have a few demo showcase functions, but capacitive touch and multi-touch really did not make the systems easier to use, and they were just another feature rather than the main mode of operation (which is what makes the iPhone such a hit). So simply "having multi-touch" is not enough. It may even work against a product.

This morning I came across a press release from Xplore Technologies, one of the earliest supporters and providers of vertical market tablet computers, where its president, Mark Holleran, lauds the launch of the iPhone as a great opportunity for the tablet form factor. Holleran points out the ease of use of tablets and views the launch of the iPad as a sign that "the tablet PC industry is poised for wider acceptance and accelerated growth."

I do think Holleran is on to something, but even if the iPad is a rousing success, it'll still be a challenge to translate the iPad/iPhone user experience into an equally satisfying solution for vertical market tablets.

Posted by conradb212 at 04:38 PM | Comments (0)

March 17, 2010

Consumerization of rugged markets?

A few weeks ago I wrote an article on Windows Mobile and the vertical markets and concluded with the question, "So what will the small but significant number of vendors who make and sell Windows Mobile devices do as their chosen operating system platform looks increasingly dated and is becoming a target of customer dissatisfaction?" I got some good (and rather concerned) feedback on that column, and I think it's an issue that is not going to go away.

Yesterday I saw an article entitled "Delays Decimate Microsoft's Enterprise Mobile Market Share" at channelinsider.com, and they asked, "So, what of the rugged device market? A market largely dominated by bulky devices running Windows Mobile from manufacturers like Motorola and Intermec. Howe (Director of Anywhere Consumer Research at Yankee Group) says that enterprise applications are becoming more and more prevalent on consumer-grade smart phones, and the rugged hardware manufacturers will become more and more niche-focused."

What they're saying is that enterprise and vertical market functionality is increasingly becoming available in inexpensive, standard consumer products, and the people who use that functionality do not want to walk around with two phones or handhelds. That's been pretty much accepted for a while now, as evidenced by the number of ruggedized handhelds that have integrated phones. The problem, though, as Howe puts it in the channelinsider.com article, is that "No one wants to go around looking like a UPS guy when they are out at the movies."

And perhaps an even bigger problem is that no one, including the UPS guy, wants to put up anymore with a clumsy, recalcitrant user interface that fights you every step along the way. Not when the iPhone and Android and Palm have shown us that it can be done so much better.

What will happen? I honestly don't think that Microsoft's mantra that handhelds need Windows CE because it leverages enterprise expertise washes anymore. Not when the handheld platform has been neglected to the degree Windows CE has been neglected. It's much more likely that Windows CE is still alive on vertical markets because it's a leap of faith to trust Apple or Google or open source to care about the relatively small vertical markets (even though some sales, like UPS, can be in the hundreds of thousands).

Yet, the fact is that I can now take an iPhone, which doesn't even have a scanner, and scan barcodes with its built-in camera. Or take pictures with it that are far better than anything I've seen out of the integrated cameras on Windows CE devices. And due to the laws of physics, small devices are often inherently more rugged (and easier to ruggedize) than large ones. Does that mean we'll soon see slightly modified smartphones do the job of rugged handhelds?

Probably not, but the thought definitely enters the mind.

Posted by conradb212 at 08:31 PM | Comments (0)

March 10, 2010

Will the iPad replace my iPhone?

I wrote this column for the blog at iPhoneLife Magazine, a terrific resource for iPhone owners (or anyone interested in the iPhone) that's published by my old friend Hal Goldstein who used to be a friendly competitor when we published the print version of Pen Computing Magazine.

The article really has nothing to do with rugged computing, but I think it's relevant here anyway because a) the fate of the Apple iPad will have a big impact on how tablets are viewed in the coming years, and b) because of the mobile industry's never-ending struggle to find form factors that are really right for a given job.

So here's what I contemplated:

This week I will order my iPad. Though I know it'll take a bit longer, I am aiming for the 3G model with 32GB of storage. When I get it, I will sign up for the unlimited data plan, forking over an even larger part of my disposable income to AT&T every month. What I do wonder is whether the iPad will replace my iPhone.

Silly question you may say. The iPad is not a phone, so how could it replace the iPhone? True, but I really don't consider my iPhone as primarily a phone. It is, in fact, a pretty crappy phone, with voice quality worse than virtually any cellphone I've ever had, going back to the original Motorola "brick." But I do need a phone for the few calls I make, and it doesn't make sense to carry a much more convenient little fliphone in addition to the iPhone, and so, yes, the iPhone is my phone, too. But if I checked the number of minutes I use my iPhone as a phone versus for everything else, everything else would account for about 95%, at least.

That's because the iPhone has pretty much become my information and entertainment device of choice. Before I leave the house I check the weather and temperature on the iPhone so I know what to wear. I get my news from the iPhone's USA Today and CNN apps (and even a couple of local and foreign newspapers), and more detailed news from the NY Times on the iPhone. I keep in touch with my Facebook friends on my iPhone. I read e-books on it. I play games on it. I use it when I go running and want to keep track of my time. I use it to check prices and read reviews while shopping. I check sports scores, the load on my servers, new messages on websites I post on. I do all that on my iPhone because it's so darn handy and convenient, and because it is good enough to do all those things. Had anyone told me a few years ago that, yes, it WILL be possible to use the web on a tiny device not as just a technology demonstration, but because it really works, I probably would not have believed it. After all, everyone had tried and it just didn't work. Until the iPhone.

So now the iPad will do everything the iPhone can, but on a much bigger screen. No more squinting, no more screen rotating to make columns more easily readable, no more constant pinching to zoom in and out. That will all be a thing of the past as what we have all been waiting for is now here with the iPad, the book/magazine reading experience in an electronic device. Because that is the one remaining hang-up that keeps print newspapers and mags in business; they are more convenient than reading on a laptop computer.

But now I wonder if the iPad will do everything the iPhone can, and do it better. Will I appreciate the much larger screen, or will it simply make the iPhone experience big and unwieldy? Will I have much higher expectations from a "real" computer like the iPad than I have of the little iPhone? For example, will I still tolerate the lack of Flash on the iPad? Will iPhone apps still look so terrific and clever on a much bigger screen, or will I expect real computer functionality? Will I start whining about the lack of "real" software? But most importantly, will I be able to use the iPad like I use the iPhone, just whipping it out wherever I am? Because if not, it may not work, and the new big iPhone will be something else that'll have to fly, or fail, on its own merits.

Posted by conradb212 at 10:36 PM | Comments (0)

February 24, 2010

Windows Mobile and the vertical markets

While Windows Mobile pretty much has ceased to be a factor in consumer markets, it remains very firmly entrenched in industrial and vertical markets where it may have a market share that's probably larger than that of Windows in desktops and notebooks. The good news is that as long as Microsoft continues to dominate the desktop, the leverage of Windows programming tools and expertise will probably all but guarantee a continuing role for Windows CE and Windows Mobile. That said, the rapid vanishing act of Windows Mobile in the consumer markets simply must be disconcerting to those whose business depends on Windows Mobile.

I won't go into the long and checkered history of Windows CE here, nor into Microsoft's bewildering meandering with nomenclature or the disruptive inconsistency and frequent course changes. It all has become an almost impenetrable mess even for longtime followers of Microsoft's smallest OS. Unfortunately, Windows Phone 7, Microsoft's latest knee jerk reaction to a changing smartphone market that has essentially relegated Windows Mobile into insignificance, casts more doubt and shadows on Windows Mobile than ever.

While in the real world, the one where manufacturers make and sell rugged mobile products, we continue to see Windows CE 5.0/6.0 and Windows Mobile 6 and 6.1, in the hype and announcement world, Microsoft has announced Windows Phone 7 OS, a trendy me-too music player interface trying to leverage the floundering Zune music player platform while copying iPhone and social networking concepts. It's hard to see how Windows Phone 7 could make a dent into the smartphone market, and it is most certainly not the future in commercial and vertical markets.

So where does that leave vertical markets who probably aren't thrilled at the prospect of being stuck with an increasingly obsolete Microsoft mini OS? Not in a very good position. Let's face it, the odd Windows Mobile 6.5 interface is essentially unsuitable for vertical markets. And now that Microsoft, scrambling to remain relevant in the mobile market, is putting its eggs into the projected capacitive (multi) touch basket, it's hard to see how any of the older versions of Windows CE/Windows Mobile (which is now renamed to "Windows Mobile Classic") may benefit from the Windows Phone 7 OS. Yet, something with "7" in it must happen to at least give the impression that Windows Mobile is moving forward (and to benefit from the relative shine of Windows 7).

So we have a situation where only last year, Microsoft's entertainment and devices division president Robbie Bach waxed enthusiastically about WinMo 6.5 ("It will give you access to more websites than you will be able to get to on an iPhone that will work actively and work well. It really is a much better experience.") and now the future of 6.5 already seems quite uncertain.

Personally, I think what may happen is that Microsoft will quietly integrate Windows CE/Mobile into its Windows Embedded Products business. That area already includes Windows Embedded CE in addition to Windows Embedded Standard, Windows Embedded Enterprise, Windows Embedded POSReady, Windows Embedded Server, Windows Embedded NavReady, and so on. The stated purpose of Windows Embedded CE is to "develop small footprint devices with a componentized, real-time operating system. Used in a wide array of devices, including portable navigation and communications devices." That makes sense.

One problem with this approach is that one part of the appeal of Windows CE/Windows Mobile was always that people were already familiar with its look and feel. Today, that look and feel is ancient, just as are the very visible underpinnings of Windows CE that essentially date back to the last millennium. And with Windows Mobile Pocket PCs gone and Windows Mobile phones irrelevant, that part of the leverage is gone as well. A new interface approach is sorely needed, but if Windows Mobile 6.5 and Windows Phone 7 are any indication, Microsoft's thrust is in the Zune player and social networking arena.

So what will the small but significant number of vendors who make and sell Windows Mobile devices do as their chosen operating system platform looks increasingly dated and is becoming a target of customer dissatisfaction? That's a good question. You can never count Microsoft out, but after all the fumbling with their mobile OS over the years, hopes for a cohesive, logical and compelling direction for Windows Mobile seem optimistic.

Posted by conradb212 at 07:45 PM | Comments (0)

February 16, 2010

A look at Intel's new Core i3/i5/i7 processors and how they will affect rugged computing

Just when most manufacturers of rugged mobile computers have switched from earlier platforms either to Intel Atom or Core processors, Intel raises the ante again with new Atoms and the next generation of Core processors. In essence, the Core 2 Duo that has served the mobile community long and well is being replaced by a next generation of mobile chips with higher performance, newer technology, better integration, improved efficiency, and smaller package sizes.

The new Intel Core i3, Core i5 and Core i7 processors come in numerous versions with two or four cores, clock speeds ranging from 1.06 to 3.33 GHz, maximum power dissipation of 18 to 95 watts, different process technologies, different degrees of integration and different complementing chipsets.

Unfortunately, while the difference between Intel's older Core 2 Solo and Core 2 Duo processors was pretty obvious, differentiating between the Core i3, Core i5 and Core i7 chips can quite confusing. As a rule of thumb, the 3/5/7 sort of represent Intel's "good," "better," and "best" processor solutions in any given category just like BMW makes 3, 5, and 7 series cars (though the analogy only loosely applies). Core i3 processors, for example, do not have the Intel TurboBoost feature that provides extra performance via automatic overclocking and seems more than just a marketing feature. Core i7 processors generally have more cache and support more of the special Intel features and technologies than Core i5 and Core i3 processors. There is, however, more than a bit of overlap in functionality and performance, and figuring out which one is best suited for a task won't be simple.

As of February 2010, Intel has announced about three dozen of the new Core i3/i5/i7 processors. About a third of them are designated as embedded processors, which makes them especially interesting for for embedded systems designers due to considerations such as package size, structural integrity, error correcting code memory, system uptime as well as, and perhaps most importantly, an 7-year extended life cycle.

Intel has always had an excessive fondness of code names, and it's no different with the new Core processors. It's therefore useful to know that Intel distinguishes between generally desktop-oriented "Piketon" platforms that use either two core 32nm "Clarkdale" processors or four core 45nm "Lynnfield" processors (and usually have TDPs that makes them unsuitable for most mobile applications), and mobile "Calpella" platforms that use two core 32nm "Arrandale" processors with lower thermal design powers (generally 18 to 35 watts).

So let's take a quick look at Calpella and Piketon.

In essence, "Calpella" is Intel's new "low power" platform, though there is now a much sharper differentiation between the really low power Atoms and the low power but rather high performance new Core processors. The Calpella class "Arrandale" CPUs are based on the latest 32nm lithography. They are are essentially the successors of the mobile Core 2 Duo CPUs and come in standard, low voltage, and ultra low voltage versions at various processor clock speeds.

There are, however, some interesting differences: As a first in this class of Intel CPUs, the memory controller and reasonably powerful integrated graphics with HD hardware acceleration and other new capabilities are now part of the processor, which means no more conventional Front Side Bus and "Northbridge" part of the chipset complementing the processor. These integrated graphics can be turned off when they are not needed, and Nvidia (who is probably not that thrilled about this Intel move) has already announced their "Optimus" technology (see what it is) that automatically determines whether to use the integrated graphics and extend battery life, or use an external NVIDIA GPU to boost graphics.

Like the Core 2 Duos, "Arrandale" processors have two cores but the new chips use Intel's HyperThreading technology that act like virtual cores, making the operating system think it is dealing with four cores. The new chips also add L3 cache while the Core 2 Duo chips only had L1 and L2 cache. They require DDR3 RAM that supports higher speeds (up to 1,333MHz). An interesting new technology is Intel "TurboBoost" that automatically steps up processor core speed if it detects that the CPU is operating below certain power, current, and temperature limits.

The new embedded Intel Core i5/i7 processors range from ultra low voltage models with a base clock frequency of 1.06GHz and a Thermal Design Power of 18 watts to low voltage models with base clock frequency up to 2.0 GHz and TDPs of 25 watts, and standard voltage models with base clock frequencies as high as 2.66GHz and 35 watt DTP. This means they're suitable primarily for higher end, high performance rugged notebooks and tablets, but not for lower end systems that require the still significantly lower power draw of Atom-based prcessor technology (or emerging alternate solutions such as the Nvidia Tegra).

"Piketon"-class processors also include a variety of Intel's new Core i3, Core i5, and Core i7 CPUs but unlike the "Arrandale" versions they are mostly standard voltage, higher-powered chips that include both older 45nm technology "Lynnfield" versions of the chips (four cores but no integrated graphics) as well as newer 32nm "Clarkdale" versions with two cores and integrated graphics. These are performance-oriented processors with TDP ratings of 73 to 95 watts and thus unsuitable for most mobile applications.

What about performance? We haven't had a chance at benchmarking any rugged systems with the new processors yet. Intel and other benchmarks of all new Core processors suggest a hefty 30-60% performance increase over equivalent Core 2 Duo processors at roughly the same TDP levels. Literature and previews also suggest that the performance of the integrated graphics processor is improved by perhaps about 50% over that of the predecessor GM45 platform. There is also said to be better 3D performance, high definition video hardware acceleration, audio and other advancements.

It should be interesting to see who is first in making the new i5/i7 chips available and how they'll work out in rugged systems.

For a comparison table of all Intel Core i3/i5/i7 released through January 2010, see here.

Posted by conradb212 at 06:22 PM | Comments (0)

January 28, 2010

Talking with Paul Moore, Fujitsu's Senior Director of Product Development

The other day I had a very interesting hour-long conversation with Paul Moore, who is Senior Director of Mobile Product Development at Fujitsu. The call was arranged by Fujitsu's ever helpful Wendy Grubow to give me a chance to talk with Paul about the Fujitsu Lifebook T4410 Tablet PC that's currently in the RuggedPCReview.com lab for evaluation and testing.

Fujitsu, of course, has been into tablets longer than most and probably has the most experience of any Tablet PC and convertible vendors. Fujitsu had the PoquetPAD and 325Point tablets a decade before IBM reinvented the Tablet PC in 2002, and the company is now in something like the 40th generation of tablet technology. Yes, the 40th. During the 1990s, Fujitsu built a successful business around vertical market slate computers, most notably the Point and Stylistic models, with the latter line carrying on to this day. For a while Fujitsu also offered Windows CE-based devices such as the PenCentra line. Fujitsu also offered small business-oriented notebooks with pens when almost no one else did. What it all boils down to is that there's no one who has more corporate DNA in tablet and slate computers in any number of form factors.

Paul pointed out that at this point, Fujitsu is the only company that offers both slate AND convertible computers. There are many that have a notebook convertible in their lineups, such as Dell and HP, and there are some that only offer tablets, such as Motion Computing, but no one offers both in their market (one could argue that DRS ARMOR and a couple others do offer both platforms, but those are in the heavily rugged markets).

Anyway, it was interesting to hear Paul tell that Fujitsu is seeing a heavy migration from tablet to convertible. Customers are transitioning from the Stylistics to the more conventional Lifebook convertible notebooks that can also be used as slates by rotating the display and laying it down flat on top of the keyboard. That probably explains why Fujitsu is now down to one single model in the Stylistic line, the Stylistic ST6012, whereas the company offers no fewer than six different convertibles (the Lifebook T1010, T1630, T2020, T4310, T4410, and T5010).

With Panasonic making a big issue out of their rugged computers still being made in Japan, I asked Paul if the Fujitsu tablets and convertibles are also still made in Japan. The answer was yes, all Lifebook tablets are made in Japan, and all E-Series machines as well. However, while with Panasonic it was pretty clear that they made a connection of made in Japan = much lower failure rates, Fujitsu makes no such claim. Paul said failure rate stats are compiled, but given the vast differences in markets served makes any meaningful comparison essentially impossible.

I asked Paul why Fujitsu does not market its computers as "business-rugged," "semi-rugged," or one of the other ruggedness categories. The unequivocal answer: We don't have rugged tablets. Ours are durable, well-built, according to the markets we serve. We don't lose many customers because of ruggedness requirements. Fair enough. Full or even partial ruggedness can add a lot of cost and weight, so if it is not needed, why add it. Paul points out useful features that prolong the life of a computer, like a user-cleanable dust filter, accelerometer-based hard disk protection, a display hinge that rotates in both directions so it won't get damaged by inadvertently turning it the wrong direction, and so on.

With reference to the rotating display hinge, I asked Paul whether he knew why all Tablet PCs since 2001 have been designed with the same exact rotating hinge that lets users rotate the display and then fold it flat on top of the keyboard, LCD facing up. This is a good solution, but in notebook mode, the display flexes when you tap it with the pen. In the 1990s there had been several alternate solutions that minimized or eliminated the flex problem, but they are all gone. Paul said he wasn't aware of any patent protection or other reason why designers should be limited to the rotating displays, but it's a solution that works, flexing is not an issue when the device is used in tablet mode, and with the increasing importance of touch, flexing again is not an issue. Cost, too, might be an issue in staying with standardized solutions.

We also discussed the inherent suitability of a full desktop operating system for tablet and touch use. In my opinion, Windows itself has always been a major factor standing in the way of widespread tablet adoption; it's simply not suitable for pen operation. Paul felt that Windows 7 has made great strides towards better usability, but that in vertical markets it's really all about custom applications anyway, and those are usually optimized for whatever input medium is used.

With the recent advent of Intel's new Piketon and Calpella processor/chipset platforms I asked Paul what Fujitsu's plans were for the Intel Core i3/i5/i7 processors. His answer was that, for the most part, they prefer to use standard voltage processors that generally cost less, offer better performance, and represent an overall better value for users. Based on the benchmark result of our review unit that's equipped with a 2.53GHz Core 2 Duo P8700 with a thermal design power of 25 watts, we see no immediate reason for a chip upgrade: the T4410 scored the highest overall performance results of any Tablet PC we have ever tested, and it still had an idle power draw of just 9.9 watts, barely more than most Atom-based systems.

Posted by conradb212 at 06:35 PM | Comments (0)

January 26, 2010

Tablet hype at fever pitch

A day before an Apple event where Steve Jobs will announce a new computing device, the hype about tablets is at an absolute fever pitch. Experts are popping out from the woodwork, showering us with their wisdom and predictions, most apparently believing that Microsoft invented and introduced the tablet in 2001, which couldn't be farther from the truth. But, perhaps, if enough instant experts say it's so, history has been rewritten. What will those instant experts do when they discover that the original early 1990s IBM Thinkpad was a tablet, and that we had the same exact tablet hype back in 1989/92?

That said, if Apple indeed releases a tablet device, it may well change things quite a bit.

Posted by conradb212 at 05:08 PM | Comments (0)

January 07, 2010

Slate and tablet computers: learning from the past

According to CNN, tablet-sized computers are now "a much-hyped category of electronics." True. The Associated Press says, "Tablet-style computers that run Windows have been available for a decade." Yes, and a lot longer than that. And a PC World editor states, "Tablet PC's are not new. The slate form factor portable computer has been around for almost a decade, since Microsoft initially pushed the concept with its Windows XP Tablet PC Edition." Nope. Microsoft did not initially push the concept with the XP Tablet PC Edition. Microsoft released a tablet OS way before that, in 1991, and even then it was just a reaction to what others had done before.

This shows how soon we forget. Or perhaps how effective current coverage has been in creating the impression that Microsoft invented tablet computers in 2001, rewriting history in the process. Fact is, slate and tablet computers have been around for a good 20 years, and in 1991, there was as much hype about slates as we have today.

A bit of slate computer history

In the late 1980s, early pen computer systems generated a lot of excitement and there was a time when it was thought they might eventually replace conventional computers with keyboards. After all, everyone knows how to use a pen and pens are certainly less intimidating than keyboards.
Pen computers, as envisioned in the 1980s, were built around handwriting recognition. In the early 1980s, handwriting recognition was seen as an important future technology. Nobel prize winner Dr. Charles Elbaum started Nestor and developed the NestorWriter handwriting recognizer. Communication Intelligence Corporation created the Handwriter recognition system, and there were many others.

In 1991, the pen computing hype was at a peak. The pen was seen as a challenge to the mouse, and pen computers as a replacement for desktops. Microsoft, seeing slates as a potentially serious competition to Windows computers, announced Pen Extensions for Windows 3.1 and called them Windows for Pen Computing. Microsoft made some bold predictions about the advantages and success of pen systems that would take another ten years to even begin to materialize. In 1992, products arrived. GO Corporation released PenPoint. Lexicus released the Longhand handwriting recognition system. Microsoft released Windows for Pen Computing. Between 1992 and 1994, a number of companies introduced hardware to run Windows for Pen Computing or PenPoint. Among them were EO, NCR, Samsung (the picture to the right is a 1992 Samsung PenMaster), Dauphin, Fujitsu, TelePad, Compaq, Toshiba, and IBM. Few people remember that the original IBM ThinkPad was, as the name implies, a slate computer.

The computer press was first enthusiastic, then very critical when pen computers did not sell. They measured pen computers against desktop PCs with Windows software and most of them found pen tablets difficult to use. They also criticized handwriting recognition and said it did not work. After that, pen computer companies failed. Momenta closed in 1992. They had used up US$40 million in venture capital. Samsung and NCR did not introduce new products. Pen pioneer GRiD was bought by AST for its manufacturing capacity. AST stopped all pen projects. Dauphin, which was started by a Korean businessman named Alan Yong, went bankrupt, owing IBM over $40 million. GO was taken over by AT&T, and AT&T closed the company in August 1994 (after the memorable "fax on the beach" TV commercials). GO had lost almost US$70 million in venture capital. Compaq, IBM, NEC, and Toshiba all stopped making consumer market pen products in 1994 and 1995.

By 1995, pen computing was dead in the consumer market. Microsoft made a half-hearted attempt at including "Pen Services" in Windows 95, but slate computers had gone away, at least in consumer markets. It lived on in vertical and industrial markets. Companies such as Fujitsu Personal Systems, Husky, Telxon, Microslate, Intermec, Symbol Technologies, Xplore, and WalkAbout made and sold many pen tablets and pen slates.

That was, however, not the end of pen computing. Bill Gates had always been a believer in the technology, and you can see slate computers in many of Microsoft's various "computing in the future" presentations over the years. Once Microsoft reintroduced pen computers as the "Tablet PC" in 2002, slates and notebook convertibles made a comeback, and new companies such as Motion Computing joined the core of vertical and industrial market slate computers specialists.

So now tablets, or slates as Ballmer called them in his CES speech, are once again a "much-hyped category of electronics." The difference is that this time, thanks to Apple and the iPhone, tablets are to have multi-touch.

Let's hope all this works. Technology has come a very long way since those early days of tablet computers, but hype is never good if it's based on a flood of me-too products of a concept that has yet to prove it can work.

For an illustrated history of tablets and slates, see excerpts of "The Past and Future of Pen Computing" by RuggedPCReview.com editor Conrad H. Blickenstorfer, presented as a keynote address at the Taipei International Convention Center in December of 2001.

Posted by conradb212 at 04:37 PM | Comments (0)

January 04, 2010

Getac now offers 5-year warranties!

Sometimes the most amazing news is not a product announcement. That's what I thought when I saw Getac's press release about offering 5-year "bumper-to-bumper" warranties for all their rugged notebook computers. That's a long time.

According to Getac, the new warranty covers all of their fully rugged computers (i.e. the A790, B300, E100, M230 and V100 models) delivered on or after January first of this year. And the warranty includes "damage that occurs due to accidental acts and exposure to environmental conditions". According to Getac president Jim Rimay, they did that because in these tough economic times, computers are more likely replaced on a 5-year cycle instead of the 3-year upgrading cycle of more prosperous times. By offering a full 5-year warranty, customers will not incur additional service/warranty fees if they keep their equipment longer. The 5-year warranty is also a welcome change, the press release says, to governments and other large entities where getting approval for equipment repair can be a lengthy and involved process (it can, I've been there).

Five years is a long time, and especially so for a product that is designed to be used outdoors and under demanding environmental conditions where it is much more likely that computers are dropped, bumped around, rained on, and just generally experience conditions far from those in a nice, warm, clean office. It'd be interesting to know the actual mechanics of the warranty, what all is included, if certain items are excluded, what the turn-around is, shipment costs and so on. I am sure Getac thought this through, and we'll put in an inquiry to the folks at Getac.

How important are warranties and service in this field? Extremely so. I've personally visited the service and repair facilities of the leaders in the rugged computer market and came away more than impressed. Unlike in the commercial market where service is often hit-or-miss, with rugged systems failure rates, failure statistics and service turn-around times are meticulously recorded and managed. That's because with rugged systems, total cost of ownership matters and a good reputation for service and a good warranty definitely represent a strategic advantage.

Getac is on to something here, and offering a 5-year warranty definitely offers significant value-added to their products.

Posted by conradb212 at 04:18 PM | Comments (0)

December 22, 2009

New Atom processors: N450, D410 and D510

On December 21, 2009, Intel announced the next generation of Atom processors. The new generation of Atom processors includes the single core N450, the single core D410 and the dual-core D510.

Up to this announcement, millions of netbooks (as well as related devices such as tablets and boards) used the Atom N270 processor with its two companion chips, the ICH7M I/O chip and the 945GSE graphics and memory controller. The combo of the latter two is known as the Intel 945GSE Express chipset and makes for a total of three chips. Of N-Series processors released prior to this latest announcement, the Atom N280 was really just a very slightly faster N270 (1.66GHz vs 1.6GHz), and the Atom 330 (technically not N-series, but still in the "Diamondville" family as opposed to the more industrial "Silverthorne" Z-series Atoms) a dual-core version of the desktop-oriented Atom 230.

With the new chips, the big news is that Intel reduced the chip count from three to two by integrating the graphics and memory controller into the CPU itself. The old ICH7M I/O controller chip is replaced with the Intel NM10 Express. This means fewer chips to mount, lower power consumption, and, not mentioned, one less reason to seek third party chipsets (such as NVIDIA's Ion Graphics Processors).

Of the three new processors, the N450 is specifically geared towards netbooks whereas the D410 and D510 processors, all working in conjunction with the new NM10 I/O controller, are geared towards low-end desktops. The new NM10 I/O controller consumes just two watts compared to the older southbridge ICH7M's 3.3 watts. More amazingly, while the old GMCH display and memory controller with its 945GSE northbridge chip with GMA950 graphics consumed six watts, the Graphics Media Accelerator 3150-based integrated solution only adds about three watts to the consumption of the netbook-oriented N450 (chip max TDP (thermal design power, a measure of power consumption) 5.5 watts vs 2.5 watts of the N270 w/o graphics).

From what I can tell, the GMA3150 has hardware acceleration for MPEG-2 but not for H.264, so there's still no HD hardware decoding, which means a third-party HD decoder chip will come in handy. Onboard video is now likely to move from 17 : 10 aspect ratio 1024 x 600 pixel to a somewhat more palatable 1366 x 768 pixel, with significantly higher (2048 x 1536) external analog video possible (though some reports say that the N-Series chip is limited to 1400 x 1050, which would be less than what we have now). Somewhat surprisingly for a new chip, memory support is for DDR2 instead of the newer DDR3 standard.

Transistor count goes from the N270's 47 million to 225 million in the new single core models and 317 in the new dual-core chip, which means the CPU alone goes from 47 to 92 million transistors, with the graphics and memory controllers using about 133 million transistors. What exactly the extra 45 million transistors do is not clear as the tech specs look pretty much the same.

Note that Intel targets the D410 and D510 processors specifically for desktops. Though the D410 has the same clockspeed and uses the same NM10 I/O controller, it max TDP is almost twice that of the N450, 10 watts versus just 5.5. That's likely due to the graphics core running at twice the speed in D-series chips (400 vs 200MHz).

Overall, it doesn't look like the new Atoms, which have the Intel 64 extensions, will bring much of a performance improvement to netbooks and netbook-level rugged or embedded devices. Reducing the chip count from three to two is nice, but the Z-series processors already had that. Graphics seem somewhat improved, but not enough to make a huge difference, and there's still no HD playback hardware support. I am also not quite sure why the D410 and D510 processors are aimed at the desktop when the D410 chip combo has a total system TDP that's the same as that of the N270 and N280 (12 vs 11.8 watts), and the dual-core D510 just a bit more (15 vs. 11.8 watts). Also interesting is that Intel highlights the smaller footprint when it was a larger footprint that was lauded at the introduction of the "large package" P and PW series of industrial processors just a bit ago.

Overall, it's good to see these new Atom chips although I can't help but feeling that Intel looked out for itself more than adding compelling value for consumers.

Here is Intel's list of the entire Atom processor family.

Posted by conradb212 at 05:47 PM | Comments (0)

December 18, 2009

The Atom processor predicament

Well, this is going to be interesting. Despite the Intel Atom chips' modest performance, consumers have bought millions and millions of those little netbooks. I am quite certain they bought them because of the low price that made netbooks an impulse buy as opposed to spending more for a "real" notebook computer.

Whether or not customers are happy with their netbooks largely depends on how they use the computers. The small display with 1024 x 600 pixel resolution is confining for almost any real work as there's just not enough real estate. And while the term "netbook" implies that the devices are especially well suited for accessing the web and browsing around, that really isn't true. Netbooks are generally sluggish browsers and mostly unable to deliver adequate multimedia performance. And those who hoped to run HD video on their netbooks struck out completely, because first-gen netbooks simply couldn't do that at an acceptable pace.

On the other hand, the netbooks' small size and weight made them wonderful travel companions, and with an extended battery they practically ran forever on a charge (well, six hours or more in the case of my Acer Aspire One). And when hooked up to a big screen and a full-size keyboard, netbooks work really well as office computers. I hook up my little Acer to a 1680 x 1050 pixel 22-inch wide-screen.

However, we always want more, and so netbooks have been creeping up in size and power. Display size went from 7 to 8.9 inches, then 10.1 and now 12.1 inches. Which means netbooks are morphing ever closer to standard notebook range, which also means customers will continue to want and expect more. I mean, if the netbooks are so large now, why not an optical drive, and could we have the screen just a bit larger yet? Obviously, what customers really want is a device that costs as little as a netbook, but is as large and powerful as notebooks were before they became hefty giants with 19-inch ultra-wide-format displays.

Problem is, the Atom N270 simply isn't up to powering anything more than a little netbook, and even that just marginally. So Intel released the very slightly more powerful N280 and the dual-core N330. And NVIDIA came up with the NVIDIA Ion Graphics chipset that is supposed to work better with Atom N-Series chips than Intel's own chipset. I recently read a review of the Asus Eee PC 1201N netbook that uses both the N330 chip and the NVIDIA chipset, has a 1366 x 768 12.1-inch screen and lists for US$499. According to the review, you can now actually watch HD video, play many games, and things feel quite a bit less sluggish. Battery life is less than it was for the older, smaller netbooks, of course, and for 500 bucks you can easily get a "real" notebook with far higher performance and many more features.

Why do I bring all this up? Because the rugged market has also heavily invested in Atom technology and almost everyone has Atom devices in their lineup or pipeline. Almost all of them are based on either the Atom N270 or the Z510/530/540, i.e. the first generation of Atoms, the minimal ones with "targeted" performance. And now, just as we're starting to see nicely optimized Atom systems that live up to battery life expectations, some of those initial chips are already going to be replaced by the N280, N330 and soon by next gen Atom chips. That's bad news for rugged manufacturers whose first-gen Atom products are just now becoming available.

The moral of the Atom story is, at least for vertical market manufacturers: pick an Atom chip that Intel is likely to support for several years, and make certain the drivers are fully optimized and all the power saving features are fully implemented. Atom can deliver superior battery life and acceptable performance, but manufacturers must carefully target those products so customers won't be disappointed. We've seen Atom-based machines that use hardly less battery power than devices with much more powerful processors. That won't do. And we've seen some where non-optimized graphics drivers made the machines painful to use.

Using an automotive analogy, with the Atom Intel created a small and miserly 4-cylinder engine for use in fuel-efficient vehicles that provide adequate performance as long as the car isn't too big and heavy and customers have not been led to have unrealistic expectations. With the new and upcoming Atom chips, Intel is already making bigger, more powerful engines, obsoleting the earlier ones and giving in to the demand for more horsepower at the expense of efficiency and good design.

Posted by conradb212 at 01:41 AM | Comments (0)

October 29, 2009

Apple stores supposedly transitioning from WinMo to iPod Touch

Anyone who's ever been to an Apple store for an appointment or service knows the weird procedure where someone greets you at the door, takes your info, and then wirelessly sends it to some other Apple people who then come greet you when it's your time. Same for making payments away from the main desk and so on. It all works, but it's a bit odd, and even weirder is that some of that mobile check-in and checkout is done on non-Apple hardware (Symbol, actually) that's running Windows CE software. Supposedly it was done that way because Apple mobile gear couldn't handle bar codes and credit cards and such.

I always thought that was strange because there are all sorts of scanning and credit card processing apps available for the iPhone. And, in typical iPhone fashion, they are being used in cool, innovative ways. For example, there's an app ("Red Laser") that scans a barcode and then instantly checks the Web for the best prices for that product. That way you always know whether you're getting a good deal. There are also numerous apps for credit card processing. That should not come as a surprise in an era where banks are starting to allow you to remotely "deposit" checks from an iPhone.

Anyway, the folks at ifoapplestore.com now report that Apple stores may be transitioning to iPod Touches with an advanced scanner accessory and point-of-sale POS software for checkout. Other businesses are probably following in their path. And I can easily see iPhones and iPods being used in more industrial applications thanks to all those ruggedized cases available now (my favorite one is the Otterbox Defender). Can iPhone-based industrial-strength vertical market apps be far behind?

Posted by conradb212 at 06:36 PM | Comments (0)

October 23, 2009

Windows 7

Well, the much advertised public release date of Windows 7 has come and gone. The equivalent of "War and Peace" has been written on how wonderful it is and on how Microsoft "got it right" this time. Maybe they have and maybe they haven't. Here at RuggedPCReview.com, we've used Windows 7 on some of the rugged hardware we've had here for testing and evaluation recently and, frankly, it looked so much like Vista that we barely noticed anything was different.

At this point, I have mixed feelings. Almost all the rugged hardware that comes in here still runs Windows XP or the Tablet PC Edition or, increasingly, one of the embedded versions of Windows. It was actually interesting to see all those "XYZ recommends Vista" tag lines on manufacturers' websites and promotional materials when most of their machines really still ran XP.

So now Windows 7 is here, and Microsoft has been quite successful in creating the buzz that it's new and leaner and faster than Vista. Some of the industry pundits were practically falling all over themselves heaping praise upon Microsoft, so much so that it was almost embarrassing. Steve Wildstrom at Business Week, whose straightforward opinions I greatly respect, was quite critical over the unacceptable upgrade from XP to Windows 7 (reinstall every app from scratch) and how long the upgrade takes, but he also then said Windows 7 was "something truly better."

I think whether or not Windows 7 is indeed something truly better will eventually determine the fate of Windows 7. It looks so much like Vista that had it not been for Vista's questionable reputation, Microsoft probably would have simply called the "new" OS Vista Service Pack 3. As is, that wasn't an option. From a PR standpoint, Vista was so damaged that almost anything would look better. So creating something that is not as bad as Vista is like General Motors improving the Corvair back in the 1960s. It really was a pretty good car in the end, but Ralph Nader's "Unsafe at Any Speed" had damaged the Corvair beyond repair. So from that point of view, having Windows 7 look like Vista and simply saying it's better than Vista may not have been a great idea.

But let's assume that Windows 7 is better than Vista and that Microsoft really has learned and listened. Then you still have the problem that a good number of users will have to upgrade from XP to Windows 7, which so happens to be perhaps Windows 7's most frustrating point. That particularly applies to corporate users where many shops never migrated to Vista at all. It's conceivable that Windows 7, Vista-like though it is, may indeed cause a lot of companies to finally make the migration from XP, but that may mostly be because by now XP is two generations out of date and Microsoft very actively discourages the use of XP.

Only time will tell. It seems almost unthinkable that the world will wholesale reject another Microsoft OS the way Vista as rejected. I mean, a company cannot continue to have 90+% of the market when its new products are rejected. This is why Windows 7 is hugely important to Microsoft. If it's another failure, and the coming weeks and months will tell whether the media enthusiasm will give way to user frustration or not, then, Redmond, we have a problem. If the Vista flop is forgiven like Windows ME was eventually forgiven, Ballmer & Co will likely breathe a huge sigh of relief.

Does it all matter in the rugged space? Not as much as it matters in the consumer and commercial markets. The major players will make sure their product lines are able to run Windows 7 well. And an increasing number may look to Windows Embedded, now that it's called Windows Embedded Standard and "XP" has been banished from the name, though for now it's still really XP (Windows Embedded Standard 2011 will be Windows 7-based).

As expected, Apple is having a field day with the Windows 7 release, running one funny "I'm a PC and I'm a Mac" commercial after another. And just as many would love to have iPhone ease-of-use and functionality on their industrial handhelds, many wish the Mac OS were available on rugged machines. But it's not, and so we truly hope that Windows 7 will give the world a productive and reliable computing platform to work on.

Posted by conradb212 at 07:37 PM | Comments (0)

October 07, 2009

Getac to offer multi-touch on its V100 rugged Tablet PC

Multi-touch has been all the rage ever since Apple showed the world the effortless elegance and utility of the iPhone's two-finger pinch and spread to zoom in and out. So what is multi-touch? Basically, it means the touch screen is able to accept simultaneous input from more than one position. While on the iPhone, multi-touch is currently limited to two fingers, there is theoretically no limit as to the number of simultaneous touches.

What is multi-touch good for? Well, Apple's super-elegant zooming certainly go everyone's attention, but multi-touch can also be used for things like rotating with a two-finger screw in or screw out motion. In addition, multi-touch can be used gestures and the functionality can be built into vertical market custom applications.

While Apple iPhone achieves its multi-touch capability with projected capacitive touch screen technology, that wouldn't work very well in industrial applications where users often wear gloves. For those applications you need a more traditional resistive (pressure-sensitive) touch screen.

There are currently a number of companies working on providing resistive multi-touch systems. Among them are Stantum, Touchco, SiMa Systems, and several others. Some of these products are in the development stage, others are currently available, and each technology is targeted at certain types of applications.

On October 7, 2009, Getac's press release, this marks a first for rugged computers, and the multi-touch feature will enable users to rotate maps and pictures, zoom in and out of manuals and other documents, move and edit, navigate, and employ a series of special gestures that go beyond what is possible with traditional touch screens that only recognize a single touch.

While the technology used by Getac wasn't mentioned in the press release materials, Getac added an explanatory page to its website (see here). Getac resellers and developers will certainly have an interesting tool to work with.

Posted by conradb212 at 05:54 PM | Comments (0)

Gorilla Glass -- lighter and tougher display protection

On October 6, 2009, Motion Computing announced that their C5 and F5 were the first Tablet PCs to use Corning's Gorilla Glass. What is Gorilla Glass? In its press release, Motion states that it is "thin-sheet glass that was designed to protect against real-world events that cause display damage."

To learn more I scheduled a call with Corning's Dr. Nagaraja Shashidhar. To prepare myself I checked Corning's very informative page on Gorilla Glass. They have some videos there that show the glass being bent and steel balls falling onto it. The glass neither shatters nor breaks. In fact, it's hard to believe it's glass at all. It looks more like a very thin sheet of some polycarbonate plastic or acrylic. But it is glass.

The secret, according to Dr. Shashidhar, lies in a special chemical ion-exchange strengthening process that results in what Corning calls a "compression layer" on the surface of the glass. The primary purpose of that layer is to act as an armor that guards against the nicks and tiny cracks that then result in the glass breaking. And even if there are tiny nicks, the layer keeps them from propagating.

What's amazing is just how thin the glass is. Corning makes it in thicknesses ranging from 0.5mm to 2mm, or 1/50th to 1/12th of an inch. The Gorilla Glass used in the Motion tablets is just 1.2mm thick, yet it provides the protection of a much thicker layer of protective glass at a fraction of the weight. And a thinner layer of protective glass doesn't only mean less weight, it also makes for a more natural feel when using the tablet. With thick glass it sometimes looks like the tip of the pen hovers far above the actual screen. That's not the case with the Gorilla Glass-equipped Motion tablets.

I had actually had some face time with a Motion F5 tablet with the new glass before Motion announced it. I took the opportunity to not only examine the new display, but also benchmark performance and battery life with the new and more powerful processor Motion now uses for the C5 and F5. I also did side-by-side comparisons between an original Motion F5 and the latest model (see full report).

I must admit that it's a bit hard to figure out all the F5's display technologies. You start with a Hydis display that now has AFFS+ technology for not only a totally perfect viewing angle in all directions, but also superior brightness. You then add the Gorilla Glass cover that significantly increases the durability of the display. On top of it all is Motion's View Anywhere, which is an anti-reflective sputtered coating on the front side of the glass that is optically bonded to the display.

How does it work? Extremely well. Between the super-wide viewing angle (which makes for an unbelievably "stable" display) and the excellent sunlight viewability, this is a machine that you can really use outdoors. The Gorilla Glass adds peace of mind (no, I didn't try to break it). And the Gorilla Glass also has another benefit that may turn out to be quite a selling point for Motion: it's nearly immune to smudges. There's nothing worse than a display that's full of grime and fingerprints, and that just doesn't seem to be an issue with Gorilla Glass.

So there. It's a funny name, Gorilla Glass, but it's definitely a good thing. And I am not surprised that Motion is the first to have it on a tablet. They always seem to adopt new stuff first.

Posted by conradb212 at 02:47 AM | Comments (0)

September 10, 2009

Gotcha, fool! Your friends at AT&T

The other day we tested a rugged handheld in the RuggedPCReview.com lab. The device so happened to have a SIM slot because it also worked as a phone and a WWLAN data communicator. I so happen to have an unused phone with a SIM in it, and so I decided to use that SIM for testing the rugged handheld. Why do I have an unused phone? Because it's on one of the AT&T's 2-year service contracts. It's just a crappy throw-away phone, but thanks to AT&T I am now paying for it for another year whether I am using it or not.

So I stick that SIM into the review handheld, make three local calls and load a couple of pages of the RuggedPCReview.com website. Works fine. Take the card out and return it into the unused AT&T phone.

So then I get the bill. That'll be $14.83 for 1,483kb, i.e. loading one or two large webpages. Thank you very much, AT&T. This kind of highway robbery is precisely why I have completely stopped making any call that I am not certain is covered in my "plan." I am not even calling my mom anymore because I have no clue what outrageous amount AT&T may charge me for a call to Europe.

But wait, there's more.

I was on vacation in the Caribbeans for a week. I took my iPhone with me, not because I was going to make a call (heavens no, not with AT&T in an unknown situation!!!), but because the iPhone is a little computer/camera/vidcam/PDA that I take everywhere. Well, apparently six people called my phone while it was in the Caribbeans. I never answered. "That'll be a buck 99 for each call, fool. Haha. Gotcha again. - Your friends at AT&T."

And there AT&T and the other telcos wonder why we loathe them so much.

With voice/data increasingly integrated into rugged handhelds and notebooks, be very careful. That SIM in your machine has "Sucker!!!" written all over it.

Posted by conradb212 at 09:30 PM | Comments (0)

July 30, 2009

Deal killers: The Telco 2-year contracts

Years ago, when some exciting new piece of technology came along I simply could not resist buying it. When the first Newton came out I plunked down seven hundred bucks, just to see how it worked and because I simply had to have one. Likewise when Compaq released the Concerto Tablet PC in the mid-1990s. And when that same Compaq came out with its first iPAQs. I bought one.

You can't do that anymore these days. That's because virtually every piece of technology now includes a phone, and in order to get service you have to sign up for a 2-year contract with the telephone company. Not gonna happen. If I could pick and choose service or just try out a service, I'd probably have a Palm Pre by now, and each of my notebooks and tablets would probably have a wireless card in it. As is, I'd have to sign up for 2-year contracts for each of those devices. Not gonna happen, ever.

So instead of having a Palm Pre and being able to tell friends and anyone out there interested in reading my blogs and articles on what I think about it, I couldn't care less. Am I going to sign with Sprint just to get a Palm Pre? Not gonna happen. Sprint is the company who sent me to collection three times after I cancelled a fully paid and expired contract. Am I going to sign with Verizon or anyone else for TWO YEARS just to get wireless in my notebook? Not gonna happen. Ever.

I know, enough people sign those obnoxious contracts because they see no other option. For those of us who love technology and always had the latest and greatest to write about and take wherever we went, we don't do that anymore. We can't. The telcos' greed has killed it all.

Posted by conradb212 at 10:53 PM | Comments (0)

July 13, 2009

The dangers of product photography

While most of the press either uses official product photography supplied by PR agencies or press centers, or takes quickie snapshots with their smartphones, we here at RuggedPCReview.com do it the hard way. We do our own product photography and always make sure that the devices are shown in the environment they are most likely going to be used in. That isn't always easy.

I was reminded of that as we recently needed to do product photography on a good half dozen of rugged machines. These were rugged and ultra-rugged computers designed to be used on forklifts, in trucks, on bulldozers and other such heavy duty equipment. Well, it so happened that there was a significant construction site nearby where a large number of utility company trucks, dozers, graders and lifts were prepping a parcel of land for who-knows-what. Construction hadn't really started yet, and so the property wasn't fenced in, and all that heavy-duty machinery was just a perfect prop for the product photography I wanted.

So I filled the back of my car with rugged computers, seven in all, and headed for the construction site. For a couple of hours, Carol, our intrepid product photographer, posed the machines on bulldozers, trucks and all sorts of heavy equipment, taking a couple hundred great shots. But we were also sweating bullets as all of a sudden it occurred to us that law enforcement might show up and inquire as to what, exactly, we were doing and where, exactly, all those computers were coming from. The rugged tablets, panels and notebooks we photographed looked like they belonged in the trucks we took pictures of much more than they looked like they belonged to us.

As it turned out, while a few police vehicles drove by, no one stopped and asked what we were doing. And so we didn't have to explain why we were carrying about US$25,000 worth of rugged computers from a construction site into the back of our car. Obviously, we could have explained, but it might have taken an hour or two and perhaps a trip downtown in the back of a police cruiser.

Posted by conradb212 at 09:14 PM | Comments (0)

June 30, 2009

Where rugged computers come from

Where do rugged computers come from? Not always where you think. In an increasingly global marketplace the old business model of companies designing, making, selling and servicing their products is increasingly going by the wayside. These days, it's more likely that one company thinks of a product, hires another to design it, has it built by a third, a forth one is marketing and selling it, and a fifth one does the service. As a result, it's becoming pretty difficult to figure out who does what, and where the computers we buy and use are actually coming from.

For us here at RuggedPCReview.com, this global marketplace often means a good deal of detective work when trying to figure out who actually makes a machine. You could argue that a computer is a computer and it's not really important who designed and manufactured it. That may be so for some, but I really like to know who did the design, who specified the features, and where manufacturing took place. It'd be silly to praise a company for their excellent design when, in fact, all they did was strike a deal with a Chinese manufacturer and put their label on the machine. There's nothing wrong with that, and many companies do a great job searching for good products that they then sell and service in the US. But it'd still be good to know the actual origin and background of a machine.

What are some of the different business models?

  • There are resellers that sell machines from other companies.

  • There are distributors which carry machines from a variety of sources and often put their own names on the machines.

  • There are vendors and system integrators that sell value-added third party machines under their own name. They may or may not have exclusive arrangements with their supplies.

  • There are companies that have their own engineering resources and jointly develop machines with Taiwanese or Chinese manufacturers.

  • There are companies that design their own machines, but have them built by a Taiwanese or Chinese contract manufacturer.

  • And finally, there are those who still design and manufacture their own machines.

However, it doesn't end there. Some of the Asian manufacturers have their own relationships and interconnections. As a result, we've seen machines where the top part came from one Asian company and the bottom part from another. We've seen machines seemingly made by Taiwanese manufacturers also being marketed by Chinese companies, apparently under reseller agreements (by and large we assume that machines are made in countries with lower manufacturing costs and marketed or re-sold in countries with higher costs). It can get really confusing.

There are also an awful lot of vendors out there, some of which we never heard from. This morning, for example, I came across Chinese Evoc Group, which has been around since 1993 and makes a large variety of rugged, embedded and industrial computers and components, including some interesting looking panel PCs and rugged notebooks (check the Evoc JNB-1404 and Evoc JNB-1502 rugged notebooks).

Does it even matter where all those computers come from? Probably not to consumers. Whether the Dell or HP notebook at OfficeMax is actually made by Quanta or by Wistron hardly matters (though it really concerns me that apart from CPUs, some other chips and software, almost nothing is made in the US anymore). All those Taiwanese OEMs are top notch, and an increasing number of the Chinese ones as well. It does matter to us, though.

Knowing, and reporting on, all those lesser known Asian OEMs means finding the hidden gems, the companies whose products we'd love to see on the US market. Covering them may lead to OEM deals with US and European companies, and such relationships can be win-win arrangements for all involved. Our feedback may also help them adjust their products for the US and other Western markets that often have different values, priorities and expectations. In that sense, I hope that we at RuggedPCReview.com can be a clearinghouse and conduit of information.

Posted by conradb212 at 07:28 PM | Comments (0)

June 12, 2009

Palm and Windows Mobile and how the iPhone really changed everything

With all the hoopla over the much anticipated release of the Palm Pre in early June of 2009, I thought about the ever-changing fortunes of the mobile platforms in our industry.

Disregarding some smaller players and initiatives, here's the big picture: In 1993, the Apple Newton made news when then Apple CEO John Sculley pushed it hard and predicted that such devices and their infrastructure would one day be a trillion dollar industry. Sculley was scorned for that remark, as was the Newton for its various shortcomings. But the Newton, way ahead of its time, was still good enough to get Microsoft to respond with its own mobile platform, just as a few years prior Microsoft had responded when pen computing with its PenPoint operating system threatened to compete with Windows.

So Windows CE was introduced in 1996, together with a lineup of little clamshells handhelds. The same year, Palm Computing released the little Palm Pilot that no one thought was going to be successful because it neither had a keyboard (considered mandatory after the Newton handwriting recognition fiasco) nor an expansion slot. But much to everyone's surprise, the Palm Pilot took off while Windows CE devices quickly garnered a reputation for being clumsy and underpowered.

Microsoft's approach was to reluctantly add features and gradually allowing more powerful hardware, always concerned that devices might eat into the much more lucrative low-end notebook market, just as they are now worried about netbooks. Microsoft's hardware partners played along and came up with some amazingly innovative devices (yes, you could get a Windows CE-based "netbook" with a 10-inch display and 800 x 600 resolution ten years ago), but even that didn't work against Palm, which sold handhelds by the millions and adeptly crafted a "Palm economy" and thriving developer community that quickly dwarfed Microsoft's tentative and fragmented efforts.

At some point, Microsoft had the chutzpah to steal from Palm by trying to launch a handheld platform called the "Palm PC," but Palm's lawyers quickly nixed that, and their ho-hum handheld PC platform went nowhere. In a last ditch attempt, Microsoft nuked its multiple processor architecture approach around the turn of the millennium and tried again with the "Pocket PC," a markedly improved platform that has survived, in almost unchanged form, to this day.

Palm, in the meantime, thrived and reached a 75% global marketshare. When I gave a keynote presentation at the Taipei International Convention Center in 2001 on the future of pen computing and PDAs, I noted that Palm's OS was aging and Windows CE was gaining market share and might catch Palm within four or five years, but no one really believed that. Yet, it happened in a remarkable, unlikely succession of events that saw Palm fumble its leading position away and sink into virtual irrelevance while Microsoft, hardly more adept with its own mobile efforts repositioned Windows CE as, essentially, an embedded platform for the vertical market.

That approach, while it made sense, wasn't actually one that I thought was automatically going to be successful. In the late 1990s, Symbol Technologies, now part of Motorola, had been one of the first to adopt non-proprietary operating systems into its products. At some point, they offered both a Palm OS product and a very similar one powered by Windows CE, and at the time we were told that the Palm device did far better. Yet, Symbol was one of the very few vertical market companies that chose Palm, whereas Microsoft was remarkably successful in quietly positioning Windows CE as sort of a low-cost subset of Windows that would leverage corporate IT expertise and investments.

So while a lot of people wondered why Microsoft couldn't do any better in the mobile space, it was probably because they didn't want to. In 2002 I reviewed the T-Mobile Pocket PC Phone, an early smartphone that was amazingly good and would still fit right into the smartphone landscape of today, both in terms of looks and performance. Yet, not much happened after that. HP pretty much gambled away the "iPAQ" brand that came into its possession when they took over Compaq. Taiwanese and Korean companies became the new driving force, with the likes of HTC and Samsung settings trends and directions. And somehow the notion took hold that every handheld had to be a phone, which, in the US at least, meant being forced into overpriced 2-year contracts with telcos that couldn't care less about anything other than profit.

The reason why Windows CE became so successful is not because it's so good. It's a nice workmanlike effort, to be sure, but it's clumsy, sluggish and about as agile as a riverboat. But it only took over because a) the proprietary computing platforms of earlier handhelds were no longer acceptable, b) Palm let it by self-destructing, and c) because IT uses Windows and Windows CE sort of fits in. So there. It works, but it's ugly, really ugly.

It took Apple with the iPhone to demonstrate just how ugly Windows CE was. Unlike the Newton, the iPhone was right from the start, and it totally redefined how a mobile device should work. Its effortless elegance is exactly what people want, and Apple made it look natural and easy. The iPhone is human interface engineering at its very best. It may not meet all the IT-mandated checkmarks (yet) and thus earned the stern finger-wagging from some corporate types, but even they probably have an iPhone in their pockets. Once you know how simply and beautifully things can work, you never want to go back.

In a sense it's deja-vue all over again. Apple has a better product and a better idea, but Microsoft still dominates the desktop. Palm, back from the pretty-much-dead, tries again with a slick little box, just like the Palm Pilot once was, only this time they're copying Apple. The question in my mind is how long even workers and industrial users are willing to put up with klutzy, clumsy Windows CE now that almost everyone knows how well handheld electronics can work.

Posted by conradb212 at 04:27 PM | Comments (0)

April 10, 2009

Atom platform expands, but does it have a clear direction?

In the days of the 386, 486 and even early Pentium processors, it used to be fairly easy to follow Intel's chips as they mainly differed in clock speed. These days, staying on top of Intel's various offerings has become an almost full time job. That even goes for Intel's low-end Atom chips that, together with resurrecting some older Intel technologies such as hyperthreading, seemed to simplify the matter of processor selection. It didn't really turn out that way. Intel has been very successful in positioning the Atom processor as new, exciting, efficient and just generally the way to go, but it's really not that simple.

For example, "Atom" has from the start referred to two very different processor families.

The initial generation of Atom processor was the Z5X0 that was codenamed "Silverthorne" with a tiny 13 x 14 mm package footprint. They were targeted at mobile internet devices (MIDs) and used the also entirely new "Poulsbo" System Controller Hub. The processor has about 47 million transistors, which is more than the Pentium 4 had. Bus frequency is 400 or 533MHz (which support Intel's HyperThreading). Thermal Design Power is between 0.85 watts for a low-end 800MHz version without HyperThreading, and 2.65 watts for a 1.86GHz verison with HyperThreading. The chipset uses about 2.3 watts, which means total CPU and chipset consumption isn't even 5 watts. And the chipset has hardware support for H.264 and other HD decoding. However, as a the combo is targeted for internet devices, there is PATA but no SATA support.

A second family of Atom processors, the N2X0 that was codenamed "Diamondville," was meant for standard low-cost PCs and netbook type of devices. The N2X0 is similar in many ways to the 5XX platform, but used a somewhat larger 22 x 22 mm package. The N270 has a TDP of 2 watts and costs less than US$44, the same speed N230 4 watts and US$29. As of now, the N2X0 processor generally uses a version of the older i945 chipset. In order to reduce its power consumption down to 5.5 watts, its frequency (and performance) have been lowered as well and the chipset is called the i945GSE. This is used in the N270. The N230 chip, geared towards desktops, uses the i945GC that is quicker, but also uses 18 watts! Note that the i945's GMA 950 IGP is not able to decode HD signals. The N2X0 can be used with SiS chipsets. though I haven't seen any such systems.

From the looks of it, system designers have been struggling in figuring out whether to use the Z5xx or the N2xx chip. In netbooks it was a slamdunk for Diamondville as almost all netbooks use the 1.6GHz N270. However, there are exceptions. When Panasonic introduced its Toughbook CF-H1 Mobile Clinical Assistant, it came with the 1.86GHz Atom Z540 processor. And when Samwell, one of Taiwan's major OEMs in the semi-rugged and rugged space, introduced what is essentially a rugged tablet version of a netbook, they also picked a "Silverthorne" processor, in this case the Z530P.

I am not sure what drives the decision to go with a Atom N270 versus a Atom Z530. On the surface, they seem to have about the same performance and use about the same amount of power. One glaring difference in their specification is that the N2XX series supports the ever-important SATA (serial ATA) disk interface whereas the Z5XX does not and needs to use PATA drives. On the other hand, the technically inclined point out that the N2XX's use of a very slow version of the already dated i945 chipset makes for sluggish graphics performance and that the i945's GMA 950 IGP is not able to decode HD signals. Anyone who has tried playing back high-def video on a N270-based netbooks knows the pain. However, both versions of the Atom score about the same on the two benchmark systems we use here at RuggedPCReview (PassMark 6.1 and CrystalMark 2004). The Z5xx, in fact, scored very low in 3D graphics, which one would assume are at least somewhat of importance in any "mobile internet device."

But things are getting more interesting yet. Despite what on the surface appears to be the more lucrative "Diamondville" market with its many millions of N270 chips, on April 8, 2009, Intel announced the expansion of the Z5xx platform with a new high-end version, the 2GHz Z515, and a new gas miser version, the "up-to-1.2GHz" Z515. At the same time, Intel spoke of an entirely new Atom platform called "Moorestown" that combines the "Lincroft" system-on-chip with the "Langwell" hub of which as of now all I know is that it uses a lot of acronyms and is still based on the 45nm manufacturing technology.

On the N2xx horizon, there is the N280 processor, and apparently also a dual core Atom chip. There is not much material out there on those, and I need to look more into it.

There was another development. For embedded computing Intel quietly expanded the Z5XX platform with larger form factor versions that carry a "P" in their name, and then special "large form factor with industrial temperature options" versions marked with a "PT." I was aware that Intel would release a "large package" version of the Atom, but not the timing and the purpose. Well, this happened in March of 2009 when Intel added the "large form factor" Atom 1.1GHz Z510P and 1.6GHz Z530P as well as the "large form factor with industrial temperature option" 1.1GHz Z510PT and 1.33GHz Z520PT. What does that mean? In essence, the P and PT versions look like larger chips. Instead of the tiny 13x14mm package of the original Z5xx chips, they use a 22x22mm package, which is actually the same size as the N2xx chips. As far as temperature range goes, 0 to 70 degrees Celsius (32 to 158 degrees Fahrenheit) is considered "commercial," whereas -40 to 85 degrees Celsius (-40 to 185 degrees Fahrenheit) is considered "industrial." Interestingly, only the "PT" series processors support the industrial temperature range; the "P" series versions are listed with the same commercial temperature range as the initial chips.

Intel's updated Z5xx product brief now stresses fairly strongly that there are industrial as well as commercal temperature range packages for both the Z5xx processors as well as for their complementing US15W system controller hubs (GMA 500 graphics, I/O controller and memory controller). The brief also stresses that the small footprint versions are for space-constrained handheld and embedded devices whereas the large form factor is pitched for designs without small space restrictions but industrial temperature requirements. So why then do the "P" processors still have the same commercial temperature rating? Probably because the large package also includes "an integrated heat spreader" that "further contributes to its value for thermally constrained, fanless applications." Since the thermal design power of these chips was already tiny, I am not sure what the integrated heat spreader does, or why it was necessary.

In terms of performance, the "P" large form factor and "PT" large form factor/industrial temperature range chips appear unchanged, though the TDP is up a bit from 2.0 to 2.2 watts. However, if you compare the Intel's summary sheets for the Z530 and the Z530P it looks like the 530P chip is missing Intel Virtualization Technology as well as Demand Based Switching. Virtualization technology, according to Intel, allows "consolidating multiple environments into a single server or PC" which I believe means the CPU acts as if it were multiple CPUs operating independently so you can run different operating systems at the same time. Demand Based Switching was described as an enhanced version of Intel's SpeedStep technology (see description) that is available in both versions of the Z530. These are generally fairly involved server-based issues and I am not sure what the relevance to the new "large package" Atom processors is.

In any case, the "large package" also has a different "ball pitch," which refers to the spacing of the little balls of solder that replace pins on the underside of these tiny processor packages. From what I can tell, the 0.6mm ball pitch of the original Z5xx series requires high density interconnects (HDI) on the printed circuit boards, and those are more difficult to do and also more finicky--not what one would want in a rugged product (for an example of these issues, read this). So the "P" series would address that issue with its larger package size whereas the "PT" series would appeal to automotive and other transportation and industrial applications that often have a -40 to 185 degrees Fahrenheit requirement.

Now add to this that Atom chips, despite all the hoopla and market acceptance, are pretty poor performers, benchmarking no better than the lowly original Core Solos. Graphics performance, especially, is weak (what's considered weak in one device can be more than adequate in another, of course). There's the low power consumption, of course, but even that is not a given. We've benchmarked exceedingly thrifty Core 2 Duo machines as well as power-guzzling Atom systems, so proper setup and configuration are an issue.

Sometimes it almost seems like the Atom is sort of a trial balloon, one where Intel very successfully created an attractive image of a hip processor, but is also somewhat aimlessly trying out various applications to see where the Atom will fit and stick.

Posted by conradb212 at 02:20 PM | Comments (0)

January 15, 2009

The Intel Atom processor phenomenon

Frustrated with the small display and insufficient battery life of your mobile or handheld computer? Is it also too big and just not quick enough? And you can't stand a fan coming on and the thing getting so hot you can barely touch it? Welcome to the world of mobile computing where optimizing mutually exclusive goals is the order of the day. As a result, manufacturers of mobile gear are fighting a never-ending struggle to find the best compromise -- and it is always a compromise -- between size, weight, usability, performance and battery life. The screen should be large enough to be useful. Size and weight should be such as to render the device as mobile as possible. Performance should at least be adequate. And the battery must last long enough to get the job done. Long battery life either means a big battery or a device that doesn't use much power, and the latter is often preferable. Displays use a lot of power, especially with the backlight up high, but you simply need to see what you're doing and so display size may be a given.

Which gets us to the processor. There was a time when processors cost next to nothing and the mere thought of needing to cool them with a big fan would have been preposterous. When I bought my first IBM PC in 1981, it cost US$4,000, in 1981 dollars. It was powered by a 4.77MHz Intel 8088 processor that you could be at any electronics store for about six dollars (the folks who proclaim that ALL electronics components have become so much cheaper obviously weren't around in 1981). Intel managed to parlay the processor business into a near monopoly, with Microsoft and Intel going lock-step in a mutually advantageous game of creating ever more resource intensive software. Microsoft made Windows bigger and bigger, and Intel delivered the processors needed to run it. That's what got us to a point where software needs minutes to boot, and the processor, chipset and graphics card all need big fans for cooling. Oh, and while the cost of computers has come way down, the cost of Intel processors has gone way, way up. A big new one can cost a thousand dollars, and even more modest ones approach the cost of low-end notebooks. A halfway decent Core 2 Duo costs more than a little Acer Aspire One netbook.

How can Acer, and everyone else who makes small, inexpensive computers do it? Increasingly by using the Intel Atom processor, which is smaller, uses less power, and costs relatively little. Why did Intel do it? Because they found themselves in a predicament. Microsoft increasingly insists that every computer must run Windows proper. The 1990s experiment with Pocket PCs is essentially over. By insisting that small platforms had to be compatible with Windows, yet making sure they didn't get powerful enough to be a threat to the Windows business, Microsoft successfully kept the wings of mobile devices clipped, to the extent where they eventually disappeared as viable platforms. Just the other day I came across a press release from a major manufacturer of rugged handheld computers that said its customers increasingly demanded full Windows even on handheld devices. And that gets us right back to the Atom processor.

Now cost isn't as much of a factor in the vertical marketplace as it is in the consumer market. I am not saying cost doesn't matter, but a market where a device may cost US$4,000 has a bit more leeway than one where customers expect US$800 pricing. What does matter, though, is size, weight and battery life. So what Intel did with the Atom processors is essentially remove the processor as a major power consumption factor. What do I mean by that? Well, an average Core 2 Duo desktop processor uses around 65 watts, a mobile version about 35 watts. There are chips that use considerably more or a bit less, but those are the rough numbers.

Now how do we know how much power a processor uses? After all, Intel sells them using a weird nomenclature that, unlike light bulbs that have a watt rating, seems unrelated to performance. Instead, Intel usually provides what they call the "Thermal Design Power," or TDP. TDP is described as "The maximum amount of heat which a thermal solution must be able to dissipate from the processor so that the processor will operate under normal operating conditions." There's a good deal of debate as to what TDP actually means and how it relates to real world power consumption of a processor. But for the sake of the argument, let's assume we're talking watt-hours and the processor is in a battery-powered computer. We can easily compute the battery's watt-hours by multiplying volt and amp ratings. A powerful notebook computer battery may provide 75 watt-hour, just enough to run a typical desktop processor for an hour (and that's without the power needed for the display and everything else in the notebook). A frugal notebook processor with a TDP of 25 watts would run three hours, and that sounds about right (in the real world, the processor uses power conservation modes most of the time, but you have to add in the power used by all the other computer components).

Now what does an Atom processor use? Between 0.6 and 4 watts. There are two different families of Atom chips, one geared towards mobile Internet devices (MIDs) and one towards netbooks and other low-cost PCs. The most popular chip in mobile computing is probably the 1.6GHz Atom N270, which has a TDP of 2.5 watts. That's the chip you find in almost all current (early 2009) netbooks and in many embedded components. Why two families? Because MIDs and PCs have different feature requirements. MIDs are usually multimedia-oriented and power consumption is totally crucial because the devices are so small. Netbooks and similar generally rely more on compatibility and standard PC interfaces (like SATA).

So where do the Atom processors fit in as far as power consumption goes? Well, 2.5 watts is sensationally low compared to just about anything else available. The generally unloved Intel Core Solo uses about 5.5 watts in its ultra-low power version (U1300/1400/1500), the Core 2 Solo (U2100/2200) about the same, the mobile Core 2 Duos between 10 watts (U7500) and 45 watts (Q9100/9300). So the most popular Atom processor uses less than half the power of a Core Solo and only a small fraction of that of the Core 2 Duo chips.

Now keep in mind that processors need corresponding chipsets, and those use power, too. Intel designed a super-efficient chipset to go with the MID-oriented Z5xx series of Atom chips that was once codenamed Silverthorne. That chipset, the "Poulsbo System Controller Hub," can do high definition video decoding and other neat stuff required in consumer multimedia devices, and it only uses about 2.3 watts. However, it does not support serial ATA and some other essentials, which rules it out for many computing applications. The N2x0 series of Atom chips uses the i945GSE, which is a slowed-down version of an older Intel chipset, the i945. That's good as far as compatibility goes, but there is no high-def decoding and 3D performance is low. The i945GSE uses about 5.5 watts, so overall consumption of the N270 and the chipset is still only about 8 watts, but it's not exactly a state-of-the-art solution.

How about performance? This is where it gets a bit complicated because overall "performance" of a computer depends not only on the CPU, but also the chipset, the memory, the hard disk or SSD, overall system configuration and -- very important -- the OS platform and software loaded. That said, we run fairly extensive benchmarks on all systems that come to our lab, and so far we've found that an average Atom N270 device scores roughly one third of that of a 2.5GHz Core 2 Duo T9400, about 30% less than that of a 1.2GHz Core Duo U2500, about the same as a 1.2GHz Intel Core Solo U1400, and about 50% better than that of a 1GHz Celeron M 373. So we're talking decent, but certainly not blazing speed.

As far as architecture goes, the Atom is an interesting mix of old and new technologies. It's definitely state-of-the=art in terms of miniaturization, using Intel's hafnium-based high-k manufacturing. That is a fancy terminology describing the use of different conductor materials to make even tinier transistors possible. The architecture of the chips is less advanced. There's only a single core, though Intel uses the old HyperThreading technology known from as far back as the Pentium 4. There are also advanced new power savings technologies.

Overall, the Atom is certainly an interesting marketing phenomenon. At this point, everyone is clambering to get onboard the Atom bandwagon, and somehow Intel managed to stay clear of the nuclear power connotation though one would expect that from a name like "Atom." Intel, though, stresses the hafnium-based manufacturing process, and hafnium's primary use is in control rods in nuclear power plants, so that may be the "Atom" connection. In any case, even with the sub-optimal chipset situation, the lack of some features, and only moderate performance, Atom is hot. And in the new Intel world order of massively expensive processors, Atom is cheap, too, with prices of well under US$100 depending on the type and version. I've seen $44 for the N270 mentioned, and about the same for some of the low-end Z5x0 chips plus their Poulsbo chipset. Oh, and if you wonder what the difference is between the N270 and the 230, there is a N270 and a 230 that run at the same speed, the N270 is for mobile applications whereas the 230 uses a bit more power (4 watts) and is used with a considerably more power-hungry version of the of the i945 chipset, making the Atom 230 more suitable for desktop use.

As usual, there are numerous expert opinions out there, and the overall consensus seems to be that, for now, the Atoms just represent Intel's first step into the small form factor embedded and a MID market that is pretty much dominated by ARM-based designs.

With Intel's resources and marketing savvy, Atom as a "low power" processor platform may well be here to stay. As is, they are off to an amazingly good start.

For much more information on the Silverthorne platform, check Intel's Intel Atom processor Z5xx Series.

Posted by conradb212 at 04:42 PM | Comments (0)

January 05, 2009

The amazing success of "netbooks"

These days, "netbooks" get a lot of press. You' think a "netbook" were some sort of miraculous new device, a technological breakthrough that lets you do new and wondrous things. In fact, "netbooks" are nothing more than little notebooks. There is absolutely nothing new or exciting about them. And there is nothing that makes them earn the "netbook" name.

Nor are they new. There have been numerous attempts at selling downsized miniature laptops over the years, going back to the early 1990s and before. None were ever successful. People simply did not want an underpowered mini version of a notebook with a small screen and a keyboard that was not full size. Apparently that's changed and "netbooks" sell by the millions. Go figure.

One difference perhaps is that technology has come a long way. Even an underpowered mini notebook can do just about anything anyone would ever need in terms of computing. Standard wordprocessing, scheduling, spreadsheets, presentations, email and internet access tasks can all be done on a mini notebook. Let's take a look at what "netbooks" offer:

For the most part they are clamshells measuring about 10 x 6.5 inches and weighing between two and three pounds. They have displays measuring between 7 and 10 inches diagonally and they usually offer WSVGA resolution, which means 1024 x 600 pixels. Their keyboards are usually around 90%-scale, which is infuriating because that makes touch-typing a pain and also because there'd actually be enough room for a full QWERTY layout by making punctuation keys smaller, but apparently Taiwanese and Chinese ODMs and OEMS do not realize that. Memory is usually limited to a gigabyte, though some can be expanded to a gig and a half. Storage is either via Flash for Linux-based netbooks or generously-sized hard disk for Windows-based units. Most come with a rudimentary onboard cam, SD card or multi-card slots and, of course, Bluetooth and WiFi. And most are powered by Atom chips, generally the 1.6GHz N270.


How do they work? It depends on your expectations. Benchmark performance is about a third of that of a modern notebook, so routine stuff can take much longer than you're used to. The biggest limitation is the small screen. My Acer Aspire One, one of the most popular netbooks, has a 8.9-inch screen which is bright and sharp, but 1024 x 600 pixels simply isn't enough for anything these days. Working with it becomes a continuous for screen real estate, which means turning off unneeded toolbars and a lot of scrolling, scrolling, scrolling. The term "netbook" is also a total misnomer as the one thing where the current generation of netbooks falls way behind is fast web access. Pages take forever to load.

If they are such a pain to use, why do I have a netbook? Because they have a lot going for themselves, too. My Acer One runs Windows XP speedily on 1.5GB of RAM, and the 160GB hard disk is both quick and large enough. With its 6-cell battery the little Acer can run as long as six hours on a charge, and sometimes more. I like its dual SD card slots. I occasionally miss an optical drive, but have my office network set up so I can access the DVD drive of a desktop. Most of all, I like the Acer's small and handy size. Packing and transporting even a compact notebook is usually a pain, but the little Acer netbook fits absolutely anywhere. Even its power supply is tiny. In my office, I hook it up to a 20-inch LCD and a full-size keyboard and mouse. I get full 1600 x 1200 pixel resolution, which makes working on the little Acer feel like working on a "real" computer.

So, "netbooks" they are not. But there does seem to be a good-size niche for surprisingly competent little notebooks that go for less than US$400. Price is definitely an issue. I'd rather have a more rugged device with a touch screen. Fujitsu and Panasonic and others make them, but for several times the money. Why not a rugged netbook with a very small price? It might sell in large quantities.

Posted by conradb212 at 05:29 PM | Comments (0)

December 19, 2008

The problem with Linux

On the surface, Linux should be a huge winner, and in many respects it is. Hey, what more can one want than a free operating system with mostly free software that runs on just about anything? I've been using Linux for many years for just that reason. Free. No hassles with activation, copy protection, and other pesky schemes meant to keep pirates away yet only inconveniencing customers.

So why hasn't Linux taken over? Because it's too complex. Sure, there are distributions that install simply and easily, but you can also spend hours trying to get one little thing to work right. Linux is a giant patchwork of code from all over the world. Perhaps the biggest challenge is that almost all Linux developers think Linux is so simple that absolutely everyone should be able to perform arcane steps and procedures.

Linux suffers from the expert syndrome. The expert syndrome is what makes academics speak in nearly incomprehensible language. It makes them look and sound important and, in their minds, is a reflection of their superior intellect and knowledge. Coders, likewise, revel in acting as if their most complex systems were child's play and anyone who does not master it must be an idiot. Some of the instructions for Linux are so complex and incomplete as to make it impossible for anyone who does not already know the systems to install things or make them work.

In all my time of working with Linux I've found perhaps a handful of truly useful tutorials and instructions. Sadly, this pits an incredibly productive global community of Linux coders and developers squarely at odds with the rest of humanity who can no more compile a kernel than split an atom. The rest of humanity also does not appreciate being talked down to when it comes to doing simple things like properly extracting a file, making a wireless connection work, or numerous other things that should be simple and self-explanatory but, in Linux, are not.

Unfortunately, I do not see a solution to this problem. You either have tightly controlled empires like Microsoft or Apple where things are centrally controlled and packaged, or you have loosely knit global communities of techies with all their human brilliance and flaws. So things will likely continue the way they have for decades, with Linux being both a a terrific solution but also one that can be endlessly frustrating.

Posted by conradb212 at 04:32 PM | Comments (0)

November 21, 2008

Smartphone & Pocket PC Magazine -- the shortsightedness of letting an incredible resource die

With Microsoft sitting on billions of dollars in cash and spending many millions on comedian Jerry Seinfeld and a silly Vista campaign, the one magazine that has covered Pocket PCs and Windows Mobile for many years has just died due to lack of support. I am talking about Smartphone & Pocket PC Magazine, published by Thaddeus Computing Inc. Those guys were publishing magazines on small Microsoft-powered computers for almost a quarter of a century, yet neither Microsoft nor Hewlett Packard apparently cares enough about real, quality coverage of their products to at least use this incredible magazine as a venue for advertising, let alone as the important, invaluable partner in spreading the word about handheld computers that they are (and now were).

Having founded and run several print magazines myself, I know all about the work and hardship that goes into creating a quality magazine, and how things are all different in this age and day of the Internet and web. Advertising dollars are increasingly going away from print, and people no longer want to wait for information to appear in print. Everything is available instantly. Yet, the information on the web is .... different. In a way it almost does not compete with print. How else would one explain the fact that there appear to be more magazines on newsstands than ever? I myself absolutely cannot imagine life without computers and the web, yet I have a good dozen print magazine subscriptions that I never intend to give up. Magazines and the web are as different as radio and TV -- both convey information and entertain, but in different ways. Unfortunately, tech companies like Microsoft do not seem to understand that, and the phone companies who have taken over the smartphone business are clueless about the market that has fallen into their laps.

Fact is, online is becoming much like TV -- far too many channels and nothing to watch. It's all commercials and infomercials. You have to channel-flip not because you can, but because you're constantly avoiding commercials and seeking something, anything, meaningful to watch. And quality is getting lost in a vast sea of drivel. You can google a particular product and instantly get 10,000 references to it, mostly junk. By now the web is jam-packed with virtually content-free sites that are just landing pages for ads and more ads. Even reputable sites are doing it: two paragraphs of content and then commercial bombardment. The ever more popular "customer reviews" are often little more than "this product sucks!", "no, this product is the best ever" slugfests, and the same goes for bulletin boards where there is endless posting and almost no factual information. With the exception of the by now almost suffocating commercialization it's all worth it, of course. But it is NOT a replacement for a good print magazine.

When I look at the final copy of Smartphone & Pocket PC Magazine (the Smartphone & Pocket PC Super Resource Guide Dec/Jan 2009) I see a hundred pages of superb, comprehensive information, a reference guide I am certain to keep around for years. You'd have to visit literally thousands of websites to get that amount of good information, and even then you would not get the quality. A complete and total spec list of ALL smartphones with touch screens? Check. A complete and total spec list of ALL PDAs? Check. Reviews and ratings of hundreds of the best software apps? Check. A complete analysis of GPS on Windows Mobile, including product reviews and comprehensive comparison charts? Check. Detailed reviews of the leading and upcoming smartphone platforms? Check. And that is just a small part of it. If a consultant were given the task of compiling the huge wealth of information contained in just one issue of Smartphone & Pocket PC Magazine, it'd cost many tens of thousands of dollars, and probably hundreds of thousands. For a company like Microsoft to let such an incredible resource die -- a resource that does nothing but promote Microsoft's mobile embedded platform -- is simply unimaginable. Spending millions on nonsensical commercials and sitting on billions, yet not support real, quality, serious information, it just does not compute. The cost of supporting a resource like Smartphone & Pocket PC Magazine that provides real information is absolutely minuscule compared to the billion here, billion there mentality of big business.

Lacking any meaningful support from the Windows Mobile side of things, Thaddeus Computing is now going on to cover the iPhone platform with their new iPhone Life magazine. It'll be an uphill battle as now they'll be dealing with one single hardware and software vendor (Apple), one single service provider (AT&T), and application software vendors who do all of their selling through Apple's App Store, so the impact of print advertising will be less traceable than ever. The iPhone is hugely popular, of course, but neither will people buy another iPhone (they're locked into a 2-year contract) nor can they buy another model (there's only one). The phone companies have historically not supported enthusiast magazines and there is no indication they ever will. They also don't "get it," something at least the Microsoft field people certainly did.

But won't Apple be thrilled to see one of the most respected niche and enthusiast publishers switch allegiance? Likely not, if they even notice. Apple is sitting on its own billions of cash, but I am fairly certain none of it will go to supporting a small magazine that could spread high quality news and real information at an annual cost that's a tiny fraction of the interest on Apple's cash reserves alone. And AT&T, which in the U.S. has a service monopoly on the iPhone? Hah.

So best of luck to the folks at Thaddeus Computing. It's an absolute crying shame to see Smartphone & Pocket PC Magazine die, and those in the Windows Mobile industry who let that happen deserve to be accused of colossal, inexcusable shortsightedness. Maybe someone will come to their senses and buy Thaddeus. 25 years of experience and commanding knowledge of the major serious mobile platform in the world AND they know how to compile and present information AND they have all the magazine distribution channels in place AND running them for a year probably costs peanuts? No brainer if you ask me.

Posted by conradb212 at 03:27 PM | Comments (0)

November 18, 2008

Thoughts about ingress protection: eliminate potential points of failure

The most commonly used measure for protection against the elements is the IP rating, or Ingress Protection rating. The IP rating consists of two numbers where the first indicates protection against solids and the second protection against liquids. Solid ratings go from 1 to 6, with 6 meaning the best protection. Liquid ratings go from 1 to 8, with 8 meaning the highest protection. Essentially, the purpose of these ratings are the determination of how well a device can keep out dust and water. As far as liquids go, the purpose of the rating is not to signify waterproofing for underwater operation (though IP68 means a device is indeed waterproof) but how well a piece of equipment can keep out water during normal operation in the field. What could happen, for example, is that a device gets exposed to rain, or even strong driving rain during a storm. In a marine setting it is possible for a device to suddenly become exposed to heavy seas, and it may need to be protected against that.

All of this needs to be tested and certified, and the way it is usually done is by following standard procedures that describe a controlled lab testing setup, like document 60529 issued by the International Electrotechnical Commission (IEC).

The problem is that lab tests do not always accurately predict what may happen in real life. In that respect the ratings should really be considered guidelines rather than hard data. Consider, for example, two devices that both carry an IP67 rating. One of them has no external ports other than a single surface mount connector used to provide interfacing via a port replicator or dock. The other has a variety of commonly used ports, all protected by individual rubber plugs. One machine may also have an externally accessible expansion slot and an easily replaceable battery, each nicely sealed via o-rings and other high quality seals. Which device do you think is more at risk for leaking?

I'd say the second as it has multiple areas of entry as opposed to just one. No matter how well engineered the device may be, the probability of something going wrong is higher. A protective cover may not be pushed in all the way. A seal may have shrunk or gotten broken. A door was inadvertantly left open. It can happen.

A compromised seal may not necessarily mean a leak into the inside of the device. The port itself may carry enough sealing in addition to the protection provided by its cover to ward off damage. Then again, it may not. Bottomline is that the simplest and most foolproof protection is best.

Anything mission-critical should be failsafe. Failsafe means that if a system fails, it must fail in its safe state. A relay that snaps closed when it loses power is an example. The problem with protective rubber and other seals I'd that none are fail-safe. They are all fail-fail. So the best way to proceed is to have as few potential points of failure as possible.

What that means is that, all else being equal, a device with fewer possible points of failure will almost always be a better choice as far as protection us concerned.

Posted by conradb212 at 11:12 PM | Comments (0)

November 10, 2008

Benchmarking popular mobile Intel processors

Well, we finally managed to benchmark a mobile device with an Atom processor. Like everyone else, I was wondering where Atom performance fits in. The Thermal Design Power (TDP) of the 45nm Atom processors is so ridiculously low that it's impossible to even make an educated guess. There are, of course, a number of different Atom processors out there, but one that appears to be popular in small mobile devices is the Atom N270.

The N270 is a single-core processor that runs at 1.6GHz and has a TDP of 2.5 watts -- significantly less than even an ultra-low voltage Intel Core Solo and only a small fraction of the power consumption of your average consumer notebook. There are other system parts that use power, and for now Intel doesn't offer Atom-compatible chipsets that are nearly as miserly as the processor itself. Further, a lot of the advanced features we've come to take for granted in Intel Core processors are simply not part of the Atom. Instead, Intel resorted to the hyper-threading technology from its past. It's all quite complex and it probably takes a chip design experts to tell how various Intel technologies impact performance.

What we can do is run benchmarks, and that's what we did on an Atom N270-powered Acer Aspire One netbook, an exceedingly handy little clamshell computer with an WXGA 8.9-inch display and a weight of just over two pounds. The tiny Acer came with a gigabyte of RAM, a 160GB 5400rpm disk, and ran Windows XP. Our standard benchmark suite, PassMark, did not complete and so we switched to CrystalMark 2004R2. Here are the results:

PERFORMANCE COMPARISON Intel A110 Core Solo U1400 Atom N270 Core Duo U2500
Clock speed 800MHz 1.2GHz 1.6GHz 1.2GHz
Test Unit GETAC E100 Motion F5 Acer One Xplore 104C4
Thermal Design Power (TDP) 3.0 watts 5.5 watts 2.5 watts 10.0 watts
ALU 3026 4565 5544 9291
FPU 3682 5343 5370 11124
MEM 2732 4989 4442 6132
HDD 3614 3252 7900 6381
GDI 3040 4239 3293 3987
D2D 2530 4221 2912 3899
OGL 738 1151 684 1187
Overall CrystalMark 19362 27760 30145 42001

These figures suggest that systems equipped with the Atom N270 are quite a bit quicker than machines with the Atom's predecessor chip, the A110, but only a bit faster than the first-gen Intel Core Solo. The 1.6GHz Atom N270 is no match for the 1.2GHz Core Duo U2500 that's used in a number of high-performance Tablet PC slates. The high clock speed of the single core N270 is therefore a bit misleading. Clock cycle for clock cycle, the unloved Core Solo is more powerful.

However, in a lean, smartly designed system with enough RAM and a speedy disk, such as the Acer One netbook, the N270 can deliver both power and economy. The Acer feels fairly quick, and it runs about 2-1/2 to three hours on a small 24 watt-hour 3-cell battery and 5-1/2 to six hours on a 49 watt-hour 6-cell battery.

Posted by conradb212 at 09:59 PM | Comments (0)

October 15, 2008

Ultra-rugged waterproof displays

In RuggedPCReview we usually cover mobile computers, i.e. systems that combine processing, storage, data input and display all in one unit. That, however, doesn't mean that all mobile systems are all-in-one type of devices. Tablets and slates, for example, are often used in conjunction with an external display and full-size keyboard when used in a stationary environment, and there really is no compelling need for vehicle and panel mount systems to be all-in-one.

I was reminded of that when I came across some very interesting display products from a company called Digital Systems Engineering, located in Scottsdale, Arizona. They have the DVE Raptor display where DVE stands for "Driver Vision Enhancement." It is a ruggedized LCD display designed to operate under the kind of extreme environmental conditions encountered in tactical wheeled and tracked vehicles. The 10.4-inch SVGA display is sunlight readable with a super-strong 1,000 nits backlight (standard notebooks have less than 200 nits), good vertical an horizontal viewing angles, and zero color shift.

What's most amazing, though, is the Raptor display's environmental specs. It carries an IP67 rating, which means it is not only totally sealed against dust, but it is also waterproof to the extent where it is submersible. Hopefully that won't happen in a tactical vehicle, but this display will continue to operate under water. It can also operate in an extremely wide temperature range of -40 to 158 degrees Fahrenheit, handle any degree of humidity, and operate at 45,000 feet of altitude. Needless to say, the milled aluminum and heavily sealed and protected display has been shock and vibration tested to MIL-STD-810F specs.

The screen, which only weighs a bit over eight pounds, is also MIL-STD-3009 compliant. MIL-STD-3009 (also referenced as DOD-STD-3009) sets requirements for aircraft display equipment for use with night vision imaging systems. For mobile computers that generally means they must not interfere with night vision equipment in a cockpit. Part of this document is the U.S. Navy MIL-HDBK-87213 Revision A (Electronically/Optically Generated Airborne Displays) that describes, among other, criteria for legibility of electro-optical display equipment and daylight readability in bright environments, which is a military requirement. This can be an issue with daylight readable displays marketed to the govenment and armed forces.

If the indestructible Raptor is overkill, Digital Systems Engineering has a line of MSM monitors where MSM stands for Mil Spec Monitor. These come in various display sizes (8, 10, 12, 15) and are lighter than the Raptor. Despite IP67 sealing, they only weigh between 3.5 (8.4-inch display) and 6.9 pounds (15 inch display). Yet, the MSMs are MIL-STD-3009, MIL-L-85762A and MIL-PRF-22885 compliant and have an incredibly bright 1,400 nit backlight in addition to anti-reflective and anti-glare surface treatment, making them viewable under any lighting conditions.

To learn more about those super-rugged monitors, check Digital Systems Engineering's website at http://www.digitalsys.com.

Posted by conradb212 at 04:18 PM | Comments (0)

September 30, 2008

Why is no one using the Marvell speedy and powerful PXA320?

When we reviewed the TDS/Trimble Nomad last year here at RuggedPCReview.com, I marveled over the machine and noted, "The 800 MHz Marvell PXA320 processor certainly had something to do with it. The difference between it and the 624MHz PXA270 is much larger than we expected."

In fact, the chip performed so well in the Nomad that I was certain other manufacturers would quickly follow suit and use the formidable PXA320 chip as well. Interestingly, that didn't happen. If I remember correctly, the only other product I've come across that uses the PXA320 is the Aceeca Meazura MEZ2000, which I think is still in the planning stage. Everyone else still seems to be using the older PXA27x, even in new designs. The PXA27x is certainly a good and time-proven processor, but it is no match for the PXA320 when it comes to performance.

Maybe something is going on that I am not aware of. Maybe Marvell isn't pushing the chip and it's such a secret that no one realizes technology has advanced. Maybe it's too expensive, or has some drawbacks I am not aware of. As is, the Nomad with its powerhouse PXA320 chip appears to continue to enjoy a significant performance edge over anyone else out there.

Posted by conradb212 at 12:36 AM | Comments (0)

September 26, 2008

The digitizer mysery

Imagine if someone had patented hard disks so iron-clad that no one else could make them. Or that an enterprising company had legally locked up LCDs such that it had a monopoly. If that were the case, we might still have giant, sluggish 20 megabyte (not gigabyte!) hard disks and computing as we know it would not be possible. And we'd all get eye strain from using smallish, barely readable antediluvian STN displays. That would be a bad situation. As is, fierce competition propels progress, and as a result we have the most wondrous products brought upon by innovation and improvement.

Except in one area.

Digitizers.

How much progress has there been since I began reviewing pen computers back in 1993? Basically none. And as far as I can tell, that sad situation sits squarely in Wacom's court. Wacom's patented digitizer technologies have resulted in Wacom having almost 96% market share in Japan, and a good 70% in the rest of the world. The Wacom digitizers I used on 1993 pen computers worked, sort of, but were hugely frustrating because it was essentially impossible to calibrate them. The Wacom digitizes I have used in vastly better and more powerful computers in 2008 worked, sort of, but were hugely frustrating because it's essentially impossible to calibrate them. I mean, there are any number of touch screens where you can calibrate 25 points or more, do edge compensation, and all sorts of other cool stuff geared towards enhancing precision and improving the user experience. A Wacom digitizer calibration? Four points, and that's it. Along the edge of the screen, the digitizer is often so badly off that it becomes frustrating to use it.

I've complained about this for pretty much as long as I can remember, and there hasn't been any change. Anything else in computing has improved dramatically. What gives? Is Wacom's technology inherently incapable of working better? Is no one else able to come up with a better alternative because of patent blocks? I don't know, but between Microsoft's marginal handling of the Tablet PC and the dismal performance of the Wacom digitizer, pen computing is where it is.

There. End of sermon. I just had to say it.

Posted by conradb212 at 01:53 AM | Comments (0)

September 02, 2008

MIL-STD-810F 509.4 and thoughts on salt water exposure

During a week of scuba diving off Roatan island in Honduras, I had first-hand experience of what salt water exposure can do to equipment. I took several underwater cameras with me for testing and used them on up to four daily dives to 85+ feet with each lasting an hour or more. I thoroughly rinsed off the equipment after each dive, but still found that salt accumulated under rubber coatings, inside screw holes, under screw heads and inside or under anything that allows water to go under or moisture to seep in. After I returned back home I soaked all equipment again in my bathtub and then cleaned each part and component. Without that, adjustment screws, hinges and joints could seize, and the equipment quickly deteriorate due to longer term corrosion.

I remember when Panasonic showed me the results of their Toughbook corrosion testing on an invitational tour of their facilities in Osaka back in 2002. Without special consideration of salt water and salt fog exposure, there could quickly be appalling damage as shown on the picture to the right (click on it for a larger version). Panasonic explained how they had been approached with requests for such testing, performed the salt water and salt fog tests, and were surprised to see the extent of the damage. They then systematically changed design and materials to ward off or minimize the effects of salt. This benefitted all subsequent Toughbooks, and also showed Panasonic how to develop special solutions for customers who use their products in environments where they are exposed to salt fog and water.

When you look at these pictures it becomes obvious that sealing alone is not enough when it comes to salt water exposure. Sealing standards only tell how well a product keeps dust and water out of the inside of the unit. They don't tell what salt can do to components that lay outside of the sealing barriers. What can salt do when it gets under a keyboard? Inside a hinge? Underneath protective doors? The result can be ugly. Nothing can ever ward off salt entirely when a product is used in marine environments. Users need to keep computers away from excess exposure as much as possible, and equipment needs to be cleaned meticulously after any exposure. That means that cleaning must be possible in the first place, which means that places that are potentially expose to salt water and fog must be accessible. There are just a whole bunch of additional considerations.

This is why the famous MIL-STD-810F (Department of Defense Test Method Standard for Environmental Engineering Considerations and Laboratory Tests) document includes a 9-page section on Salt Fog testing.

MIL-STD-810F Method 509.4 describes testing methods to determine the effectiveness of protective coatings and finishes on materials for corrosion, electrical effect and physical effects. The tests can also determine the effects of salt deposits on the physical and electrical aspects of materiel. The product is exposed to salt fog mist from a 5% salt solution via atomizers at about 95 degrees Fahrenheit for a minimum of four alternating 24-hour periods, two wet and two dry. The product is then examined for salt deposits that can clog or bind components, electrical malfunction, and potential short and long-term impact of any observed corrosion.

The reason why I am writing this all down is because my return coincided with an announcement from GETAC that its impressive B300 rugged notebook had received Salt Fog certification. Here's part of their press release:

LAKE FOREST, CA. – September 2, 2008 – GETAC Inc., a leading innovator and manufacturer of rugged computers that meet the demands of field-based applications, announced today that its B300 ruggedized notebook PC received full Salt Fog certification based on testing standards set by the Department of Defense (MIL-STD-810F – 509.4). Salt Fog is a specialized test used to evaluate and determine the effectiveness of protective coatings and finishes on materials to repel salt corrosion and may also be applied to determine the effects of salt deposits on the physical and electrical aspects of materials. Adding the Salt Fog certification to an already robust and rugged notebook PC makes the GETAC B300 the ideal choice for military installations, marine applications such as the Coastguard and other industries where salt or salt air can impact equipment performance.

“Salt is one of the most aggressive chemical compounds in the world,” said Jim Rimay, president, GETAC. “Salt will quickly corrode a computer’s exterior, impair vital electrical system functions through salt deposits and have a physical impact by restricting free movement of its mechanical components. The B300 addresses these issues with its Salt Fog certification and elevates it to an elite status among ruggedized computers for safe and uninterrupted operation in any location, especially in coastal regions of the world.”

We recently did a detailed hands-on test of the Getac B300 and found it to be a very impressive machine full of clever engineering and innovation. A combination of optical coatings and superbright backlight make the screen readable in the brightest sunlight, and amazing power conservation methods can extend battery life to a stunning 12 hours. It's good to see that the company also invests in testing against one of the less-often mentioned environmental threats to mobile computers -- salt fog exposure. While most specs include resistance to drops and vibration, salt fog/water exposure can destroy a piece of equipment just as surely. Once the corrosion is detected, it's usually too late, so it's nice to see Getac take proactive steps.

MIL-STD-810F, however, only describes testing methods, and not the criteria that determine passing tests. It would therefore be nice to know what Getac found during its tests, and what the company did to make the B300 as immune to salt fog damage as possible.

Posted by conradb212 at 02:31 PM | Comments (0)

August 06, 2008

The Motion Computing F5

We've had the Motion Computing F5 tablet here in the lab for a while. The F5 is a follow-up to Motion's C5 medical market tablet, which was a rather unique design solution that received a lot of positive feedback. The folks at Motion are generally right on the mark, and have been ever since some former Dell people formed the company back in 2002 or so to take on Fujitsu with a Tablet PC slate. At the time no one gave them much of a chance to prevail in a market that Fujitsu practically owned with their Stylistic pen tablets, but Motion pulled it off. I remember a dinner meeting with Motion founders Scott Eckert and David Altounian in San Francisco where they showed me the prototype of their initial tablet. It wasn't substantially different or better than what Fujitsu had at the time, but it was immediately obvious that the Motion folks truly believed in their product and that they had a very clear focus. That never changed. Whereas tablets are just a small part of their overall business for Fujitsu, tablets are the only thing Motion does. It's been six years now, and Motion never wavered from their mission. And somehow they always managed to stay ahead of the curve, with new technologies generally available in Motion products sooner than anywhere else.

I don't know what the thought process was that led to the design of the original C5 medical tablet, but it was certainly a smart decision to go after the medical market. It's a tough one to break into for a variety of reasons, but also one where mobile systems can make a huge impact. At Kaiser, the HMO I use, they finally have terminals in almost every examination room so they can call up patient info, and they can now also call up x-rays onscreen, but it took them forever, and I still see no portable electronics. I suppose it's the same elsewhere.

The Motion C5 was an attempt to provide a portable computer that could do more and was easier to integrate into the daily workflow of medical people. So they made it small and light and gave it an integrated handle to easily carry it around. They integrated an RFID reader and a bar code reader and also a camera. They also made it white so it fits in with all the other medical equipment, and it's easy to wash and disinfect. Motion also created a small, handy dock for it. So the overall idea was the provide a small computer that was easy to carry around and that included all sorts of data capture methods. It all still depended on systems integrators to package the hardware with medical systems software, and then have hospitals actually pick it up and use it. I am not sure how many did, but the Motion C5 was, and currently still is, probably the best mobile hardware for such projects.

When I first looked at the C5 I wondered why Motion limited the platform to just one market. True, it's a potentially huge market, but the C5 seemed sturdy enough to be used in other mobile applications, and it already carried IP54 sealing, which means it was didn't mind a bit of rain and some spills. Motion apparently agreed and created a second version of the C5, the F5. They called this one a "Field Tool," -- not the greatest of names, but obviously an attempt at communicating that this computer should be seen as a tool for jobs rather than a conventional computer.

I must admit, I had a bit of a hard time with the F5. When I wrote about the C5, I had no problem seeing the design decisions that had been made to make this computer just right for the medical market. The size, the shape, the features, the color and so on. The F5 is gray instead of white, but other than that, it's the same computer. It does include Motion's "View Anywhere" display because unlike the C5, the F5 would probably be used outdoors where sunlight viewability counts. So there wasn't any additional thought on how to make a computer best suited for use in the field.

The way I see it, the field IS different from a hospital. You won't always have a dock to charge a computer, and so the fairly small battery of the C5 may not be enough. And in the field it does come in handy to have a USB port or two and perhaps even an old serial port for some arcane instrument or measuring tool you need to hook up. And having some sort of expansion slot also comes in handy. Wireless communication is great and we can't do without, but it's been my experience that even with Bluetooth and WiFi, there are times when it's a lot simpler to just copy files onto a USB key or a SD card than to send them. The F5 can't do that as it doesn't have any ports or slots and totally relies on wireless or the dock.

All of this made it a bit more difficult to review the product. I am used to Motion having a very clear rationale for a machine, and in this case the rationale seemed to be that the healthcare C5 was good enough to be offered for other markets. That was probably a good idea, but something still doesn't feel quite right. Even the "View Anywhere" display that I remember as effective from previous reviews of Motion tablets seemed rather low-contrast compared to other sunlight-viewable technologies on the market.

The F5 is also one of the few machines that uses the Intel Core Solo processor. The Solo is essentially a Core Duo with one core not used, sort of like an 8-cylinder engine with only four of them running to conserve fuel. It is an economical chip, with a thermal design power of just 5.5 watts, which is only a bit more than half of what a Core Duo chip running at the same clock speed uses. Problem is that benchmark performance is much lower, too, and generally closer to the lowly Intel A110 than even an ultra-low-power Core Duo. The F5 is no slug at all, at least with Windows XP, but with Motion always being at the forefront of technology I wonder why they didn't just use an Atom processor instead. They did switch from the Core Solo U1400 to a Core 2 Solo U2200 which is said to include better caching and even more power-saving technologies, so perhaps that was the right move for now.

Anyway, just a few thoughts on what is, in fact, an interesting and welcome addition to the hardware alternatives available to those who need to implement computing solutions in the field. The official review of the Motion Computing F5, with pics and specs and all is here.

Posted by conradb212 at 02:03 PM | Comments (0)

June 23, 2008

Tablet PC: We could use a hammer....

"We could use a hammer..." That's the tag line of MobileDemand's latest video in their Tablet PC Torture Chamber Series where a man uses a Tablet PC to hammer a bunch of large nails into a board. The video is the latest in a series of increasingly sophisticated and outrageous demonstrations of just how tough their Tablet PC is.

Usually, rugged equipment is dropped or exposed to water to show that it can survive the kind of punishment encountered in the field. MobileDemand's earlier videos pretty much followed that tradition. xTablets were dropped, exposed to showers, rolled down a hill and so on. But soon the videos showed drops more extreme than anything that would likely happen in the real world. And instead of being exposed to a showerhead, the computer was strapped to the top of a car and run through a car wash five times, with the computer running and its display on camera during the whole ordeal.

And now the "We could use a hammer..." video. It's very smart. No one would actually use a computer as a hammer (though, come to think of it, I've used a variety of objects as hammers when none was handy), but the image of using that sophisticated piece of electronic equipment as a hammer certainly drives the point home, no pun intended.

Using the xTablet Tablet PC computer as a hammer really means to illustrate a point: shock and vibration do happen in the field. If you use a machine in a truck or as a data capture device you do not intend to damage it, but sooner or later it will fall. And constant vibration is affecting the computer. Eventually things can happen. Electrical parts may touch and short-circuit. Fasteners may come loose. Structural pieces may crack. Seals may deform and begin leaking. Electrical contacts may become unreliable. The display panel may become get out of alignment. Fasteners and ties may get loose. Wiring may chafe. Materials may fatigue and then break. Parts may deform or crack. And so on. At best, sealing may be compromised, electrical noise may be introduced, and individual parts are headed for failure. At worst, the computer fails.

This is why manufacturers usually provide test data, usually how a product performed when using the procedures described in MIL-STD-810F. Those procedures try to replicate conditions actually encountered in the field during transportation and operation. That makes sense, but the testing is quite involved and not very easy to interpret. Witness the following caution regrading acceleration testing found in MIL-STD-810F 514.5:

Care must be taken to examine field measured response probability density information for non-Gaussian behavior. In particular, determine the relationship between the measured field response data and the laboratory replicated data relative to three sigma peak height limiting that may be introduced in the laboratory test.

That's a mouthful, and the results are even more difficult to read. General integrity test conducted may then yield results such as, say, a power spectral density of 0.04G²/Hz, 20 to 1000Hz, descending 6dB/oct to 2000Hz. MobileDemand, like all the other serious rugged equipment vendors and manufacturers, has its gear tested in accordance with the MIL-STD-810F (and other) procedures, but what has more impact, some tech specs comprehensible only to engineers or a video of a man using the that rugged Tablet PC as a hammer and it still works?

"We could use a hammer..."

Brilliant.

To see the "We could use a hammer..." video, click this Blip.tv link.

Posted by conradb212 at 03:46 PM | Comments (0)

May 28, 2008

Electrovaya settles patent infringement suit

An interesting situation: An intellectual property company named Typhoon Touch Technologies announced Electrovaya had settled a patent infringement lawsuit by Typhoon and Nova Mobility Systems "for an undisclosed sum representing a royalty payment of at least 20% on past and future sales of its Scribbler Tablet PCs in the United States. Additionally, Electrovaya formally recognized the validity of Typhoon’s patents at issue in the litigation and acknowledged infringement of one or more of the patent claims." (see here)

20% on past and future sales of a tablet? Wow! And recognizing the validity of a patent? That's even more amazing given the vague and confusing nature of many patents. So what is this patent for? That would be US patents 5,379,057, issued January 3, 1995 and 5,675,362, issued October 7, 1997. They both have the same abstract:

"A portable, self-contained general purpose keyboardless computer utilizes a touch screen display for data entry purposes. An application generator allows the user to develop data entry applications by combining the features of sequential libraries, consequential libraries, help libraries, syntax libraries, and pictogram libraries into an integrated data entry application. A run-time executor allows the processor to execute the data entry application."

The drawings accompanying both patents show a tablet computer like the ones Momenta, IBM, NCR, GRiD, Samsung, Fujitsu, Dauphin, TelePad, Toshiba and many others offered for sale in the early 1990s. The picture on the right shows the drawing included in the 1995 patent and a couple of computers that precede it. The two computers I added for comparison's sake are a 1993 IBM ThinkPad 700/710 and a 1992 Dauphin DTR1. On the surface it's hard to see how a 1995 patent for a "self-contained general purpose keyboardless computer" could impact a 2008 Electrovaya slate when numerous companies made such computers already in the early 1990s. Then again, patents are finicky things and their interpretation is up to courts.

Anyway, the patents in question were issued to Microslate, a company that was certainly a pen computing pioneer with its ultra-rugged Datellite touch screen computers (see one of our early reviews of it in Pen Computing here).

Interestingly, Typhoon also sued Dell, Xplore, Sand Dune (the Tablet Kiosk folks) and Motion for infringement on touch screen technology and seeks damages for lost profits. Motion reached some sort of settlement. Typhoon apparently thinks that the patent in their possession covers just about the entire mobile market: "manufacturing, selling, offering for sale, and/or importing a variety of portable computer products, including but not limited to tablet PCs, slate PCs, handheld PCs, personal digital assistants (PDAs), ultra mobile PCs (UMPCs), smart phones, and/or other products covered by the patents-in-suit."

The suit has a co-plaintiff in Nova Mobility Systems, located in Tempe, Arizona. Nova, interestingly, offers the SideARM handheld. The SideARM was originally conceived by long defunct Melard and then became part of Microslate's lineup, the very company that was assigned those two patents. Typhoon's Form 10QSB shows that they bought the patents from Nova Mobility and agreed to pay them a 10% royalty from enforcements. So Microslate, an early player in the rugged slate market, sat on the patents all this time, then sold them, and now they are supposed to cover virtually every mobile device ever made even though such devices existed long before the patents? Elegant.

We're all in favor of respecting intellectual property, but figuring out what exactly that means isn't always easy. When I was a kid many decades ago I envisioned a little black box that told me everything I wanted to know by simply asking a question and let me communicate with anyone who had one. I doodled drawings of it. Does that mean I own the exclusive rights to cellphones, smartphones, Google and the entire web? Sadly not. But it would really be nice to at least have 20% of all those sales.

Posted by conradb212 at 04:36 PM | Comments (0)

May 26, 2008

XP Embedded: When benchmarks lie

Providing rugged mobile computers is a constant exercise in trade-offs and balancing. Screens get bigger and brighter, processors get fasters, disk larger, and customers want all that, without paying for it in the form of larger batteries and more weight. The problem, really, is that battery technology has not kept pace with the rest of the circuitry inside a computer, and so batteries struggle to provide enough juice to keep everything running for long. When you think about it, it's pretty bizarre that the very machines that are supposed to go as fast as possible often annoy their users by constantly trying to go to sleep, stand by, hibernate or shut off. Or that they come factory-configured to run at half speed and with the backlight dimmed.

The increasing power demand of the latest electronics (and in the processor department, their cost) has driven many manufacturers to look for alternate solutions. One is to pick a much simpler processor that consumes a lot less power. That approach, however, has its own problems. Two primary ones, in fact. The first is that customers think a machine with a "slow" processor cannot possibly be very powerful. And second that, in fact, it isn't. Fortunately there's a solution, albeit one that is only suitable for certain tasks and applications.

An embedded operating system.

See, a general purpose OS, like Windows XP Professional, is just that, general purpose. You can do anything you want with it, and run anything you want on it. With that in mind, Microsoft equipped Windows XP with all the drivers and software and utilities one could possibly need. The result is a rather large operating system with numerous processes and services running all the time, all consuming memory and power, and having the potential to slow even a powerful machine to a crawl.

An embedded operating system is totally different. The idea is to only use what you need to perform a certain task and leave everything else behind. This greatly reduces the size of the operating system and dramatically reduces hardware requirements. XP Embedded is generally used for smart, connected and service oriented commercial and consumer devices that do not need all of Windows XP, yet can still run thousands of existing Windows applications. An embedded OS can easily be as small as 40MB and it's even possible to cut it all down to around 8MB with a bootable kernel.

XP Embedded is not one-size-fits all. A company will determine exactly what a machine is for and what it should be able to do. They then include as many components (hence the term "componentized" operating system) as they need. There are over 10,000 available and it's easy to create lean, nimble embedded OS platforms that can still do sophisticated high level tasks like advanced multimedia, browsing, communications or whatever a task requires. An embedded OS can even run as a real-time OS via third party plug-ins. Essentially you get the power of the basic Windows XP engine, but without any overhead you don't need.

Which means that in an embedded systems machine, benchmarks do not necessarily tell the true story. They simply measure raw power, but not how efficiently that power is put to use. What all this boils down to is that a mobile computer with an embedded OS can be much faster than you'd think it is based on its hardware specs. In fact, we reviewed some that were so quick that almost no one would believe they ran on a low-power, inexpensive processor and just a minimum of RAM. So benchmarks would tell one story, real world performance another.

This is not to say that an embedded OS is the perfect solution for all mobile computing tasks. But it can be for organizations that build their own customized, componentized OS. And for those who have very clearly defined applications that work within the confines of an embedded OS.

Posted by conradb212 at 03:50 PM | Comments (0)

May 21, 2008

What happened to Symbol!?

Symbol Technologies was always one of my favorite companies. I visited their headquarters in Holtsville, long Island several times over the years and always came away impressed with their sleek designs and willingness to try out new ideas. That feistiness carried over into some aggressive acquisitions (like the bitter fight with Telxon) and, after some financial incongruencies, the sale of Symbol itself. Now Symbol is part of Motorola, but it isn't very clear what kind of part.

A good year or so after the acquisition Symbol seems to have been halfway absorbed into Motorola, but if you go by the Motorola website it's almost impossible to figure out how. Symbol is only listed as carrying bar code scanners, mobility software, and OEM scan engines, but no longer any handheld computers. The former Symbol handhelds have become sort of stateless, popping up under "Mobile Computers" without any brand name at all. So the former Symbol MC50, for example, is now just a "MC50," presumably somehow by Motorola.

It's actually quite sad to see all that. Symbol's once proud state-of-the-art handhelds now languish, carrying on in some way with dated processors and even more dated software. Some have unceremoniously been discontinued whereas others seem destined to just die from neglect. The MC35, MC50, and MC70 had a very promising career ahead of them when they were introduced, but now they are aging rapidly. The emphasis appears to be on the big and fairly conventional MC9000 Series of handhelds. They come in a variety of permutations with various size keypads, and they remain reasonably up-to-date with Windows Mobile 5.0 and Marvell (why does almost everyone still call them Intel when Intel sold the business a long time ago?) PXA270 processors.

There may well be method to this madness, and the decision to focus Symbol entirely on scanners may be a good one. Obvious it's not. And it's truly sad to see Symbol's proud legacy of handheld computers rapidly go to seed. I mean, make them part of.... SOMETHING!

Another sad thing is Motorola's website itself. It must rank right up there with the most confusing, least user-friendly ones I've seen. It's not surprising the company is in such trouble. The impression you get along every step of the way is, "We don't know who or what w are, or what we want to be!"

Frankly, as is, I think Symbol, and its customers, would have been a whole lot better off with Symbol intact and independent. Spin them off so they can get back to business, Motorola.

Posted by conradb212 at 01:21 AM | Comments (0)

May 06, 2008

A video says more than a thousand pictures

While it's still not entirely sure how the YouTube phenomenon is changing our view of the world, changed it has. Initially we thought YouTube and its many competitors were simply repositories for stuff people recorded off TV, but that has changed. These days, if anything happens anywhere, whether it's important or not, it'll be on YouTube in a moment.

However, the YouTube phenomenon has also led to entirely more serious changes in how things are being portrayed to the world. Specifically, video is being used to show what products can do. But that's not new, you might say. No, the idea of using video to highlight a product is not new, but the way video is being used now is. In the olden days, videos were mostly polished commercials, the kind we watch on TV (unless we have TiVo). YouTube gave video sort of an underground flavor. It's not glitzy footage created by Madison Avenue types, but clips done by us, the people.

Last fall, for example, we thought it might be fun to do an underwater video of one of the products we reviewed. It was by no means professional quality; we just used a little Casio digital camera with a YouTube mode. Then we set up a tripod in a pool, I donned my scuba gear and, bingo, video of a handheld computer being used underwater. This went up on YouTube with a rather innocuous title, "Trimble Nomad computer goes diving." Amazingly, even with this non-provocative title and very utilitarian keywords (trimble, tds, rugged, scuba, waterproof), the video has been viewed over 4,000 times in the few months since. Another one we did a bit later, of the Juniper Systems Archer Field PC, has also been viewed almost 2,500 times. Hmmm....

Turns out, an increasing number of entrepreneurial companies are taking advantage of the YouTube phenomenon by rolling their own underground videos. One of our sponsors, MobileDemand, has been playing a leading role by creating a number of videos that demonstrate the toughness and ruggedness of their xTablet slate computer. The result is a series of increasingly better and more outrageous videos that are both funny and compelling. While I never warmed up to Panasonic's omnipresent "Legally we can't say..." commercials/videos/billboards/print ads, MobileDemand makes their point much more convincingly (and at infinitely lower cost). And while the origins of the idea are clearly based on the YouTube syndrome, MobileDemand is running its videos on Blip.tv which has much better video quality.

If you haven't seen one of the MobileDemand videos you can do so right here by running the clips embeded in this paragraph. You see their flagship product being tossed around, thrown off a hill, and strapped to the top of a car and taken through a car wash. In a loose adaptation of the MIL-STD-810F "drop test" (officially called MIL-STD-810F Method 516.5, Procedure IV -- Transit Drop), you see the xTablet being dropped, rapid-fire, 26 times. To drive the point home they use the computer to pound a nail into a wooden board. All the while, video is running on the computer's screen so you can see that it still works and never skips a beat. That's pretty clever. Oh, and knowing that outdoor footage of a screen that is not outdoor-viewable isn't exactly compelling, the MobileDemand folks smake sure it's abundantly clear that theirs IS outdoor-viewable. It's all done in a fun, "YouTube" way. To demonstrate that their tablet's display, usually the most vulnerable part of a rugged computer, can take a direct hit, they drop a full beer can onto it. And then, to make sure folks realize that a beer can dropped from a few feet packs a punch, they drop one onto a guy's midsection. Ouch!

A video can clearly say more than a thousand pictures. That's because we've all become jaded with mere images. We all know how easily they can be edited, modified and faked. Video, that's another story. It's hard to fake a video of a guy hammering a big nail with his computer. Which means, for now, demonstrating products on funky videos is a great idea. It certainly doesn't replace images or the printed word as video is a serial medium that you pretty much have to watch from start to end as opposed to glossing over "random access" print.

Posted by conradb212 at 07:59 PM | Comments (0)

March 18, 2008

Shrinking military spending an opportunity for mobile vendors?

What I am about to write is based on assumptions and conjecture. It has to do with military procurement. And more specifically, military procurement of rugged mobile technology.

We've all heard about the proverbial $600 toilet seats and other supposed gross waste of resources. We also somehow assume that the military has ultra-advanced equipment and secret weapons that are more sophisticated than anything we can think of. In the same respect, having served in the military, I know that the armed services often use equipment that, by civilian and commercial standards, is completely and utterly obsolete. So what is true? That the military has incredible gee-whiz weaponry and gadgets, or is it all tried-and-true (and rather old) stuff?

Most likely some of both. When you peruse the product lineups of some of the defense contractors you see some shockingly obsolete stuff in there. Machinery powered by ancient Pentium chips, murky LCDs, a complete lack of modern interfaces and so on. Heck, our fighter planes are positively ancient if you applied the standards of, say, the automotive industry. Sure, they are said to be equipped with the latest computer gadgetry, but still, how up-to-date can decades-old designs be?

Anyway, I really want to talk about how all of this relates to the cost of rugged mobile equipment. In a recent summary report, Venture Development Corporation (VDC) reported that military spending on expensive rugged mobile technology may dry up in coming years. They also stated that this will leave an interesting opening for a new class of "good-enough" hardware that can fill most requirements, or all, at a considerably lower price. What this means is that the military may stop paying premium prices for traditional military market equipment from traditional military market vendors. So instead of simply ordering a successor model from an established (and presumably expensive) vendor, they may look around for less costly alternatives.

This indeed may present an interesting opening for some companies that have not traditionally dealt with the military market. It also means that such companies will have to take a crash course in how to deal with the military, learn more about requirements and certifications, and about service and sales cycles. Truth be told, we've seen a good number of "civilian" rugged handhelds that we believe could serve the military quite well whereas some of the traditional gear makes us wonder about its usefulness.

So are some vendors just a small learning curve and a few modifications away from being serious contenders for armed forces contracts? Or is dealing with the governments simply too cumbersome to even attempt for anyone other than the handful of defense contractors?

Costs, of course, are relative. Given that a very simple ankle fracture without any complications or anything cost a friend of mine the appalling amount of $28,000 five years ago, I can only imagine what the military's health care cost must be. Perhaps, compared to that, it simply doesn't matter whether a handheld costs $1,500 or $5,000.

Posted by conradb212 at 07:40 PM | Comments (0)

March 10, 2008

Keeping track of who makes (and sells) what

Keeping RuggedPCReview.com updated is no easy task. In the olden days, when we started Pen Computing Magazine back in 1993, there were only a small handful of companies that offered ruggedized equipment. These days, a even giant companies like Dell are realizing that adding durable and ruggedized equipment makes a lot of sense. I mean, in a mobile world not everyone is well-served with a flimsy, plasticky notebook that can't handle the potential abuse during a day on the job.

Anyway, keeping track of things... Not only is it quite a job to stay on top of every tech upgrade (and with Intel adding and changing processors every few weeks those come hot and heavy), it's often even more difficult figuring out who makes what and where it's being sold. For many years now, most notebooks sold in the world have been made by a fairly small number of Taiwanese and, increasingly, Chinese OEMs. For a while we licensed Pen Computing Magazine to a publishing company in Taiwan and I had a chance to go to Taipei to see them and also make a presentation on Tablet PCs in the Taipei International Convention Center. My hosts arranged for interviews with most of the major OEMs, such as Compal, Quanta, Mitac, FIC, Tatung and so on. That was very informative, but it's difficult to keep track of the ever-changing alliances between OEMs, ODMs, resellers, partners and customers.

So what does that mean for all the hundreds of rugged products listed and described at RuggedPCReview.com? Most are manufactured, though not necessarily designed, by an OEM in Taiwan. Many are joint productions where a computer company designs a product and then has it built by an OEM. Or the various aspects of design are divided in some way. Or a product is available from several vendors, but is customized for particular markets for different vendors. Sometimes there are exclusives. Other times the same machine is sold under different labels. There are also cases where an OEM sells a product under its own name, but that same product is also sold by other companies under different labels. This whole big supply chain means that there are many different ways of working together.

As for us here at RuggedPCReview.com, we always try to know who exactly makes a product. That's primarily so that we can state facts. If a product is really good, we'd like to know who deserves the praise. It makes no sense to heap praise on an OEM when the design actually comes from elsewhere. Or, the other way around, celebrate the genius of a reseller when they really did not design the product at all.

But that's not all of it. Another problem for us is that larger resellers do not necessarily offer the same machines in all markets. This morning, for example, I updated some product listings and realized that some of the old Dolch products were still listed under Kontron, the German company that had taken over Dolch in February of 2005. We had often marveled at Dolch's various rugged platforms at industry tradeshows and were bit saddened to see them get absorbed. After all, Dolch had been building rugged machines since 1987. So we relisted whatever Kontron took over as Kontron machines and added new contact information. Kontron had also created a new website, kontronmobile.com.

At the time, Kontron's CEO was quoted as saying, "This investment presents an excellent opportunity for Kontron to further expand its embedded computer solutions in the USA and Europe on mobile platforms for government and defense programs." Well, apparently it was not such a great opportunity after all as Kontron's US website now states, "Thank you for your interest in mobile rugged computing. This line of products was recently acquired by Azonix, a division of Crane Company." Azonix so happens to be a division of Crane, a multinational with over 10,000 employees. Azonix Corporation is located in Billerica Massachusetts and was set up in 1981 as a design and manufacturing firm specializing in rugged, high-precision measurement and control products. Some of the former Dolch/Kontron products are now part of the Asonix Military Grade Solutions product lineup, in competition with the likes of DRS Tactical and General Dynamics.

The Dolch/Kontron/Asonix NotePAC, however, looked familiar to me and it turns out to be a GETAC machine, the A790. On a hunch I go to the German Kontron website and it turns out that Kontron continues to sell rugged notebooks in that, and other, markets, just not in the US. In fact, the German Kontron lineup does not hide its GETAC origins. They have a whole line of Kontron NotePACs, all carrying the same model numbers as the corresponding GETAC machines.

Nothing wrong with all that, of course. It's just another example of how everything is going global. But after all is said and done, customers need to know who they can call if they need service and support. And then it is good to know they're dealing with a reliable, competent company that doesn't just slap a badge on a machine and pushes it out the door. In the end, it is that support and that local connection that matters and factors in big in that holy grail of vertical market mobile computing, the Total Cost of Ownership.

Posted by conradb212 at 06:48 PM | Comments (0)

March 03, 2008

Where will Intel's Atom chip fit in?

On March 3rd, 2008, Intel introduced the low-power Atom processor designed specifically for mobile internet devices. While desktop chips draw as much as 35 watts of thermal design power (TDP) and even ultra-low power Core Duos draw almost 10 watts, the Atoms will draw from 0.6 to 2.5 watts. Intel stresses that the chip is not a shrunken version of a desktop chip, but designed from the ground up. In a series of YouTube-style videos various Intel spokespeople describe Atom's use. It goes into really inexpensive ($250-400) notebooks. It is "Intel's architecture for mobile devices." It is for "devices that fit in pockets." And it is "the basis of new sexy: low power and small." And no fan is needed. Does this mean the Atom processors are meant to replace replace the ARM-based PXA processors that Intel jettisoned to Marvell?

It's really confusing with processors these days. Back in the early days of mobile computing everyone knew what to expect from an 8088 processor (including price, which was about $5), and then, say, a 386/16 or a 486/33. People even had a "feel" for how fast a Pentium 90 was going to drive an early Windows computer. Later, Intel's product lines mushroomed, but it was still kind of possible to guess how each would perform because in the public's mind, the clock speed of a computer chip determined how fast it was. Then Intel did away with that also, sort of, and now we have slower processors that are faster and faster ones that are slower. Processors are no longer sold on their specifications, but on what wonderful things Intel says they will do for us.

For those of us in the mobile field, one problem with Intel has always been that the company really had no mobile chips. Whatever found its way into notebooks was generally a crippled desktop processor. Sometimes crippled in terms of technology (like when one of two cores was simply disconnected as in the unloved Core Solo) and sometimes by running the poor thing with so little juice that it barely moved.

But Intel also had the PXA processors specifically developed for handheld devices you may say. Yes, they had, and it is not entirely clear why. Think back to the beginnings of Windows CE in the mid 1990s (it was introduced at Comdex 1996 to be exact). Windows CE began as a multi-processor architecture platform. Unlike desktop Windows PCs that almost exclusively relied on Intel, CE devices had a choice of several chip architectures. There was support for Hitachi's SuperH architecture and two variants of Silicon Graphic's MIPS engine, and then Microsoft announced support of the 486 and Pentium, the PowerPC 821, and the ARM architecture. I don't think the first three ever became real, but ARM support sure did. Anyway, the competition among chip manufacturers was heavy and resulted in sort of an "arms race" to deliver faster and more integrated chipsets. There quickly were faster versions of the Hitachi SH-3, Philips introduced the TwoChipPic set, and NEC the 4100 family. Toshiba announced its entry with the MIPS-based TX39 family of RISC processors (perhaps one of the quickest CE chips ever), and Digital Equipment Corporation the StrongARM 1100. And there was AMD with its 486-compatible Elan variants. Now that is competition.

Sadly, all that changed with Pocket PC 2002 when Microsoft dropped support of the MIPS, SH, and X86 architectures and mandated the use of an ARM core, which at the time was the SA1110 "StrongARM," and the ARM72xT and ARM92xT. That swiftly eliminated a whole bunch of CE device manufacturers from the market, and some never came back. At least, we thought at the time, ARM processors were made by Intel, Motorola, Texas Instruments, and ARM itself, but even then we assumed that there would be an emphasis on the Intel StrongARM and Intel's Xscale architecture.

XScale, of course, prevailed and was soon found in virtually all Windows CE devices. Now let's remember that StrongARM really wasn't an Intel invention at all. It originated with none other than the once mighty Digital Equipment Corporation, the supermini powerhouse that once seemed destined to replace IBM, but then meekly imploded and sold itself to Compaq, which meekly imploded and sold itself to HP. Somewhere along the process Intel picked up StrongARM and quickly morphed it into XScale. I remember several somewhat awkward conference calls where Intel reps tried to explain how XScale was different from StrongARM. In the end it really didn't matter as the Intel PXA chips became fairly competent workhorses for millions of Windows CE-powered devices.

However, XScale had fatal flaws. First, it couldn't run "real" Windows. Second, it wasn't a very lucrative business. And third, it was not invented here. So off it went, to Marvell. Marvell Technology Group -- a silicon solutions high tech firm based in Santa Clara, California -- officially took over Intel's communications and applications processors in November of 2006 and has since launched the PXA 3xx series, consisting of the high-end PXA320 running at 806MHz, the cost-optimized low-end PXA 300, and the PXA310. The 806MHz PXA320 is a scorcher as we found out in a review of the Trimble/TDS Nomad rugged handheld. Unfortunately, Marvell's marketing is so low-key that hardly anyone knows they exist. Check the tech specs of just about any Windows CE device and it still says "Intel PXA." And despite the remarkable power of the PXA320 chip, few have picked it up. Shame, that.

So now we have the Intel Atom chip. Designed from the ground up for mobile devices. Designed for cheap computers costing just 250-400 bucks. Not a shrunken desktop chip, but still one with 47 million transistors. One that goes into devices that fit into pockets but also on desktops, and those inexpensive notebooks. And then there's the new sexy, "low power and small." Why "Atom"? Because "it's the smallest element of computing."

Along with the Atom chip also comes Atom Centrino. With "Centrino" being a rather successful Intel strategy of bundling various Intel components and making the package look superior to just an Intel processor and then third party components, Centrino Atom is no surprise. Centrino Atom will include an Atom chip and companion chips for graphics and wireless for "the best mobile computing and Internet experience on these new devices."

The thermal design power (TDP) specs are certainly impressive. Just 0.6 to 2.5 watts, as opposed to almost ten for an ultra-low power Core Duo processor. And the 45nm process is unimaginably microscopic (the PXA processors use 90 nm) and certainly a testimony to Intel's expertise. Thermal design power, of course, is a somewhat odd measurement. It just describes, according to a Wiki entry, the "maximum amount of power the cooling system in a computer is required to dissipate."

To me, the question is where the chip will really fit in. One of the Intel clips has the spokesperson showing an OQO type of little computer with a slide-out keyboard. Quite obviously, the overall goal is to provide the kind and quality of internet access we've all become used to, and even more so since Apple showed that "real" browsing is possible even on something as small as the iPhone.

So what does Atom mean for the manufacturers of all those PXA-powered devices? With Marvell taking such a low-key approach, are they hustling to see if Atom perhaps is a better alternative? I am certain Intel hopes so. What are the respective power requirements? I don't think I've ever seen a TDP spec for the PXA chips. Whatever specs there are for the PXA320 would indicate substantial capabilities and power, but so far we haven't seen any device that takes advantage of all of its remarkable range of multimedia features (see Marvell PXA320 features).

There are, of course, other considerations. For example, we're seeing new products with Intel's A100/A110 chips that are part of Intel's UMPC 2007 platform. Those chips, essentially lower power M-cores, also use 90 nm technology, run at 600 and 800MHz and have 3 watt TDPs. Will these be totally replaced by the Atom chips that appear to have a range from 500MHz to 1.8GHz at lower to equal TDPs?

Time will tell.

Posted by conradb212 at 05:53 PM | Comments (0)

February 20, 2008

What do we make of Geode, VIA and Intel A100 powered devices?

As of late, I've seen an increasing number of small tablet-style devices that run Windows but do not use one of Intel's heralded Core processors, or even one of their lower-powered predecessor chips. That inevitably brings up the central conundrum the industry has been dealing with for the past 15 years or so. After dabbling with Windows CE in its various versions, Microsoft has pretty much decided that "real" Windows is the way to go. Any device that is not solely dedicated to performing a single task, or running a single custom app, will likely do other things or have to communicate with other computers. And that is when the problems start. Anything that doesn't run "real" WIndows will inevitable have browser problems, drivers and plug-ins aren't available and so on. Might as well give up and build a small device with real Windows. That can be done, but real Windows was designed for desktops and powerful laptops. It wants plenty of processing power and a big screen lest it all becomes an exercise in frustration.

So here we are, with Vista taxing even the most powerful machines and even XP desktops struggling to keep up with the myriad of functions and giant applications and add-ons and start-up programs and other gunk. Heck, my own personal 2GB Gateway notebook takes so long to boot Vista or bring up programs that I usually have meandered off to some other task by the time it's done. And yet, I see Microsoft plugging its Intel Ultra Mobile Platform 2007 with its A100 and A110 processors running at 600 and 800MHz, and AMD's Geode LX800 and LX900 at 500 and 600MHz. VIA's ultra low voltage C7-M runs at 1-1.5GHz and is probably in a somewhat different class, but in all instances we're far from Intel Core Duo and Core 2 Duo specs.

The question simply becomes this: Can a tablet powered by one of these chips really run Windows XP without its owner quickly giving up on it because it is too slow?

Unfortunately, there isn't an easy answer. See, it's really all a matter of software. Let's not forget that a couple of decades ago perfectly functional computers booted faster and ran their spreadsheets, wordprocessors and databases faster than what we have today, all on a few meg of memory and 16MHz processors. We have vastly more functionality today, but it's all become so complex that it often barely moves, and that is WITH powerful processors.

So why not simply scale back the software? That's a good idea but far from simple. If we only could just load Windows 98 onto a new machine and make it do whatever we need. It'd probably fly even on a -- by today's standards -- vastly underpowered machine. Sadly, it'd also be almost useless because it couldn't connect to anything and be incompatible with almost everything.

So the answer is to use today's software that speaks today's protocols and runs today's drivers, but remove as much overhead as possible. That can be done in several ways. You can, for example, load a standard operating system but do away with all the clutter and shovelware today's computers come with. You also remove all unnecessary startup programs, all unneeded background processes and so on. That still results in a big system, but it's surprising how much speed can be recovered by putting Windows on a diet.

Another approach is using Windows XP Embedded. What does "embedded" mean? Basically that you only pick those parts of a componentized operating system that you absolutely need for a task. Standard Windows XP or Vista load a computer with everything under the sun, whether you ever need it or not. An embedded version of Windows XP has ONLY what a device needs to do its job. That means it will be limited, but it will also be faster and use fewer resources. XP Embedded is especially well suited to run on a relatively small flash disk.

Yet another approach is to use one of the various Linux variants. Standard Linux distributions also have grown over the years and they now need much more space and have far larger resource requirements than they used to, but they are generally still smaller and faster than Windows. And since Linux is free and all its major applications are free, there can be substantial cost savings. Not everything is free, of course; companies who create custom applications to run on Linux systems can and will charge for licenses and upgrades.

All this gets me back to the original question: can a small slate computer with a minimal processor and minimal resources really run Windows at an acceptable pace? Does it all make sense? Some rather prestigious manufacturers seem to think so. Getac announced its lightweight rugged E100 tablet that uses an Intel A110 chip. Roper Mobile Technology announced the Geode-powered Duros Tablet PC. HTC's intriguing "Shift" can run both Windows and a clipped version of Windows Mobile, and Windows runs on an Intel A110. And there is a whole slew of other small devices that roughly follow what once was the Microsoft "Origami" ultra-mobile PC spec. All do Windows, and all use one of those ultra-economical processors (I hate the term "low-power" as it implies low performance rather than high energy efficiency) that is supposed to provide an adequate user experience while still providing halfway decent battery life.

What I'd really like to do, and I hope we get a chance here at RuggedPCReview.com, is to compare the Windows XP, XP Embedded and Linux versions of some of those machines side-by-side. I somehow cannot image that anything that runs XP on a 600MHz processor will be blindingly quick when even my 3GHz desktop is a slug, but it's entirely possible that a lean and specially configured rugged tablet with one of those high-efficiency (see, I didn't say "low power") processor is just what the doctor ordered.

Posted by conradb212 at 08:05 PM | Comments (0)

January 24, 2008

Panasonic -- Still top of the heap?

We just finished taking another detailed look at an old acquaintance, a Toughbook from Panasonic. Now called the CF-30, it's a descendant of the original Toughbook that goes back many years and essentially created a whole new market. The way that came about was that a number of Japanese companies that had once dominated the US laptop market found it increasingly difficult to be profitable. At some point the US launched protectionary measures against TFT LCD panels, making them more expensive. And the Taiwanese were beginning to move in.

Panasonic's approach was to seek new ways and they decided to gamble on a niche they had discovered. As notebooks were increasingly used in the field, customers became unhappy with standard laptops breaking all the time. It really wasn't the laptops' fault. They were built to be used at home and in an office, and then being shuttled back and forth. But with companies now deploying them for all sorts of field applications, they just couldn't handle it. So Panasonic conceived the idea of notebooks that were as elegant and powerful as standard laptops, but a lot tougher. And they came up with the "Toughbook" moniker, which was brilliant.

For many years, Panasonic owned the market. It wasn't that they were so much better than the rest, but their products sure looked better, and they had giant Matsushita behind them, so there were plenty of resources and off-the-shelf components right inside the company. And they knew the importance of industrial design. Compared to the utilitarian-looking competition at the time, Panasonic's ruggedly handsome Toughbooks were simply in a league of their own.

Panasonic also did a terrific job working with the press. In the heydays of vertical market print publications, when we did Pen Computing Magazine, Panasonic's PR folks always made sure we were informed of every new product. They made review units available and just generally helped us in every way to get information and hands-on time with the units so that we could keep our readers informed. So we reviewed many Toughbooks, liked most and criticized some. Panasonic was always appreciative of feedback and apparently passed constructive criticism on to their engineers as the machines steadily improved.

But time does not stand still, and the only constant is change. The rest of the industry began catching up and Panasonic, as the market leader, had a bullseye on their back. They were everyone's target. All of a sudden, superb industrial design was no longer exclusively found at Panasonic. One look at currently available rugged and semi-rugged notebooks shows that it's a real race now, and one where Panasonic no longer automatically has an edge.

There are other issues. Relationships matter, and after many years of superb access to Panasonic through a couple of long-term PR people, things changed and it became next to impossible to get anything from Panasonic. Seemingly every contact with them was from a different PR person. So when we emailed one of them, s/he was already no longer with the company, or the PR firm had changed. Not good. Whoever we deal with does their best, of course, and sometimes things just cannot be helped.

Anyway, we finally did get another longer term hands-on with a Toughbook. As described in detail in our review on the site, the Toughbook CF-30 is almost unchanged. Which is really a good thing. After all those years, that particular platform -- the traditional full-size rugged notebook -- is as mature and perfected as it gets. And having talked to Matsushita's engineers and designers In Japan, and having seen the production facilities in Osaka and Kobe, I am not surprised at the extremely high level of execution, fit and finish. It's probably nearly impossible to meet Panasonic's sheer perfection when it comes to do wizardry with magnesium or applying the most eye-catching finish to it.

And Panasonic certainly keeps the machine technologically up-to-date. The one we reviewed had an Intel Core Duo processor, but by the time the review was over, in January 2008, Pana had already revved the machine again and it now has a Core 2 Duo and a few other enhancements, albeit not enough to change the name from CF-30 to CF-31 just yet.

Outdoor viewability is becoming ever more important, and there has been a lot of progress in that field. Our technology editor, Geoff Walker, is an expert in that field, and thanks to him we have a pretty good idea of the state-of-the-art. From what I can tell, and from what I have seen with my own eyes, Panasonic is not completely at the forefront with their outdoor displays, but they are close. No display is anywhere near perfect yet, but the progress that's been made is amazing, and current technology can only do so much against the sun.

But is the CF-30 still on top? That's hard to say. In terms of look and finish, it remains unsurpassed, but it is an aging platform. The touchpad was just plain unresponsive and certainly didn't make the machine easy to use. In the olden days, a quick call to our sources at Panasonic might have yielded an explanation as to why a particular type of touchpad was used, but these days the path of communication is longer. Fortunately, today's company websites contain so much information that grabbing a missing spec is usually just a lookup away, but, alas, as pretty and professional as Panasonics Toughbook website looks, it is a total bear to navigate and find anything. If it takes me several screens to actually find a product, something's wrong. And the confusing, inconsistent way Panasonic literature and online resources handle ruggedness specs is not doing them any favors. And Panasonic's "Legally we can't say...." campaign we're assaulted with in every airport or business magazine, well, the less said the better.

But what about other Panasonic products? Well, most are still there and more or less the same. I saw the prototype of the very compact CF-18 notebook convertible at Panasonic in Japan back in 2002, and we later reviewed the final product. It's almost six years later now, and the CF-18 is now the CF-19. Is it still the best? Maybe, maybe not. GETAC's V100 competes with it now, and when we reviewed that rather excellent machine we wondered whether Panasonic has kept up.

Don't get me wrong. The Panasonic CF-30 is an awesome machine. But the world has changed, and it's not clear to me if Panasonic has made all the right moves.


Posted by conradb212 at 05:05 PM | Comments (0)

November 22, 2007

Thoughts about rugged handhelds -- the Juniper Archer

For the past few weeks we've had an Archer Field PC from Juniper Systems. "Field PC" is perhaps a bit of a misnomer as "PC" generally implies a Windows-based computer. The Archer is Windows-based alright, but it's Windows Mobile, so it's really a Pocket PC or whatever Microsoft is trying to call handhelds these days. We still generally call these machines Pocket PCs, or just PDA, the term Apple originally used when it came out with the Newton back in 1993.

Creating a "rugged" PDA isn't easy. And just like "rugged" notebooks or slate computers, the degree of ruggedness varies greatly. Commercial products really don't have that problem. It's the electronic guts and then a plastic case that should look good, be small and light, and hold up in daily use. It doesn't have to be waterproof or be able to absorb punishment, like drops or getting crushed and so on.

For mobile computers used in field work, things are very different. If you use a machine outdoors, all sort of stuff can happen. For one thing, outdoors is not an air-conditioned 72 degrees all year round. It can get very cold and very hot. Some electronics don't like that. Also, outdoors it rains. And sometimes pours. And a handheld terminal may even fall into a puddle or get sloshed by water some other way. Dropping it is a distinct possibility. And that generally happens when you pull it out of a bag or Pocket, or while holding it. So it should survive four to five feet drops. There's other stuff to consider. If it goes up in a military airplane, pressure may be an issue. If it's strapped to a truck, vibration can be the killer issue. And in certain flammable environments it is imperative that there is chance the device can ignite things with a little spark or arc. There's more, but one thing isn't usually listed: if a device must be rugged, it's likely going to be used outdoors, and outdoors there is sunlight. So the display must be readable outdoors. That's never included in ruggedness specs as it is, technically, not an environmental exposure issue. But it's part of what a rugged device must be.

So how do manufacturers go about building rugged handhelds? In many different ways. While the guts of a Windows Mobile/CE device are fairly standard, rugged housings most definitely are not. As a result, almost everyone does it in a different way. Here at RuggedPCReview.com, we love looking at, and analyzing, those different design approaches.

In a way, making a handheld tougher is not that different from making a slate or notebook computer tougher. Seek the traditional weak points and eliminate them. Consider all possible accidents and challenges and address them. And since building a rugged device usually means higher cost, larger size, and higher weight, have a very clear view of what exactly you're trying to achieve. The design must be just right for its intended use.

So how does all that apply to the Archer handheld build by the friendly folks at Juniper Systems in Logan, Utah? Well, they have a history in catering to agricultural markets, then branched into all sorts of other outdoor markets, like surveying, forestry, fisheries and so on. So whatever they build should be fairly waterproof, able to handle a drop and just generally be a tool that its owner can take along on a hard day's work in the wild, without having to baby the computer.

When you first see the Archer, and usually you see the one with bright orange protection molding, it has a friendly look that is far removed from some of the deadly-serious designs that, if they were in a Pixar movie, would probably say, "Sir, unless you're military and have proper clearance, you are not authorized to touch me. Please step away." When you look at the Archer, alas, pumpkin comes to mind. Same orange, same texture. That provides excellent visibility, which is a good thing if you accidentally dropped it in the woods and then have to backtrack to find it. For that, bright orange is much better than camouflage.

But take a closer look and the Archer is a rather nasty wolf in sheep's clothing. The friendly elastomer overmold comes off easily and underneath it's a hefty case made of magnesium. Hefty as in you could probably take a sledgehammer to it. I described all of this in the review, but seeing this "compartmentalized" approach to designing a rugged device was really interesting. They Juniper engineers must have said, "Look, if we enclose the whole box in a waterproof and dustproof shell, how are we going too have connectivity? Hardly possible. So let's separate things into a totally sealed core and then protect that with rubber molding that can easily be replaced. And we just seal the electronic contacts and leave the actual jacks exposed. Think that'll work?"

It does, with some limitations. The Archers housing is certainly an "armored core" and invulnerable, but dust and water can get into the jacks and other places. Which means the Archer DOES have great connectivity in an ultra-rugged device, but if it falls into the water or hits a dust storm it will not fail, but afterwards you have to take it apart and dry and clean everything outside of the armored core.

A couple of months ago we did a little stunt with the Trimble/TDS Nomad by actually taking it scubadiving. It was just in a pool, but it made for great video and underwater pics. I wanted to do the same with the Archer after we determined that it could do it, but the water was pretty cold by now and so we just dropped it into the pool. Juniper's most helpful Pat Trostle had told me how they often display the Archer in a fishtank at trade shows, but that they keep an eye on air bubbles which usually mean the thing is flooding. I've flooded a few underwater cameras in my day and know what Pat meant. So when bubbles emanated from the Archer upon being dropped into the pool, I felt a little burst of anxiety until I remember that, of course there will be some bubbles. They come from the air escaping the outside overmold and the plastic block that houses the interface jacks. No matter gets inside the core, of that I was sure. And it didn't. But it had to be taken apart and carefully cleaned and dried afterwards. Professionals would do that anyway, so no worries there. Saltwater may be a bit of an issue and I wonder if Juniper has data on the long-term effects of repeated contact with saltwater.

Later, we did drop tests by carelessly swiping the Archer off a wall and down onto rather a rough driveway surface. We did that two or three times and I was afraid the unit would go back to Juniper with some good scratches. Amazingly, no scratches at all. That is impressive. Rugged device with exposed metal almost always scratch. Apparently not this one.

Like many mobile computers, the Archer can be expanded in a number of ways, via a SD and a CF card slot. That way customers can use their own choice of expansion cards rather than being stuck with whatever is integrated into the unit. That's a good solution, and Juniper offers several extended caps that fit over such expansion cards. Amazingly, they claim that all of those expansion cards also provide the exact same IP67 ingress protection rating. That is a tall order. The way they do it is by separating the extension caps into two pieces. One is a precision-engineered adapter plate with a o-ring type of seal. The cap then screws on top of that. It works beautifully. But as anyone familiar with underwater housings knows, the o-ring approach depends totally on having immaculately maintained o-rings or sealing plates (which is what Juniper uses). Rings are, as far as I am concerned, easier to maintain as they can be replaced. The soft rubber sealing plate in our adapter was slightly deformed, and I wondered if it still sealed properly. I didn't want to risk flooding the machine and thus didn't put it to the test.

It was an interesting experience, reviewing the Archer. It is fully up to the job and probably suitable for a far wider range of applications than Juniper currently pursues. But it also showed me again that design of professional equipment is only one part of the whole package. The other is the care the professional him/herself takes in working with, and maintaining, the equipment. These are tools for tough jobs, and good professionals always treat their tools with care and respect.

Posted by conradb212 at 05:08 PM | Comments (0)

November 15, 2007

Tests and reviews - how much punishment?

I love rugged machinery, and so does everyone else here at RuggedPCReview.com. When a new machine comes in, everyone wants to see it, touch it, comment on in, and speculate how much abuse it can take. And this is where it gets interesting, the degree of abuse.

Rugged machines are, by design, conceived and built to take a beating and survive. But the only way to know for sure if they indeed CAN take a beating is to administer one. And whether or not we should do that is a sensitive issue. A lot of this equipment is not inexpensive. So do we take a $4,000 computer, drop it, twist it, spill coffee on it, try to see if the screen is really scratch-proof and whether it's really water-proof? And then send back, at best, a severely banged-up machine, and at worst, one that is destroyed? Dvorak may get away with that and maybe some of the few remaining big print magazines, but I am not sure most eval unit coordinators would look upon such a reputation with great favor.

That puts us in an interesting situation. We really think that rugged equipment should be just as rugged as manufacturers say it is, and sometimes we have doubts. We also see some stuff we are not very fond of. For example, glossy metallic surfaces that can and will get scratched in an instance simply should not be on a rugged machine, no matter how cool they look. But even there, do we just mention that in a review, or see just how badly it scratches (or not), document that, and then send it back?

Most rugged machines come with ruggedness specs. MIL-STD results are listed and perhaps compliance with other testing procedures as they may vary from country to country. That can include inhouse testing and third-party independent tests in labs. Now I have seen many of those torture chambers -- the ones of Panasonic, GD-Itronix and Intermec, to name a few. I've seen machines being baked, shaken, rattled, dropped, scratched, exposed to extreme humidity, vibration, pressure, materials fatigue testing and more. The tests are real, and they certainly reveal weak points that are then addressed.

Problem is that the reported testing results are not always very informative. MIL-STD testing means just that; a piece of equipment has been tested in accordance with the procedures mandated in a MIL-STD document. Often it is not reported what the outcome was, or if the machine even passed. Or only part of the test results are included in the specs. So prospective customers often do not have enough data to really compare. Some of the big companies in the field are guilty of not including truly meaningful ruggedness specs, and that doesn't do anyone a favor.

Sometimes we do go beyond simply describing a machine and administer our own torture testing. When Trimble/TDS claimed their Nomad handheld was waterproof to the extent that it would survive for an hour in a full meter of water, we decided to see if that was really so. I made sure they were okay with that. We used scuba gear and actually took it for a dive. I used it underwater and pushed the specs. The Nomad went down to maybe seven feet, it stayed underwater for a good while, and it survived. It worked underwater and I even used it for handwriting reco underwater. It's all on video and up on YouTube.

As a result, some manufacturers may be reluctant to send us their gear because -- hey -- those guys at RuggedPCReview may actually check the ruggedness specs for themselves. Others send us gear with the specific request to do so.

A current example: Toshiba makes a remarkable machine, the R500 notebook. It is an ultra-light and definitely not fully rugged. But it has an awesome outdoor-viewable display and was designed to take the kind of punishment that may occur on the road. I think a Toshiba rep called it "executive-rugged". The R500's display case is very flexible, so much so that we had our doubts if it'd hold up to any abuse. Well, Toshiba explained it was designed that way, and there is even a video showing the machine take abuse and the LCD being twisted to a frightful extent, and survive. We're tempted to see if we can duplicate that, but should we? The last thing I want to do is send the R500 back with a busted display.

For the most part, all this doesn't pose a dilemma. Most of the time the official test results are very clear and we see no reason to doubt them, nor would we have the ability to duplicate the torture testing. But the question does come up at times, and hence this column.

What we would like to challenge the rugged industry to do is this: State all ruggedness specs fully and clearly enough so readers will know what exactly the machine passed, and, more importantly, what it means.

Posted by conradb212 at 02:43 PM | Comments (0)

September 08, 2007

Underwater computing?

Underwater computing? Now that's a novel concept. For the past 15 years I've been dealing with rugged computing equipment, machines that can be dropped, survive in dusty environments, continue to operate whether it's scorching hot or really cold. They can also handle rain, though these days the trend seems to be surviving an accidental coffee or soda spill onto the keyboard. Sort of like cupholders in cars have become a make-or-break feature, second only to how many DVD screens for entertainment they have.

Anyway, it's not unreasonable to expect computers come in contact with water. It covers 70% of the planet. People hang out around water. It rains. So we might expect a rugged handheld to continue to function if it is exposed to water. Why am I thinking of that? Well, maybe it's because I took up diving last year and since have been exposed to some pretty amazing equipment that does work underwater.

For example, divers depend on dive computers. That's because diving subjects the human body to much higher pressure than it is subjected to on the surface. To counteract that pressure, the air a scuba diver breathes is also much denser. At a depth of 33 feet, for example, the pressure is twice that on the surface, and the air that is released from the scuba tank via the regulator is also twice as dense. That means that the partial pressure of nitrogen is twice as high, and according to William Henry's law, more nitrogen dissolves into body tissues. Once the diver comes up and the pressure lessens, that nitrogen is released from the tissues again. Normally it just goes into the bloodstream and is safely breathed out through the lungs. However, if the diver ascends too quickly, or if s/he has absorbed a large amount of nitrogen during a long, deep dive, the released nitrogen can form bubbles, and that can have dire, and at times deadly, consequences. Divers used to compute safe dive times on dive tables, and that is still being taught in scuba classes, but almost everyone uses a dive computer these days. Dive computers are sophisticated devices that continually measure depth and compute absorbed nitrogen. They show numerous values on their displays, tell the diver how much longer s/he can stay at a given depth, and when it is time to go up.

Needless to say, dive computers must be totally and completely reliable. Failure is not an option. Leaking is not an option. Bugs are not an option. And wimpy battery life is not an option. And they must be able to handle not just a bit of splashing, not just a few minutes at three feet, but potentially hours at hundreds of feet. Without failing, ever. My dive computer has a wireless connection to my air tank so that it knows how much air I have left. After using the computer for a YEAR, the battery is still at 95%. Extreme "technical" diving may require very sophisticated dive computers to perform numerous life-supporting tasks at depths of many hundreds of feet. Sure, some look just like watches, and we're used to trust watches to survive swimming and snorkeling and a bit of diving. But many are larger -- sophisticated devices bigger than smartphones or PDAs, and with large displays and several controls.

But it's not just dive computers. It is also cameras. As a reviewer of rugged mobile computing equipment I have an appreciation for one of the standards by which we judge a machine's ability to protect itself from dust and water, the Ingress Protection, or IP, rating. A handheld rated at IP56 is one tough machine and can likely survive in just about any environment. If, in addition, it can survive four foot drops, well, that is a very rugged device, and it probably looks like one, too.

Diving exposed me to equipment that can do all that, and more. Case in point - a camera that Olympus makes. If you were to look at the Stylus 770 SW, you'd see a snazzy, handy little digital camera measuring 3.6 x 2.3 x 0.8 inches and weighing a bit over six ounces, battery included. It looks very elegant with a matte-silver finish. It has a bright 2.5-inch LCD display that's larger than those on most smartphones. The camera has about a dozen hardware controls, mostly pushbuttons, but also a navigation disk. There is a microphone and a speaker. What is special about it?

It is rated IP58. It can survive 5-foot drops. It is crushproof. It can operate at 14 degrees Fahrenheit. And it can be operated in 33 feet of water.

It does that without any protective case at all. No rubber bumpers, nothing. Just very intelligent design, meticulous manufacturing, and good sealing. It costs just over US$300.

As a diver, I took that camera down to not only 33 feet, but 67 feet, and later 77 feet. It stayed underwater for a good hour. No problem at all. At the maximum depth I reached, the water pressure was so great that some of the push buttons were pushed in. And a small black rectangle showed up in the center of the LCD, from the water pressure. But it continued to take pictures.

What those dive computers and cameras like the Olympus 770 SW show is that it is possible to create sophisticated electronic devices that can function underwater. I totally agree that there probably isn't a great need for handhelds you can take diving. Then again, some people out there might just like to have one. Most likely, we haven't even really started to think about possible applications.

I am pretty sure military divers would make good use of an underwater rugged computer. And commercial divers would, too. Even recreational divers might just love to take a handheld underwater, or perhaps a tablet so they can write on it or doodle or draw. Divers communicate via hand signals mostly, and those are often misunderstood. As an alternate they write on little slates. A computer or electronic slate would certainly be much better. As I write this I am supposed to follow up on a new underwater texting technology -- texting like SMS on cellphones. My guess is whatever device is used for that must be rugged and quite waterproof.

As is, we have a brand-new Trimble/TDS Nomad rugged handheld in our lab. It is a very tough handheld computer with an IP67 rating and thus was designed to survive immersion into water. We may put that to test test and record the performance on video. Simply don scuba gear and find a nice comfy spot somewhere at a depth of six or seven feet. Then see if it works. Without, of course, exceeding design specs. It's been pointed out that touch screens have not been designed to deal with water pressure and may thus fail to operate properly. At a depth of seven feet, the pressure on the touch screen would indeed be about 21% higher than on the surface, and this might make it inoperable, depending on design.

Is rugged underwater computing on the horizon? Is there a need for it? Personally I think there is. There are practical applications. And besides, it is always interesting to see if something can be done. Hey, Olympus did it, with a vengeance.

Posted by conradb212 at 11:12 PM | Comments (0)

July 31, 2007

Marvell, not Intel

I spend a lot of time updating the vast database of rugged devices listed and reviewed here at RuggedPCReview.com. Specs change all the time but the rugged and mobile computing industry is usually very modest when it comes to press releases and announcements. It's not like certain other fields where every new cellphone ringtone or executive promotion warrants a major PR campaign. So the way we go about it is making the rounds of all the companies, via their web sites, and check for updated specs.

One thing I noticed is that even in updates, almost everyone continues to refer to the "Intel XScale" processor, the family of chips that power almost all Windows Mobile and Windows CE devices. Well, Intel doesn't make them anymore. They sold that business to a company named Marvell. Here's what happened:

Marvell Technology Group -- a silicon solutions high tech firm based in Santa Clara, California -- decided to become a supplier in the cellphone and consumer electronics markets, and officially took over Intel's communications and applications processors in November of 2006. The original deal between Intel and Marvell was made in June of 2006 when Marvell agreed to buy the business from Intel for US$600 million. Under Intel's watch, the XScale PXA series were used in a wide variety of devices, with the PXA2xx used in Windows Mobile devices and the PXA9xx in such handhelds as the Blackberry 8700. Early on, Intel benefitted greatly from Microsoft's decision to switch Windows CE from a more or less open processor platform to mandating XScale. The deal between Intel and Marvell took several months to complete as Marvell had to find a manufacturer for the chips. Under the deal, Marvell took over the 3rd generation XScale processors, codenamed Monahan, and the 1.25GHz successor to the PXA27x Bulverde processors.

Losing no time, in December 2006 Marvell launched the PXA 3xx series, consisting of the high-end PXA320 running at 806MHz, the cost-optimized low-end PXA 300, and the PXA310. The PXA300 and 310 run at clockspeeds up to 624MHz, with the 610 adding VGA playback. The PXA320 is able to scale from 806MHz to 624MHz to conserve power when full performance isn't needed. The chip is also more energy-eficient than the predecessor Bulverde processor, especially under heavy video and audio load. The PXA320 can run VGA resolution video at 30 frames per second, support a 5megapixel digital camera, video telephony, all at lower power consumption than the older XScale chips. The first products with the 806MHz PXA320 are now appearing, such as the recently released Trimble Nomad.

From what we can tell, Marvell will continue to offer both the XScale PXA27x family as well as the older PXA255. But they are now Marvell chips, and no longer Intel chips. So let's do a global search and replace: It's Marvell XScale PXA and no longer Intel XScale PXA.

The emergence of the PXA 3xx processors is exciting. More performance and more capabilities at lower power consumption. That's great. We can't wait to do hands-on reviews of the first Marvell XScale powered devices!

Posted by conradb212 at 11:11 PM | Comments (0)

July 25, 2007

The RuggedPCReview Blog launches

Well, we finally added a blog section to RuggedPCReview.com. Yes, I know, everyone and their uncle has a blog these days, but I think it definitely makes sense to have one at a site like this where we are compiling information on just about every rugged mobile device out there. As is, our front page lists daily news and alerts readers to additions to the site, but often there is more than that. What's in a review is not always the whole story -- there's more to tell. Impressions, circumstances, interactions with PR people, engineers, product managers, testing, all the stuff that generally does not go into a review. That's one thing.

Another is that we tend to have our own opinions on matters. Be it new developments in the field, new product, company acquisitions, mergers, or consolidations. Anything that affects the rugged industry landscape. Or promising new technologies, and how we see them fitting into rugged computing. Sometimes we do factory visits and those are always fascinating and provide new insights.

Other times we have gripes, or come across stuff that simply doesn't make sense. So we may wonder, "What were they thinking?!?" and contemplate that. Or we may be presumptuous enough to offer commentary and recommendations, or share our views on developments.

Finally, we go to shows. We see new stuff. We talk to people. All that will go in here. And it won't just be us guys here at RuggedPCReview.com doing all the commenting and blogging. No, we'll invite guest bloggers to share their views and insights, so that you'll get as broad a cross section on opinions as possible. So if you have something to say or contribute, let us know via email to cb@ruggedtablet.web.fc2.com!

Posted by conradb212 at 11:11 PM | Comments (0)

 
inserted by FC2 system