Monday, February 28, 2011

Facebook plans to resume address, phone sharing

Despite congressional criticism, Facebook is planning to resume the aborted rollout of a feature that allowed the optional sharing of addresses and mobile phone numbers.


Facebook said in a letter (PDF) released today that it is evaluating different ways to "enhance user control" over information sharing that would go into effect "once the feature is re-enabled."


The social-networking site encountered some criticism in January after announcing the feature, which allowed applications to request permission to access user information. Only if the user clicked "Allow" was information shared.


Only three days after announcing the platform update, Facebook voluntarily delayed it, with Douglas Purdy writing that "we are making changes to help ensure you only share this information when you intend to do so."


Reps. Ed Markey (D-Mass.) and Joe Barton (R-Texas), who have a history of assailing tech companies including Apple and Google over perceived data transfer snafus, suggested in a letter (PDF) on February 2 that the pop-up permissions window was insufficient "given the sensitivity of personal addresses and mobile phone numbers compared to other information users provide Facebook."


Facebook's response, prepared by Marne Levine, vice president for global public policy, stressed that applications that run on the Facebook platform have long had the ability to ask for information. For example, Levine wrote, "a photo-printing application that prints photos for a user requests permission specifically to access a user's photo; a social-gaming application that allows users to play a game with his or her friends requests permission to access the user' friends list."


In last month's announcement that dealt with contact information, Levine wrote, "we allowed applications to ask users for that information, through a permissions screen...that provided clear and conspicuous notice to the user regarding what information the application is seeking."
And in response to the politicians' point about minors, Levine said that anyone under 13 is prohibited from using Facebook, and the company is "actively considering" whether to allow applications to request information from even older minors.


Markey said in a statement today that he's not satisfied with Facebook's response.
"I don't believe that applications on Facebook should get this information from teens, and I encourage Facebook to wall off access to teen's contact information if they enable this new feature," Markey said. "Facebook has indicated that the feature is still a work in progress, and I will continue to monitor the situation closely to ensure that sensitive personal user data, especially those belonging to children and teenagers, are protected."


Separately, Facebook announced last week that it's asking for comments on a proposed revamp of its privacy policy that's meant to make it easier to understand.


Source: http://news.cnet.com/

Friday, February 25, 2011

New evaluation algorithm in Google Search Engine for higher quality sites in search

Our goal is simple: to give people the most relevant answers to their queries as quickly as possible. This requires constant tuning of our algorithms, as new content—both good and bad—comes online all the time.

Many of the changes we make are so subtle that very few people notice them. But in the last day or so we launched a pretty big algorithmic improvement to our ranking—a change that noticeably impacts 11.8% of our queries—and we wanted to let people know what’s going on. This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.

We can’t make a major improvement without affecting rankings for many sites. It has to be that some sites will go up and some will go down. Google depends on the high-quality content created by wonderful websites around the world, and we do have a responsibility to encourage a healthy web ecosystem. Therefore, it is important for high-quality sites to be rewarded, and that’s exactly what this change does.

It’s worth noting that this update does not rely on the feedback we’ve received from the Personal Blocklist Chrome extension, which we launched last week. However, we did compare the Blocklist data we gathered with the sites identified by our algorithm, and we were very pleased that the preferences our users expressed by using the extension are well represented. If you take the top several dozen or so most-blocked domains from the Chrome extension, then this algorithmic change addresses 84% of them, which is strong independent confirmation of the user benefits.

So, we’re very excited about this new ranking improvement because we believe it’s a big step in the right direction of helping people find ever higher quality in our results. We’ve been tackling these issues for more than a year, and working on this specific change for the past few months. And we’re working on many more updates that we believe will substantially improve the quality of the pages in our results.

To start with, we’re launching this change in the U.S. only; we plan to roll it out elsewhere over time. We’ll keep you posted as we roll this and other changes out, and as always please keep giving us feedback about the quality of our results because it really helps us to improve Google Search.

Posted by Amit Singhal, Google Fellow, and Matt Cutts, Principal Engineer

Intel's Thunderbolt: What you need to know


Intel's long-awaited Light Peak technology, now known formally as Thunderbolt, is finally available on its first consumer device, and the company today unveiled more details about when we'll be seeing it in consumer PCs and gadgets.


First unveiled at the Intel Developer Forum back in 2009, the data transfer tech promises to replace a handful of ports with one that can do more things, and do them faster.


Its first inclusion in a computer is in Apple's MacBook Pro line, which refreshed earlier today with Thunderbolt ports across the line (see CNET's hands-on here). Intel followed up a few hours later with a press conference about the technology, as well as its plans to bring it to computers and devices over the next year or so.


To help readers better understand what the technology is and why it matters, CNET has put together this 
FAQ.


What is Thunderbolt? 
Thunderbolt is Intel's new input/output technology that promises to bring transfer speeds that exceed what is currently available with USB 3.0, as well as extending that speed across several devices at once. In terms of where you'll see it, Thunderbolt will appear as a new port on laptops and PCs, as well as on devices that support it.
  
The technology itself makes use of existing DisplayPort and PCI-Express data protocols to open up what you can do with a single port into multiple uses and at high speeds. This includes "daisy chaining" up to seven Thunderbolt-equipped devices together, while retaining full speed across all of them at once.


How fast is it? 
Thunderbolt currently runs with a top speed of 10Gbps, though promises to one day top 100Gbps in data throughput when it moves from a copper wire to optical fiber. In the interim, copper wire has both speed and cable length limits, keeping cable length at 3 meters or less. The data transfer is also bidirectional, meaning it can both transmit and receive data at the same time, and at its top speed.


During Intel's press conference about the technology this morning, the company demonstrated it working on a MacBook Pro, pulling four raw, uncompressed 1080p video streams through a Thunderbolt storage array, and feeding into a Thunderbolt-attached display, all the while topping more than 600MBps in its transfer speeds. An earlier test of just file transferring had gotten it up to 800MBps.


To put this in perspective of what's been available up to this point, that's twice as fast as the theoretical limit of USB 3.0, 20 times faster than USB 2.0, and 12 times faster than FireWire 800.
Here's a demo from this morning's Intel press conference that gives you an idea of what it's capable of in a video editing and viewing work flow, as well as a file transfer:





When can I get it? 
The long and the short of it is that you can get Thunderbolt today, so long as you buy Apple's MacBook Pro, which is the first laptop to ship with a Thunderbolt port as a standard port across its entire line.
As far as it arriving on PC laptop and desktop machines, the company today estimated that we wouldn't see it there until early next year given OEM design cycles. In the interim, there will be a slew of Thunderbolt-ready devices like hard drives and displays that will take advantage of the technology arriving in the spring. One of the first will be a LaCie external hard drive called the Little Big Disk that packs multiple solid state drives in a single enclosure that works with Thunderbolt.


Will I be able to add it to my old PC or laptop? 
If your old machine is a PC you built, replacing its motherboard with one that will carry Thunderbolt will do the trick. During Intel's press conference today, the company stayed mum on offering it as an expansion to PCs through PCI Express slots, or laptops through ExpressCard technology.


Does this replace USB? 
Intel is positioning Thunderbolt as an "adjacent" technology, one that will compliment it. That said, USB's ubiquity means it's not going anywhere just yet. Intel has also said it plans to support USB 3.0 in future chipsets alongside Thunderbolt.


How much will it cost? 
Intel has stayed mum on cost besides saying that it was competitive with other high performance I/O solutions. As far as its inclusion in the new MacBook Pros, it's been added as a standard feature across the entire line, versus being a paid add-on at the time of configuration.
The same cost principle goes for Thunderbolt's cables too. Because Thunderbolt is not an open specification, that means companies cannot simply make their own through a license, though that could change once we're into the lifespan of the product.




Source: http://news.cnet.com

Thursday, February 24, 2011

Symbian is here to stay, says Nokia


Nokia stalwart Symbian isn't quite making its exit yet even though the world's largest phone maker is switching to Microsoft's Windows Phone 7 as its primary smartphone platform.


"Just because we're changing our direction in terms of smartphone platform, it doesn't mean that the existing platform is completely broken," said Vlasta Berka, general manager for Nokia Singapore, Malaysia, and Brunei, at the launch of the E7 smartphone in Singapore today. "We still have obligations to our users, developers, business partners, and customers."
According to Nokia, there are currently 200 million Symbian users around the world. The Finnish outfit said it expects to sell about 150 million Symbian devices going forward.


"Symbian is here to stay. Symbian will still be around, but it's just going to go somewhere around the corner," Berka added.


The latest E7 smartphone features a 4-inch AMOLED display, physical QWERTY keyboard, 8-megapixel camera with dual-LED flash, 16GB of onboard memory and a suite of enterprise solutions. It retails at S$989 ($774) in Singapore.


Berka also tried to allay fears that Nokia will cease support for the Symbian platform. He said over 50 improvements, from visual to performance enhancements, will be rolled out for the rest of this year.


Last week, Nokia's Chief Executive Stephen Elop noted that the Qt development environment that is used for Symbian devices would not be usable on Windows Phone 7 applications. He said this was to ensure that differing platforms don't confuse developers or consumers.


According to Gartner, Symbian's market share dropped from 46.9 percent in 2009 to 37.6 percent in 2010.


Source: http://asia.cnet.com/

Tuesday, February 22, 2011

It's time for Google Docs to work offline



Google is betting on a future with ubiquitous, affordable, wireless, high-speed Internet access. That may be smart in the long run, but this week that philosophy drove me straight back into the arms of Microsoft.
My technology choices generally come down to pragmatic rather than religious choices, and it was pragmatism that led me to embrace Google Docs last year. I like the fact that I can work simultaneously on multiple computers--indeed, even on mobile phones these days--and that multiple people can easily collaborate. My requirements for advanced formatting and formulas are low enough that I generally can put up with the shortcomings.


Here's what I don't like, though: For Google Docs, you need a network connection.
I just spent five days at the Mobile World Congress show in Barcelona. Contrary to what one might hope for a show devoted to the latest in mobile communications, the wireless networking at the show generally ranged somewhere from crippled to crushed.


For reasons that baffle me, network giant Cisco sponsored the show's Wi-Fi, with signage in the halls touting it and attendees receiving a flier explaining how to use it. I'd have thought that Cisco, a company with a brand to promote and protect, would have learned by now to steer clear of tech trade shows in which auditoriums filled with Net-enabled gadgets bring wireless networks to their knees.


I eventually hobbled by with a Vodafone 3G dongle plugged into my computer's USB port, but that only works some of the time (it was too bulky to use the dongle and the other USB port at the same time, for example). And of course the data plan is expensive, I had to unplug it much of the time, and connecting to the network is slow.


Under these circumstances, was I going to rely on a word processor that needed a network connection? Not a chance.


Thus, it was back to Microsoft Word for me during the show.
I recognize that these trade show circumstances might be a little extreme when it comes to network failings, but there have been plenty of times driving around my previous home in California and my present one in England in which the network doesn't work for me. Taking the train into London, a classic commuter scenario if there ever was one, is one example.


Google had tried to enable offline Google Docs in years past using its now-discontinued Gears plug-in. That didn't work for me for a number of reasons: First, I use a Mac when traveling, and Gears broke with the release of Mac OS X 10.6, aka Snow Leopard. Second--and maybe this was some kind of user error--I just found it awkward.


I wasn't alone. The relatively low usage of the feature probably minimized the pain when Google announced last year it was temporarily ditching the offline feature in a Google Docs overhaul that I otherwise like for new abilities.


"We need to temporarily remove offline support for Docs starting May 3rd, 2010. We know that this is an important feature for some of you, and we are working hard to bring a new and improved HTML5-based offline option back to Google Docs," said product manager Anil Sabharwal in a blog post at the time.
How long will we have to wait? In December, Google promised that offline Google Docs will return "early in 2011." An eighth of the way into the new year, I'm looking at my clock, and Google isn't commenting on any particulars at this stage.


What's the holdup? First, I suspect, is browser support for a new standard called Indexed DB, aka Indexed Database. A general consensus backing IndexedDB only emerged a year ago, and browser support is only arriving now..


Aside from the browser issues, Google has some re-engineering to do as well. The earlier offline technique used a different offline storage technique in Gears very similar to a browser technology called Web SQL Database. But facing Mozilla and Microsoft opposition, Web SQL lost out to IndexedDB.
In a perfect world, offline Google Docs would be an invisible, unnoticeable step away from online Docs. That means first and foremost that I'd be able to edit a document without an Internet connection, of course, with changes being synced with the online incarnation once a Net connection was re-established. But it would mean more than that. I also should be able to create new documents, search my archive, and perform file-management tasks such as adding a document to a collection.


Those features are among the most basic actions one takes for granted in the Microsoft Office world. Although Google Docs shows promise, without those features, it's profoundly broken until that perfect network arrives.




Source: http://news.cnet.com/

Sunday, February 20, 2011

Nokia baits developers with free Windows Phones


Microsoft and Nokia may have spent much of Mobile World Congress trying to convince attendees of the merits of their fresh, new partnership, but the harder step will perhaps be urging Nokia's loyal Symbian developers to shift alliances.


There's nothing like free stuff to help change one's mind. To that end, Nokia will be giving away one E7 smartphone and one Nokia-made Windows Phone to each developer in Launchpad, it's program for mobile app developers, according to information procured by SlashGear.


In addition, Nokia will waive its tech support charges for developers for the next three months. 
Developer support is one key to make Nokia's Microsoft pairing successful, especially as the two companies are poised to battle Canada's Blackberry maker RIM for market relevance behind Apple's iOS and Google's Android platforms.


On the handset front, Nokia will also feel pressure from Sony Ericsson, as both brands will keep a hungry eye trained on the US smartphone market.




Source: http://www.cnet.com/

Friday, February 18, 2011

Investing in Groupon Would Be a Bum Deal



I've yet to receive a Groupon offer I couldn't refuse. Most of the offers that hit my in-box are for discounts at day spas or hair salons. Clearly, they're targeting a different gender, and at no point has Groupon asked me for any information that would help it prepare offers that might be more to my liking. Both Google and Facebook have loads of personal data about their users -- and I'm certain they will use it to tailor personalized deals.


Despite Groupon's recent high-profile missteps, enthusiasm over the daily deal site's anticipated initial public offering doesn't seem to be dissipating.


That's unfortunate, at least in my opinion -- and my thoughts have nothing to do with Groupon running what were widely perceived as culturally insensitive Super Bowl ads or the bad deal it offered on Valentine's Day flowers.


The bottom line is that Groupon, as a business, is a poor investment.


For months, we've been hearing that Groupon is poised to launch an IPO sometime this year. Recent reports indicate it could happen as early as this spring.


That would be would be a wise move on the company's part, if you believe the Wall Street investment bankers who peg Groupon's value at US$15 billion. Among social media companies, only Facebook, which has an estimated value of $50 billion, is believed to be worth more.


A Bad Business Model


If Groupon proceeds with an IPO, the well-connected investors who are able to purchase the true initial shares will probably turn a nice profit  as the stock skyrockets in the short term. The regular people who buy in later won't be as lucky. Many of them are likely to end up feeling as if they bought into a Ponzi scheme.


My lack of enthusiasm over Groupon's prospects is based on two simple facts:


There is nothing unique about Groupon's business model; and
Groupon is doing almost nothing to enhance the model.
Groupon's business is built around a simple premise: Offer one discount a day on a product or service that users can purchase in their home towns. The discounts are often substantial -- sometimes as much a 90 percent.


Everybody loves a bargain, so it makes sense that the Groupon site has become popular. The company reportedly generates $1 billion in annual revenue by offering deals to 50 million subscribers in 35 countries.


Too Much Competition


There are many problems with Groupon's business model, however, starting with the fact that it's very easy to copy. That became evident when Groupon started its international expansion and ran into 147 competitors in Japan alone.


There also are numerous websites now peddling software programs that purportedly will allow just about anyone to create a Groupon clone. One of these sites will let you download a basic version of its software for free; the "premium" version is $95.


It's farfetched to think anyone could create a true competitor to Groupon with a $95 piece of software, but more formidable entities -- including Google and Facebook -- certainly could build services that would make Groupon nervous.


Facebook Deals is already operating in the U.S. and is set to launch in Europe and Canada, and Google Offers is expected to debut any day now.


Groupon thought it had effectively conquered the competition in Japan by shelling out $10 million to acquire the company that had amassed the largest share of that country's daily deal market.


Poor Order Fulfillment


Then, a problem that has plagued Groupon in the U.S. cropped up in Japan. Its new acquisition started selling coupons to more customers than the local businesses offering the deals could handle. This became major news in Japan when a restaurant on the outskirts of Tokyo ran out of the caviar it was supposed to be serving as part of a Groupon New Year's dinner package and began serving what customers described as limp fish instead.


The Valentine's Day fiasco -- in which people who purchased coupons for $20 off flowers at a special FTD-Groupon website later learned they could have gotten a better deal by going to the regular FTD site without a coupon -- proves that Groupon is not paying close enough attention to ensuring its customer  actually get the discount they're expecting.


Beyond the fulfillment issues, Groupon has weaknesses that companies like Google and Facebook should find easy to exploit. Chief among these is the lack of personalization.


No Personalized Deals


I've subscribed to Groupon for several months now, and I've yet to receive an offer I couldn't refuse. Most of the offers that hit my in-box are for discounts at day spas or hair salons. Clearly, they're targeting a different gender, and at no point has Groupon asked me for any information that would help it prepare offers that might be more to my liking.


Both Google and Facebook have loads of personal data about their users -- and I'm certain they will use it to tailor offers to specific users on their deal sites.


Even Foursquare, which I have criticized as being less than a viable business in the past, is working on personalization. It apparently is close to launching a feature that gives users recommendations on new places to visit based on the places they have previously checked in.


A Missed Opportunity


Meanwhile, Groupon is offering refunds to customers who bought the FTD Valentine's Day coupons and still apologizing to the people of Tibet for giving the impression that it was exploiting their problems for financial  gain with its Super Bowl ad.


One big ironic twist to this story is that it was only after Groupon turned down its $6 billion purchase offer that Google started working on Google Offers.


Ultimately, Groupon's management team may find itself wishing it had taken that deal when it had the chance.


Source: http://www.ecommercetimes.com/

Thursday, February 17, 2011

The New Wave of App Development


The rapid growth of mobile devices is happening alongside several related trends, and together they're fundamentally changing the way software is developed. Tablets are gaining in popularity over netbooks, smartphones are getting smarter, enterprise workers want to use their phones for both business and personal tasks, and a growing amount of developer activity is happening online.


Several new and ongoing phenomenons, all intertwined, will mark the next phase of application development.


There will likely be more emphasis on mobile apps. Tablets, particularly the iPad, are already cannibalizing netbook sales , and as mobile devices get multicore processors with more power and capabilities, they'll increasingly be useful in the enterprise .


Add to that the consumerization of technology, which is a fancy way of saying that workers use their personal mobile devices both at work and in their personal lives, and the demand for mobile apps gets a boost.


"Large enterprises are already starting to employ significant numbers of mobile developers in-house," Leila Modarres, vice president of marketing at DeviceAnywhere told TechNewsWorld. "In 2011, the majority of medium and large enterprises will be working on rolling out apps to their workforces, and in a lot of instances they'll develop them from scratch."


There will be an increase in Web app creation as developers seek to create products once that will run on multiple platforms, Modarres said.


An increasing amount of appdev activity will be conducted online  as enterprises seek to reduce costs, and that will further boost the concept of Platform as a Service (PaaS).




Going Mobile


The growing market penetration of tablet devices as well as smartphones "means that smart devices of one sort or another will slowly become the de facto communications, work and leisure tools," DeviceAnywhere's Modarres suggested.


That will give rise to the automation of mobile app testing.


"Enterprises already rely on test automation outside of the mobile world," Modarres pointed out. "Now, as they embrace mobile platforms, they want to extend the same time-saving and efficiency benefits to their work with mobile applications."


The Rise of Web Apps


There will also be a move toward developing Web apps because they'll let developers write apps once and run them on any platform.


"If you're a developer and want to be able to write once and write on many devices, the browser becomes a very compelling option," Dan Nguyen, vice president of product management at OpenWave, told TechNewsWorld.


The move toward browser-based development dovetails nicely into the shift toward mobile apps because many of the major companies offering browsers, such as Apple and Mozilla, are basing these products on WebKit.


"With new smartphones and now with Apple, Android, RIM (Research In Motion) and HP (NYSE: HPQ) through webOS using WebKit in their browsers, WebKit is becoming the de facto core of the new smartphone browsers," Nguyen stated.


"Unifying around WebKit offers uniformity that didn't exist before on the desktop or smartphone, and that reduces the fragmentation we have had to deal with in terms of apps," Nguyen elaborated. "Now apps are rendered consistently on browsers."


This will be especially useful for mobile apps, because there were "orders of magnitude more fragmentation" on the mobile side, Nguyen said.


Further, Web-based development can tap a broad pool of talent, Nguyen pointed out. "There are orders of magnitude more Web developers than Objective C developers," he remarked. "Also, tools for Web development are broader and more mature than those for mobile development, Web development having started back in the 1990s."


Appdev for the Non-Geek


New browser-based development platforms like the one from Mobiflex let anyone develop a mobile app using drag and drop, without knowing anything about technology or appdev.


"We give a whole range of users out there that don't have coding skills build their app in a visual way," George Adams, cofounder and CEO of Mobiflex, told TechNewsWorld.


Using Mobiflex to create an app is similar to creating slides in PowerPoint presentations, Adams said. "If you know how to do spreadsheets or PowerPoint you can build a complex, robust app that can run natively on either your iPhone or Android phone."


Users create their app layout on Mobiflex's platform, place buttons in the appropriate locations and select either the Android or iOS platform to run their app on. Mobiflex's code then translates the flow diagram and visual representation of the app screen in WYSIWYG fashion into native code for the smartphone platform of choice, Adams said.


While Mobiflex is currently targeting smartphones and tablets, its model can work for the enterprise, Adams contended.


Meanwhile, Microsoft has released WebMatrix, a free Web development tool that it claims lets anyone create, customize and publish websites easily. This includes IIS Express, a development Web server; AP.net, which is a Web framework; SQL Server Compact, an embedded database; and Razor, a new inline syntax for coding pages that adds dynamic functionality to HTML.


Moving Into PaaS


Increasingly, developers will turn to Platforms as a Service (PaaS) because they want to focus on creating apps rather than on managing servers and installations, Paul Kopacki, vice president of developer relations at Salesforce.com (NYSE: CRM), told TechNewsWorld.


"That's why so many developers are increasingly embracing cloud computing platforms like Heroku and Force.com," he added.


There will be two trends in PaaS, according to Vidur Apparao, chief technology officer at LiveOps.


One is the expanded use of domain-specific language, and the other is the exposure of PaaS services and capabilities to broadly focused languages and frameworks.


"These may seem like contrary trends, but are actually not mutually exclusive," Apparao told TechNewsWorld. "In fact, they support the same underlying goals -- the removal of barriers to adoption of PaaS services and support of a variety of types of applications and styles of application development."


Successful PAAS frameworks of the future will allow rapid application development through domain-specific languages within the framework as well as easy integration and use within broadly focused languages and frameworks, Apparao predicted.


Increasingly, established SaaS (Software as a Service) vendors will begin offering PaaS-like services, Apparao forecast.


"SaaS vendors have invested in robust, scalable, multitenant architectures, and the most successful vendors will respond to developers' increasing demands for openness and flexibility," he explained. "SaaS companies with rich, powerful Web services APIs will begin to host developer logic for more time-critical tasks and will create the facilities required to offer developer sandbox instances of their platform."


Mobilizing PaaS


PaaS will begin moving toward mobile integration, Rob Christensen, senior vice president at EDL Consulting, told TechNewsWorld.


"Standard protocols such as REST and Web service support are included in today's PaaS environments," Christensen said.


While this is critical for mobile integration, Christensen expects to see PaaS environments add features such as data synchronization, authentication and built-in push notification. 


Source: http://www.technewsworld.com

Wednesday, February 16, 2011

Cloud Security Takes More Than Superficial Teamwork


Doing business with a cloud provider necessarily means cooperating with a third party on some level in terms of data security. Sharing information does not come naturally to many security organizations, but working most effectively with a cloud provider requires a level of trust over and above what that organization has with other third parties. Successful strategies in the cloud will both trust and verify.


Teamwork is important. We all know this to be the case whenever we do anything in a group involving other people. But arbitrary -- even directionless -- teamwork doesn't make success by itself, no matter what the motivational poster might tell you. There are different kinds of teamwork.


Consider, for example, the difference between a three-legged race and a relay race. In both cases, teamwork defines success -- victory or defeat is directly decided by how well the individuals can work together cooperatively. But there's a crucial difference between the dynamics of the teamwork involved in those two cases: While a three-legged race limits the team as a whole to the performance of the weakest team member, the relay race combines strengths of each team member cumulatively. In other words, tie an Olympic sprinter to a couch potato and you get a slow-running team (no matter how many gold medals the sprinter has won in the past). Take that same team and make it a relay? Well, you might still be competitive (depending on who the competition is).


In the first case, the weakness of the slower runner limits them both; in the second case, the strength of the faster runner makes up for the slower one.




Which Kind of Cooperation Do You Need?


There's a lesson here for those of us working to secure cloud-based services. In a cloud model, there's no question that security  tasks need to be performed cooperatively. Because a cloud service can span multiple environments, different sets of personnel -- in most cloud models, at multiple companies -- need to jointly operate controls and coordinate efforts.


Take, for example, trying to trace a given security event to its source over multiple systems. You might need application logs from a SaaS provider, OS logs from an IaaS provider, and logs from internal systems the client organization maintains. Joint effort -- teamwork -- is required to do this without stepping on each other's toes.


But as in the racing examples, different types of cooperation apply in the cloud world too. Models that just "glue" existing security measures (processes and controls) together are like the three-legged race. The weakest set of process or control measures, be they on the vendor or the customer  side of the equation, can limit the security of the overall system. Unfortunately, this is where most cloud arrangements tend to start, and it undermines one of the key promises of the cloud: Namely, that leveraging economies of scale benefits everyone over the long term.


So how can deliver on that promise? How can we move to the different model that leverages both vendor and customer strengths rather than locking them both into inheriting the weaker paradigm? To start with, it requires embracing some new (and potentially scary) ways of thinking about security -- ways that might run contrary to how we've approached security in the past.


Principle 1: Delegate


When it comes to securing a cloud environment, organizations can find themselves in a scary place. On the one hand, they retain liability for breaches, hacks, etc. Who takes the public image "hit" of showing up in the papers if something gets hacked? The customer does. So an organization moving to the cloud rightly needs to consider where that liability lies and protect themselves accordingly. But at the same time that they retain liability, they also need to be able to delegate responsibility in order to ensure security controls operate effectively.


Why is delegation important? Because the vendor is the party best equipped to be the "eyes and ears" when it comes to control implementation. This is because they are closest to the asset, and also because they can potentially have economies of scale. In other words, not only are they likely to be the first to know about anomalous conditions that impact an asset (such as malware, DDoS, and other attacks), but also that they can (and should) maintain specialized niche personnel who are in a position to respond to those situations. The vendor will, quite likely, be your "first responder" -- so assigning them room to operate (within, of course, structured and defined boundaries) can have a real payoff.


Principle 2: Data Sharing


Also, in order for control operation to be effective, organizations need to make sure that the vendor has sufficient information about the environment to keep data safeguarded and to plan accordingly. The vendor should be maintaining expertise in cloud-specific security architecture considerations and also should best understand the nuances of their own environment. So they need to understand role, sensitivity, and criticality of the systems they're maintaining.


But it's the customer organization and not the vendor's that has the institutional knowledge required to do this -- where security controls are, mitigating controls put in place based where there are areas of risk exist, compensating controls, data sensitivity and regulatory context. But data-sharing -- particularly sharing the intimate details of this type of security-sensitive data -- does not come naturally to many security organizations. Let's face it: A philosophy of openness isn't something usually a part of the established security traditional wisdom.


To the contrary, many organizations espouse the belief that specific details of security controls, as well as areas of operational risk, should be kept out of public view to avoid an attacker from capitalizing on that information. But in a cloud relationship, it de facto limits the ability of both parties to operate a maximum potential.


It may not be the most comfortable thing for security organizations to accept, but working most effectively with a cloud provider requires a level of trust over and above what that organization has with other third parties. Understandably, this level of trust isn't easy to come by -- and quite frankly, not every vendor warrants it.


Trust'n'Verify


So successful strategies in the cloud will both trust and verify. To verify that delegated security responsibilities are being performed appropriately, document who is responsible for what and establish a feedback loop to make sure those responsibilities are being taken seriously. It is important here to make sure there is no ambiguity with respect to responsibility. If there is an application issue, can your vendor deploy a patch? Should they? You need to decide this ahead of time and give your vendor a framework within which to work.


You also need some way to measure activity so that you can have confidence that appropriate action is being taken. Is your vendor responsible for patching? Establish a process to check for missing patches on those assets to make sure the job is getting done according to expectations. Let your vendor do what they are (supposed to be) good at, but keep an ear to the ground to make sure they're not dropping the ball. If your vendor is worthy of your trust, measuring their performance can open up the true value of cloud by moving to a mutual-trust level of operations; if your vendor is unworthy, at least you'll know it so you can start looking out for someone else who is. 


Source: http://www.technewsworld.com

Tuesday, February 15, 2011

When Groupon goofs, everyone notices


Few companies have changed the e-commerce world in the recent past as much as Groupon, a local-deals broker that has gotten the nation hooked on half-price massages, discounted restaurant bills, and packages offering rock-climbing and yoga combos (though, ideally, not at the same time). It's earned rave reviews for customer service, thanks in part to its hiring of underemployed comedians as copywriters and service reps.
Yet Groupon has taken a beating in the past few weeks--not in terms of traffic, and not from the rise of any of its several dozen smaller competitors--but just because of a few bonehead moves. Its much-talked-about television ad campaign, kicking off with a Super Bowl spot, used C-list celebrities to mock charitable donation ads, and was seen by many as so tasteless that the company pulled the plug on it. Then, Groupon users nationwide were furious when a Valentine's Day-themed deal with an online flower retailer redirected users to a site with jacked-up prices, rendering the discount useless and raising concerns that Groupon isn't properly vetting its partner retailers.


The problem for Groupon isn't that it's making these mistakes. It's that it's making them as a company that, while barely over the age of two years old, sends deals to more than 60 million e-mail in-boxes, has sold more than 39 million "deals" according to new internal data, and plans to file for an initial public offering later this year. To put things into perspective, when Facebook was this age, not only was it a year prior to the social network's landmark launch of its developer platform, it wasn't even possible to register for Facebook unless you had an e-mail address from an approved school or business.


The rise of Groupon has been unlike anything else we've seen in the recent boom in tech companies for a lot of reasons, not the least of which is the fact that it rocketed smack into the mainstream without much time in the domain of insidery early adopters.


They say the Bay Area's technology culture is a bubble--not necessarily in terms of overvaluation, but in terms of isolation. It's more like a cocoon. Companies that grow there are, typically, entitled to a period of quasi-gestation in which they can screw up, and people will be vocal, but those who are actually noticing and listening are a relatively restricted set. Twitter's servers used to go haywire on a near-daily basis, but the service was so restricted to tech enthusiasts that pundit Robert Scoble was its most popular user. Facebook, though founded in a college dorm on the East Coast, kept its numbers low with the e-mail address requirement and was well ensconced in Valley culture by the time it opened up the gates.


Groupon, firing out e-mails to the Deep South and Mountain West and Mid-Atlantic from its sprawling headquarters in Chicago, was not afforded that privilege. It certainly has a quirky start-up attitude--but that's exactly what clashed with the "real world" when the offbeat humor of its Super Bowl offended the mass market. And it seems to be grappling with the tech-industry vision of being a platform rather than a media company, connecting advertisers with customers while remaining the universally appealing brand humming away in the background. The heavy publicity surrounding the bogus flower deal last week might hint to some that Groupon is focusing too much on being everywhere and losing its focus on the quality of its content.
In contrast, when Facebook released its disastrous Beacon advertising product--in late 2007, when it had roughly the same number of users that Groupon does now--its users largely didn't notice. The product was launched at a small press conference in New York, subsequently ripped apart by the press, and was watered down within weeks. The average Facebook user likely never even saw a Beacon ad; true mainstream interest in the company's inner workings didn't take off until well over a year later. Facebook was still very much in the cocoon.


Groupon's got a bit of a catch-22 on its hands. It's big and obviously proud of its choice spot in the mainstream and outside-the-Valley attitude--could you ever see Facebook, Twitter, or LinkedIn buying a Super Bowl ad? And Groupon is big enough for the occasional slip-up to have truly visible reverberations. Still, it's not so big that a PR crisis could be swallowed up by the sheer size of the rest of the company. Google's launch of Google Buzz was disastrous, but the fate of the Mountain View, Ca., conglomerate was hardly resting upon the lightweight, experimental product. Caught in between these two phases of development, Groupon is like a kid who grew too tall too quickly and now finds that everyone notices when he trips or hits his head on things.



The good news for the company is that its loyal users seem to have gotten over the Super Bowl revulsion pretty easily, and that CEO Andrew Mason's frankness about pulling the ad campaign seems to have helped. YouGov, a research firm that measures "brand perception," plotted the positive and negative buzz about Groupon on its scale of 100 (very positive) to -100 (very negative), and found that in the days following the Super Bowl ad, Groupon's score fell from 14.4 to 5.3 but then shot back up to 26.6 after Mason wrote an apologetic blog post.
YouGov hasn't yet measured the change in Groupon brand perception in the wake of the botched flower deal.

Both follies are high-profile shortcomings that customers ought to forget about the next time they see a killer deal that they simply can't pass up. One of Groupon's many competitors, LivingSocial, might have something to say about that: While it has never put forth an explicit "we're better than Groupon" message, it's proven remarkably savvy at weaseling its way into situations where it's getting pitted against the bigger site. On Super Bowl Sunday, LivingSocial purchased a pre-game ad--something that probably wouldn't have happened if Groupon hadn't been advertising during the game.

Previously, LivingSocial had raised a bucketload of funding from Amazon.com and then, perhaps thanks to the new business relationship, offered a killer Amazon.com deal and attracted plenty of new members in the process. In late January, traffic firm Experian Hitwise reported that after the Amazon deal, LivingSocial went from pulling in one-tenth the traffic of Groupon to nearly half.

But in aiming straight for Groupon's market, LivingSocial is in the same spotlight. The moment it messes up on something--and knowing young technology companies, it will--that's going to be Groupon's gain.

Either way, it's a rare look at the rise of a Web company that's grown outside the industry's famed bubble, or cocoon, or whatever you want to call it.




Saturday, February 12, 2011

Porn studio could teach Apple, Google about cloud



Home video innovations always seem to lead back to porn.


The fingerprints of the adult-film industry can be found on the development of VHS and Blu-ray disc. Soon, the sector may teach us about the cloud.




Pink Visual, a porn studio with a history of embracing new technologies, appears to be among the first filmmakers in the United States to offer the kind of streaming-video features that Apple and Google were said to be considering last year.


Instead of storing digital movies they own on computer hard drives, Pink Visual customers will be able to store clips they buy from the studio on the company's servers, said Quentin Boyer, a company spokesman. For a one-time fee, buyers can access their films from PVLocker.com anytime and as often as they choose.
Digital shelves


Computing done over the Internet is commonly referred to as "the cloud" and this is where the next generation of digital entertainment is supposedly headed. Sources in the movie industry told CNET last year that Google and Apple have spoken to some of the Hollywood film studios about providing "digital shelves" for which users could store movies, songs, and other media. Buyers can access their flicks from Web-connected devices.


With PC hard drives getting crammed, features like this stirred some excitement. Unfortunately for cloud fans, however, the Google and Apple services have yet to show up.


While cloud video and music services are packed with potential, say supporters, there are few success stories to support the claim. A smattering of music services have enabled users to store and stream songs on their servers, including Lala. Apple acquired the struggling company in December 2009 and later shut it down. A pioneering cloud music service, MP3tunes.com. was sued by EMI, one of the four largest record companies, for copyright violations. That case is ongoing.


In video, Pink Visual could help determine whether these cloud services are a slice of heaven or just vapor. The company will be among the first to tackle issues of pricing, copyright protection, and most importantly, gauge consumer demand.


Liability 
Already, managers at Pink Visual are asking important questions.
There was a debate at the porn studio about whether it should store and stream content created by other filmmakers, according to Boyer. At least at the start, Pink Visual will handle only its own content but could open up its cloud later, he said.


Why the hesitation?


"We don't want to accidentally have a lot of liability," Boyer said. "We don't want to become fertile ground for copyright infringement [should users upload pirated content to the company's cloud]."


Another question that managers have yet to decide is what would happen to a customer's content should Pink Visual go out of business. Boyer said that the company would try to make good but acknowledged that this is a contingency that Pink Visual hasn't considered since the company is doing well financially.


This seems like the logical spot to address the suspicions that many Internet users have about cloud services. Skeptics argue cloud services could try to charge buyers every time they access their movies. They note that there are costs associated with streaming video and that someone has to pay it. But Boyer said emphatically there are no hidden charges.


He said that Pink Visual buys bandwidth at a low cost and there's no plan to trap customers inside a silo. He added that for people who don't like the cloud model, the company plans to continue to offer traditional downloads, subscriptions and even DVDs.


Unlikely tech ambassadors 
That pornographers could become the film industry's cloud-video ambassadors shouldn't surprise anyone. Perhaps Hollywood should consider the adult-film market as a technology testing ground. Porn studios have typically operated on much slimmer margins than their counterparts in mainstream movies. The adult-movie business has earned a reputation for placing big bets on technological advances to help solve some of their unique problems, none the least of which has been providing privacy to consumers of their content.


Moving the adult-film experience out of shabby porn theaters and into the home was a huge breakthrough for the industry. From VCRs to live Internet sex shows, the sector has proven itself to be a technology hotbed. Ground zero is in the steamy section of Los Angeles known as the San Fernando Valley--a 30-minute drive from Hollywood--where hundreds of porn studios are based, including Pink Visual.


Similar to other digital-media industries, adult film has been marked by white-hot competition and consolidation. Pink Visual didn't go untouched. In 2006, the studio went from a high of 130 employees to half that number, Boyer said. Managers there began looking to the cloud for answers on how to differentiate the company from rivals.


"Our strategy is when a new technology comes out, we need to do it first and best," Boyer said. "Everyone used to think that the secret to succeeding in this business was strictly about the quality of the porn. We know it is more than that. In the early days of Web porn, what separated [winners and losers] was the quality of hosting. People hated buffering."


Whatever happens at Pink Visual, Josh Martin, a senior analyst with Strategy Analytics, cautioned against coming to any hasty conclusions about cloud video. He says we're still in early days and nothing close to a successful business model has emerged.


"Ultimately, you still have to lots of people trying lots of different things," Martin said. "I don't know how you can introduce a simple, easy and elegant solution into such a fragmented world."


A representative from a company called HotMovies.com e-mailed to say that they've had a version of the cloud available for years. They sell unlimited rentals, which enables users to access films anytime from the company's servers. The HotMovies.com rep said, however, that most of their customers prefer to use a system that enables them to buy access to the company's films for specific units of time. I don't think that this kind of plan would work for mainstream viewing but it's interesting that adult-film has already begun wrestling with these issues.






Source: http://news.cnet.com/

Friday, February 11, 2011

Nokia, Microsoft becoming Windows Phone bedfellows

Microsoft and Nokia announced a broad mobile phone partnership today that joins two powerful but lagging companies into mutually reliant allies in the mobile phone market.
As expected, Nokia plans to use Microsoft's Windows Phone 7 operating system as part of a plan to recover from competitive failings detailed in Nokia Chief Executive Stephen Elop's "burning platform" memo.

But it's deeper than just an agreement to install the OS on Nokia's phones. Instead, the companies call it an attempt to build a "third ecosystem," acknowledging that competing with Apple's iOS and Google's Android involves a partnership that must encompass phones, developers, mobile services, partnerships with carriers, and app stores to distribute software.

Nokia CEO Stephen Elop, let, and Microsoft CEO Steve Ballmer explained their company's new tight alliance for mobile phones at an analyst and strategy meeting in London.

"There are other mobile ecosystems. We will disrupt them. There will be challenges. We will overcome them. Success requires speed. We will be swift," Elop and Microsoft CEO Steve Ballmer said in a boldly worded open letter. "Together, we see the opportunity, and we have the will, the resources and the drive to succeed."

The companies will cooperate tightly under an agreement the companies just describe so far as proposed, not final. Under the deal, Windows Phone 7 would become Nokia's "principal" operating system, and Nokia would help Microsoft develop it and ensure a broad range of phones using it are available globally.

Nokia will use many Microsoft online services, many of which trail Google rivals, such as Bing for search and maps and AdCenter for advertisements.

When it comes to the sales part of the ecosystem, each company brings something to the deal. Microsoft phones will be able to link up with Nokia's agreements for carrier billing--a popular option in parts of the world where credit cards are less common. And Nokia will fold its own app store into the Microsoft Marketplace.


Elop the optimist


In a press conference today, Elop and Ballmer touted the alliance as good for both companies' aspirations.

"A pessimist sees the difficulty in every opportunity, but an optimist sees the opportunity in every difficulty," Elop said, quoting England's historic prime minister Winston Churchill as he spoke in London. "I am an optimist."

It's not immediately clear what needs to be done to make the deal final; details "specific details of the deal are being worked out," the companies said.

Nokia, once the dominant power of the mobile phone industry, has ceded the smartphone initiative to Apple's iPhone and Google's Android, and Elop believes Nokia's own Symbian and MeeGo operating systems aren't competitive. Microsoft has tried for years to penetrate the mobile phone market, and although it now has a credible option with Windows Phone 7, it trails Android when it comes to developer interest and the breadth of phones available.

The two companies can expect their combined might will be more convincing for software authors debating whether they need to bring their apps to yet another ecosystem. But it's not yet clear how the alliance will extend to another hot new market, tablets, where Microsoft prefers Windows instead of the Windows Phone operating system. In contrast, iOS and Android developers enjoy the same mobile operating system on phones and tablets.

An end for Symbian, MeeGo


The partnership means a gradual end for both Symbian and MeeGo, Nokia's mainstream and next-generation smartphone operating systems, respectively. Symbian will be gradually phased out in favor of Windows Phone, though Nokia expects to ship 150 million more Symbian phones in the meantime.

And although Nokia will ship a single MeeGo-based smartphone later this year, Elop consigned it to near-oblivion by calling it merely "an opportunity to learn." MeeGo engineers will "change focus into exploration" of future devices and user services.

Nokia seriously considered using Google's Android operating system, but feared it would ultimately be a boon for Google and a bane for Nokia, Elop said.

"We would have difficulties differentiating within that ecosystem," Elop said. Joining the Android world would make it even bigger, "with prices, profits, and everything being pushed down, and value moving to Google."


Nokia cost-cutting


Along with Nokia's new operating system strategy come major internal changes. Leadership in the United States, a large market now awakened to smartphones but where Nokia trails particularly badly, is being replaced. Bureaucracy will be cut. Research and development costs will be cut but R&D will become more productive. And there will be substantial layoffs, in Finland and elsewhere, Elop said, though the company will not move headquarters anywhere else.

"Already evolving rapidly at Nokia is a change in attitude and behavior" that shows "the fighting spirit of Nokia worldwide and the fighting spirit of the Finnish people," Elop said. In communicating with employees about the change, including an company-wide message earlier this morning, he added, "We've been vocal and transparent about the challenge. I think it's having a positive effect."

Nokia will pay Microsoft royalties for use of its operating system, but apparently money will likely flow the other way, too, Ballmer and Elop said.

"We have different forms of value transfer in different directions," Elop said. "We have new opportunities that come from advertising and new forms of monetization."

"We the ability to activate users and put them in a position to be more effective in what we call local commerce," added Ballmer. "Nokia has not just got mapping but other assets that we will import, if you will. There are a set of financial considerations for that as well."

Elop and Ballmer detailed their proposal immediately before the vast Mobile World Congress trade show in Barcelona, Spain, where a large number of new Android phones and tablets can be expected.

It's uncertain what effect the alliance will have. Microsoft has had strong operating system partnerships with multiple competing PC makers, but the Nokia alliance, with mutually developed products and shared road maps, appears much deeper than the average relationship Microsoft has with hardware makers. That could encourage those who've made strong Android commitments--HTC, Motorola, Sony Ericsson, LG Electronics, Samsung, and more--to double down. After all, they're all enjoying a period of relative freedom with Nokia in its present relatively uncompetitive state, and strongly pushing Windows Phone products arguably would be abetting the enemy.

Although the alliance is nonexclusive, Nokia clearly is becoming the premier Windows Phone partner. "There are things we are planning to do with Nokia that are unique," Ballmer said.

The announcement was accompanied by a YouTube video featuring Microsoft and Nokia's chief executives praising the deal.

"Today, Nokia and Microsoft intend to enter into a strategic alliance," Elop said in the video, a precursor of a turnaround plan he's set to detail later today at an analyst conference in London. "Together, we will bring consumers a new mobile experience, with stellar hardware, innovative software, and great services. We will create opportunities beyond anything that currently exists."

Ballmer said the partnership "brings the brands mobile consumers want, like Bing, Office, and of course Xbox Live."


Source: http://news.cnet.com/