Who's who of cloud market

Seemingly every tech vendor seems to have a cloud strategy, with new products and services dubbed “cloud” coming out every week. But who are the real market leaders in this business? Gartner’s IaaS Magic Quadrant: a who’s who of cloud market article shows Gartner’s Magic Quadrant for IaaS. Research firm Gartner’s answer lies in its Magic Quadrant report for the infrastructure as a service (IaaS) market.

It is interesting that missing from this quadrant figure are big-name companies that have invested a lot in the cloud, including Microsoft, HP, IBM and Google. The reason is that report only includes providers that had IaaS clouds in general availability as of June 2012 (Microsoft, HP and Google had clouds in beta at the time).

Gartner reinforces what many in the cloud industry believe: Amazon Web Services is the 800-pound gorilla. Gartner has also found one big minus on Amazon Web Services: AWS has a “weak, narrowly defined” service-level agreement (SLA), which requires customers to spread workloads across multiple availability zones. AWS was not the only provider where there was something negative to say on the service-level agreement (SLA) details.

Read the whole Gartner’s IaaS Magic Quadrant: a who’s who of cloud market article to see the Gartner’s view on clould market today.

1,065 Comments

  1. Tomi Engdahl says:

    The only way is down for NetApp, HP Enterprise and IBM storage – study
    Dell-EMC combo to be top dog in 2017 enterprise on-premises spend
    http://www.theregister.co.uk/2016/01/25/netapp_hpe_and_ibm_storage_declines/

    In its inaugural Voice of the Enterprise: Storage Study, 451 Research forecasts public cloud storage spend to double in two years – with NetApp, HPE and IBM falling down the supplier rankings as Amazon’s AWS and Microsoft’s Azure bulldoze their way in.

    We have seen a copy of the report: 451 asked its enterprise research base “which vendor does your organization currently spend the most with on storage in 2015 and which will it spend the most on in 2017?”

    The top five suppliers in 2015 were EMC (28.7 per cent), NetApp (12.9 per cent), Dell (12.7 per cent), Hewlett Packard Enterprise (9.9 per cent) and IBM (9.5 per cent). 451 said AWS and Azure muscled their way into the top five in 2017, but what happened to the others?

    Public cloud storage spend to double in two years – reliable sources
    It’s a pain! Data and storage capacity growth
    http://www.theregister.co.uk/2016/01/21/public_cloud_storage_spend_to_double/

    Reply
  2. Tomi Engdahl says:

    Microsoft Continues to Obscure Its Real Cloud Revenue
    http://www.wired.com/2016/01/microsoft-continues-to-obscure-its-real-cloud-revenue/

    Microsoft beat analyst expectations for its most recent quarter, pulling in a profit of 78 cents a share on $25.7 billion in revenue, versus predicted 71 cents on $25.26. The company touts to strong growth in its “Intelligent Cloud” division as one of the main highlights of the quarter. Year over year, the division grew by 5 percent to $6.3 billion in revenue. That certainly bodes well for the Microsoft, which is seeking to reinvent itself as a mobile and cloud computing company, markets led by rivals like Apple, Amazon, and Google. But Microsoft’s numbers say little about how it actually compares to those rivals.

    At first glance, it appears that Microsoft is making far more on its cloud services than Amazon, which made $2.41 billion last quarter from its Amazon Web Services division. The problem is that, in reporting its results, Microsoft bundles its Azure line of cloud services with Windows Server and other traditional enterprise software sales together under the label “Intelligent Cloud” without revealing what percentage of that total actually comes from Azure. That makes an apples to apples comparison with Amazon Web Services impossible.

    Reply
  3. Tomi Engdahl says:

    Amazon Misses Big—But Don’t Give Up on the Company Just Yet
    http://www.wired.com/2016/01/amazon-misses-big-but-dont-give-up-on-the-company-just-yet/

    This was supposed to be a stellar quarter for Amazon. Its fourth quarter earnings report covers the all-important holiday season, and by all indications, Amazon had killed it. It added more than 3 million Prime members in the third week of December alone. It broke shipping records. And sales of Amazon devices doubled compared to a year ago.

    But when it comes to Wall Street, there’s always a possibility for a plot twist. Today Amazon shares are plummeting in after-hours trading—as of this writing, it’s down more than 15 percent—after the company didn’t meet the soaring expectations of analysts and everyone else.

    Over the past three months, the e-commerce giant reported a net income of $1 per share on revenue of $35.75 billion. Great, right? Especially for a company with a reputation for miniscule profits. What’s more, that’s a 22 percent increase from $29.3 billion in revenue during the same time last year.

    But there’s nothing quite like setting great expectations and then not quite meeting them.

    Another trend Amazon can be proud of: its ever-expanding cloud business. Amazon Web Services’ fourth quarter revenue grew 69 percent year over year, generating $2.41 billion in sales. But the crazy growth of its cloud is slowing down a bit: two quarters ago, Amazon posted 82 percent growth in AWS revenue; last quarter, sales grew 78 percent year over year. Still, $2.41 billion beats analyst estimates of $2.38 billion.

    Reply
  4. Tomi Engdahl says:

    New York Times:
    Facebook to shut down Parse, the toolkit for mobile developers it acquired in 2013; developers will have year-long window to migrate their data — Facebook to Shut Down Parse, Its Platform for Mobile Developers — Facebook acquired Parse, a toolkit and support system for mobile developers, in 2013.

    Facebook to Shut Down Parse, Its Platform for Mobile Developers
    http://bits.blogs.nytimes.com/2016/01/28/facebook-to-shut-down-parse-its-platform-for-mobile-developers/?_r=0

    Facebook acquired Parse, a toolkit and support system for mobile developers, in 2013. At the time, the social network’s ambitions were high: Parse would be Facebook’s way into one day harnessing developers to become a true cloud business, competing alongside the likes of Amazon, Google and Microsoft.

    Those ambitions, it seems, have fallen back to earth. On Thursday, Facebook said it plans to shut down Parse, the services platform for which it paid upwards of a reported $85 million.

    “We know that many of you have come to rely on Parse, and we are striving to make this transition as straightforward as possible,” Kevin Lacker, co-founder of Parse, said in a blog post. “We enjoyed working with each of you, and we have deep admiration for the things you’ve built.”

    Most of what Parse does involves things most people will never see. Parse helps developers with support and tools, so that independent programmers can spend more time writing code and less time on keeping up the back end. Developers who use Parse include those at Quip, a productivity app, and Expedia’s Orbitz, a travel website. Facebook would make money from Parse by storing data from developers and sending customers product notifications.

    Achieving that goal, however, would be no easy feat. Microsoft, Google and Amazon have similar developer offerings, along with a much richer set of other computing tools and services that developers need. Amazon Web Services, in particular, has in the past two years stressed both its developer tools and analytic services, so companies can think about what to build next. In every case, these companies can also benefit by selling other computing services, like complex commercial databases, which Facebook does not provide.

    Reply
  5. Tomi Engdahl says:

    Microsoft cloud email services winning the race for enterprise adoption
    Adoption is up and Redmond has taken a lead
    http://www.theinquirer.net/inquirer/news/2444386/microsoft-cloud-email-services-winning-the-race-for-enterprise-adoption

    ENTERPRISES OF ALL SIZES are willingly surrendering their emails to the cloud, according to the analysts at Gartner, and the bulk of them are relying on Microsoft to keep them up in the air and spinning.

    The cloud, in case you missed it, is everywhere. Even your nan uploads her photos to the cloud. Cloud email services have been embraced by consumers, but have been welcomed more cautiously in the business world. Until now, that is, according to a new Gartner cloud and email report.

    The leading firms in this area are Google and Microsoft. The latter seems to have the edge, perhaps because Microsoft solutions are as entrenched in business as tedious meetings. Google is getting its game together, however, thanks to a mix of improvement and marketing.

    “Although it is still early days for cloud email adoption, Microsoft and Google have achieved significant traction among enterprises of different sizes, industries and geographies,” said Nikos Drakos, a research vice president at Gartner.

    “Companies considering cloud email should question assumptions that public cloud email is not appropriate in their region, size or industry. Our findings suggest that many varied organisations are already using cloud email, and the number is growing rapidly.”

    “Among public companies using cloud-based email, Microsoft is more popular with larger organisations and has more than an 80 per cent share of companies using cloud email with revenue above $10bn,” added Jeffrey Mann, research vice president at Gartner.

    “Google’s popularity is better among smaller companies, approaching a 50 per cent share of companies with revenue less than $50m.”

    Reply
  6. Tomi Engdahl says:

    IBM Is Finally Embracing the Cloud—It Has No Other Choice
    http://www.wired.com/2016/02/ibm-learns-to-stop-worrying-and-love-the-cloud-for-real/

    Startup founder Alex Polvi has a name for the biggest idea in the world of information technology. And, yes, it doubles as a hashtag, a six-character encapsulation of this sweeping movement: #GIFEE.

    The acronym has nothing to do with time-wasting animations in your Slack feed. It stands for “Google Infrastructure For Everyone Else!” (exclamation point optional). Nowadays, in the world of IT, the big idea is to give everyone else their own incarnation of the state-of-the-art infrastructure Google built to run its Internet empire. And that’s good news for everyone else. Or, rather, almost everyone else.

    This big idea presents a conundrum for venerable tech giants like HP and Microsoft and IBM. For so long, these giants sold a very different type of IT infrastructure, and it keep their profit margins high. The #GIFEE movement undercuts the old way of doing things. But in recent years, Microsoft has regained some of its mojo by embracing the #GIFEE ideal—and embracing it wholeheartedly (though I’m sure they call it something else). And now, it looks like IBM has made the same leap of faith. Google—and, just as importantly, Amazon—left the company no choice.

    Reply
  7. Tomi Engdahl says:

    Google Forms Gets Templates, Add-On Support And More
    http://techcrunch.com/2016/02/10/google-forms-gets-templates-add-on-support-and-more/

    Google Forms, the company’s tool for creating and analyzing surveys, is getting a major update today. Google is adding a slew of new features to the service that range from support for templates to new options for analyzing surveys.

    Google’s last update to Forms brought a design update and added the ability to add logos, videos and GIFs to surveys.

    With today’s release, you can now select from a number of templates for standard surveys like customer feedback, quizzes and event sign-up right from the Google Forms home screen.

    Google is also adding a few new features for analyzing surveys. Google Apps for Work and Education users can now easily see who has responded to a survey, for example (and prod those who haven’t with the help of the new “send reminder email” feature).

    https://www.google.com/forms/about/

    Reply
  8. Tomi Engdahl says:

    Janakiram MSV / Forbes:
    Google launches alpha version of Google Cloud Functions to let devs run single-purpose functions without full server hosting, similar to AWS Lambda — Google Brings Serverless Computing To Its Cloud Platform — Google Cloud Functions, the latest addition to the Google Cloud Platform …

    Google Brings Serverless Computing To Its Cloud Platform
    http://www.forbes.com/sites/janakirammsv/2016/02/09/google-brings-serverless-computing-to-its-cloud-platform/#574fb3ca25b8

    Google Cloud Functions, the latest addition to the Google Cloud Platform, enables developers to upload JavaScript code snippets that are triggered in response to a few events. This new service, which is in Alpha, is available to select customers whose accounts are whitelisted by Google. Cloud Functions complements existing compute services such as App Engine, Compute Engine, and Container Engine.

    In the recent past, serverless computing has gained industry attention mainly due to its simplicity and “NoOps” model. Developers follow the fire-and-forget paradigm where they upload individual code snippets that are hooked to a variety of events at runtime. This model offers a low-touch, no-friction deployment mechanism without any administrative overhead. Serverless computing and microservices are ushering a new form of web-scale computing.

    According to the official documentation, Google Cloud Functions is a lightweight, event-based, asynchronous compute solution that allows developers to create small, single-purpose functions that respond to cloud events without the need to manage a server or a runtime environment. It supports JavaScript functions that are executed within a managed Node.js runtime environment. There is no requirement to provision or configure virtual machines. The uploaded snippets are invoked by events raised by Google Cloud Pub/Sub and Google Cloud Storage. They can be synchronously invoked through HTTP calls which enable on-demand execution of functions.

    Google Cloud Pub/Sub provides reliable messaging between applications. It enables applications to talk to each other is a loosely-couple fashion.

    Google is the not the first to launch serverless computing service. Amazon Web Services has it in the form of AWS Lambda. Launched at AWS re:Invent 2014, Lambda has become one of the most popular services of AWS. In a Gigaom report published last year, I analyzed the potential of AWS Lambda and its significance in driving Amazon’s microservices strategy.

    Why AWS Lambda is a Masterstroke from Amazon
    https://gigaom.com/2015/01/09/why-aws-lambda-is-a-masterstroke-from-amazon/

    Reply
  9. Tomi Engdahl says:

    Yevgeniy Sverdlik / Data Center Knowledge:
    Netflix shuttered its last data center in January after a seven-year migration to the cloud, now everything runs on AWS

    Netflix Shuts Down Final Bits of Own Data Center Infrastructure
    http://www.datacenterknowledge.com/archives/2016/02/11/netflix-shuts-down-final-bits-of-own-data-center-infrastructure/

    It’s done and dusted. Since someday last month, everything Netflix does runs on Amazon Web Services, from streaming video to managing its employee and customer data.

    In early January, whatever little bits of Netflix that were still running somewhere in a non-Amazon data center were shut down, Yuri Izrailevsky, the company’s VP of cloud and platform engineering, wrote in a blog post Thursday.

    To be sure, most of Netflix had already been running in the cloud for some time, including all customer-facing applications. Netflix has been one of the big early adopters of AWS who famously went all-in with public cloud. Thursday’s announcement simply marks the completion of a seven-year process of transition from a data center-based infrastructure model to a 100-percent cloud one.

    Reply
  10. Tomi Engdahl says:

    AWS Bolsters High Performance Computing Offering With NICE Acquisition
    http://techcrunch.com/2016/02/12/aws-bolsters-high-performance-computing-offering-with-nice-acquisition/

    AWS attempted to enhance its high performance computing offering today when it purchased NICE, an Italian software and services company for an undisclosed price.

    NICE provides a set of tools and technologies that were attractive to AWS, and brings with it an international clientele, which should help AWS expand its market with a set of customers who have high-end compute requirements.

    Among the technology that NICE owns is a nifty tool they call Desktop Cloud Visualization (DCV), which provides remote access to 2D and 3D applications, giving engineers, game designers and others access to their designs and the high-end hardware to make them work in the cloud, no matter what desktop or laptop they are using..

    Another piece of technology called the NICE EnginFrame could also be particularly attractive to AWS. It enables customers to run high-end computing environments like HPC clusters, data, licenses and batch & interactive applications inside a standard browser.

    “The NICE acquisition gives Amazon a good trove of IP, access to good clients and a good presence in Europe, the Middle East and Africa (EMEA). These IP capabilities help with performance and some newer features that can benefit the overall Amazon platform,” Wang told TechCrunch.

    It’s worth noting that AWS has had a substantial presence in Europe for some time with offices throughout the EU including Italy where NICE is located.

    Just last year Amazon launched new C4 instances for companies with high performance computing requirements in the cloud.

    High Performance Computing
    https://aws.amazon.com/hpc/

    Reply
  11. Tomi Engdahl says:

    AWS Plans To Launch A UK Region By The End Of 2016
    http://techcrunch.com/2015/11/06/aws-plans-to-launch-a-uk-region-by-the-end-of-2016/

    Amazon Web Services (AWS) is aggressively expanding the number of different geographic regions it offers its services in. As Amazon CTO Werner Vogels announced today, the company plans to launch a UK region by the end of 2016 (“or early 2017”).

    That’s Amazon’s third region in the European Union. Amazon’s facilities in Dublin were long the only option for European developers who wanted to host their applications close to their home markets. AWS then launched its first region in Frankfurt, Germany last year.

    The announcement of the upcoming UK region comes only a day after AWS also announced that the company plans to launch a region in South Korea, too (its fifth in the Asia-Pacific region).

    Reply
  12. Tomi Engdahl says:

    Hazy outlook for Verizon cloud storage after compute gets axed
    Change of direction in Verizon’s data centre jet stream
    http://www.theregister.co.uk/2016/02/15/hazy_outlook_for_verizon_cloud_storage/

    The Grand Old Duke of Verizon has marched his troops up the public cloud storage hill; is it time to march them down again?

    In January 2014 Verizon was going to use HDS’ object-based Content Platform (HCP), known as Himalaya, in a public cloud storage offering.

    At the time Verizon Terremark CTO John Considine said; “We are building a dynamic ecosystem of enterprise-class networking, storage, and software-based capabilities that will run on top of our new Verizon Cloud infrastructure and Hitachi Data Systems is an important part of this plan.”

    High up in the sky, changes in the direction of high-altitude winds can be seen with mares’ tails, cirrus cloud streams that change direction. This typically signals a change in the weather. It could be that Verizon’s data centre cirrus clouds are changing direction.

    Fast forward two years and Verizon is abandoning its public cloud server ambitions with the VM server instance facility shut down on April 12.

    Verizon says its Virtual Private Cloud (VPC) and Cloud Storage services will remain in operation. You may ask yourselves, for how long? If Verizon can cut cloud compute services in the face of competition from the Amazon meat-grinder then you might think it could cut cloud storage as well.

    Reply
  13. Tomi Engdahl says:

    Frederic Lardinois / TechCrunch:
    Microsoft now selling licenses to deploy Red Hat Enterprise Linux on Azure, also announces Azure support for Walmart’s OneOps app lifecycle management platform

    Microsoft Brings Red Hat Enterprise Linux To Azure
    http://techcrunch.com/2016/02/17/microsoft-brings-red-hat-enterprise-linux-to-azure/

    Microsoft is now selling Red Hat Enterprise Linux licenses. Starting today, you will be able to deploy Red Hat Linux Enterprise (RHLE) from the Azure Marketplace and get support for your deployments from both Microsoft and Red Hat.

    In addition, Microsoft today announced that it is now offering certified Bitnami images in the Azure Marketplace and it now supports Walmart‘s (yes — that Walmart‘s) open source OneOps application lifecycle management platform. Until today, OneOps only offered a machine image for Amazon’s AWS platform.

    2016-02-17_0957Seeing the words ‘Microsoft’ and ‘Linux’ together in a single sentence may still come as a shock to a few people, but Microsoft says more than 60 percent of images in the Azure Marketplace are now Linux-based.

    As Red Hat’s Mike Ferris, its senior director for business architecture, and Microsoft’s director of program management for Azure Corey Sanders told me earlier this week, the two companies are also working closely together on supporting customers who choose to go the RHEL route on Azure. Red Hat and Microsoft’s support specialists are actually sitting together to answer their customers’ questions, which is a first for both companies.

    Reply
  14. Tomi Engdahl says:

    Ram Ramanathan / Google Cloud Platform Blog:
    Google Cloud Vision API, which detects the contents of images, enters public beta

    Google Cloud Vision API enters Beta, open to all to try!
    http://googlecloudplatform.blogspot.fi/2016/02/Google-Cloud-Vision-API-enters-beta-open-to-all-to-try.html

    Today, we’re announcing the beta release of Google Cloud Vision API. Now anyone can submit their images to the Cloud Vision API to understand the contents of those images — from detecting everyday objects (for example, “sports car,” “sushi,” or “eagle”) to reading text within the image or identifying product logos.

    With the beta release of Cloud Vision API, you can access the API with location of images stored in Google Cloud Storage, along with existing support of embedding an image as part of the API request. We’re also announcing pricing for Cloud Vision API and added additional capabilities to identify the dominant color of an image. For example, you can now apply Label Detection on an image for as little as $2 per 1,000 images or Optical Character Recognition (OCR) for $0.60 for 1,000 images. Pricing will be effective, starting March 1st.

    Cloud Vision API supports a broad set of scenarios from:

    Insights from your images: Powered by the same technologies behind Google Photos, Cloud Vision API detects broad sets of objects in your images — from flowers to popular landmarks
    Inappropriate content detection: Powered by Google SafeSearch, Cloud Vision API moderates content from your crowd sourced images by detecting different types of inappropriate content.
    Image sentiment analysis: Cloud Vision API can analyze emotional attributes of people in your images, like joy, sorrow and anger, along with detecting popular product logos.
    Text extraction: Optical Character Recognition (OCR) enables you to detect text within your images, along with automatic language identification across a broad set of languages.

    Since we announced the limited preview of Google Cloud Vision API in early December, thousands of companies have used the API, generating millions of requests for image annotations. We’re grateful for your feedback and comments and have been amazed by the breadth of applications using Cloud Vision API.

    Google Cloud Vision API is our first step on the journey to enable applications to see, hear and make information in the world more useful.

    Reply
  15. Tomi Engdahl says:

    Finnish startup creates a giant cloud services ecosystem

    Finnish startup Flashnode aims high: It is developing a cloud compatibility standard, which would link software used by companies to each other. The project has already received plenty of response, because the process of creating a standard is already involved in a number of software companies.

    The standard aims to combine going to support the business processes of software packages into one larger cloud services ecosystem. This would provide significant opportunities to accelerate the processes of financial management and accounting firms would greatly alleviate the workload. This would be an advantage, particularly for small and medium-sized enterprises involved.

    Source: http://www.tivi.fi/Kaikki_uutiset/suomalainen-startup-luo-jattimaista-pilvipalvelujen-ekosysteemia-6305501

    Reply
  16. Tomi Engdahl says:

    Google cloud wobbles as workers patch wrong routers
    Is ‘Sorry about that, we promise to learn from our mistakes’ any way to run a cloud?
    http://www.theregister.co.uk/2016/03/01/google_cloud_wobbles_as_workers_patch_wrong_routers/

    Add another SNAFU to the long list of Google cloud wobbles caused by human error: this time The Alphabet subsidiary decided to patch the wrong routers.

    The wobble wasn’t a big one: it lasted just 46 minutes and only hit Google Compute Engine Instances in the us-central1-f zone zone. Of course it wasn’t minor if yours was one of the 25 per cent of network flows that just didn’t make it into or out of that region.

    Here’s Google’s explanation of the problem’s root cause:

    “Google follows a gradual rollout process for all new releases. As part of this process, Google network engineers modified a configuration setting on a group of network switches within the us-central1-f zone. The update was applied correctly to one group of switches, but, due to human error, it was also applied to some switches which were outside the target group and of a different type. The configuration was not correct for them and caused them to drop part of their traffic.

    Face, meet palm. Repeat.

    Reply
  17. Tomi Engdahl says:

    Spare a reserved cloud instance, gov? Microsoft’s $1bn, 70k charity sales pitch
    Tech philanthropy’s problem with giving
    http://www.theregister.co.uk/2016/01/20/microsoft_one_billion_pledge/

    Microsoft is donating $1bn worth of its cloud services over three years to charities and non profits.

    The software giant used the annual Davos World Economic Forum media ego fest to unveil plans to recruit 70,000 to Microsoft cloud services in the next three years. Others promoted themselves at Davos in different ways.

    Microsoft hasn’t said whether qualifying organisations will get Microsoft’s cloud services for free or a reduced price.

    Organisations will get Microsoft’s PaaS Azure, Power BI, CRM Online and Enterprise Mobility Suite.

    Also, Microsoft will increase by 300 the number of grants to the Microsoft Azure research program that hands out free Azure compute and storage.

    Reply
  18. Tomi Engdahl says:

    There’s a courier here says he’s got 50TB of cloud data for you
    Amazon’s Snowball array-for-rent now available for data export
    http://www.theregister.co.uk/2016/03/03/aws_snowball_exports_now_available/

    Amazon Web Services has announced it will happily delivery 50TB of cloud data to your doorstep.

    The data will arrive wrapped in a Snowball, the rugged 50TB array the company revealed last year as a way to import data to its cloudy storage services.

    Amazon’s idea with Snowball is that lots of people want to adopt cloud storage, but balk at the time required to make big uploads, bandwidth costs and AWS’ own data movement fees. Snowball comes with an Ethernet port that AWS assumes you’ll use to pump the thing full of data before trucking it to one of Jeff Bezos’ cut-price bit barns. The device encrypts data, so even if the snowball falls off the back of a truck and rolls somewhere dangerous, your data should be safe.

    AWS has now reversed that process: if you have data in its S3 cloud storage service you can order it to be downloaded onto a Snowball and have it trucked back to you. Again, speed and cost are the issues.

    Amazon’s applied all its shipping-stuff-through-meatspace skills to the export process, as the Snowball arrives complete with return address label so that once your internal upload is done, sending the array home is easy.

    Reply
  19. Tomi Engdahl says:

    AppleInsider:
    IBM’s SleepHealth debuts as first ResearchKit app for IBM Watson Health Cloud, to study how sleep quality impacts daytime activities

    SleepHealth debuts as first ResearchKit app & study to support IBM Watson Health Cloud
    http://appleinsider.com/articles/16/03/02/sleephealth-debuts-as-first-researchkit-app-study-to-support-ibm-watson-health-cloud

    Almost one year after IBM announced Watson Health Cloud, a cognitive computing platform built in partnership with Apple, Johnson & Johnson and Medtronic, the computing giant on Tuesday debuted SleepHealth, an app and ResearchKit study investigating the connection between sleep habits and health.

    Reply
  20. Tomi Engdahl says:

    Cisco CTO: Containers will ride to private cloud’s rescue. Oh yes!
    Translation: We’re touting services but please don’t forget to buy our on-prem kit
    http://www.theregister.co.uk/2016/03/03/cisco_cto_says_cloud_no_mo/

    Cisco Partner Summit The emergence of containers will spark a renaissance for on-premises data centers, thus luring many businesses away from public cloud services, Cisco CTO Zorawar Biri Singh reckons.

    Speaking at the Cisco Partner Summit in San Diego, Singh said he believes as much as 30 per cent of public cloud workloads will be going offline in the next five years as customers opt instead for local data centers based on container stacks.

    Singh predicted that, as companies become more comfortable developing and deploying data centers with containers, larger deployments with public clouds will make less sense financially for many.

    “It is very expensive at that scale, as IT practitioners see simpler container-based infrastructure come out, they will build more smaller container-based data centers,” he said.

    Singh notes that Cisco would, well, obviously stand to profit from such a trend, though he argues that, with its focus on networking and UCS, Switchzilla has less to lose from public cloud growth than other server vendors.

    “There is a misperception that we are super exposed,” he said.

    “Overall port count decreases over time, but it is not as hard hit as compute and storage.”

    “We know exactly where our revenue base is, we are investing more in software because it is a natural balance,” he said. “There is nothing here that is a crazy leap.”

    Reply
  21. Tomi Engdahl says:

    Larry Dignan / ZDNet:
    Hewlett Packard Enterprise launches Haven OnDemand, the cloud version of its big data analytics software, on Azure — HPE launches machine-learning-as-a-service on Microsoft Azure — HPE Haven OnDemand provides application programming interfaces as well as services for enterprise software.

    HPE launches machine-learning-as-a-service on Microsoft Azure
    http://www.zdnet.com/article/hpe-launches-machine-learning-as-a-service-on-microsoft-azure/

    HPE Haven OnDemand provides application programming interfaces as well as services for enterprise software.

    Reply
  22. Tomi Engdahl says:

    Cade Metz / Wired:
    How Dropbox moved 90% of its files from Amazon’s S3 to data centers designed by its own engineers over two-and-a-half years — The Epic Story of Dropbox’s Exodus From the Amazon Cloud Empire — If you’re one of 500 million people who use Dropbox, it’s just a folder on your computer desktop …

    The Epic Story of Dropbox’s Exodus From the Amazon Cloud Empire
    http://www.wired.com/2016/03/epic-story-dropboxs-exodus-amazon-cloud-empire/

    If you’re one of 500 million people who use Dropbox, it’s just a folder on your computer desktop that lets you easily store files on the Internet, send them to others, and synchronize them across your laptop, phone, and tablet. You use this folder, then you forget it. And that’s by design. Peer behind that folder, however, and you’ll discover an epic feat of engineering. Dropbox runs atop a sweeping network of machines whose evolution epitomizes the forces that have transformed the heart of the Internet over the past decade. And today, this system entered a remarkable new stage of existence.

    For the first eight years of its life, you see, Dropbox stored billions and billions of files on behalf of those 500 million computer users. But, well, the San Francisco startup didn’t really store them on its own. Like so many other tech startups in recent years, Dropbox ran its online operation atop what is commonly called “the Amazon cloud,”

    But not anymore. Over the last two-and-a-half years, Dropbox built its own vast computer network and shifted its service onto a new breed of machines designed by its own engineers, all orchestrated by a software system built by its own programmers with a brand new programming language. Drawing on the experience of Silicon Valley veterans who erected similar technology inside Internet giants like Google and Facebook and Twitter, it has successfully moved about 90 percent of those files onto this new online empire.

    It’s a feat of extreme engineering, to be sure. But the significance of this move extends well beyond Dropbox. Rather ironically, it highlights how cloud computing is rapidly transforming the way businesses operate.

    Today, more and more companies are moving onto “the cloud”—not off. By 2020, according to Forrester, cloud computing will be a $191 billion market, with giants like Google and Microsoft challenging Amazon with their own cloud services.

    Amazon, which declined to comment for this story, just reported $2.41 billion in revenue for its Amazon Web Services division during the fourth quarter of last year, or more than $9.6 billion in annualized sales

    But some companies get so big, it actually makes sense to build their own network with their own custom tech and, yes, abandon the cloud. Amazon and Google and Microsoft can keep cloud prices low, thanks to economies of scale. But they aren’t selling their services at cost. “Nobody is running a cloud business as a charity,” says Dropbox vice president of engineering and ex-Facebooker Aditya Agarwal. “There is some margin somewhere.” If you’re big enough, you can save tremendous amounts of money by cutting out the cloud and all the other fat. Dropbox says it’s now that big.

    Reply
  23. Tomi Engdahl says:

    Frederic Lardinois / TechCrunch:
    Google adds support for Microsoft Office, Facebook at Work, Slack and others to its single sign-on solution

    Google adds support for Microsoft Office, Facebook at Work, Slack and others to its single sign-on solution
    http://techcrunch.com/2016/03/14/google-adds-support-for-microsoft-office-facebook-at-work-slack-and-others-to-its-single-sign-on-solution/

    Google doesn’t just offer its own web-based productivity apps, but it also offers a service for business users who want to use Google as an identity provider for accessing other online services using the widely used SAML standard.

    Today, Google is adding a few new options to this program, which now includes a number of Google competitors. Among the 14 new pre-configured options are the likes of Microsoft Office 365, Facebook at Work, New Relic, Concur, Box, Tableau, HipChat and Slack.

    Reply
  24. Tomi Engdahl says:

    Frederic Lardinois / TechCrunch:
    Google’s new Analytics 360 Suite for enterprise marketers directly challenges Adobe’s Marketing Cloud

    Google announces Analytics 360 Suite for enterprise marketers
    http://techcrunch.com/2016/03/15/google-announces-analytics-360-suite-for-enterprise-marketers/

    Google is launching a new product for enterprise marketers today that will directly challenge Adobe’s Marketing Cloud and similar services.

    The Google Analytics 360 Suite will combine Google Analytics Premium (now called Google Analytics 360) and Adometry (which it acquired in 2014 and which is now called Attribution 360), with an enterprise-class version of Google’s Tag Manger and three new products (Audience Center 360, Data Studio 360 and Optimize 360) into a single solution for marketers.

    Reply
  25. Tomi Engdahl says:

    Jack Clark / Bloomberg Business:
    Google to boost hiring for cloud division, plans 12 new data centers in next 18 months — Google’s Greene Hastens Cloud Expansion in Race With Amazon — Company plans 12 new cloud data centers in next 18 months — Cloud division to boost hiring, marketing under Diane Greene

    Google’s Greene Hastens Cloud Effort to Catch Amazon
    http://www.bloomberg.com/news/articles/2016-03-22/google-s-greene-hastens-cloud-expansion-in-race-with-amazon

    Google’s new cloud chief Diane Greene had unsettling news for employees at an internal sales meeting this month in Las Vegas: They weren’t taking corporate customers seriously enough and needed to sell harder, be hungrier and less complacent.

    That was an unusual message at Google, which typically venerates technology over sales and marketing. But it was a necessary one. Google is third in cloud computing, an increasingly popular way for companies to run their IT operations. That’s a $20 billion-a-year business forecast to grow 35 percent over the next year, according to Gartner Inc.

    To climb this ranking, the Alphabet Inc. subsidiary will massively expand its network of data centers, a move that fits with Google’s tendency to rely on technological solutions to challenges.

    Reply
  26. Tomi Engdahl says:

    Google Cloud Platform Blog:
    Node.js on Google App Engine goes beta — We’re excited to announce that the Node.js runtime on Google App Engine is going beta. Node.js makes it easy for developers to build performant web applications and mobile backends with JavaScript. App Engine provides an easy to use platform …

    Node.js on Google App Engine goes beta
    https://cloudplatform.googleblog.com/2016/03/Node.js-on-Google-App-Engine-goes-beta.html?m=1

    We’re excited to announce that the Node.js runtime on Google App Engine is going beta. Node.js makes it easy for developers to build performant web applications and mobile backends with JavaScript. App Engine provides an easy to use platform for developers to build, deploy, manage and automatically scale services on Google’s infrastructure. Combining Node.js and App Engine provides developers with a great platform for building web applications and services that need to operate at Google scale.

    Getting started with Node.js on App Engine is easy. We’ve built a collection of getting started guides, samples, and interactive tutorials that walk you through creating your code, using our APIs and services and deploying to production. When running Node.js on App Engine, you can use the tools and databases you already know and love. Use Express, Hapi, Parse-server or any other web server to build your app. Use MongoDB, Redis, or Google Cloud Datastore to store your data.

    https://cloud.google.com/appengine/

    Reply
  27. Tomi Engdahl says:

    Google reveals rapid cloud expansion
    Announces a dozen new regions by end of 2017
    http://www.theregister.co.uk/2016/03/23/google_reveals_rapid_cloud_expansion/

    Google has decided to play catch-up with Amazon Web Services (AWS), announcing that it’s going to add 12 regions to its cloud services by the end of 2017.

    The rollout will start with a Western US region hosted in Oregon, and an East Asia region to be built in Tokyo. Each of these regions will have multiple availability zones, Google says.

    Its existing regions are Eastern US (South Carolina), Central US (Iowa), Western Europe (Belgium), and an East Asia region hosted in Taiwan.

    Google Cloud Platform adds two new regions, 10 more to come
    https://cloudplatform.googleblog.com/2016/03/announcing-two-new-Cloud-Platform-Regions-and-10-more-to-come_22.html

    Reply
  28. Tomi Engdahl says:

    IBM LinuxONE: Who Needs the Cloud?
    http://www.linuxjournal.com/content/ibm-linuxone-who-needs-cloud

    IBM has long been a stalwart supporter of, and participant in the Open Source community. So IBM’s announcement of the LinuxONE platform last year should have come as a surprise to no one. The ultimate goal for LinuxONE, however, may be a bit more surprising.

    LinuxONE is a computing platform designed specifically to take optimum advantage of any or all of the major distributions of Linux; SUSE, Red Hat and starting in April Canonical’s Ubuntu as well. All models have just undergone a significant refresh, adding even more features and capabilities including faster processors, more memory and support for larger amounts of data. There are two LinuxONE models: The LinuxONE Emperor is designed primarily for large enterprises. According to IBM, it can run up to 8,000 virtual servers, over a million Docker containers and 30 billion RESTful web interactions per day supporting millions of active users. The Emperor can have up to 141 cores, 10 terabytes of shared memory, and 640 dedicated I/O (input/output) processors. The LinuxONE Rockerhopper model is a more entry-level platform aimed at mid-sized businesses. Available with up to 20 cores, running at 4.3 GHz, and 4 TBs of memory for performance and scaling advantages. It is capable of supporting nearly a thousand virtual Linux servers on a single footprint. Both LinuxONE systems support KVM (Kernel-based Virtual Machine) with the initial port being supported by SUSE’s distribution.

    Reply
  29. Tomi Engdahl says:

    Introduction to Cloud Infrastructure Technologies
    https://www.edx.org/course/introduction-cloud-infrastructure-linuxfoundationx-lfs151-x

    Learn the fundamentals of building and managing cloud technologies directly from The Linux Foundation, the leader in open source.

    What you’ll learn

    Basics of cloud computing
    Characteristics of the different cloud technologies
    Working knowledge on how to choose the right technology stack for your needs

    Reply
  30. Tomi Engdahl says:

    Enterprise revenues power Red Hat past $2bn barrier
    Linux spinner claims hybrid cloud growth
    http://www.theregister.co.uk/2016/03/23/red_hat_2_billion_revenue_q4_fy_2016_results/

    Red Hat is in the enviable position of having become the first open-source firm to break the $2bn revenue barrier.

    The Linux spinner has reported full-year revenue $2.05bn, an increase of 14 per cent from subscriptions, training and services. Net income was up 10 per cent to $199m.

    For its fourth quarter Red Hat reported $543m in revenue – growing 17 per cent year on year – with net income of $53m, up 11 per cent on 2015.

    Red Hat was the first open-source firm to break the psychologically important – for software firms – $1bn barrier in its fiscal year 2011, announced in March 2012.

    Red Hat has achieved its targets by focusing squarely on the enterprise and on the server, unlike its consumer and/or desktop-obsessed rivals, and other distros.

    Red Hat has promoted independent, vendor-neutral cloud with CloudForms – its Infrastructure as a Service – and Red Hat Enterprise Linux OpenStack Platform.

    Cloud, particularly independent cloud, is a tough road in the world of AWS and Microsoft Azure, neither of which are open source, as Red Hat discovered with CloudForms. RHEL is, though, an option for Penguins on both.

    “Our revenue from private Infrastructure-as-a-Service, PaaS and cloud management technologies is growing at nearly twice as fast as our public cloud revenue did when it was at the same size,”

    Reply
  31. Tomi Engdahl says:

    Frederic Lardinois / TechCrunch:
    Google debuts Cloud Machine Learning Platform to assist in developing pre-trained machine learning models and building new models from scratch — Google launches new machine learning platform — Google today announced a new machine learning platform for developers at its NEXT Google Cloud Platform user conference in San Francisco.

    Google launches new machine learning platform
    http://techcrunch.com/2016/03/23/google-launches-new-machine-learning-platform/

    Google today announced a new machine learning platform for developers at its NEXT Google Cloud Platform user conference in San Francisco. As Google chairman Eric Schmidt stressed during today’s keynote, Google believes machine learning is “what’s next.” With this new platform, Google will make it easier for developers to use some of the machine learning smarts Google already uses to power features like Smart Reply in Inbox.

    The service is now available in limited preview.

    “Major Google applications use Cloud Machine Learning, including Photos (image search), the Google app (voice search), Translate and Inbox (Smart Reply),” the company says. “Our platform is now available as a cloud service to bring unmatched scale and speed to your business applications.”

    Google’s Cloud Machine Learning platform basically consists of two parts: one that allows developers to build machine learning models from their own data, and another that offers developers a pre-trained model.

    TechCrunch:
    Google opens access to its speech recognition API, going head to head with Nuance
    http://techcrunch.com/2016/03/23/google-opens-access-to-its-speech-recognition-api-going-head-to-head-with-nuance/

    Google is planning to compete with Nuance and other voice recognition companies head on by opening up its speech recognition API to third-party developers. To attract developers, the app will be free at launch with pricing to be introduced at a later date.

    We’d been hearing murmurs about this service developing for weeks now. The company formally announced the service today during its NEXT cloud user conference, where it also unveiled a raft of other machine learning developments and updates, most significantly a new machine learning platform.

    The Google Cloud Speech API, which will cover over 80 languages and will work with any application in real-time streaming or batch mode, will offer full set of APIs for applications to “see, hear and translate,” Google says. It is based on the same neural network tech that powers Google’s voice search in the Google app and voice typing in Google’s Keyboard.

    Reply
  32. Tomi Engdahl says:

    Ron Miller / TechCrunch:
    Google Stackdriver helps IT get unified view across AWS and Google Cloud — Today at the GCPNext16 event in San Francisco, Google announced the launch of Google StackDriver, a tool that gives IT a unified tool for monitoring, alerting, incidents management and logging complete …

    Google Stackdriver helps IT get unified view across AWS and Google Cloud
    http://techcrunch.com/2016/03/23/google-stackdriver-helps-it-get-unified-view-across-aws-and-google-cloud/

    Today at the GCPNext16 event in San Francisco, Google announced the launch of Google StackDriver, a tool that gives IT a unified tool for monitoring, alerting, incidents management and logging complete with dashboards providing visual insights across each category.

    Google purchased Stackdriver, the Cambridge, MA company in 2014 when it was mostly devoted to AWS cloud monitoring. It helped the team incorporate that into the Google Cloud Platform while continuing to support AWS (making it an extremely valuable tool for companies that support both platforms).

    Stackdriver is not only bringing these capabilities together into a single tool, which is in itself providing a valuable service, it’s also making it highly customizable. For example, if you know that your application has problems when there are memory spikes, rather than CPU spikes, you can fashion an alert that lets you know when your application has a memory issues. Theoretically, this should enable you to react before the problem gets out of control.

    The logging capabilities let you search across your GCP and AWS clusters from a single interface.

    It’s trying to differentiate itself from the competition, particularly AWS, even while supporting AWS in this tool.

    Reply
  33. Tomi Engdahl says:

    Mary Jo Foley / ZDNet:
    Microsoft starts rolling out Office 365 Connectors, which pull content from other apps and services into Office 365 Groups shared inboxes

    Microsoft starts rolling out Office 365 Connectors as part of Groups
    http://www.zdnet.com/article/microsoft-starts-rolling-out-office-365-connectors-as-part-of-groups/

    Microsoft is starting to roll out Office 365 Connectors, a feature that pulls content and updates from Microsoft and third-party apps and services, like Twitter, Trello, RSS and more, directly into Outlook.

    Reply
  34. Tomi Engdahl says:

    Jordan Novet / VentureBeat:
    Microsoft debuts Cognitive Services with 22 APIs available now, including some for vision, speech, knowledge, and search, priced per transaction but free to try

    Microsoft launches Cognitive Services based on Project Oxford and Bing
    http://venturebeat.com/2016/03/30/microsoft-cognitive-services-project-oxford/

    Microsoft today announced updates to its portfolio of machine learning tools. Until now they have fallen under the Project Oxford name, but now they are being rebranded to Microsoft Cognitive Services. The Project Oxford website now redirects to the new Cognitive Services website.

    In total there are 22 APIs available in Cognitive Services now, said Microsoft senior program manager lead Cornelia Carapcea.

    And now there are prices for the new services, along with application programming interfaces (APIs) made available from Microsoft’s Bing search division. Developers can still try out these services for free.

    Reply
  35. Tomi Engdahl says:

    Andrew Cunningham / Ars Technica:
    Microsoft says Windows 10 is now on over 270M active devices, up from 200M in January, making it the fastest growing version of Windows

    Microsoft: Windows 10 has over 270 million active users
    http://arstechnica.com/gadgets/2016/03/microsoft-windows-10-has-over-270-million-active-users/
    Brisk adoption rate continues eight months after Windows 10′s initial launch.

    Reply
  36. Tomi Engdahl says:

    The Epic Story of Dropbox’s Exodus From the Amazon Cloud Empire
    http://www.wired.com/2016/03/epic-story-dropboxs-exodus-amazon-cloud-empire/

    If you’re one of 500 million people who use Dropbox, it’s just a folder on your computer desktop that lets you easily store files on the Internet, send them to others, and synchronize them across your laptop, phone, and tablet. You use this folder, then you forget it. And that’s by design. Peer behind that folder, however, and you’ll discover an epic feat of engineering. Dropbox runs atop a sweeping network of machines whose evolution epitomizes the forces that have transformed the heart of the Internet over the past decade. And today, this system entered a remarkable new stage of existence.

    For the first eight years of its life, you see, Dropbox stored billions and billions of files on behalf of those 500 million computer users. But, well, the San Francisco startup didn’t really store them on its own. Like so many other tech startups in recent years, Dropbox ran its online operation atop what is commonly called “the Amazon cloud,”

    But not anymore. Over the last two-and-a-half years, Dropbox built its own vast computer network and shifted its service onto a new breed of machines designed by its own engineers, all orchestrated by a software system built by its own programmers with a brand new programming language.

    Reply
  37. Tomi Engdahl says:

    Triple Play: GitHub’s Code Now Lives in Three Places at Once
    http://www.wired.com/2016/04/github-now-three-places-keep-code-connected/

    On the Internet, everything can be everywhere. And that’s true in more ways than one. If your phone goes online—no matter where you are in the world—you can theoretically visit every last bit of information uploaded to the global network of machines we call the Internet. And by that same logic, all this information can also be stored in so many different places.

    The Google search engine doesn’t sit on one machine in one location. It resides on thousands of machines in computer data centers across the globe. The same goes for Facebook and Twitter and Dropbox. If these tech giants are doing their jobs right, each individual piece of data they store is sitting not just in one place but in many places, in case of emergency.

    Some companies do this kind of thing better than others. But among the biggest and best services, it’s the norm. They even ensure redundantly distributed data within individual data centers.

    Today, the power of redundancy was reaffirmed by GitHub, the online service that has become the world’s de facto repository for open source software, software freely available to the world at large. This morning, the eponymous San Francisco company that runs the service announced that it’s now storing projects using a new system called DGit, short for Distributed Git, to ensure everything sits in many places, not just one.

    GitHub is, in essence, making itself look more like Google or Facebook.

    Rule of Threes

    GitHub is already a vastly distributed system. Based on software called Git, invented by open source granddaddy Linus Torvalds, GitHub operates in a wonderfully smooth way. Coders download a complete copy of an open source project onto their own machines and, as they make changes, they can so easily merge these changes back into the central repository. The result is that myriad copies of each project are spread across the net, which makes for a great backup if GitHub ever goes belly up or otherwise disappears from the face of the Earth.

    But with DGit, GitHub has gone a step further. The central repository is now stored only just on one machine but on three machines. If two go down, the project is still available to everyone, and the system then rebuilds additional replicas on other machines. “What DGit does is that it makes Git a lot more aware of the environment it’s in and where it’s being stored,”

    Reply
  38. Tomi Engdahl says:

    Jeff Bezos: AWS Will Break $10 Billion This Year
    https://news.slashdot.org/story/16/04/07/2334254/jeff-bezos-aws-will-break-10-billion-this-year

    Jeff Bezos is bullish on the cloud, pegging AWS’ sales for this year at $10 billion in a recent letter to shareholders. But he said there was a surprising source of that success: The company’s willingness to fail. That said, with AWS now spanning 70 different services, Amazon can afford to fail some as long as few, like EC2 and S3, keep winning.

    Bezos wrote: “One area where I think we are especially distinctive is failure. I believe we are the best place in the world to fail (we have plenty of practice!), and failure and invention are inseparable twins.”

    Jeff Bezos: AWS will break $10 billion this year — driven by Amazon’s failures
    In Jeff Bezos’ recent letter to shareholders, the Amazon CEO said that AWS is hitting $10 billion in sales
    http://windowsitpro.com/cloud/jeff-bezos-aws-will-break-10-billion-year-driven-amazons-failures

    Reply
  39. Tomi Engdahl says:

    Arik Hesseldahl / Re/code:
    Box teams up with AWS and IBM for Box Zones, which lets firms store files locally in Europe, Asia — Box Teams Up With Amazon, IBM to Offer Local Storage Worldwide — Cloud storage company Box is making an effort to widen its appeal as the office file storage and sharing product …

    Box Teams Up With Amazon, IBM to Offer Local Storage Worldwide
    http://recode.net/2016/04/12/box-teams-up-with-amazon-ibm-to-offer-local-storage-worldwide/

    Cloud storage company Box is making an effort to widen its appeal as the office file storage and sharing product of choice among large companies in regulated industries that have offices spread out all over the world.

    The company founded by CEO Aaron Levie said it is teaming up with Amazon’s cloud computing division Amazon Web Services and with the computing giant IBM to offer file-storage options in Germany, Ireland, Japan and Singapore. The aim is to address concerns some companies face about storing critical information on computers that are physically located in a country where they operate.

    It seems like a small thing, but for companies in certain industries like life sciences, financial services and health care, it’s a big deal. They’re often subject to data sovereignty laws that require them to keep the data they work on stored locally within the country where they operate. That has made it hard for some to fully embrace cloud apps like Box.

    Reply
  40. Tomi Engdahl says:

    Julia Fioretti / Reuters:
    Microsoft offers first major endorsement of new EU-U.S. data pact — Microsoft (MSFT.O) became on Monday the first major U.S. tech company to say it would transfer users’ information to the United States using a new transatlantic commercial data pact and would resolve any disputes with European privacy watchdogs.

    Microsoft offers first major endorsement of new EU-U.S. data pact
    http://www.reuters.com/article/us-microsoft-dataprotection-eu-idUSKCN0X81IN

    icrosoft (MSFT.O) became on Monday the first major U.S. tech company to say it would transfer users’ information to the United States using a new transatlantic commercial data pact and would resolve any disputes with European privacy watchdogs.

    Data transfers to the United States have been conducted in a legal limbo since October last year when the European Union’s top court struck down the Safe Harbour framework that allowed firms to easily move personal data across the Atlantic in compliance with strict EU data transferral rules.

    EU data protection law bars companies from transferring personal data to countries deemed to have insufficient privacy safeguards, of which the United States is one, unless they set up complex legal structures or use a framework like Safe Harbour.

    Microsoft said it would sign up to the EU-U.S. Privacy Shield, the new framework that was agreed by Brussels and Washington in February to fill the void left by Safe Harbour and ensure the $260 billion in digital services trade across the Atlantic continues smoothly.

    Reply
  41. Tomi Engdahl says:

    Frederic Lardinois / TechCrunch:
    Microsoft is bringing automatic video summarization, Hyperlapse, OCR and more to Azure Media Services
    http://techcrunch.com/2016/04/14/microsoft-is-bringing-automatic-video-summarization-hyperlapse-ocr-and-more-to-its-azure-machine-learning-service/

    Azure Media Services, Microsoft’s collection of cloud-based tools for video workflows, is about to get a lot smarter. As the company announced at the annual NAB show in Las Vegas today, Media Services will now make use of some of the tools Microsoft developed for its machine learning services for video, as well.

    This means Media Services can now automatically select the most interesting snippets from a source video, for example, to give you a quick summary of what the full video looks like.

    In addition, Microsoft is building face detection into these tools and the company is including its ability to detect people’s emotions (something the company’s Cognitive Services already do for still images).

    Reply
  42. Tomi Engdahl says:

    Alex Konrad / Forbes:
    DigitalOcean borrows $130M to expand global infrastructure ahead of new storage product launch scheduled for December

    Cloud Computing Upstart DigitalOcean Borrows $131 Million To Add Storage And Expand Into India
    http://www.forbes.com/sites/alexkonrad/2016/04/14/digital-ocean-borrows-131-million/#4f7573bd5022

    In his goal to offer the simplest and most developer-friendly cloud computing option on the market, DigitalOcean CEO Ben Uretsky is content to let Amazon, Microsoft MSFT +0.02% and Google GOOGL +0.52% battle it out to for supremacy among the biggest corporate customers. But even that goal comes at considerable cost.

    DigitalOcean announced Thursday that it has raised $130 million in a credit facility that the company plans to use in large part to expand its global infrastructure as it prepares a key new product launch in December.

    The startup had previously raised $123 million from investors

    DigitalOcean says it’s adding about 20,000 customers a month and its all-time user base has nearly tripled to 700,000 from 200,000 in 2014. Users have launched 13 million of the company’s virtual private servers, which it calls Droplets, since its founding in 2012, all but 1.5 million of those in the past two years.

    Those numbers point to rapid growth, but in absolute terms can still look like a drop in the bucket compared to overall footprint of the users at big businesses working with Amazon Web Services, Microsoft and newer arrival Google.

    Reply
  43. Tomi Engdahl says:

    Microsoft Details Security Responsibilities for Azure Cloud Customers
    http://www.securityweek.com/microsoft-details-security-responsibilities-azure-cloud-customers

    Microsoft Publishes White Papers on Incident Response and Shared Responsibility for Azure Cloud Customers

    Incident response is a hot topic. The cloud is a hot topic. But how do you respond to incidents in the cloud for which you may have no knowledge? It’s a difficult issue that could cause problematic relations between enterprises and the big clouds like Microsoft Azure, Amazon Web Services (AWS) and Google.

    Now Microsoft is setting the ground rules with two new documents: Shared Responsibilities for Cloud Computing, and Microsoft Azure Security Response in the Cloud.

    The first specifies Microsoft’s view of its own responsibilities (and therefore, by omission, the enterprise CISO’s responsibilities); while the second gives an outline of how Microsoft will actually respond.

    Microsoft believes that there is a fairly fundamental split in responsibility. The cloud provider is responsible for the physical aspects of the cloud IT infrastructure and the software that it provides. The customer is responsible for its own data.

    It sounds simple, but there will inevitably be problems. Forensic proof will become important. If a customer’s data is modified via a flaw in Microsoft software that Microsoft doesn’t recognize, there could be issues.

    This is where the CISO needs to understand both documents – because there are some incidents that Microsoft will not report to the customer.

    And as every CISO knows, visibility into the cloud is not always 20/20.

    Having suggested that Microsoft’s definition of its and its customers’ responsibilities could cause operational difficulties, the remainder of the Incident management document is informative and valuable. It first outlines the roles that should be involved in a response, and then describes the plan itself.

    The plan itself involves five stages: Detect, Assess, Diagnose, Stabilize, and Close.

    Reply
  44. Tomi Engdahl says:

    views of cloud service providers are excellent, the research house IDC predicts cloud IT infrastructure investments to grow 18.9 percent this year.

    This development increased the sales of traditional IT infrastructure systems is reduced by four per cent this year, although the traditional systems continue to be the strongest. They account for 62.8 per cent of end-users of all investments.

    14.1 per cent of all cloud-infrastructure investment is investing in the public cloud. Private cloud represents 11.1 per cent.

    ElasticHostsin CEO Richard Davies says customers especially benefit from the next generation of container technology that allows scalability of IT services will be improved.

    “The public cloud, customers pay only for the computing services, is therefore no longer idle,”

    The cloud-infrastructure represent the nobility of the Ethernet ports, with investments increasing by 26.8 per cent during the next 12 months.

    Servers and storage devices is a growth rate of 12.4 and 11.3 per cent.

    Source: http://www.tivi.fi/CIO/pilvi-infran-myynti-kasvaa-lahes-viidenneksen-tana-vuonna-6542309

    Reply
  45. Tomi Engdahl says:

    Ingrid Lunden / TechCrunch:
    IBM inks video deals with AOL, CBC, more; debuts quality live-stream over ‘commodity’ Internet
    http://techcrunch.com/2016/04/18/ibm-inks-video-deals-with-aol-cbc-more-debuts-quality-live-stream-over-commodity-internet/

    IBM today unveiled some significant strides forward in its bid to be a major player in the world of online and cloud-based video services, three months after the company acquired live-streaming startup Ustream and formed a cloud video unit. AOL (which owns TechCrunch), the Canadian Broadcasting Company, Comic-Con and Mazda have all signed on for IBM to provide online video solutions.

    And IBM’s video unit is also launching two new services to expand its footprint in live-streaming: a product that will let media companies produce high-quality live-streams over ordinary broadband connections; and an enterprise CDN product that lets companies broadcast live-stream video within their firewalls without impacting other traffic.

    Reply
  46. Tomi Engdahl says:

    Natalie Gagliordi / ZDNet:
    IBM beats Q1 earnings targets with quarterly revenue of $18.7B; strategic businesses including cloud and analytics grew revenue 14% year-over-year — IBM beats Q1 earnings targets on double-digit cloud growth — IBM said quarterly revenue from its strategic businesses including cloud …

    IBM beats Q1 earnings targets on double-digit cloud growth
    http://www.zdnet.com/article/ibm-beats-q1-earnings-targets-on-double-digit-cloud-growth/

    IBM said quarterly revenue from its strategic businesses including cloud and analytics increased 14 percent year-over-year and now represent 37 percent of the company’s revenue.

    IBM’s cloud revenue climbed 36 percent to $10.8 billion over the last 12 months.

    Reply
  47. Tomi Engdahl says:

    Telstra unveils Cloud Gateway, adds AWS to cloud services
    http://www.zdnet.com/article/telstra-unveils-cloud-gateway-adds-aws-to-cloud-services/

    Telstra has revealed its hybrid cloud solution for business customers, aiming to provide secure access to multiple cloud services.

    Telstra has announced a business multi-cloud connecting solution to support the use of hybrid cloud services including those from Microsoft, Amazon Web Services (AWS), VMware, and IBM.

    The solution, called Cloud Gateway, allows customers to connect directly to multiple public cloud environments via Telstra’s IP network

    Cloud Gateway will enable Australian customers to connect to Microsoft Azure, Office365, AWS, IBM SoftLayer, and VMware vCloud Air, while international customers can only access AWS and IBM SoftLayer for now.

    Reply
  48. Tomi Engdahl says:

    Frederic Lardinois / TechCrunch:
    Microsoft’s Azure Container Service is now generally available
    http://techcrunch.com/2016/04/19/microsofts-azure-container-service-is-now-generally-available/

    Azure Container Service, Microsoft’s container scheduling and orchestration service for its Azure cloud computing service, is now generally available.

    The service, which allows its users to choose either Mesosphere’s Data Center Operating System (DC/OS) or Docker’s Swarm and Compose to deploy and orchestrate their containers, was first announced in September 2015 and hit public preview this February.

    As Microsoft’s CTO for Azure (and occasional novelist) Mark Russinovich told me, he believes this ability to use both Docker Swarm/Compose and the open-source components of DC/OS — both of which are based on open-source projects — makes the Azure Container Service stand out from some of its competitors.

    Microsoft also believes that using these open-source solutions means its users can easily take their workloads and move them on-premise when they want (or move their existing on-premise solutions to Azure, too, of course).

    Azure Container Service
    Deploy and manage containers using the tools you choose
    https://azure.microsoft.com/en-us/services/container-service/

    Reply
  49. Tomi Engdahl says:

    John Ribeiro / Computerworld:
    Mesosphere makes its data center management software, DC/OS, open source

    Mesosphere open-sources data center management software
    http://www.computerworld.com/article/3058287/data-center/mesosphere-open-sources-data-center-management-software.html

    The startup is backed in this move by over 60 tech companies, including Hewlett Packard Enterprise and Microsoft

    Derived from its Datacenter Operating System, a service that Mesosphere set out to build as an operating system for all servers in a data center as if they were a single pool of resources, the open-source DC/OS offers capabilities for container operations at scale and single-click, app-store-like installation of over 20 complex distributed systems, including HDFS, Apache Spark, Apache Kafka and Apache Cassandra, the company said in a statement Tuesday.

    DC/OS is built around the Apache Mesos kernel for distributed tools including analytics, file systems and Web servers; Mesosphere founder Benjamin Hindman and colleagues at the University of California, Berkeley developed DC/OS in 2009.

    But the minimalist approach used to develop Mesos proved inadequate when it came to running most applications as other functionality such as service discovery, load balancing, user/service authentication and authorization, and command-line and user interfaces had to come from components that run alongside or on top of Mesos, Hindman said in a blog post.

    “By open sourcing DC/OS we’re enabling organizations of all sizes to harness the same computing infrastructure as the Twitters and Apples of the world,”

    While some of the technologies in DC/OS such as Mesos were already open source, others such as the GUI and the Minuteman load balancer were proprietary technologies developed by Mesosphere.

    Some of the components that Mesosphere built as part of its Data Center Operating System, and are included in DC/OS, are Marathon, a container orchestrator platform; Universe, which provides an app store-like experience for deploying distributed systems and additional management components; tools for operating the DC/OS from the Web or command line; and GUI-based installers for on-premises and cloud.

    Reply
  50. Tomi Engdahl says:

    Google brings robust cluster scheduling to its cloud
    http://www.computerworld.com/article/2491299/cloud-computing/google-brings-robust-cluster-scheduling-to-its-cloud.html

    Google Cloud users can now run Docker jobs alongside their Hadoop workloads in the same cluster

    Google is drawing from the work of the open-source community to offer its cloud customers a service to better manage their clusters of virtual servers.

    On Monday, the Google Cloud Platform started offering the commercial version of the open-source Mesos cluster management software, offered by Mesosphere.
    Cloud Watch

    Google brings robust cluster scheduling to its cloud
    What failure looks like in Microsoft’s cloud-first world
    VC investors hot for the cloud, mobile and robots
    Peer pressure! Business pushing the cloud on enterprise IT
    HP trims its cloud offer for lighter use

    With the Mesosphere software, “You can create a truly multitenant cluster, and that drives up utilization and simplifies operations,” said Florian Leibert, co-founder and CEO of Mesosphere. Leibert was also the engineering lead at Twitter who introduced Mesos to the social media company.

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*