Gartner Magic Quadrant for Cloud IaaS 2017

Gartner has published a new magic quadrant for infrastructure-as-a-service (IaaS) – the results should not be surprising to anybody. Consider this posting as update to my previous cloud market posting few years back. Here is reporting on newest cloud market trends from two sources:
Gartner puts AWS, Microsoft Azure top of its Magic Quadrant for IaaS | ZDNet

https://www.google.fi/amp/www.zdnet.com/google-amp/article/gartner-puts-aws-microsoft-azure-top-of-its-magic-quadrant-for-iaas/

Amazon Web Services (AWS) and Microsoft Azure dominate the infrastructure-as-a-service field, according to Gartner, which released its IaaS Magic Quadrant.

However, Google Cloud is emerging as a key challenger.

Gartner confirms what we all know: AWS and Microsoft are the cloud leaders, by a fair way

https://www.google.fi/amp/s/www.theregister.co.uk/AMP/2017/06/19/gartner_confirms_what_we_all_know_aws_and_microsoft_are_the_cloud_leaders_by_a_fair_way/

Paranormal parallelogram for IaaS has Google on the same lap, IBM and Oracle trailing

84 Comments

  1. Tomi Engdahl says:

    ‘Biggest Data Center’ To Be Built in Arctic
    https://hardware.slashdot.org/story/17/08/15/1726209/biggest-data-center-to-be-built-in-arctic

    A small town in the remote north of the Arctic Circle is set to be home to the world’s largest data center. From a report:
    The firm behind the project, Kolos, says the chilled air and abundant hydropower available locally would help it keep its energy costs down. The area, however, suffers the country’s highest rate of sick leave from work, which may be related to its past as a mining community. The US-Norwegian company says it has already raised “several million dollars” for the project from Norwegian private investors. However, it is still working with a US investment bank to secure the remaining necessary funds.

    Record-sized data centre planned inside Arctic Circle
    http://www.bbc.com/news/technology-40922048

    Plans to build the world’s “largest” data centre are being made public.

    The facility is set to be created at the Norwegian town of Ballangen, which is located inside the Arctic Circle.

    The firm behind the project, Kolos, says the chilled air and abundant hydropower available locally would help it keep its energy costs down.

    The area, however, suffers the country’s highest rate of sick leave from work, which may be related to its past as a mining community.

    The US-Norwegian company says it has already raised “several million dollars” for the project from Norwegian private investors.

    However, it is still working with a US investment bank to secure the remaining necessary funds.

    Cheap energy

    Tech consultancy Gartner says this has meant private endeavours have needed to seek scale of their own in order to keep their prices competitive.

    “There’s always a danger with this kind of thing that providers rush to build capacity that outstrips what the market requires,” added David Groombridge, research director at tech consultancy Gartner.

    “But in terms of data centres, it’s hard to see consumer-driven demands dropping off and there’s the promise of the internet-of-things, with millions of sensors generating information that will need to be processed.

    “So, unless there are radical new technologies that come along very quickly to help compress data, we will need the resources that these kind of facilities provide.”

    Reply
  2. Tomi Engdahl says:

    Norway gets the world’s largest data center

    The beginner data center entrepreneur Kolos starts right away with full steam. The Norwegian company builds Narvik near Ballangeni the world’s largest data center. When all servers are in use, the center needs gigawatts of power.

    Gigawatt is a shocking power requirement. For example, Google’s Hamina Center is a 72-megawatt facility and Facebook’s center in Sweden consumes 120 megawatts.

    According to Kolos, the northern data center starts with 70 megawatts of consumption. All the energy used in the plant comes from renewable sources. Norway has a lot of hydropower.

    Source: http://www.etn.fi/index.php/13-news/6677-norjaan-tulee-maailman-suurin-datakeskus

    Reply
  3. Tomi Engdahl says:

    Microsoft launches Azure Event Grid, a fully managed event routing service
    https://techcrunch.com/2017/08/16/microsoft-launches-azure-event-grid-a-fully-managed-event-routing-service/

    Microsoft announced a new product in its Azure line-up in preview today that will make it easier for developers to build event-based applications.

    The Azure Event Grid makes events (like uploading a picture or video, clicking a button, updating a database, etc.) first-class Azure objects. Event Grid complements Azure Functions and Azure Logic Apps, Microsoft’s existing serverless offerings, and gives developers access to a fully managed event routing service. This new service gives them the flexibility to ingest and react to virtually any event — whether that’s happening inside Azure or on a third-party service or in an existing application.

    Developers can use Event Grid to route events to specific endpoints (or even multiple endpoints) and filter them as necessary.

    “Serverless” has always been a misnomer, given that even the most serverless of serverless applications still needs to run on servers. Still, the basic idea behind serverless platforms is that you can use this model to build event-driven applications without having to worry about the underlying infrastructure.

    Indeed, Microsoft director of Azure Compute Corey Sanders told me that Event Grid actually sits on top of Service Fabric, Microsoft’s platform for building microservices.

    Event Grid takes the ideas of Azure Functions and Logic Apps a bit further, though, thanks to its built-in ability to take inputs from any application with the help of webhook endpoints

    Out of the box, Event Grid also supports Azure Blog Storage, Resource Manager, Application Topics, Event Hubs, Azure Functions, Azure Automation and Logic Apps, with support for other Azure-based services, including the new CosmosDB database service and IoT Hub, coming later this year. Given that IoT applications are a logical fit for this service, it’s actually a bit of a surprise that support for IoT Hub isn’t part of this initial release.

    Pricing for Event Grid is based on the number of operations you process. The first 100,00 operations are free; after that, you pay $0.60 per million operations

    Operations are defined as any ingress, advanced match, delivery attempt or management call.

    Reply
  4. Tomi Engdahl says:

    Five characteristics of cloud computing
    http://www.controleng.com/single-article/five-characteristics-of-cloud-computing/d3d1cbc797f4149ba98d082e466f08f3.html

    Cloud computing’s characteristics and benefits include on-demand self-service, broad network access, and being very elastic and scalable.

    As cloud computing services mature both commercially and technologically, it will be easier for companies to maximize the potential benefits. Knowing what cloud computing is and what it does, however, is just as important. The National Institute of Standards and Technology (NIST) defines cloud computing as it is known today through five particular characteristics.

    1. On-demand self-service
    2. Broad network access
    3. Multi-tenancy and resource pooling
    4. Rapid elasticity and scalability
    5. Measured service

    Cloud computing resources usage is metered and manufacturing organizations pay accordingly for what they have used. Resource utilization can be optimized by leveraging charge-per-use capabilities. This means that cloud resource usage—whether virtual server instances that are running or storage in the cloud—gets monitored, measured and reported by the cloud service provider.

    Reply
  5. Tomi Engdahl says:

    Jon Fingas / Engadget:
    Code42 says it will end its cloud backup service Crashplan for Home on Oct 23, 2018 and focus instead on its business services — If you rely on Crashplan as a remote backup for your computer, you’re going to have to find an alternative in short order. Code42 is phasing out its Crashplan …

    Crashplan drops its cloud backup service for home users
    You’ll have to find an alternative to safeguard your files online.
    https://www.engadget.com/2017/08/22/crashplan-drops-cloud-backups-for-home-users/

    If you rely on Crashplan as a remote backup for your computer, you’re going to have to find an alternative in short order. Code42 is phasing out its Crashplan for Home service as it switches its focus to business users. The company has stopped offering new or renewed Home subscriptions as of August 22nd, and the service will shut down entirely on October 23rd, 2018. If you haven’t moved your files elsewhere by then, you’re out of luck. The team is trying to make the transition as gentle as possible, at least. It’s extending all Home subscriptions by 60 days to give people time to find alternatives, and it’s offering discounts for both its own Small Business tier and a preferred alternative, Carbonite.

    You don’t have to go to either of those options, of course. Alternatives like Backblaze exist if you need to safeguard absolutely everything, and you can use free or low-cost services like Google Drive if you’re just interested in protecting a limited number of can’t-lose files.

    The move isn’t entirely shocking, especially in an era where ISP data caps make it impractical to upload the entire contents of your PC. Businesses are more likely to need that absolute protection, and their tendency to subscribe in bulk makes them tempting targets.

    Reply
  6. Tomi Engdahl says:

    Salvador Rodriguez / Reuters:
    Google says it will announce specs for Titan, a security chip that scans cloud hardware for evidence of tampering, on Thursday

    Google touts Titan security chip to market cloud services
    http://www.reuters.com/article/us-alphabet-google-titan-idUSKCN1B22D6

    SAN FRANCISCO (Reuters) – Alphabet Inc’s (GOOGL.O) Google this week will disclose technical details of its new Titan computer chip, an elaborate security feature for its cloud computing network that the company hopes will enable it to steal a march on Amazon.com Inc (AMZN.O) and Microsoft Corp (MSFT.O).

    Titan is the size of a tiny stud earring that Google has installed in each of the many thousands of computer servers and network cards that populate its massive data centers that power Google’s cloud services.

    Google is hoping Titan will help it carve out a bigger piece of the worldwide cloud computing market, which is forecast by Gartner to be worth nearly $50 billion.

    Reply
  7. Tomi Engdahl says:

    Tom Krazit / GeekWire:
    Google Cloud Platform launches cheaper Standard Tier traffic networking option that uses public internet, not its fiber

    Google unveils a new, cheaper networking option for cloud customers: the public internet
    https://www.geekwire.com/2017/google-unveils-new-cheaper-networking-option-cloud-customers-public-internet/

    Google Cloud Platform customers will have a new option when selecting the type of network used to deliver their traffic to their users: they can keep using Google’s network, or they can save some money with the new option of using public transit networks.

    Google has long argued that one of the best reasons to use its public cloud service is the strength of its fiber network, developed and enhanced for more than a decade to support the global data centers powering its search engine. But there are some applications that don’t require that level of performance, and so Google is now offering a cheaper networking service that uses the transit networks that deliver the bulk of traffic to internet service providers, said Prajakta Joshi, product manager for cloud networking at Google.

    Reply
  8. Tomi Engdahl says:

    Google Unveils a New, Cheaper Networking Option For Cloud Customers: the Public Internet
    https://tech.slashdot.org/story/17/08/23/1910252/google-unveils-a-new-cheaper-networking-option-for-cloud-customers-the-public-internet?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Slashdot%2Fslashdot%2Fto+%28%28Title%29Slashdot+%28rdf%29%29

    Google has long argued that one of the best reasons to use its public cloud service is the strength of its fiber network, developed and enhanced for more than a decade to support the global data centers powering its search engine. But there are some applications that don’t require that level of performance, and so Google is now offering a cheaper networking service — costing between 24 percent to 33 percent less — that uses the transit networks that deliver the bulk of traffic to internet service providers,

    Google unveils a new, cheaper networking option for cloud customers: the public internet
    https://www.geekwire.com/2017/google-unveils-new-cheaper-networking-option-cloud-customers-public-internet/

    Reply
  9. Tomi Engdahl says:

    Paul Sawers / VentureBeat:
    Google adds a firewall to its App Engine platform, letting developers restrict apps to certain users or regions, available in beta

    Google launches App Engine firewall so developers can easily restrict specific IP addresses
    https://venturebeat.com/2017/08/24/google-launches-app-engine-firewall-to-make-it-easier-for-developers-to-restrict-access-from-specific-ip-addresses/

    Google has announced a new firewall feature that allows developers and administrators using App Engine to easily restrict access from specific traffic sources.

    Google App Engine (GAE), a managed platform within the broader Google Cloud suite of services that compete with the likes of Amazon Web Services (AWS) and Microsoft Azure, is a web framework and platform for developing and scaling apps hosted on Google’s cloud.

    Whether during testing or for other reasons entirely, developers may wish to open up new apps to just a few specific groups of users or perhaps prevent certain regions from accessing them. It is already possible to restrict access based on IP address, but this requires implementing access controls within the code, and even then the requests are still allowed in the door, which not only consumes resources but can also cost companies money for traffic they don’t need or want. With the new Google App Engine firewall, launching today in beta, developers provide a set of rules through the App Engine Admin API, Google Cloud Console, or gcloud command-line tool and specify the IP addresses to block or allow. And that’s about it.

    Reply
  10. Tomi Engdahl says:

    VMware has cracking Q2, explains how it will beat Azure Stack
    vSphere sales forecast changed from long-term-decline to long-term-flat
    https://www.theregister.co.uk/2017/08/25/vmware_q2_2018/

    Reply
  11. Tomi Engdahl says:

    VMware Cloud is now live on AWS — and IT pros just did a little happy dance
    https://techcrunch.com/2017/08/28/vmware-cloud-is-now-live-on-aws-and-it-pros-just-did-a-little-happy-dance/?utm_source=tcfbpage&sr_share=facebook

    When VMware announced it was partnering with AWS last fall, it turned more than a few enterprise heads. After all, we’re talking about one company that dominates virtual machines on-prem, and the other in the public cloud. Together, the two companies make a powerful combination

    Reply
  12. Tomi Engdahl says:

    Synergy Research Group:
    SRG: enterprise SaaS market up 31% YoY in quarterly revenue to almost $15B; Microsoft still market leader, followed by Salesforce, Adobe, Oracle, SAP

    Microsoft Leads in SaaS Market; Salesforce, Adobe, Oracle and SAP Follow
    https://www.srgresearch.com/articles/microsoft-leads-saas-market-salesforce-adobe-oracle-and-sap-follow

    New Q2 data from Synergy Research Group shows that the enterprise SaaS market grew 31% year on year to reach almost $15 billion in quarterly revenues, with collaboration being the highest growth segment. Microsoft remains the clear leader in overall enterprise SaaS revenues, having overtaken long-time market leader Salesforce a year ago. Microsoft was already rapidly growing its SaaS revenues, but in Q2 its acquisition of LinkedIn gave its SaaS business a further boost. In terms of overall SaaS market rankings, Microsoft and Salesforce are followed by Adobe, Oracle and SAP, with other leading companies including ADP, IBM, Workday, Intuit, Cisco, Google and ServiceNow. It’s notable that the market remains quite fragmented, with different vendors leading each of the main market segments. Among the major SaaS vendors those with the highest overall growth rates are Oracle, Microsoft and Google.

    Reply
  13. Tomi Engdahl says:

    Synergy Research Group:
    SRG: enterprise SaaS market up 31% YoY in quarterly revenue to almost $15B; Microsoft still market leader, followed by Salesforce, Adobe, Oracle, SAP

    Microsoft Leads in SaaS Market; Salesforce, Adobe, Oracle and SAP Follow
    https://www.srgresearch.com/articles/microsoft-leads-saas-market-salesforce-adobe-oracle-and-sap-follow

    Reply
  14. Tomi Engdahl says:

    Barb Darrow / Fortune:
    HPE plans to acquire Boston-based Cloud Technology Partners, which helps clients deploy cloud computing on multiple vendors, to be HPE’s 5th acquisition in 2017 — Hewlett-Packard Enterprise said Monday that it will acquire Cloud Technology Partners, a Boston-based company …

    HPE Shopping Spree Continues With Purchase of This Cloud Specialist
    http://fortune.com/2017/09/05/hpe-buys-cloud-technology/

    Hewlett-Packard Enterprise said Tuesday that it will acquire Cloud Technology Partners, a Boston-based company that helps business customers plan and build cloud computing capabilities.

    Terms of the deal were not disclosed.

    Seven-year-old CTP works with businesses to determine which cloud technology—be it from Microsoft, Amazon Web Services, Google, or the non-vendor aligned OpenStack—is best for the customer’s needs. It then helps corporate customers plan out how they will run their information technology on that cloud (or clouds, if spread out across multiple vendors).

    Reply
  15. Tomi Engdahl says:

    VMware and AWS launch VMware Cloud on AWS hybrid service
    http://www.lightwaveonline.com/articles/2017/09/vmware-and-aws-launch-vmware-cloud-on-aws-hybrid-service.html?cmpid=enl_lightwave_lightwave_datacom_2017-09-05

    VMware, Inc. (NYSE: VMW) and Amazon Web Services, Inc. (AWS), an Amazon.com company (NASDAQ: AMZN), have announced availability of VMware Cloud on AWS. The integrated hybrid offering, the outcome of a strategic alliance VMware and AWS announced in October 2016, will provide VMware’s software-defined data center (SDDC) to the AWS Cloud. Customers will be able to run applications across operationally consistent VMware vSphere-based private, public, and hybrid cloud environments, with optimized access to AWS services, the companies say.

    Reply
  16. Tomi Engdahl says:

    Microsoft, Adobe advance partnership with new cross-cloud productivity integrations
    http://www.zdnet.com/article/microsoft-adobe-advance-partnership-with-new-cross-cloud-productivity-integrations/

    Microsoft and Adobe are continuing to integrate more of their cloud services — such as Microsoft Teams and Adobe’s Sign e-signature service — across each other’s portfolios.

    Reply
  17. Tomi Engdahl says:

    Jordan Novet / CNBC:
    AWS says that beginning October 2, it will start charging EC2 customers by the second, not hour; competitors Azure and Google Cloud charge by the minute

    Amazon Web Services will now charge by the second, its biggest pricing change in years
    https://www.cnbc.com/2017/09/18/aws-starts-charging-for-ec2-by-the-second.html

    The move comes four years after Google outdid AWS with per-minute pricing.
    Historically AWS has charged by the hour for its EC2 cloud computing service

    The move is historically significant. Since AWS became available in 2006, it has charged by the hour. Then, in 2013, Alphabet’s Google, which had introduced its direct competitor to AWS a year earlier, said it would start charging by the minute, after a 10-minute minimum. Microsoft’s Azure followed suit shortly thereafter.

    Now Amazon is hitting back by becoming even more granular when it comes to making people pay only for the computing resources they use, with a one-minute minimum.

    The price change is only applicable for Linux virtual machines, AWS’ chief evangelist, Jeff Barr, wrote in a blog post.

    While the per-second pricing could mean companies will end up paying less money for certain workloads, the change might also lead companies to be more experimental with their use of EC2 instances for certain types of computing.

    New – Per-Second Billing for EC2 Instances and EBS Volumes
    https://aws.amazon.com/blogs/aws/new-per-second-billing-for-ec2-instances-and-ebs-volumes/

    Reply
  18. Tomi Engdahl says:

    IBM packs 120TB into a carry-on bag, for snow-balling cloud uploads
    Whither the rapid file transfer app Aspera that Big Blue acquired in 2014?
    https://www.theregister.co.uk/2017/09/19/ibm_cloud_mass_data_migration_appliance/

    IBM’s decided to join AW and Google in the appliances-to-haul-data-into-the-cloud market, by launching an appliance called “IBM Cloud Mass Data Migration”.

    Readers familiar with AWS’ Snowball or the Google Transfer Appliance already know what this box is all about: there’s 120TB of disk inside, a couple of Ethernet ports and the plan is you’ll hire one from Big Blue and full it fill of your stuff.

    The appliance uses AES 256 encryption and RAID-6 data protection, so your data should survive the overnight UPS trip from your premises to IBM’s nearest cloud data centre. Once it arrives, Big Blue will upload it all into the cloud rather faster than would be possible if you tried to upload it over your internet connection.

    At US$395 per hire, it will also likely be rather cheaper than wide-area-over-the-wire transfer. It’s a US-only offer for now, but IBM promises “regional and global expansion coming soon in the EU.”

    Reply
  19. Tomi Engdahl says:

    Oracle promises SLAs that halve Amazon’s cloud costs
    Larry Ellison also pledges ‘Autonomous Database’ to cut the cost of – gulp – the people who run databases
    https://www.theregister.co.uk/2017/09/20/oracle_cloud_pricing/

    Oracle chair and chief technology officer has pledged to undercut Amazon Web Services pricing by 50 per cent for infrastructure-as-a-service and platform-as-a-service, in part by increasing use of automation.

    Big Red staged a Cloud event on Tuesday, at which Ellison said that the primary cost of running platform-as-a-service (PaaS) is labour and opined that human involvement in running databases or middleware is best avoided.

    “The way we want to compete in PaaS is to deliver a high degree of automation to our customers,” he said. “By automating a lot of those services, we reduce the amount of labour a customer would need to expend to run the DB database or run the middleware and also reduce the amount of human error associated with that labour.”

    “How expensive could an error be?” Ellison asked, then answered by saying “I don’t know … if you don’t patch the database at Equifax, that could be expensive.”

    Reply
  20. Tomi Engdahl says:

    Brian Jackson / IT World Canada:
    Manifold, which enables developers to find, buy, and manage several cloud services without being locked into a single cloud platform, announces $15M Series A

    Jevon MacDonald returns with new startup, Manifold, ‘by developers for developers’
    https://www.itworldcanada.com/article/jevon-macdonald-returns-with-new-startup-manifold-by-developers-for-developers/396751

    As public cloud infrastructure providers have become more popular, on-premises vendors like to joke that they are like Hotel California for your data: you can never leave. Today, a startup that is co-headquartered in Halifax and San Francisco is launching with a solution to that problem.

    Manifold provides a way for developers to find, buy, and manage services across multiple cloud providers through a single interface. At the helm of the new startup is Canadian entrepreneur Jevon MacDonald, who previously launched GoInstant.

    Reply
  21. Tomi Engdahl says:

    Manifold Makes Managing Cloud Developer Services Easy
    http://www.linuxjournal.com/content/manifold-makes-managing-cloud-developer-services-easy?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+linuxjournalcom+%28Linux+Journal+-+The+Original+Magazine+of+the+Linux+Community%29

    We love it here when superheroes drop their cloak of invisibility, emerge from stealth mode and reveal themselves to the world. Of course we do—it’s the geek in us! Manifold has just done exactly that, emerged from stealth mode and is claiming to be the easiest way to find, buy and manage essential developer services.

    Manifold, cloud-agnostic, is working to redefine the developer services ecosystem with a platform that allows developers to find, buy and manage their favorite services easily—from email to logging—without being locked in to any single cloud platform. With Manifold, developers no longer are restricted by the confines of any particular cloud. Instead, they can create stacks tailored specifically for their project needs.

    “The modern development stack is complex. Until now, there has been no easy way for developers to discover and manage the mix of services needed to create modern applications without resorting to the one-size-fits-all offerings of the monoclouds”,

    Reply
  22. Tomi Engdahl says:

    Dell EMC, Veeam eagerly clamber onto Microsoft’s Azure Stack: I love it more, no, I love it more
    Azure Stack and Hyper-V support ignition for MS
    https://www.theregister.co.uk/2017/09/28/dell_emc_and_veeam_azure_stack/

    Orlando’s Microsoft Ignite conference saw Dell EMC and Veeam worshipping at the cloudy on-premises Azure Stack shrine with protection support offerings and more.

    Dell EMC already supports Microsoft’s Azure Stack with Dell servers and networking and EMC storage running Microsoft’s public cloud in an on-premises incarnation. That’s sold by its own sales force.

    It has now laid three extra offerings at the feet of the Azure Stack gods:

    XC Series support for Hyper-V and Azure Stack,
    channel can sell Dell EMC Cloud for Microsoft Azure Stack, and
    Dell EMC Ready Bundle for Microsoft SQL Server.

    The XC Series are hyper-converged systems OEM’d from Nutanix and sold by Dell EMC alongside its in-house VxRail and Rack hyperconverged systems, which are vSphere-centric.

    Reply
  23. Tomi Engdahl says:

    Oracle promises ‘highly automated’ security in self-driving database
    Larry Ellison is keen on ‘Anything we can possibly do to reduce human intervention’
    https://www.theregister.co.uk/2017/10/02/oracle_openworld_2017_larry_ellison_keynote_day_one/

    OPENWORLD 2017 Oracle has kicked off its annual OpenWorld conference with a pledge to automate in the company’s “autonomous” database, plus plenty of snark directed at Amazon Web Services.

    Jeans-toting CTO Larry Ellison kicked off Big Red’s four-day San Franciscan extravaganza with a not-so-slick presentation that had no real big surprises.

    The main item on the agenda was the firm’s upcoming autonomous database, with Ellison offering up a smidge more detail than he did when teasing it during a recent announcement on cloud pricing.

    “If you eliminate human behaviour, you eliminate human error,” the CTO said. “My autopilot flies my plan a lot better than I do,” he added, in a bid to make sure we all know he has a jet.

    The planned cyber security offering – details of which attendees were told will emerge on Tuesday – will see the database use machine learning technologies to detect when it is being attacked.

    It will then automatically patch itself, rather than waiting for a human to schedule downtime.

    “It’s our computer versus their computers in cyber warfare, and we have to have a lot better computers, and more automation if we’re going to defend our data,” Ellison said.

    In a dig at the recent Equifax scandal, Ellison said: “The worst data thefts in history have occurred after a patch was available to prevent the theft. The patches just weren’t applied; how is that possible.”

    He later said that in that situation, “someone lost their job”, before adding that it wasn’t just the CEO in the firing line: “Nobody is safe.”

    Reply
  24. Tomi Engdahl says:

    5 obstacles holding your hybrid cloud strategy back
    http://www.cloudpro.co.uk/cloud-essentials/hybrid-cloud/7074/5-obstacles-holding-your-hybrid-cloud-strategy-back

    Overcoming these challenges is key to a successful cloud migration strategy

    e taken into account, aside from the well-known concerns about security. Planning to overcome these issues should be an essential part of a cloud migration strategy.

    Complexity: With the explosion of web services, mobile devices and new technologies, managing the complexity of an expanding data centre environment across a hybrid cloud is a significant challenge. Choosing the right service offerings at the right service levels on different data management frameworks across a blend of cloud resources can be daunting.

    IT agility: IT service delivery is about meeting the needs of the business. As those needs change, IT must adapt and respond quickly. For years, IT organisations have been working towards developing agility within the data centre.

    Data control: Businesses building their own data centres and private clouds can retain control of their data. Extending that environment to the public cloud necessitates giving up some control of infrastructure and applications, but the responsibility to control business data must remain firmly with the organisation.

    Skills: If a skills gap exists in an organisation, that shouldn’t be seen as an obstacle to moving to a hybrid cloud model. Selecting the right consulting services partner

    Vendor lock-in: Although choosing a cloud service provider to complement a set of IT services is a means to deliver a flexible, dynamic environment, it doesn’t necessarily mean ongoing flexibility among different cloud providers. For many organisations, cloud provider lock-in can be a significant hurdle

    Reply
  25. Tomi Engdahl says:

    Data Centre Arrow Cloud
    Mainframes are hip now! Compuware fires its dev environment into cloud
    But analysts say good luck convincing newcomers
    https://www.theregister.co.uk/2017/10/02/compuware_shows_off_shiny_new_mainframe_cloud_ide_toy/

    In an attempt to entice new blood to those dinosaur systems of record known as mainframes, Detroit software firm Compuware has moved its development environment to the cloud.

    The company’s flagship mainframe Agile/DevOps product Topaz is now available on Amazon Web Services.

    As Compuware and competitors such as IBM and Micro Focus have realised, in addition to convincing firms to take on high licensing usage fees, there’s a mainframe developer shortage. Industry has struggled to convince students to learn ancient programming languages such as COBOL.

    Compuware Introduces Cloud Access to Mainframe Development
    Topaz Availability on AWS is Industry-first, Transforming Enterprises’ Ability to Quickly Modernize COBOL
    https://globenewswire.com/news-release/2017/10/02/1138589/0/en/Compuware-Introduces-Cloud-Access-to-Mainframe-Development.html

    Reply
  26. Tomi Engdahl says:

    Hierarchical Storage Management is back. And this time it’s cloudy
    KompriseCloud can now shunt data between different cloud storage operators
    https://www.theregister.co.uk/2017/10/03/komprise_supports_colder_public_cloud_tiers_for_older_data/

    Data management software vendor Komprise has added extra cold cloud tiers to its data lifecycle manager, which moves data to slower access storage tiers without affecting its accessibility.

    The company has also introduced analytics to its metadata library, data confinement and ONTAP 9 support.

    Komprise’s software moves data between tiers without leaving stubs*. It instead uses dynamic links, making search, management and access easier than before.

    With this announcement the Komprise Intelligent Data Management (KIDM) product moves to v2.6. It already supports AWS S3 and Google Nearline as public cloud tiers and adds colder data tiers behind them; S3 Infrequent Access and Glacier from Amazon, and Coldline for Google, calling this cross-cloud tiering.

    Reply
  27. Tomi Engdahl says:

    Google backs up Firebase with a second realtime NoSQL silo
    Cloud Firestore aspires to scale better
    https://www.theregister.co.uk/2017/10/04/google_backs_up_firebase_with_a_second_realtime_data_store/

    Google’s twin fetish manifested itself in its Firebase platform-as-a-service offering on Tuesday through the introduction of a second realtime NoSQL database.

    Firebase has been growing in popularity among mobile and web developers thanks in part to the Firebase Realtime Database, which provides a way to synchronize data across multiple clients more or less instantaneously.

    This turns out to be a quite useful for games or social apps, where you want everyone to be on the same page, and isn’t particularly easy to do when users may be located all over the world.

    Realm offers something similar called the Realm Mobile Database.

    And now Google has something similar too, Cloud Firestore, a doppelgänger with distinctions.

    Google exhibits double vision fairly often. It backs up products they way people back up data. It has two cloud platforms (Google Cloud Platform and the mobile-oriented Firebase). It maintains two operating system (Android and ChromeOS, not to mention the nascent Fuchsia), two email services (Gmail and Inbox, incestuous though they may be), and two chat applications (Allo and Hangouts). It developed two programming languages (Go and Dart). And for a time, Google Video co-existed with YouTube.

    Cloud Firestore has been designed to address the shortcomings of Firebase Realtime Database: data structuring, querying, and scaling.

    Toward that end, it’s more structured. Where Firebase stores data in a single JSON tree, Cloud Firestore relies on a hierarchical document model. Data gets stored as key-value pairs in objects called documents that are organized into collections, which can contain subcollections.

    Reply
  28. Tomi Engdahl says:

    Oracle Announces New Cloud Security Services
    http://www.securityweek.com/oracle-announces-new-cloud-security-services

    Oracle announced this week at the company’s OpenWorld convention the launch of new cloud security services and improvements to existing products.

    One of the new offerings is the Oracle Identity Security Operations Center (SOC), a context-aware intelligence and automation solution designed to help organizations detect and respond to sophisticated threats targeting users, applications, data and cloud workloads.

    The Identity SOC leverages the newly released Oracle Security Monitoring and Analytics Cloud Service, which provides security incident and event management (SIEM) and user and entity behavioral analytics (UEBA) capabilities.

    Two other major components of the Identity SOC are the Oracle CASB (Cloud Access Security Broker) Cloud Service, which enables organizations to protect business-critical cloud infrastructure and data, and the Oracle Identity Cloud Service, described by the company as a “next-generation comprehensive security and identity platform.”

    Reply
  29. Tomi Engdahl says:

    Say Hi to Subutai
    http://www.linuxjournal.com/content/say-hi-subutai

    What Is Subutai?

    Subutai is an open-source project and platform that lets anyone share, barter or rent computer resources to create clouds from the edge rather than centralized locations. Available devices can attach to these clouds hovering on the edge. We started calling it Social Cloud Computing, but technically, Subutai is a dynamic p2p multi-cloud made possible thanks to Lightweight Linux Containers and software-defined networking. Think Amazon’s Virtual Private Cloud, but running on your computers and the computers of social contacts who share their computer resources with you. Or, think AirBnB on computers for the people’s cloud.

    Subutai partners with the Digital Life Collective, a member co-operative that researches, develops, funds and supports what we call “tech we trust”—those technologies that put the individual’s autonomy, privacy and dignity first, or that support those technologies that do. Our tech, not their tech.

    How Does It Work?

    You set how much of your computers’ resources you’re willing to share with others. Rules and quotas are used to share with contacts from your social-media accounts. Once your network of friends, family and colleagues share with you, the stage is set to create clouds across shared computer resources.

    When someone creates a cloud, peer computers authorized to share resources with the cloud’s owner swarm together (like bees) to form an n-way virtual private network (VPN). A peer is a group of computers with resources that can be shared with others. A peer can be a rack of computers or a single virtual machine running on your laptop.

    Peers contribute resources into the VPN as Linux container hosts. Whatever the underlying hardware, operating system or virtualization technology, resources are presented canonically to environments as containers. The VPN provides secure connectivity between these containers across the internet.

    Template containers can be fired up based on Docker images to install infrastructure rapidly.

    Subutai is ready and mature.

    There’s a lot more to Subutai than just software. We’ve designed a broadband modem that is also a turnkey Subutai Appliance and can be used for IoT applications. We call it the Liquid Router because we made it the Swiss army knife of IoT gateways: it has Raspberry Pi, Arduino and PMod headers. Yeah, we’re f’ing crazy, but the cloud router/IoT gateway will soon do something no other broadband router can do: it will effortlessly allow average broadband users to mine for Subutai’s cryptocurrency while sharing, bartering or renting computer resources. Hence, it’s obviously also a physical cryptocurrency wallet.

    Conquer The Cloud
    https://subutai.io/

    Your next personal computer won’t be just another laptop, tablet, or wearable device: it will be EVERYTHING surrounding you, connected fluidly by adaptive peer-to-peer clouds driven by social interaction!

    The Subutai platform lets you easily mine for cryptocurrency and share, barter, or rent computing resources without having to be a programming expert.

    Subutai helps connect individuals from all over the globe so they may share resources to create secure cloud environments across their peers. Just as a cloud can float down onto land and is thus called fog, IoT devices attach when available and needed to your cloud as it floats around with you.

    Reply
  30. Tomi Engdahl says:

    10 years on, Amazon CTO reflects on DynamoDB launch
    Vogels recounts how Dynamo paper became AWS NoSQL giant
    https://www.theregister.co.uk/2017/10/06/10_years_on_amazons_cloud_chief_reflects_on_dynamodb_launch/

    Amazon CTO Werner Vogels this week marked the 10th anniversary of his Project Dynamo whitepaper, the blueprint for what would become the DynamoDB platform.

    The paper [PDF], presented in October 2007 at the ACM Symposium on Operating Systems Principles, describes an Amazon-designed backend system to overcome the weaknesses of its Oracle database.

    “We prioritized focusing on requirements that would support high-scale, mission-critical services like Amazon’s shopping cart, and questioned assumptions traditionally held by relational databases such as the requirement for strong consistency,” Vogels explained.

    “Our goal was to build a database that would have the unbounded scalability, consistent performance and the high availability to support the needs of our rapidly growing business.”

    http://www.allthingsdistributed.com/files/amazon-dynamo-sosp2007.pdf

    Reply
  31. Tomi Engdahl says:

    Amazon, Azure, Google will eat all the IT. Google, let us be your cake fork, pleads Nutanix
    3 IT giants – just 1 on-prem/hybrid stack partner opening…
    https://www.theregister.co.uk/2017/10/06/nutanix_google_amazon_azure/

    Analysis An IT trio of giants will soon dominate the market – and there appears to be one on-premises/hybrid stack partner opening, Nutanix told us.

    A conversation with Sudheesh Nair, Nutanix’s president, revealed the firm’s mindset: it envisages three great beasts in the IT jungle – Amazon, Azure and Google – each with their public cloud platforms and easy-to-provision, scale and pay data centre services.

    These are moving to providing operational functions as an abstraction layer on top of operational components, such as databases and servers, and, underneath them, raw compute and various tiers of storage.

    Enterprises have increasingly looked at the public cloud with fondness, and hybrid on-premises and public cloud IT is the new normal.

    On-premises IT isn’t going away, but its boundaries are blurring as hybrid IT suppliers try to make the cloud on-premises-like in some ways, and the on-premises world cloud-like, offering cloud-style provisioning, payment, scaling and relative simplicity.

    Reply
  32. Tomi Engdahl says:

    Amazon beams: We’re best cloud buds with General Electric
    Hey, but aren’t you forgetting *cough* Azure? *cough*
    https://www.theregister.co.uk/2017/10/06/amazon_brags_it_is_general_electrics_preferred_cloud_provider/

    Amazon boasts that it is the “preferred” cloud provider for General Electric, but, in this multi-cloud world, that’s not quite the case.

    GE began moving its applications to the cloud in 2014, with AWS taking the bulk of the dollars. The company has migrated “more than 2,000 applications”, we’re told.

    The thing Amazon left out of its marketing materials is that GE is also working with Microsoft. In July, Redmond announced the energy firm was making its industrial applications development platform, Predix, available on Azure.

    A GE spokesperson told The Register: “GE is migrating its IT applications to AWS, which allows us to focus on building strong internal apps, instead of running our own data centers. The Predix platform already runs on AWS.

    “For the Predix platform, our partnership with Microsoft continues as planned. By bringing Predix to Azure, we are helping our mutual customers connect their IT and OT systems to drive actionable intelligence from data. With Microsoft, we have also integrated with tools such as Power BI (which we demonstrated at our Minds + Machines event last year) and have a roadmap for further technical collaboration and integration with additional Microsoft services that we will share later this month.”

    According to Gartner research, the Infrastructure-as-a-Service market grew from $16.8bn in 2015 to $22.1bn in 2016, up 31 per cent. Amazon had revenues of about $9.8bn of IaaS public cloud revenues in 2016 and controlled 44.2 per cent of market share, while Microsoft took in about $1.6bn and held a market share of 7.1 per cent.

    Reply
  33. Tomi Engdahl says:

    The public cloud and SaaS are growing at a rapid pace

    The market for public cloud services will grow this year by 18.5 per cent. According to Gartner, software sold specifically as a service, SaaS services are now growing faster than expected.

    SaaS services are sold this year by $ 58.6 billion. Hard demand also raises the cloud market faster than previously believed.

    The biggest cloud computing providers are already distinguished by Microsoft, Amazon and Alibaba in China. These names will not come as a surprise to anyone.

    According to Gartner, the public cloud market will grow next year to $ 305.8 billion, next year to over $ 355 billion and in 2020 to $ 411 billion. It is expected that after a few years, the cloud market will already grow larger than the semiconductor market.

    Source: http://etn.fi/index.php?option=com_content&view=article&id=6993&via=n&datum=2017-10-12_14:49:00&mottagare=30929

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*