Cloud Trends for 2018

https://www.linkedin.com/pulse/10-cloud-trends-2018-walter-/?trackingId=kll0IUQZE1Tr7oZAPhYVwg%3D%3D&lipi=urn%3Ali%3Apage%3Ad_flagship3_feed%3B9CwvlfYtSDGDz8xsBlHxRg%3D%3D&licu=urn%3Ali%3Acontrol%3Ad_flagship3_feed-object

Here are some cloud trends for you. 

46 Comments

  1. Tomi Engdahl says:

    5 cloud computing trends for 2018
    https://enterprisersproject.com/article/2017/12/5-cloud-computing-trends-2018?sc_cid=7016000000127ECAAY

    What can you expect in the year ahead? Multi-cloud and containers loom large, IT leaders and cloud experts say

    1. Multi-cloud goes mainstream – and open source is the portability enabler

    2. IT turns from cloud adoption to cloud optimization

    3. Hybrid cloud expands and powers related trends

    4. Containers, orchestration, and microservices join the “A” list

    5. CIOs learn to scale DevOps in a lean way

    Reply
  2. Tomi Engdahl says:

    Why you should pick strong consistency, whenever possible
    https://cloudplatform.googleblog.com/2018/01/why-you-should-pick-strong-consistency-whenever-possible.html?m=1

    Do you like complex application logic? We don’t either. One of the things we’ve learned here at Google is that application code is simpler and development schedules are shorter when developers can rely on underlying data stores to handle complex transaction processing and keeping data ordered. To quote the original Spanner paper, “we believe it is better to have application programmers deal with performance problems due to overuse of transactions as bottlenecks arise, rather than always coding around the lack of transactions.”1
    Put another way, data stores that provide transactions and consistency across the entire dataset by default lead to fewer bugs, fewer headaches and easier-to-maintain application code.

    Reply
  3. Tomi Engdahl says:

    Big Data 2018: 4 Reasons To Be Excited, 4 Reasons To Be Worried
    https://www.eetimes.com/document.asp?doc_id=1332835

    Four reasons to be excited:

    Machine-learning methods become more accessible
    Data will not be in short supply
    Big data tools reach more effectively into the enterprise
    Infrastructure rises to support big data volume and velocity

    Four reasons to be worried:

    Necessary skills are in critically short supply
    Privacy concerns become actionable
    Data interoperability remains limited
    Security flaws threaten data integrity

    Reply
  4. Tomi Engdahl says:

    Solving Big Problems with Big Data
    With solutions-focused pragmatism and an engineering mindset, Luís Amaral identifies intersections and patterns to understand complex, seemingly unrelated issues.
    http://www.mccormick.northwestern.edu/magazine/fall-2017/solving-big-problems-with-big-data.html

    Reply
  5. Tomi Engdahl says:

    Over 500 hyperscale data centers globally foreseen by 2019
    http://www.cablinginstall.com/articles/pt/2018/01/over-500-hyperscale-data-centers-globally-foreseen-by-2019.html?cmpid=enl_cim_cim_data_center_newsletter_2018-01-16&pwhid=e8db06ed14609698465f1047e5984b63cb4378bd1778b17304d68673fe5cbd2798aa8300d050a73d96d04d9ea94e73adc417b4d6e8392599eabc952675516bc0&eid=293591077&bid=1974690

    According to a new report by Synergy Research, by the end of 2017, there were 390 global hyperscale data centers, with 44 percent located in the United States. Coming in second, at eight percent, was China, with Japan and the U.K. at six percent each, and Germany rounding out the pack at five percent

    Over 500 Hyperscale Data Centers by 2019
    http://www.transformingnetworkinfrastructure.com/topics/hyperscale-data-centers/articles/436387-over-500-hyperscale-data-centers-2019.htm

    Dinsdale anticipates we will see over 500 hyper-scale facilities before 2019 ends, with 69 currently in the developmental stages. Synergy notes that for every company out of the 24, there’s an average of 16 data center sites. No surprise- Google, Amazon, Microsoft and IBM- are the largest cloud providers, operating a minimum of 45 data center locales, three per region (North America, Latin America, APAC, and EMEA). Oracle and Alibaba will also have a large data center presence, per the the report.

    Reply
  6. Tomi Engdahl says:

    DCD>Energy Smart highlights EU data center heat recovery
    http://www.datacenterdynamics.com/events/dcdenergy-smart-highlights-eu-data-center-heat-recovery/99569.article

    DCD’s event at the Stockholm Brewery Conference Center in March 2018, will help the digital infrastructure industry deal with rising energy demands.

    Recycling generates profits

    “Through the adoption of various energy efficiency measures, the data center industry together with the energy utilities can build scalable, flexible, and green data centers which are dynamic in their infrastructure,” says Jan Sjögren, head of global ICT centers building operations at Ericsson who will be speaking at the event. “There is a great opportunity for the data center to recycle their waste heat, where we can potentially save on energy cost whilst generating profits as producers.”

    Across Europe, cities such as Amsterdam, Paris, Odense, Dresden and Stockholm are betting big on this approach, creating a new business model for the tech industry worldwide.

    “Rising energy consumption is of great concern to the data center industry. The trend towards utilizing clean energy will redefine future data center location strategies. With the rise of edge computing, we will see a network of distributed compute. Edge compute close to dense population areas, and large data centers in close proximity to power plants, with re-use of energy, will increasingly benefit operators,” says Tor Björn Minde, CEO at RISE SICS North AB, who will be speaking on this topic at the event.

    “Data centers seek solutions to increase energy efficiency and lower cost”

    Reply
  7. Tomi Engdahl says:

    DCIM market poised for big growth through 2021: Analyst
    http://www.cablinginstall.com/articles/pt/2018/01/dcim-market-poised-for-big-growth-through-2021-analyst.html?cmpid=enl_cim_cim_data_center_newsletter_2018-01-16&pwhid=e8db06ed14609698465f1047e5984b63cb4378bd1778b17304d68673fe5cbd2798aa8300d050a73d96d04d9ea94e73adc417b4d6e8392599eabc952675516bc0&eid=293591077&bid=1974690

    According to the report, the global DCIM market accounted for USD 546.00 Million in 2015 and is expected to reach USD 1650.82 Million by 2021, growing at a CAGR of around 20.3% between 2016 and 2021.

    As defined by the report’s summary, “Data center infrastructure management optimizes and serves business assessment of IT infrastructure. The change in the functionality of DCIM is presenting huge market opportunities for several enterprises. Automated asset location, change management, touch-based technology, mobile applications, and asset auto discovery are the latest advances in the DCIM market.”

    Data Center Infrastructure Management (DCIM) Market Poised to Bring in $1650.82 Million by 2021
    Global Data Center Infrastructure Management (DCIM) Market is Expected to Reach USD 1650.82 Million by 2021
    http://www.sbwire.com/press-releases/data-center-infrastructure/release-909208.htm

    Reply
  8. Tomi Engdahl says:

    AWS brings Go support to its Lambda serverless platform
    https://techcrunch.com/2018/01/16/aws-brings-go-support-to-its-lambda-serverless-platform/?ncid=rss&utm_source=tcfbpage&utm_medium=feed&utm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29&utm_content=FaceBook&sr_share=facebook

    With this move, Lambda now supports Go, JavaScript, Node.js, Java, C# and Python. Google’s Lambda competitor Cloud functions, which is still lingering in beta, currently only supports Node.js, while Azure Functions supports C#, JavaScript, F# and Java (with experimental support for Python, PHP, TypeScript, Batch, Bash and Powershell).

    Go code on Lambda is executed in a standard go1.x runtime and developers can upload their code as a ZIP file through the AWS command line tool or in the Lambda console.

    Reply
  9. Tomi Engdahl says:

    6 Ways Cloud Computing Is Transforming Healthcare Systems
    https://www.eetimes.com/author.asp?section_id=36&doc_id=1332854

    Here are six areas in which health clouds are resolving key challenges for the healthcare community.

    The global healthcare cloud computing market is expected to reach $9.48 billion in 2020 from $3.73 billion in 2015 — a 20.5% compound annual growth rate. The market is expected to be dominated by North America, followed by Europe, Asia, and the Rest of the World (RoW).

    Access to healthcare: In remote parts of the country and for patients with busy schedules, accessing healthcare is a major issue. Telehealth or virtual care solutions are gaining popularity

    Medication adherence: Patients not following the drug regimen prescribed by the doctor, results in avoidable re-admissions to the hospital costing the U.S. healthcare system $290 billion annually. Market for products that remind the patient when it is time to take the medicine, keep a log and automatically order refills is expanding at a rapid pace

    Drug theft and counterfeiting: Theft, counterfeiting, selling expired medicine are some of the problems which can be controlled by monitoring the supply chain.

    Resource inefficiency: Escalating cost of healthcare is always a hot topic among policy makers and no real solution has been implanted to date. One of the major factors adding to the cost of healthcare is inefficiency of resources like medical staff, equipment and easy access to patient resource pool for clinical studies.

    Personal data privacy: Each healthcare organization maintaining their own medical records is a nightmare for data security and compliance to Health Insurance Portability and Accountability Act (HIPAA). Not to mention that it adds significant cost for organizations

    Uniform medical records: Each hospital or care provider using their own Electronic Health Record (EHR) system is not in favor of the consumer. Not only does it add cost to the healthcare system since each hospital has to maintain a different system but it also makes it more resistive for a patient to change care providers.

    Cloud computing and healthcare is a match made in heaven to improve the quality of life for our society.

    Reply
  10. Tomi Engdahl says:

    AWS Lambda Go vs. Node.js performance benchmark: updated
    https://hackernoon.com/aws-lambda-go-vs-node-js-performance-benchmark-1c8898341982

    Just this week AWS announced the release of Go for their Lambda service. This is pretty exciting as Go straddles a great niche between Java and Node.JS with regard to type safety, programming model and performance.

    Reply
  11. Tomi Engdahl says:

    Google is launching a new digital store to sell cloud-based software
    https://techcrunch.com/2018/01/30/google-is-launching-a-new-digital-store-to-sell-cloud-based-software/?ncid=rss&utm_source=tcfbpage&utm_medium=feed&utm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29&utm_content=FaceBook&sr_share=facebook

    Google is launching a digital store that will offer cloud-based software to companies and other organizations. Bloomberg, which reported the news a bit earlier, notes the move is just the juggernaut’s latest effort to ensure that cloud leaders, and specifically Amazon Web Services, don’t leave the company in the dust.

    Reply
  12. Tomi Engdahl says:

    Microsoft’s Azure Event Grid hits general availability
    https://techcrunch.com/2018/01/30/microsofts-azure-event-grid-hits-general-availability/?ncid=rss&utm_source=tcfbpage&utm_medium=feed&utm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29&utm_content=FaceBook&sr_share=facebook

    With Event Grid, Microsoft introduced a new Azure service last year that it hopes will become the glue that holds together modern event-driven and distributed applications. Starting today, Event Grid is generally available, with all the SLAs and other premises this entails.

    Using Event Grid, developers can easily connect virtually any service to another, no matter where they run.

    The basic idea here is that Event Grid can route information about an event (say a new file is uploaded to a storage service) and then route that to another service to process this data (maybe for image analysis) — and you can even fan this event notification out to multiple services, too. That’s a core feature for every serverless application.

    Reply
  13. Tomi Engdahl says:

    Microsoft buys gaming services startup PlayFab to bolster its Azure platform
    https://techcrunch.com/2018/01/29/microsoft-buys-cloud-gaming-startup-playfab-to-bolster-its-azure-gaming-platform/

    In the latest chapter of GAFAM’s continuing bid to conquer online gaming, Microsoft has acquired PlayFab, which helps game developers launch their titles online more quickly with simplified back-end services. The startup will be integrated into Microsoft’s Azure gaming group.

    The Seattle-based startup had raised around $13 million in funding from investors. Terms of the deal weren’t disclosed.

    “Together, Azure and PlayFab will further unlock the power of the intelligent cloud for the gaming industry, enabling game developers and delighting gamers around the world,” Kareem Choudhry, Microsoft’s corporate VP of gaming, said in a blog post.

    PlayFab offered game developers a platform to host and operate online games and the analytics tools to help understand and monetize users.

    Microsoft acquires PlayFab, accelerating game development innovation in the cloud
    https://blogs.microsoft.com/blog/2018/01/29/microsoft-acquires-playfab-accelerating-game-development-innovation-cloud/

    Reply
  14. Tomi Engdahl says:

    Google to expand cloud network with new data centers, subsea cables
    http://www.cablinginstall.com/articles/pt/2018/01/google-to-expand-cloud-network-with-new-data-centers-subsea-cables.html?cmpid=enl_cim_cim_data_center_newsletter_2018-01-30&pwhid=e8db06ed14609698465f1047e5984b63cb4378bd1778b17304d68673fe5cbd2798aa8300d050a73d96d04d9ea94e73adc417b4d6e8392599eabc952675516bc0&eid=293591077&bid=1988611

    Google is reportedly expanding its existing cloud network with new data centers this year and new subsea cables in 2019. It will further add data centers to five regions in 2018, including Montreal and the Netherlands in Q1, followed by Los Angeles, Finland and Hong Kong later in the year.

    Google expands network with new data centers, subsea cables
    https://www.cnet.com/news/google-expands-network-with-new-data-centers-subsea-cables/

    Google will build five new regional data centers in 2018 and three subsea cables in 2019 to further grow its worldwide network.

    Reply
  15. Tomi Engdahl says:

    5 data center trends to watch in 2018
    http://www.cablinginstall.com/articles/pt/2018/01/5-data-center-trends-to-watch-in-2018.html?cmpid=enl_cim_cim_data_center_newsletter_2018-01-30&pwhid=e8db06ed14609698465f1047e5984b63cb4378bd1778b17304d68673fe5cbd2798aa8300d050a73d96d04d9ea94e73adc417b4d6e8392599eabc952675516bc0&eid=293591077&bid=1988611

    1. 2018 is the year of the Edge Data Center — “It’s important to note that the edge is not here to replace the cloud. Edge computing serves as the decentralized extension of the campus networks, cellular networks, data center networks, or the cloud. [Many] edge solutions are designed to complement data center and cloud services.”

    2. All-flash solutions will eliminate a number of design challenges — “All-flash solutions should be in your design considerations if they’re not already. Interviewed customers reported significant power and cooling savings when they replaced legacy disk storage with all-flash technologies.”

    3. It’s all about converged technologies in 2018 and beyond — “This type of architecture aims to remove siloes of resources, challenges around administration, and issues with scale.”

    4. Prepare your data center for hybrid — “On the theme of decentralization, hybrid cloud continues to be a dominant factor when it comes to data center design and integration with cloud.”

    5. More investment made in data center efficiency — “Airflow management and computational fluid dynamics have helped data center operators create more efficiency and better understand how to design their data centers.

    Reply
  16. Tomi Engdahl says:

    Google’s cloud revenue is substantially below competitors.

    Google’s Diane Greene says billion-dollar cloud revenue already puts them in elite company
    https://techcrunch.com/2018/02/01/googles-diane-greene-says-billion-dollar-cloud-revenue-already-puts-them-in-elite-company/?utm_source=tcfbpage&sr_share=facebook

    It has long been believed that the big three in the cloud consisted of AWS, Microsoft and Google, with IBM not doing too badly either. But in its earnings call with analysts today, the company revealed it’s pulling in a billion dollars a quarter in combined cloud revenue. That’s a figure that Google’s Diane Greene says already puts her company on elite footing, but which is substantially below what competitors have been reporting.

    It’s worth noting that in Q4, Canalys reported that Microsoft had grown the fastest with 98 percent growth with Google second at 85 percent growth; still quite brisk, but not the fastest.

    That may be so, but it’s hard to ignore that a $4 billion run rate is not even equal to a quarter of revenue for any of Google’s main cloud competitors. While it’s hard to do a pure comparison of cloud revenue because there is no standard way of measuring it, we do know that Amazon reported AWS revenue today of $4.331 billion. Meanwhile, Microsoft passed a $20 billion total cloud run rate last year and IBM reported revenue of $17 billion for the year in its most recent earnings report, which breaks down to more than $4.25 billion a quarter.

    Reply
  17. Tomi Engdahl says:

    Larry Dignan / ZDNet:
    Oracle to expand its autonomous technology across its PaaS offerings in app development, app and data integration, analytics, and system and identity management

    Oracle expands its autonomous technology across its cloud platform
    http://www.zdnet.com/article/oracle-expands-its-autonomous-technology-across-its-cloud-platform/

    The autonomous technology in the Oracle Autonomous Database will now be rolled out to application development, app and data integration, analytics and system and identity management.

    Reply
  18. Tomi Engdahl says:

    Google releases Cloud TPU beta, GPU support for Kubernetes
    Google said a limited quantity of TPUs are available today.
    http://www.zdnet.com/article/google-releases-cloud-tpu-beta-gpu-support-for-kubernetes/

    Google Cloud announced Monday that Cloud TPUs are available in beta on Google Cloud Platform.

    Short for Tensor Processing Unit, TPU’s are designed for machine learning and tailored for Google’s open-source machine learning framework, TensorFlow. The specialized chips can provide 180 teraflops of processing to support training machine learning algorithms, and have been powering Google datacenters since 2015.

    “We designed Cloud TPUs to deliver differentiated performance per dollar for targeted TensorFlow workloads and to enable ML engineers and researchers to iterate more quickly,” Google wrote in a Cloud Platform blog.

    “Over time, we’ll open-source additional model implementations. Adventurous ML experts may be able to optimize other TensorFlow models for Cloud TPUs on their own using the documentation and tools we provide.”

    Reply
  19. Tomi Engdahl says:

    Apple now relies on Google Cloud Platform and Amazon S3 for iCloud data
    https://techcrunch.com/2018/02/27/apple-now-relies-on-google-cloud-platform-and-amazon-s3-for-icloud-data/?utm_source=tcfbpage&sr_share=facebook

    It’s no secret, Apple has been relying on third-party cloud companies for iCloud. And CNBC spotted an interesting tidbit in Apple’s own documents. The company now relies on Amazon S3 and Google Cloud Platform’s storage product to store iCloud data.

    The company mentions that user files are divided into tiny chunks and encrypted. The encryption keys and metadata information are stored on Apple’s own servers. But the encrypted files are stored on third-party services.

    While users have no idea that Amazon and Google are managing their iCloud data, Amazon and Google can’t do anything with those files without the encryption keys. So it seems highly unlikely that Amazon and Google are looking at your data.

    “The encrypted chunks of the file are stored, without any user-identifying information, using third-party storage services, such as S3 and Google Cloud Platform,” you can read in the document.

    In the past, Apple has mentioned Microsoft Azure in its partners. The wording of the document isn’t really clear. Apple could be using more storage services without naming them directly.

    In all cases, this is a great example of asymmetric competition. While Apple and Google are fighting really hard to grab market share of the smartphone market, Apple is also Google’s client. Apple also competes with Amazon and Microsoft in other areas.

    Reply
  20. Tomi Engdahl says:

    New Azure servers to pack Intel FPGAs as Microsoft ARM-lessly embraces Xeon
    ‘Intel Xeon Scalable Processor’ hailed as ‘cornerstone for new platform’ with servers customised for different roles

    New Azure servers to pack Intel FPGAs as Microsoft ARM-lessly embraces Xeon
    http://www.theregister.co.uk/2017/07/12/all_azure_servers_to_pack_intel_fpgas_as_microsoft_armlessly_embraces_new_xeons/

    ‘Intel Xeon Scalable Processor’ hailed as ‘cornerstone for new platform’ with servers customised for different roles

    Microsoft may have said ARM servers provide the most value for its cloud services back in March, but today it’s given Intel’s new Xeons a big ARM-less hug by revealing the hyperscale servers it uses in Azure are ready to roll with Chipzilla’s latest silicon and will all use Chipzilla’s field programmable gate arrays.

    Those servers are dubbed “Project Olympus” and Microsoft has released their designs to the OpenCompute Project.

    Redmond also praises the Xeon Scalable Processors as being jolly powerful and all that, which will help Azure to scale and handle different workloads. But it’s the news that Redmond’s all-in with Intel Arria FPGAs that must be warming cockles down Chipzilla way, as using Xeons as the main engine and tweaking them for different roles with FPGAs is Intel’s strategy brought to life.

    IBM’s also embraced the new Xeons, gushing that it will be the first to offer them on bare metal cloud servers. But not, in all likelihood, the first to use them at all: Google has claims to have been running them since June 1st, 2017.

    Reply
  21. Tomi Engdahl says:

    Tom Krazit / GeekWire:
    Cloudflare launches Cloudflare Workers, an edge computing service for developers using its network, charging devs $0.50 for every 1M tasks used by their apps — Cloudflare is ready to take the wraps of a new service designed for developers creating Internet-of-Things apps that want to capitalize …

    Cloudflare to open an edge computing service for developers using its network
    https://www.geekwire.com/2018/cloudflare-open-edge-computing-service-developers-using-network/

    Cloudflare is ready to take the wraps of a new service designed for developers creating Internet-of-Things apps that want to capitalize on the proximity benefits provided by edge computing.

    Cloudflare Workers was first introduced last September, and Cloudflare is expected to announce Tuesday that it is now generally available for developers to check out. The new service runs on hardware that Cloudflare has installed in more than 125 data centers around the world to power its anti-DDoS (distributed denial of service) attack service, and it allows developers to write JavaScript applications through the Service Worker API that will run much closer to their users than might otherwise be possible with standard cloud services.

    “For quite some time, we have understood that there is real power in deploying applications that ran incredibly close to where users are on the internet,”

    About 1,000 users have been playing with Cloudflare Workers since the company opened the service up to a broader beta program in January following the September announcement. “I’ve been surprised by how dramatically different all of the applications people have bult are, it doesn’t feel like there is a bound to them yet,” Prince said.

    The benefits of edge computing are just starting to make their way into the world, although lots of folks have been talking about it for a while. It’s a recognition of the fact that as connected devices spread throughout the world, it quickly makes more sense to execute a lot of the code running those devices as physically close to them as possible, as waiting for instructions from a remote cloud data center won’t always cut it for real-time IoT devices.

    Reply
  22. Tomi Engdahl says:

    Google Cloud Platform Blog:
    In collaboration with Ubisoft, Google launches Agones, an open source, multiplayer, dedicated game server hosting system built on top of the Kubernetes platform — In the world of distributed systems, hosting and scaling dedicated game servers for online, multiplayer games presents some unique challenges.

    Introducing Agones: Open-source, multiplayer, dedicated game-server hosting built on Kubernetes
    http://cloudplatform.googleblog.com/2018/03/introducing-Agones-open-source-multiplayer-dedicated-game-server-hosting-built-on-Kubernetes.html

    Reply
  23. Tomi Engdahl says:

    Oracle’s cloud biz heading in the wrong direction right now
    https://techcrunch.com/2018/03/20/oracles-cloud-biz-heading-in-the-wrong-direction-right-now/?utm_source=tcfbpage&sr_share=facebook

    Oracle announced its quarterly earnings last night, detailing that its cloud business grew 32 percent to $1.6 billion in the quarter. That might sound good at first blush, but it’s part of three straight quarters of reduced growth — a fact that had investors jittery over night. It didn’t get better throughout the day today with Oracle’s stock plunging over 9 percent as of this writing.

    When you consider that enterprise business is shifting rapidly to the cloud, and that the cloud business in general is growing quickly, Oracle’s cloud numbers could be reason for concern.

    Oracle’s cloud revenue broke down as follows: SaaS, up 33 percent to $1.2 billion, and platform and infrastructure revenue combined up 28 percent to $415 million. To put those figures into context, consider that last quarter Alibaba reported overall cloud revenue of $533 million,which was up a whopping 104 percent year over year.

    Looking purely at Infrastructure services, Canalys reported that in the third quarter of 2017, Microsoft grew at around 90 percent year over year, while Google grew around 75 percent YoY. Even market leader Amazon, which controls over 30 percent of the market, had around a 40 percent growth rate, fairly remarkable given its size.

    “Cloud revenues including SaaS, PaaS and IaaS [all cloud business combined] are expected to grow 19% to 23% in USD, 17% to 21% in constant currency,” she told analysts this week.

    Reply
  24. Tomi Engdahl says:

    Netlify wants to make it easier for web developers to use AWS Lambda event triggers
    https://techcrunch.com/2018/03/20/netlify-wants-to-make-it-easier-for-web-developers-to-use-aws-lambda-event-triggers/?utm_source=tcfbpage&sr_share=facebook

    AdChoices

    Netlify wants to make it easier for web developers to use AWS Lambda event triggers
    Ron Miller
    @ron_miller / 14 hours ago

    Website development, experienced team. Flat 3d isometric
    Netlify has a vision of changing the way we develop websites, making it simpler to connect the front-end design to backend services execution. Today, the company announced another step in that vision when it introduced AWS Lambda functions on Netlify.

    The company aims to reduce much of the complexity associated with web development. You design your front end in HTML and JavaScript, then Netlify helps you connect to a set of services you might be using such as Stripe for payments or MailChimp for email newsletter management. Netlify has abstracted away the concept of a web server, which it says is slow to deploy and hard to secure and scale. By shifting from a monolithic website to a static front end with back-end microservices, it believes it can solve security and scaling issues and deliver the site much faster.

    Reply
  25. Tomi Engdahl says:

    Google brings DDoS protection and other new security features to its cloud
    https://techcrunch.com/2018/03/21/google-brings-new-security-features-to-its-cloud/?utm_source=tcfbpage&sr_share=facebook

    As far as DDoS attacks go, Google also today announced a new service called Cloud Armor (see, GCP can do naming right sometimes!). Cloud Armor is both a DDoS and application defense service that provides all the usual IP white- and blacklisting tools and integrates with Google’s Global Load Balancing service.

    Other Cloud Platform security updates include new logging tools, updates to the Data Loss Prevention API and new tools for managing access to GCP resources. The Google Cloud Platform is now also FedRamp certified at the Moderate Impact level, though unless you work for the U.S. federal government or a state or local agency, you probably don’t care much about that.

    Reply
  26. Tomi Engdahl says:

    Google Cloud tackles applications performance monitoring
    https://techcrunch.com/2018/03/28/google-cloud-tackles-applications-performance-monitoring/?utm_source=tcfbpage&sr_share=facebook

    As Google builds out its cloud platform, it has been continually taking tools and services it has created in-house for its own team and putting them out in the world for its customers as products. Today, it added a key ingredient for developers building applications on the Google Cloud Platform when it announced a suite of application performance management tools called Stackdriver APM.

    Reply
  27. Tomi Engdahl says:

    Blair Hanley Frank / VentureBeat:
    Amazon announces AWS Secrets Manager, which allows developers to programmatically insert credentials in applications without writing them into the source code — Amazon Web Services announced a new service today that could solve one of the biggest security headaches facing users of the cloud platform.

    AWS Secrets Manager plugs a major cloud security hole
    https://venturebeat.com/2018/04/04/aws-secrets-manager-plugs-a-major-cloud-security-hole/

    Amazon Web Services announced a new service today that could solve one of the biggest security headaches facing users of the cloud platform. The AWS Secrets Manager will allow developers to programmatically insert the credentials their applications need without writing them into the source code itself or setting them as environment variables.

    Leaked credentials written into source code have been one of the biggest security risks for customers of the cloud platform. The Secrets Manager will let customers replace that risk with a small function that goes and pulls down the correct credentials when it’s run for database access and connections to other services.

    While AWS Secrets Manager works with credentials for databases managed by the cloud provider’s Relational Database Service, it also works with third-party API keys, like those provided by Twitter and other companies. The service also handles automatic rotation of those security credentials.

    (To be clear, this isn’t an AWS-only problem: users of other cloud platforms have similar issues with managing credentials for their applications.)

    In addition to the Secrets Manager, AWS also announced a new Firewall Manager that lets companies centrally control settings for the AWS Web Application Firewall across multiple accounts. Along similar lines, an update to the AWS Config Rules service will allow customers to manage different compliance rules for their configurations across multiple accounts.

    Finally, the cloud provider announced a Private Certificate Authority feature for its security certificate management service.

    Reply
  28. Tomi Engdahl says:

    AWS Launches New Tools for Firewalls, Certificates, Credentials
    https://www.securityweek.com/aws-launches-new-tools-firewalls-certificates-credentials

    Amazon Web Services (AWS) announced on Wednesday the launch of several tools and services designed to help customers manage their firewalls, use private certificates, and safely store credentials.

    Private Certificate Authority

    One of the new services is called Private Certificate Authority (CA) and it’s part of the AWS Certificate Manager (ACM). The Private CA allows AWS customers to use private certificates without the need for specialized infrastructure.

    AWS Secrets Manager

    The new AWS Secrets Manager is designed to make it easier for users to store, distribute and rotate their secrets, including credentials, passwords and API keys. The storage and retrieval of secrets can be done via the API or the AWS Command Line Interface (CLI), while built-in or custom AWS Lambda functions provide the capabilities for rotating credentials.

    AWS Firewall Manager

    The new AWS Firewall Manager is designed to simplify administration of AWS WAF web application firewalls across multiple accounts and resources. Administrators can create policies and set up firewall rules and they are automatically applied to all applications, regardless of the region where they are hosted.

    Amazon EFS data encrypted in transit

    Amazon also announced that it has added support for encrypting data in transit for the Amazon Elastic File System (EFS), a file system designed for cloud applications that require shared access to file-based storage. Support for encrypting data at rest has already been available.

    Reply
  29. Tomi Engdahl says:

    AWS adds automated point-in-time recovery to DynamoDB
    https://techcrunch.com/2018/04/04/aws-adds-automated-point-in-time-recovery-to-dynamodb/?utm_source=tcfbpage&sr_share=facebook

    One of the joys of cloud computing is handing over your data to the cloud vendor and letting them handle the heavy lifting. Up until now that has meant they updated the software or scaled the hardware for you. Today, AWS took that to another level when it announced Amazon DynamoDB Continuous Backups and Point-In-Time Recovery (PITR).

    With this new service, the company lets you simply enable the new backup tool, and the backup happens automatically. Amazon takes care of the rest, providing a continuous backup of all the data in your DynamoDB database.

    Reply
  30. Tomi Engdahl says:

    87% of public cloud migrators are keeping or increasing their investment in private cloud
    https://www.redhat.com/en/engage/hybrid-cloud-strategy-20180222?sc_cid=7016000000127ECAAY

    In this recent Forrester study, 300 IT leaders share why their migration strategy includes both public and private cloud—and the kinds of challenges they’re facing along the way.

    Reply
  31. Tomi Engdahl says:

    Azure’s new Serial Console gives you a direct window into the dark heart of your VMs
    https://techcrunch.com/2018/03/26/azures-new-serial-console-gives-you-a-direct-window-into-the-dark-heart-of-your-vms/?utm_source=tcfbpage&sr_share=facebook

    Azure developers and sysadmins have long asked for the ability to get access to a serial console for their virtual machines (VMs). While it’s generally easy enough to log into a VM after it has booted, things get far more complicated if the machine doesn’t boot for some reason. Troubleshooting that can be a nightmare. With today’s launch of the Serial Console in the Azure portal, developers get a full view of their machine’s boot process that should make fixing these kind of issues far easier.

    Reply
  32. Tomi Engdahl says:

    SSM Parameter Store for keeping secrets in a structured way
    https://medium.com/@tdi/ssm-parameter-store-for-keeping-secrets-in-a-structured-way-53a25d48166a

    AWS Systems Manager Parameter Store (SSM) provides you with a secure way to store config variables for your applications. You can access SSM via AWS API directly from within the app or just use from AWS CLI. SSM can store plaintext parameters or KMS encrypted secure strings. Since parameters are identified by ARNs, you can set a fine grain access control to your configuration bits with IAM. A truely versatile service !

    Reply
  33. Tomi Engdahl says:

    Microsoft adds ransomware protection and file restore to OneDrive cloud storage
    Outlook.com also gets encrypted email support
    https://www.theverge.com/2018/4/5/17201660/microsoft-onedrive-files-restore-feature-ransomware-protection

    Reply
  34. Tomi Engdahl says:

    Full-Metal Packet is hosting the future of cloud infrastructure
    https://techcrunch.com/2018/04/21/full-metal-packet/?utm_source=tcfbpage&utm_medium=feed&utm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29&utm_content=FaceBook&sr_share=facebook

    AdChoices

    Full-Metal Packet is hosting the future of cloud infrastructure
    Danny Crichton
    @dannycrichton / 13 hours ago

    packet-frame-80
    Cloud computing has been a revolution for the data center. Rather than investing in expensive hardware and managing a data center directly, companies are relying on public cloud providers like AWS, Google Cloud, and Microsoft Azure to provide general-purpose and high-availability compute, storage, and networking resources in a highly flexible way.

    Yet as workflows have moved to the cloud, companies are increasingly realizing that those abstracted resources can be enormously expensive compared to the hardware they used to own. Few companies want to go back to managing hardware directly themselves, but they also yearn to have the price-to-performance level they used to enjoy. Plus, they want to take advantage of a whole new ecosystem of customized and specialized hardware to process unique workflows — think Tensor Processing Units for machine learning applications.

    That’s where Packet comes in. The New York City-based startup’s platform offers a highly-customizable infrastructure for running bare metal in the cloud. Rather than sharing an instance with other users, Packet’s customers “own” the hardware they select, so they can use all the resources of that hardware.

    Even more interesting is that Packet will also deploy custom hardware to its data centers, which currently number eighteen around the world. So, for instance, if you want to deploy a quantum computing box redundantly in half of those centers, Packet will handle the logistics of installing those boxes, setting them up, and managing that infrastructure for you.

    Reply
  35. Tomi Engdahl says:

    Google Expands Cloud Connectivity Options for Enterprise Data Centers
    http://www.datacenterknowledge.com/google-alphabet/google-expands-cloud-connectivity-options-enterprise-data-centers

    Provides wider variety of connectivity partners, flexible onramp bandwidth

    Companies will soon have many more options than before for connecting their own data centers to Google’s global cloud platform.

    They will be able to choose from a list of certified carriers and colocation providers to connect to the nearest Google Point of Presence. Google Cloud Platform announced the new cloud onramp, called Partner Interconnect, Tuesday. The company expects it to be generally available “in the coming weeks.”

    Partner Interconnect is different from the current GCP onramp in that it gives clients the ability to choose the bandwidth they need. The current onramp, called Dedicated Interconnect, comes as a full 10Gbps circuit, while the new one offers partial circuits, ranging from 50Mbps to 10Gbps.

    Public cloud providers offer private connectivity to their networks to make their services more palatable for security and performance-conscious enterprises than consuming those services over the public internet. Infrastructure-as-a-Service offerings like GCP and Amazon Web Services and Software-as-a-Service products like Salesforce’s CRM and Microsoft’s Office 365 can now be accessed this way.

    About two dozen service providers are on the Partner Interconnect list. They include major carriers, such as AT&T, Verizon, BT, CenturyLink, Orange, NTT, and SoftBank, as well as the world’s two biggest data center providers: Equinix and Digital Realty Trust.

    Reply
  36. Tomi Engdahl says:

    Google’s New Spending Surge Shows a Company Playing Catch-Up
    http://www.datacenterknowledge.com/google-alphabet/googles-new-spending-surge-shows-company-playing-catch

    Data centers; three new undersea cables; processors, networking equipment, other machinery to power AI efforts drive CapEx spike

    Reply
  37. Tomi Engdahl says:

    By keeping its head in the cloud, Microsoft makes it rain on shareholders
    https://techcrunch.com/2018/04/26/by-keeping-its-head-in-the-cloud-microsoft-makes-it-rain-on-shareholders/?utm_source=tcfbpage&utm_medium=feed&utm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29&sr_share=facebook

    Thanks in part to its colossal cloud business, Microsoft earnings are drenching shareholders in dollars.

    For the quarter ending March 31, 2018, the tech ringer from Redmond saw its revenue increase to $26.8 billion (up 16 percent) from $23.2 billion, with operating income up 23 percent to $8.3 billion, up from $6.7 billion.

    Reply
  38. Tomi Engdahl says:

    Cloud services demand booms in 1Q18: Synergy Research
    http://www.lightwaveonline.com/articles/2018/05/cloud-services-demand-booms-in-1q18-synergy-research.html?cmpid=enl_lightwave_lightwave_datacom_2018-05-01&pwhid=6b9badc08db25d04d04ee00b499089ffc280910702f8ef99951bdbdad3175f54dcae8b7ad9fa2c1f5697ffa19d05535df56b8dc1e6f75b7b6f6f8c7461ce0b24&eid=289644432&bid=2087289

    Yes, more businesses are moving their operations to the cloud. Demand for cloud infrastructure services grew a whopping 51% year-on-year in the first quarter of 2018, according to Synergy Research Group data. Amazon Web Services (AWS) led the way, but Microsoft, Google, and Alibaba also enjoyed hefty growth. The major players are increasing market share at the expense of smaller players, the market research firm adds.

    Quarterly cloud infrastructure service revenues (including IaaS, PaaS, and hosted private cloud services) reached almost $15 billion in the quarter, Synergy Research estimates. The quarter’s 51% growth rate not only topped that of the year-ago quarter but the full-year 2017 growth rate of 44% and the 2016 growth rate of 50% as well. Public IaaS and PaaS services, which make up the majority of the market, grew by 56% in the quarter. AWS enjoyed revenue growth in the first quarter it hadn’t seen since late 2016, according to Synergy Research.

    “Cloud growth in the last two quarters really has been quite exceptional,”

    Reply
  39. Tomi Engdahl says:

    Cloud sales potential bright in 2018, but China situation cloudy: LightCounting
    http://www.lightwaveonline.com/articles/2018/05/cloud-sales-potential-bright-in-2018-but-china-situation-cloudy-lightcounting.html?cmpid=enl_lightwave_lightwave_enabling_technologies_2018-05-03&pwhid=6b9badc08db25d04d04ee00b499089ffc280910702f8ef99951bdbdad3175f54dcae8b7ad9fa2c1f5697ffa19d05535df56b8dc1e6f75b7b6f6f8c7461ce0b24&eid=289644432&bid=2090749

    According to LightCounting, the optical components market benefited last year from strong sales related to cloud networking, and the market research firm anticipates this segment will remain strong in 2018. Robust demand from key U.S. data center operators for 100GbE transceivers continues, and deployment of the technology by Chinese cloud companies has commenced. However, the current trade situation between China and the U.S. could put a damper on overall sales in 2018, LightCounting suggests.

    Including cloud in China, LightCounting currently forecasts that 400GbE optics sales will expand the cloud market segment from approximately $2 billion in 2017 to over $6 billion in 2023

    The market research firm states that optics sales to Huawei and ZTE, such as optics sold for Chinese deployments (illustrated in the figure as China Telecom only), were weaker than anticipated and were therefore a primary influence on the 2017 decline in the rest of the market.

    Notable drops in sales to these customers was reported in March 2017 by optical component and modules suppliers, which were related to Huawei and ZTE’s excess inventory accumulated in 2016. Suppliers report that business remains slower than anticipated with these Chinese customers, despite depletion of the excess inventory by the end of 2017.

    Reply
  40. Tomi Engdahl says:

    Steven J. Vaughan-Nichols / ZDNet:
    Microsoft and Red Hat unveil OpenShift on Azure, a jointly managed hybrid cloud service for Kubernetes container management, out in preview in coming months — Microsoft and Red Hat expand their partnership around hybrid cloud, Kubernetes container management, and developer productivity.

    ​Red Hat and Microsoft bring OpenShift to Azure
    https://www.zdnet.com/article/red-hat-and-microsoft-bring-openshift-to-azure/

    Microsoft and Red Hat expand their partnership around hybrid cloud, Kubernetes container management, and developer productivity.

    Reply
  41. Tomi Engdahl says:

    Say hello to Google One
    https://techcrunch.com/2018/05/14/say-hello-to-google-one/

    Google is revamping its consumer storage plans today by adding a new $2.99/month tier for 200 GB of storage and dropping the price of its 2 TB plan from $19.99/month to $9.99/month (and dropping the $9.99/month 1 TB plan). It’s also rebranding these storage plans (but not Google Drive itself) as “Google One.”

    Going forward, you’ll also be able to share your storage quota with up to five family members.

    That access to live experts — not some barely functional AI chatbot — comes with every Google One plan, including the $1.99/month 100 GB plan.

    Google already offered 24/7 support for paying business users with a G Suite account, but this is the first time it actively offers live support for consumers.

    It’s worth stressing that the existing free quota of 15 GB will remain.

    Reply
  42. Tomi Engdahl says:

    Google Compute Engine now offers VMs with up to 3844GB of memory
    https://techcrunch.com/2018/05/15/google-compute-engine-now-offers-vms-with-up-to-3844gb-of-memory/?utm_source=tcfbpage&utm_medium=feed&utm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29&sr_share=facebook

    Sometimes, you just need more RAM. That’s especially true when you want to run memory-hungry enterprise applications like SAP’s HANA database or high-performance computing workloads. Until now, if you wanted the Google Compute Engine to run applications like that, your options topped out at 624GB of memory. Starting today, though, the company is going beyond that by introducing three new tiers on top of this that top out at 3844GB and 160 virtual compute cores.

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*