Computer trends 2017

I did not have time to post my computer technologies predictions t the ends of 2016. Because I missed the year end deadline, I though that there is no point on posting anything before the news from CES 2017 have been published. Here are some of myck picks on the current computer technologies trends:

CES 2017 had 3 significant technology trends: deep learning goes deep, Alexa everywhere and Wi-Fi gets meshy. The PC sector seemed to be pretty boring.

Gartner expects that IT sales will growth (2.7%) but hardware sales will not have any growth – can drop this year. TEKsystems 2017 IT forecast shows IT budgets rebounding from a slump in 2016, and IT leaders’ confidence high going into the new year. But challenges around talent acquisition and organizational alignment will persist. Programming and software development continue to be among the most crucial and hard-to-find IT skill sets.

Smart phones sales (expected to be 1.89 billion) and PC sales (expected to be 432 million) do not grow in 2017. According to IDC PC shipments declined for a fifth consecutive year in 2016 as the industry continued to suffer from stagnation and lack of compelling drivers for upgrades. Both Gartner and IDC estimated that PC shipments declined about 6% in 2016.Revenue in the traditional (non-cloud) IT infrastructure segment decreased 10.8 per cent year over year in the third quarter of 2016. Only PC category that has potential for growth is ultramobile (includes Microsoft Surface ja Apple MacBook Air). Need for memory chips is increasing.

Browser suffers from JavaScript-creep disease: This causes that the browing experience seems to be become slower even though computer and broadband connections are getting faster all the time. Bloat on web pages has been going on for ages, and this trend seems to continue.

Microsoft tries all it can to make people to switch from older Windows versions to Windows 10. Microsoft says that continued usage of Windows 7 increases maintenance and operating costs for businesses as malware attacks that could have been avoided by upgrading to Windows 10. Microsoft says that continued usage of Windows 7 increases maintenance and operating costs for businesses. Microsoft: Windows 7 Does Not Meet the Demands of Modern Technology; Recommends Windows 10. On February 2017 Microsoft stops the 20 year long tradition of monthly security updates. Windows 10 “Creators Update” coming early 2017 for free, featuring 3D and mixed reality, 4K gaming, more.

Microsoft plans to emulate x86 instructions on ARM chips, throwing a compatibility lifeline to future Windows tablets and phones. Microsoft’s x86 on ARM64 Emulation is coming in 2017. This capability is coming to Windows 10, though not until “Redstone 3″ in the Fall of 2017

Parents should worry less about the amount of time their children spend using smartphones, computers and playing video games because screen time is actually beneficial, the University of Oxford has concluded. 257 minutes is the time teens can spend on computers each day before harming wellbeing.

Outsourcing IT operations to foreign countries is not trendy anymore and companied live at uncertain times. India’s $150 billion outsourcing industry stares at an uncertain future. In the past five years, revenue and profit growth for the top five companies listed on the BSE have halved. Industry leader TCS too felt the impact as it made a shift in business model towards software platforms and chased digital contacts.

Containers will become hot this year and cloud will stay hot. Research firm 451 Research predicts this year containerization will be US $ 762 million business and that Containers will become 2.6 billion worth of software business in 2020. (40 per cent a year growth rate).

Cloud services are expected to have  22 percent annual growth rate. By 2020, the sector would grow from the current 22.2 billion to $ 46 billion. In Finland 30% of companies now prefer to buy cloud services when buying IT (20 per cent of IT budget goes to cloud).Cloud spend to make up over a third of IT budgets by 2017. Cloud and hosting services will be responsible for 34% of IT budgets by 2017, up from 28% by the end of 2016, according to 451 Research. Cloud services have many advantages, but cloud services have also disadvantages. In five years, SaaS will be the cloud that matters.

When cloud is growing, so is the spending on cloud hardware by the cloud companies. Cloud hardware spend hits US$8.4bn/quarter, as traditional kit sinks – 2017 forecast to see cloud kit clock $11bn every 90 days. In 2016′s third quarter vendor revenue from sales of infrastructure products (server, storage, and Ethernet switch) for cloud IT, including public and private cloud, grew by 8.1 per cent year over year to $8.4 billion. Private cloud accounted for $3.3 billion with the rest going to public clouds. Data centers need lower latency components so Google Searches for Better Silicon.

The first signs of the decline and fall of the 20+ year x86 hegemony will appear in 2017. The availability of industry leading fab processes will allow other processor architectures (including AMD x86, ARM, Open Power and even the new RISC-V architecture) to compete with Intel on a level playing field.

USB-C will now come to screens – C-type USB connector promises to really become the only all equipment for the physical interface.The HDMI connection will be lost from laptops in the future. Thunderbolt 3 is arranged to work with USB Type-C,  but it’s not the same thing (Thunderbolt is four times faster than USB 3.1).

World’s first ‘exascale’ supercomputer prototype will be ready by the end of 2017, says China

It seems that Oracle Begins Aggressively Pursuing Java Licensing Fees in 2017. Java SE is free, but Java SE Suite and various flavors of Java SE Advanced are not. Oracle is massively ramping up audits of Java customers it claims are in breach of its licences – six years after it bought Sun Microsystems. Huge sums of money are at stake. The version of Java in contention is Java SE, with three paid flavours that range from $40 to $300 per named user and from $5,000 to $15,000 for a processor licence. If you download Java, you get everything – and you need to make sure you are installing only the components you are entitled to and you need to remove the bits you aren’t using.

Your Year in Review, Unsung Hero article sees the following trends in 2017:

  • A battle between ASICs, GPUs, and FPGAs to run emerging workloads in artificial intelligence
  • A race to create the first generation of 5G silicon
  • Continued efforts to define new memories that have meaningful impact
  • New players trying to take share in the huge market for smartphones
  • An emerging market for VR gaining critical mass

Virtual Reality Will Stay Hot on both PC and mobile.“VR is the heaviest heterogeneous workload we encounter in mobile—there’s a lot going on, much more than in a standard app,” said Tim Leland, a vice president for graphics and imaging at Qualcomm. The challenges are in the needs to calculate data from multiple sensors and respond to it with updated visuals in less than 18 ms to keep up with the viewer’s head motions so the CPUs, GPUs, DSPs, sensor fusion core, display engine, and video-decoding block are all running at close to full tilt.

 


932 Comments

  1. Tomi Engdahl says:

    DDR5 Runs in Rambus’ Labs
    Debate rises over next-gen memory interface
    http://www.eetimes.com/document.asp?doc_id=1332322

    Rambus has working silicon in its labs for DDR5, the next major interface for DRAM dual in-line memory modules (DIMMs). The register clock drivers and data buffers could help double the throughput of main memory in servers, probably starting in 2019 — and they are already sparking a debate about the future of computing.

    The Jedec standards group plans to release before June the DDR5 spec as the default memory interface for next-generation servers. However, some analysts note it comes at a time of emerging alternatives in persistent memories, new computer architectures and chip stacks.

    “To the best of our knowledge, we are the first to have functional DDR5 DIMM chip sets in the lab. We are expecting production in 2019, and we want to be first to market to help partners bring up the technology,” said Hemant Dhulla, a vice president of product marketing for Rambus.

    DDR5 is expected to support data rates up to 6.4 Gbits/second delivering 51.2 GBytes/s max, up from 3.2 Gbits and 25.6 GBytes/s for today’s DDR4. The new version will push the 64-bit link down to 1.1V and burst lengths to 16 bits from 1.2V and 8 bits. In addition, DDR5 lets voltage regulators ride on the memory card rather than the motherboard.

    In parallel, CPU vendors are expected to expand the number of DDR channels on their processors from 12 to 16. That could drive main memory sizes to 128 Gbytes from 64 GB today.

    Reply
  2. Tomi Engdahl says:

    What Comes After User-Friendly Design?
    https://tech.slashdot.org/story/17/09/19/1946256/what-comes-after-user-friendly-design

    “User-friendly” was coined in the late 1970s, when software developers were first designing interfaces that amateurs could use. In those early days, a friendly machine might mean one you could use without having to code. Forty years later, technology is hyper-optimized to increase the amount of time you spend with it, to collect data about how you use it, and to adapt to engage you even more. [...]

    What Comes After User-Friendly Design?
    The design industry needs a new way to talk to users–one that isn’t just friendly, but respectful.
    https://www.fastcodesign.com/90139957/what-comes-after-user-friendly-design

    “User-friendly” was coined in the late 1970s, when software developers were first designing interfaces that amateurs could use. In those early days, a friendly machine might mean one you could use without having to code.

    Forty years later, technology is hyper-optimized to increase the amount of time you spend with it, to collect data about how you use it, and to adapt to engage you even more. Meanwhile, other aspects of our tech have remained difficult to use, from long, confusing privacy policies to a lack of explanation on why and when your data is being collected, much less how it’s being protected. While some aspects of apps and platforms have become almost too easy to use–consider how often Facebook invites you to check out a friend’s latest update or new comment–others remain frustratingly opaque, like, say, the way Facebook crafts advertising to your behavior.

    The discussion around privacy, security, and transparency underscores a broader transformation in the typical role of the designer

    Reply
  3. Tomi Engdahl says:

    Hitachi Data Systems is no more! Arise the new ‘Hitachi Vantara’
    HDS, Pentaho and Hitachi Insight Group join forces, promise data-driven IoT fun
    https://www.theregister.co.uk/2017/09/20/hds_is_no_more_hello_hitachi_vantara/

    Hitachi Data Systems is no more: the venerable storage vendor has been subsumed into a new outfit called “Hitachi Vantara” that says it “helps data-driven leaders find and use the value in their data to innovate intelligently and reach outcomes that matter for business and society.”

    The new organisation remains a Hitachi subsidiary, and mashes together HDS, the IoT-focussed Hitachi Insight Group, and analytics outfit Pentaho into “a single integrated business”.

    Reply
  4. Tomi Engdahl says:

    Jessica Conditt / Engadget:
    Microsoft rolls out Minecraft’s “Better Together” cross-platform play update for mobile, VR, XBox, and Windows 10, says it will come to Switch later this year — The Better Together update brings the biggest set of changes to hit Minecraft in years, expanding the Community Marketplace …

    The big ‘Minecraft’ cross-platform update is live, but not on Switch
    Microsoft Studios CVP Matt Booty explains what happened with Better Together.
    https://www.engadget.com/2017/09/20/minecraft-cross-play-xbox-switch-ps4-nintendo-microsoft-sony-better-together/

    Reply
  5. Tomi Engdahl says:

    AMD Opteron Vs EPYC: How AMD Server Performance Evolved Over 10 Years
    https://hardware.slashdot.org/story/17/09/20/1952208/amd-opteron-vs-epyc-how-amd-server-performance-evolved-over-10-years

    Phoronix has carried out tests comparing AMD’s high-end EPYC 7601 CPU to AMD Opteron CPUs from about ten years ago, looking at the EPYC/Opteron Linux performance and power efficiency. Both on the raw performance and performance-per-Watt, the numbers are quite staggering though the single-threaded performance hasn’t evolved quite as much. The EPYC 7601 is a $4,200 USD processor with 32 cores / 64 threads.

    Opteron vs. EPYC Benchmarks & Performance-Per-Watt: How AMD Server Performance Evolved Over 10 Years
    https://www.phoronix.com/scan.php?page=article&item=amd-epyc-opteron&num=1

    Reply
  6. Tomi Engdahl says:

    GitLab freezes GraphQL project amid looming Facebook patent fears
    Promising query language garbled by legal lingo
    https://www.theregister.co.uk/2017/09/20/gitlab_suspends_graphql_project_over_facebook_license_terms/

    Using GraphQL, an increasingly popular query language for grabbing data, may someday infringe upon pending Facebook patents, making the technology inherently problematic for corporate usage.

    In an analysis posted to Medium and in a related discussion in the GraphQL repo on GitHub, attorney and developer Dennis Walsh observed that Facebook’s GraphQL specification doesn’t include a patent license. In other words: using GraphQL in your software may lead to your code infringing a Facebook-held patent on the technology in future.

    “The patents (as of a few weeks ago) were granted but not issued,” said Walsh in an email to The Register today. ”Damages can start before issuance but litigation cannot. But post-issuance, the threat is very real. My reading of two GraphQL granted applications and the GraphQL spec is that any properly implemented GraphQL server infringes.”

    Reply
  7. Tomi Engdahl says:

    Bill Gates says he’d do CTRL-ALT-DEL with one key if given the chance to go back through time
    Gives two-fingered salute to IBM designers for forcing us to use three-fingered salute
    http://www.theregister.co.uk/2017/09/21/bill_gates_says_hed_do_ctrlaltdel_with_one_key_if_given_the_chance/

    “The IBM hardware PC keyboard only had one way it could get a guaranteed interrupt generated. So clearly the people involved, they should have put another key on in order to make that work.”

    Gates also observed that “a lot of machines nowadays do have that as a more obvious function”. [And Sinclair's ZX Spectrum had a BREAK key – Ed]

    Rubenstein pressed, asking whether Gates regrets having chosen CTRL-ALT-DEL.

    “I am not sure you can go back and change the small things in your life without putting the other things at risk,” Gates responded, before adding: “Sure, if I could make one small edit I would make that a single key operation.”

    Reply
  8. Tomi Engdahl says:

    Chatbots: A load of hype or fancy lifehack for the lazy IT person?
    Some of these buggers are adept at handling the mundane
    https://www.theregister.co.uk/2017/09/21/the_bots_breaking_business/

    It’s the age of the chatbot. The chatbot revolution is coming. Unless it isn’t.

    Conversational User Interfaces are still very much on the Innovation Trigger rising curve of Gartner’s hype cycle. And the people who are the most hyped up about chatbots seem to be the people that sell platforms to build them on. As chatbots seep out of the consumer space and into our internal workplace tools, are we at risk of losing our job because a bot can do it better?

    Reply
  9. Tomi Engdahl says:

    Cloud fragmentation ‘widespread’ among businesses
    http://www.cloudpro.co.uk/leadership/cloud-essentials/7037/cloud-fragmentation-widespread-among-businesses

    Hybrid cloud is popular, but too many clouds cause silos – survey

    Cloud fragmentation is affecting business productivity, according to a new survey of nearly 700 IT professionals by Cloudify.

    The company said that almost a tenth of businesses have deployed five different clouds or more. Two-thirds of respondents said their organisation is suffering from siloism as a result, where certain tools and capabilities are locked to a particular cloud tool.

    To try and rectify the situation without losing the flexibility of a multi-cloud environment, firms are deploying management platforms to attempt to co-ordinate and organise cloud applications. Although this does help fix the problem, it comes at extra cost.

    “Organisations are looking to the cloud for performance and innovation, but unfortunately they often find themselves in midst of a highly fragmented cloud world, where technologies are superfluous and incongruous, and where siloism is stifling agility and innovation,” said Nati Shalom, CTO of Cloudify.

    Reply
  10. Tomi Engdahl says:

    10 tips for getting started with machine learning
    https://www.cio.com/article/3223397/artificial-intelligence/10-tips-for-getting-started-with-machine-learning.html

    Artificial intelligence and machine learning can yield game-changing solutions for enterprises. Here’s what senior IT leaders need to know to launch and maintain a successful machine learning strategy.

    Reply
  11. Tomi Engdahl says:

    Brian Womack / Bloomberg:
    Sources: Hewlett Packard Enterprise plans to start cutting 5,000 jobs, or about 10% of its staff, by the end of this year

    Hewlett Packard Enterprise Is Said to Plan About 5,000 Job Cuts
    https://www.bloomberg.com/news/articles/2017-09-21/hewlett-packard-enterprise-is-said-to-plan-about-5-000-job-cuts

    Hewlett Packard Enterprise Co. is planning to cut about 10 percent of its staff, or at least 5,000 workers, according to people familiar with the matter, part of a broader effort to pare expenses as competition mounts.

    The reductions are expected to start before the end of the year

    The cuts at the company, which has about 50,000 workers, are likely to affect workers in the U.S. and abroad, including managers

    Chief Executive Officer Meg Whitman has been jettisoning divisions since 2015, including personal computers, printers, business services and key software units. The moves are all part of an effort to make HPE more responsive to a changing industry that’s under pressure from cloud providers such as Amazon.com Inc. and Alphabet Inc.’s Google.

    Reply
  12. Tomi Engdahl says:

    Matthew Lynley / TechCrunch:
    SEC: MongoDB files for IPO, reports loss of $45.8M on revenue of ~$68M in 6 months ending July 31 — MongoDB, a database software company based in New York, has filed to go public with the Securities and Exchange Commission as it continues to burn a ton of cash despite its revenue almost doubling year-over-year.

    Database provider MongoDB has filed to go public
    https://techcrunch.com/2017/09/21/database-provider-mongodb-has-filed-to-go-public/

    MongoDB, a database software company based in New York, has filed to go public with the Securities and Exchange Commission as it continues to burn a ton of cash despite its revenue almost doubling year-over-year.

    The company, which provides open-source database software that became very attractive among early-stage startups, is one of a myriad of companies that have sought to go public by building a business around selling sophisticated tools for that software.

    Reply
  13. Tomi Engdahl says:

    Emergent Tech Arrow Artificial Intelligence
    Chatbots: A load of hype or fancy lifehack for the lazy IT person?
    Some of these buggers are adept at handling the mundane
    https://www.theregister.co.uk/2017/09/21/the_bots_breaking_business/

    It’s the age of the chatbot. The chatbot revolution is coming. Unless it isn’t.

    Conversational User Interfaces are still very much on the Innovation Trigger rising curve of Gartner’s hype cycle. And the people who are the most hyped up about chatbots seem to be the people that sell platforms to build them on. As chatbots seep out of the consumer space and into our internal workplace tools, are we at risk of losing our job because a bot can do it better?

    Back in April 2016, Facebook opened up the world of chatbots to businesses and brands.

    Fast forward to the present day and consumer chatbot adoption has been hit and miss. Despite more than 100,000 bots living on the Facebook Messenger platform, I interact with zero on a regular basis. One retailer in my area has abandoned their bot in favour in a new ordering app. The Singapore Government has a bot to ask about pollution levels or send feedback about their services, so maybe I’m just in the wrong market or the wrong demographic.

    So, how do you get past the hype?

    First, you have to decide if chatbots are a priority project. The reality of enterprise IT is that there are always fires to fight. Right now, we’ve got the business screaming at us about digital transformation and disruption, because our competitors will eat us for breakfast if we don’t do something. That’s maybe why consumer-facing bots are more prevalent than the use of bots for in-house capabilities. How much of a priority is a new chatbot compared to the rest of your transformational projects?

    Next, you’ll need to pick your platform. If you’re using a chat-based collaboration platform, that will influence what you use to build your bot, though there are a huge number of choices. Will you build your own using something like Hubot, IBM’s Watson or Microsoft’s Bot Framework, calling on Microsoft’s Cognitive Services? Will you look to a third party like ChattyPeople, Smooch or FlowXO? Look at where the data is that you want the bot to read from, interact with or change. Last thing we want is a data security nightmare on our hands.

    Reply
  14. Tomi Engdahl says:

    Ben Lang / Road to VR:
    Intel Scraps Plans to Launch Project Alloy Reference Headset, Pursuing Other VR R&D
    https://www.roadtovr.com/intel-scraps-plans-to-launch-project-alloy-reference-headset-pursuing-other-vrar-rd/

    Reply
  15. Tomi Engdahl says:

    Facebook U-turn: React, other libraries freed from unloved patent license
    Hybrid BSD pact will be replaced by MIT deal for some projects
    https://www.theregister.co.uk/2017/09/22/facebook_will_free_react_other_code_from_unloved_license/

    Faced with growing dissatisfaction about licensing requirements for some of its open-source projects, Facebook today said it will move React, Jest, Flow, and Immutable.js under the MIT license next week.

    “We’re relicensing these projects because React is the foundation of a broad ecosystem of open source software for the web, and we don’t want to hold back forward progress for nontechnical reasons,” said Facebook engineering director Adam Wolff in a blog post on Friday.

    Wolff said while Facebook continues to believe its BSD + Patents license has benefits, “we acknowledge that we failed to decisively convince this community.”

    Relicensing React, Jest, Flow, and Immutable.js
    https://code.facebook.com/posts/300798627056246/relicensing-react-jest-flow-and-immutable-js/

    Next week, we are going to relicense our open source projects React, Jest, Flow, and Immutable.js under the MIT license. We’re relicensing these projects because React is the foundation of a broad ecosystem of open source software for the web, and we don’t want to hold back forward progress for nontechnical reasons.

    Reply
  16. Tomi Engdahl says:

    Mark Hachman / PCWorld:
    Intel launches 8th-gen Core desktop chips, ranging from 4-core Intel Core i3-8100 at $117 to the highest-end 6-core Core i7-8700K for $359

    Intel launches 8th-gen Core desktop chips, claims new Core i7-8700K is its best gaming chip ever
    https://www.pcworld.com/article/3227418/components-processors/intel-launches-8th-gen-core-desktop-chips-claims-new-core-i7-8700k-is-its-best-gaming-chip-ever.html

    Intel’s new 8th-gen Core chips now include six cores on the high end, attacking one of AMD’s Ryzen advantages.

    Intel pushed further ahead into its 8th-generation Core series with the launch of its mainstream desktop chips on Sunday night, including the 6-core/12-thread Core i7-8700K, which Intel claims is its best gaming chip ever. Intel also beefed up its Core i5 family with 6-core parts, as well as its first quad-core Core i3.

    Orders for the Intel’s new Core desktop chips will begin on Oct. 5, Anand Srivatsa, general manager of the desktop platform group at Intel, said. They will begin shipping later in the fourth quarter. Though Intel executives didn’t use the term, the new chips have been referred to as part of the “Coffee Lake” family.

    Reply
  17. Tomi Engdahl says:

    Richard Stallman vs. Canonical’s CEO: ‘Will Microsoft Love Linux to Death?’
    https://linux.slashdot.org/story/17/09/24/2132218/richard-stallman-vs-canonicals-ceo-will-microsoft-love-linux-to-death?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Slashdot%2Fslashdot%2Fto+%28%28Title%29Slashdot+%28rdf%29%29

    TechRepublic got different answers about Microsoft’s new enthusiasm for Linux from Canonical’s founder and CEO Mark Shuttleworth, and from Richard Stallman. Stallman “believes that Microsoft’s decision to build a Windows Subsystem for Linux (WSL) amounts to an attempt to extinguish software that users are free to run, copy, distribute, study, change and improve.”
    “It certainly looks that way. But it won’t be so easy to extinguish us, because our reasons for using and advancing free software are not limited to practical convenience,” he said. “We want freedom. As a way to use computers in freedom, Windows is a non-starter…” Stallman remains adamant that the WSL can only help entrench the dominance of proprietary software like Windows, and undermine the use of free software. “That doesn’t advance the cause of free software, not one bit,” he says… “The aim of the free software movement is to free users from freedom-denying proprietary programs and systems, such as Windows. Making a non-free system, such Windows or MacOS or iOS or ChromeOS or Android, more convenient is a step backward in the campaign for freedom…”

    For Shuttleworth, Windows’ embrace of GNU/Linux is a net positive for open-source software as a whole. “It’s not like Microsoft is stealing our toys, it’s more that we’re sharing them with Microsoft in order to give everyone the best possible experience,” he says.

    Will Microsoft love Linux to death? Shuttleworth and Stallman on whether Windows 10 is free software’s friend
    Is Microsoft’s newfound enthusiasm for open-source software genuine?
    http://www.techrepublic.com/article/will-microsoft-love-linux-to-death-shuttleworth-and-stallman-on-whether-windows-10-is-free-softwares/

    Reply
  18. Tomi Engdahl says:

    Blame Canada? $5.7m IBM IT deal balloons to $185m thanks to ‘an open bag of money’
    With all their hockey hullabaloo, and that crazy payout too
    http://www.theregister.co.uk/2017/09/21/that_57m_ibm_contract_ended_up_costing_185m_well_blame_canada/

    A CA$5.7m IBM contract to update payroll systems for the Canadian government has turned into a CA$185m boondoggle for the Great White North. That’s $4.62m and $150m in US currency, respectively.

    The relatively straightforward task of installing PeopleSoft HR software for a few government agencies has morphed into a massive project dubbed Phoenix, which is detailed in a 1,700-page contract that has been revised some three dozen times, CBC reported today.

    Big Blue itself may not be solely to blame for the fiasco. The government messed up when it took what was to be a simple software contract and expanded it into an effort to overhaul the payroll system used by hundreds of government agencies – which left IBM with a blank check for its services.

    “The statement of requirement could leave loopholes, could leave escape avenues in it,” said former Canadian Treasury Board analyst Roman Klimowicz. “IBM basically has an open bag of money to help themselves to.”

    And we learn that IBM had virtually no competition when it made its bid to build Phoenix. When the Canadian government put out its call for bids on the contract, Big Blue was the only party to step forward and offer a solution. It now has the rights to build Phoenix through the year 2019.

    Reply
  19. Tomi Engdahl says:

    Two-thirds of businesses losing money over poor cloud skills
    http://www.cloudpro.co.uk/operations/7057/two-thirds-of-businesses-losing-money-over-poor-cloud-skills

    Lack of cloud expertise stifling innovation and costing firms over £200m a year

    Nearly two-thirds of IT professionals think that a lack of cloud knowledge is holding their business back.

    According to a new report by the London School of Economics and Rackspace, the lack of cloud space skills is affecting businesses and innovation.

    The study, consisting of 950 IT decision makers and 950 IT pros, found that 64% of IT decision makers attribute the loss of revenue to the lack of cloud skills. The study estimated the total of lost revenue within UK organisations to be around £217,864,804 a year, making it clear that changes need to be made within cloud technology.

    In addition to the loss of revenue, the study found cloud knowledge contributing to a lack of innovation. In fact, 67% of IT pros believe that a greater knowledge of cloud technology could help organisations by increasing innovation and creativity.

    Almost half of IT decision-makers in the UK already attribute positive ROI to the cloud and more believe the cloud will increase positive ROI in the future. With 85% of IT pros saying that deeper cloud expertise within their organisation would help increase the cloud’s ROI, it is no wonder why companies are looking to fully utilise it. Unfortunately, companies are finding the implementation of cloud technology difficult, largely due to the workforce.

    Over half (56%) of IT decision makers claimed the skills required to help manage their organisation’s clouds are difficult to recruit. Competition for talent, the inability to offer competitive salaries, insufficient career progression and the lack of training also create difficulties in improving a company’s cloud management.

    Reply
  20. Tomi Engdahl says:

    What we can learn from the early cloud adopters
    http://www.cloudpro.co.uk/cloud-essentials/2795/what-we-can-learn-early-cloud-adopters-1

    Expectations may have been too high and knowledge scarce, but we have much to learn from the early adopters of cloud computing.

    The misconception cloud is insufficiently secure has been an issue since the beginning and has yet to be entirely rectified by the industry. As such, it remains a key concern and a primary goal when organisations think about moving over to the cloud. Malware, data theft, data leakages; they all remain as top security challenges.

    A recent report from Intel Security, entitled ‘Building Trust in a Cloud Sky’, showed organisations’ concerns about security and expertise, as well as a lack of resources, remains some of the most significant challenges for companies.

    Yet it also revealed some additional echoes from the past, with issues faced by early adopters still having their part to play in today’s cloud market. Specifically, organisations are still facing shortages of in-house IT staff with the appropriate skills, knowledge and experience of cloud technologies and security.

    The issue is that many organisations have yet to fully realise their ambitions when it comes to the cloud, and there are some lessons that still have to be learnt, and certainly there are some improvements to make.

    Reply
  21. Tomi Engdahl says:

    Analyst: Enterprises Trust Red Hat Because It ‘Makes Open Source Boring’
    https://linux.slashdot.org/story/17/09/25/0649247/analyst-enterprises-trust-red-hat-because-it-makes-open-source-boring

    Tech analyst James Governor reports on what he learned from Red Hat’s “Analyst Day”:
    So it turns out Red Hat is pretty good at being Red Hat. By that I mean Red Hat sticks to the knitting, carries water and chops wood, and generally just does a good job of packaging open source technology for enterprise adoption. It’s fashionable these days to decry open source — “it’s not a business”. Maybe not for you, but for Red Hat it sure is. Enterprises trust Red Hat precisely because it makes open source boring. Exciting and cool, on the other hand, often means getting paged in the middle of the night. Enterprise people generally don’t like that kind of thing…

    Red Hat is pretty good at being Red Hat
    http://redmonk.com/jgovernor/2017/09/21/red-hat-is-pretty-good-at-being-red-hat/

    So it turns out Red Hat is pretty good at being Red Hat. By that I mean Red Hat sticks to the knitting, carries water and chops wood, and generally just does a good job of packaging open source technology for enterprise adoption. It’s fashionable these days to decry open source – “it’s not a business”. Maybe not for you, but for Red Hat it sure is.

    Enterprises trust Red Hat precisely because it makes open source boring. Exciting and cool, on the other hand, often means getting paged in the middle of the night. Enterprise people generally don’t like that kind of thing.

    It’s been interesting to see the rise of OpenShift. What looked like a PaaS also ran in versions one and two has emerged as a go to Kubernetes distribution for enterprises. Red Hat now more than 400 paying customers for OpenShift Container Platform.

    Reply
  22. Tomi Engdahl says:

    Chicago School Official: US IT Jobs Offshored Because ‘We Weren’t Making Our Own’ Coders
    https://news.slashdot.org/story/17/09/24/2327228/chicago-school-official-us-it-jobs-offshored-because-we-werent-making-our-own-coders

    “People still talk about it’s all offshored, it’s all in India and you know, there are some things that are there but they don’t even realize some of the reasons that they went there in the first place is because we weren’t making our own.”

    [Trailer] CS4All: Integrating Equity, Empowerment, and Opportunities
    https://www.youtube.com/watch?v=WLbgU3dmpb0

    Reply
  23. Tomi Engdahl says:

    Frederic Lardinois / TechCrunch:
    Microsoft launches new machine learning tools including Azure machine learning experimentation, workbench, and model management services

    Microsoft launches new machine learning tools
    https://techcrunch.com/2017/09/25/microsoft-launches-new-machine-learning-tools/

    For developers, the company launched three major new tools today: the Azure Machine Learning Experimentation service, the Azure Machine Learning Workbench and the Azure Machine Learning Model Management service.

    Reply
  24. Tomi Engdahl says:

    Andrew Brust / ZDNet:
    SQL Server 2017 runs on Linux, including in Docker containers, adds support for Python, and graph processing for NoSQL-like functionality

    Review: SQL Server 2017 adds Python, graph processing and runs on Linux

    Integration of Python for data science, graph processing for NoSQL-like functionality, and it runs on
    http://www.zdnet.com/article/review-sql-server-2017-adds-python-graph-processing-and-runs-on-linux/

    Linux as well as Windows. At almost 30 years of age, Microsoft’s flagship database has learned many new tricks.

    Reply
  25. Tomi Engdahl says:

    After Microsoft calls out HP Inc over stalled Windows 10 logins, HP bounces back with a fix
    Shove this tool into your PC if it’s getting stuck during startup
    https://www.theregister.co.uk/2017/09/25/hp_inc_laptop_login_fix/

    After enduring roughly two weeks of complaints, HP Inc has today produced a fix for folks struggling with blank screens on their computers.

    A Windows 10 update, released by Microsoft on September 12, caused HP PCs to get stuck showing black displays after users attempt to log in. Machine owners reported seeing nothing but blank monitors for up to five or ten minutes after entering their usernames and passwords.

    Last Tuesday, on September 19, a Microsoft spokesperson told us it was “working to resolve this” cockup, and referred affected readers to a webpage that seemed to link the problem to faulty information from HP. The page claimed some, cough, cough, manufacturers ship disks with “incorrect registry keys” installed that conflict with a service Microsoft uses to prepare applications, which is called app readiness.

    After acknowledging the problem three days later, and noting that a fix is looming, today a spokesperson for HP Inc told us it has published an online support page that offers two possible solutions for affected HP owners running Windows 10.

    According to the spokesperson for HP Inc, users can either download and run the tool from the above support page link, or wait for it to appear as an automatic update.

    HP Consumer Notebooks and Desktops – The Computer Hangs at a Black Screen After Windows Update
    https://support.hp.com/us-en/document/c05704298

    Reply
  26. Tomi Engdahl says:

    The power JavaScript: ‘Gandalf of JS’ Wirfs-Brock on ECMAscript 2017
    Looking to the AI future
    https://www.theregister.co.uk/2017/09/26/allen_wirfs_brock_interview/

    Reply
  27. Tomi Engdahl says:

    Intel launches 8th-gen Core desktop chips, claims the Core i7-8700K is its best gaming chip ever
    Intel’s new 8th-gen Core chips now include six cores on the high end, attacking one of AMD’s Ryzen advantages.
    https://www.pcworld.com/article/3227418/components-processors/intel-launches-8th-gen-core-desktop-chips-claims-the-core-i7-8700k-is-its-best-gaming-chip-ever.html

    Reply
  28. Tomi Engdahl says:

    Intel Core i9-7980XE And Core i9-7960X Review: Intel Attacks AMD Threadripper
    https://hothardware.com/reviews/intel-core-i9-7980xe-and-core-i9-7960x-review-and-benchmarks

    Over the last few months, Intel slowly divulged a number of details regarding the two processors we’ll be covering today, its flagship 18-core, behemoth Core i9-7980XE and its slightly smaller sibling, the 16-core Core i9-7960X. The fact that these two processors were in the works was officially disclosed all the way back in May, no doubt in an attempt to quell the impending AMD Threadripper firestorm, but specifics like frequencies, cache, and TDP were initially held back.

    Those specifications were eventually revealed, and some benchmarks leaked as well, but today we’ve got the full, official scoop.

    Reply
  29. Tomi Engdahl says:

    Businesses Privacy The Almighty Buck Technology
    If Data Is the New Oil, Are Tech Companies Robbing Us Blind?
    https://yro.slashdot.org/story/17/09/25/2120225/if-data-is-the-new-oil-are-tech-companies-robbing-us-blind

    Data is the new oil, or so the saying goes. So why are we giving it away for nothing more than ostensibly free email, better movie recommendations, and more accurate search results? It’s an important question to ask in a world where the accumulation and scraping of data is worth billions of dollars — and even a money-losing company with enough data about its users can be worth well into the eight-figure region.

    If data is the new oil, are tech companies robbing us blind?
    https://www.digitaltrends.com/cool-tech/data-ownership-question/

    Viewed uncharitably, companies like Google and Facebook can appear almost like the unscrupulous oil man Daniel Plainview from Paul Thomas Anderson’s 2007 movie, There Will Be Blood; offering little more than tokenistic gestures in exchange for what amounts to a goldmine.

    “The defense of this practice is that these companies provide ‘free’ services, and that they deserve some reward for their innovation and ingenuity,” Dr. John Danaher, a lecturer at the School of Law at NUI Galway, who writes about the intersection of the law and emerging technology, told Digital Trends. “That may well be true, but I would argue that the rewards they receive are disproportionate. The other defense is that many companies provide for some revenue-sharing agreements with more popular users, such as YouTube. That’s becoming more true, too, but it’s only a handful of users who can make decent money from this.”

    Reply
  30. Tomi Engdahl says:

    Seth Stevenson / Wall Street Journal:
    Interview with Satya Nadella and Bill Gates on political climate, how AI, quantum computing, and mixed reality will shape the future as Nadella debuts memoir — On the occasion of the publication of Nadella’s first book, out this fall, Nadella and his predecessor talk shop

    A Rare Joint Interview with Microsoft CEO Satya Nadella and Bill Gates
    On the occasion of the publication of Nadella’s first book, out this fall, Nadella and his predecessor talk shop
    https://www.wsj.com/articles/a-rare-joint-interview-with-microsoft-ceo-satya-nadella-and-bill-gates-1506358852

    Reply
  31. Tomi Engdahl says:

    Robert Hof / SiliconANGLE:
    Nvidia unveils AI acceleration software, partners with server makers Huawei and Lenovo, and cloud providers Alibaba, Baidu, and Tencent in China push

    Unveiling AI acceleration software, Nvidia jumps into China market with big partners
    https://siliconangle.com/blog/2017/09/25/unveiling-ai-acceleration-software-nvidia-jumps-china-market-big-partners/

    Reply
  32. Tomi Engdahl says:

    Emil Protalinski / VentureBeat:
    Firefox 57 beta, called Firefox Quantum, arrives with major visual overhaul and faster next-generation engine

    Firefox 57 beta arrives with major visual overhaul and next-generation browser engine
    https://venturebeat.com/2017/09/26/firefox-57-beta-arrives-with-major-visual-overhaul-and-next-generation-browser-engine/

    Mozilla today updated the beta version and developer version of its browser to Firefox Quantum. “Since the version number — 57 — can’t really convey the magnitude of the changes we’ve made, and how much faster this new Firefox is, we’re calling this upcoming release Firefox Quantum,” Mozilla explains. The new massive beta release is available for Windows, Mac, Linux, and Android.

    The new name signals Firefox 57 is a huge release that incorporates the company’s plans to build a next-generation browser engine (Project Quantum) that takes full advantage of modern hardware. The goal is to make Firefox the fastest and smoothest browser for PCs and mobile devices

    Reply
  33. Tomi Engdahl says:

    Don’t let lack of All Flash Array experience hold you back
    Bridging the chasm
    https://www.theregister.co.uk/2017/09/27/dont_let_a_lack_of_afa_experience_hold_you_back/

    There are many reasons why someone might not be using an All-Flash Array. It might be that you can’t get the budget, say, or that you think it has been over-hyped. Maybe you are sceptical about whether your application set would benefit from it, or if it would fit into your operations seamlessly.

    Reply
  34. Tomi Engdahl says:

    7 keys to a successful business intelligence strategy
    https://www.cio.com/article/2437838/business-intelligence/7-keys-to-a-successful-business-intelligence-strategy.html

    BI success requires more than just a strong technology platform. It takes laser focus on processes and personnel — and a business-first approach to gaining insights from data.

    usiness intelligence (BI) is essential for business growth and competitive advantage, yet reaping benefits from BI requires more than implementing the technology that enables it.

    In fact, deploying the technology is the easiest part of any BI initiative, according to Boris Evelson, vice president and principal analyst at Forrester Research. Getting the personnel and processes portions right are much more challenging, he says.

    Following are seven essential components of any successful BI strategy, according to several BI experts.
    1. Give business ownership over BI
    2. Monitor BI use and adjust as necessary
    3. Validate, validate, validate
    4. Focus on business problems first, then on data
    5. Prioritize — and build in processes for improvement
    6. Upskill ‘citizen’ data scientists
    7. Empower staff to tell stories with data

    Reply
  35. Tomi Engdahl says:

    IBM launches unified data analytics system, promises machine learning for all
    Big Blue analytics boss wants to automate, automate, then automate some more
    https://www.theregister.co.uk/2017/09/28/ibm_launches_system_to_bring_analytics_to_data_promises_machine_learning_in_everything/

    IBM has announced a unified analytics system that allows data scientists to work across multiple data stores in ways the company said should eliminate time-consuming data integration and preparation.

    The Integrated Analytics System, launched at the Strata Data Conference in New York, aims to let data scientists develop and deploy models wherever data resides.

    Rob Thomas, general manager for IBM Analytics, said he has “declared that machine learning will be a part of everything we deliver”.

    Reply
  36. Tomi Engdahl says:

    HPE sharpening the axe for 5,000 heads – report
    All part of CEO Whitman’s ‘long-term ops and financial blueprint’
    https://www.theregister.co.uk/2017/09/22/hewlett_packard_enterprise_prepares_to_chop_5000_staff/

    Hewlett Packard Enterprise is about to release the trap door again with 5,000 employees, or almost 10 per cent of its workforce, expected to fall through it.

    The redundancies, first reported by Bloomberg, are part of the HPE Next initiative that CEO Meg Whitman hatched in June, a radical programme – the latest in a long line – to improve financial results and compete with cloud rivals.

    HPE confirms Belfast-based 3PAR engineering office to close
    No, we’re not switching focus to Nimble, spokesman says
    https://www.theregister.co.uk/2017/09/28/hpe_closing_3par_belfast_engineering_office/

    Hewlett Packard Enterprise is to shutter its 3PAR engineering office in Belfast, Northern Ireland, but the firm emphasised the move does not mean it is shifting focus from 3PAR to Nimble storage.

    The Belfast 3PAR team came into being in 2007 when HP set up a Global Centre of Software Engineering Excellence at Forsythe House, Cromac Square. It started developing cloud-based software for 3PAR storage.

    Reply
  37. Tomi Engdahl says:

    IBM Now Has More Employees In India Than In the US
    https://news.slashdot.org/story/17/09/28/2348215/ibm-now-has-more-employees-in-india-than-in-the-us

    Over the last decade, IBM has shifted its center of gravity halfway around the world to India, making it a high-tech example of the globalization trends that the Trump administration has railed against. Today, the company employs 130,000 people in India — about one-third of its total work force, and more than in any other country. Their work spans the entire gamut of IBM’s businesses, from managing the computing needs of global giants like AT&T and Shell to performing cutting-edge research in fields like visual search, artificial intelligence and computer vision for self-driving cars. One team is even working with the producers of Sesame Street to teach vocabulary to kindergartners in Atlanta.

    IBM Now Has More Employees in India Than in the U.S.
    https://www.nytimes.com/2017/09/28/technology/ibm-india.html

    IBM has shifted its center of gravity halfway around the world to India, making it a high-tech example of the globalization trends that the Trump administration has railed against.

    Reply
  38. Tomi Engdahl says:

    10 critical security skills every IT team needs
    https://www.cio.com/article/3228965/it-skills-training/10-critical-security-skills-every-it-team-needs.html

    Focus on hiring talent with the following security skills and your team will be equipped to prevent, protect and mitigate the damage of cybersecurity attacks — and speed recovery efforts.

    Following are 10 security skills your organization should focus on when staffing up or upskilling your security teams.
    1. Security tools expertise
    2. Security analysis
    3. Project management
    4. Incident response
    5. Automation/devops
    6. Data science and data analytics
    7. Scripting
    8. Soft(er) skills
    9. Post-mortem deep forensics
    10. Passion

    Finally, good security talent has a passion for their work and a desire to share that knowledge, says Antoniewicz. That can manifest itself in various ways, from picking up a new programming language to taking courses to actively sharing knowledge across their organization or at community meetups, he says.

    “A good security person will have a major passion for sharing, learning and growing their knowledge all the time,”

    If you already professionals like this on board, do whatever you can to encourage and support them. “Develop teambuilding exercises, knowledge-sharing sessions, get-togethers, hack-a-thons, demos of new products or solutions, bug bounties — any way you can continue their engagement and add fuel to their fire,” he says.

    Reply
  39. Tomi Engdahl says:

    10 terabytes on a company server

    Toshiba has introduced a 10-terabyte HDD hard drive for business use, with improved performance compared to the previous one. The speed of data transfer has been increased by a quarter, but the advantages and disadvantages of HDD technology are unchanged.

    The biggest advantage of mechanical hard drives is still a cheaper price for flash-based solutions. Instead, the bus used by the MQ06 is its biggest bottleneck. The SATA bus transfers data at six gigabits per second, which often is not enough.

    Numerous videos can be found on the network, where differences in SDD and HDD sizes are measured. The Windows operating system starts at least half of the SSD at a faster speed. In heavy applications, booting can take 3-4 times more time on the HDD.

    Source: http://etn.fi/index.php?option=com_content&view=article&id=6900&via=n&datum=2017-09-27_15:45:03&mottagare=31202

    Reply
  40. Tomi Engdahl says:

    Best SSDs: Q3 2017
    by Billy Tallis on September 28, 2017 7:30 PM EST
    https://www.anandtech.com/show/9799/best-ssds

    Reply
  41. Tomi Engdahl says:

    James Somers / The Atlantic:
    Experts shed light on how model-driven engineering can help programmers solve problems and reduce software errors more effectively than traditional programming

    The Coming Software Apocalypse
    A small group of programmers wants to change how we code—before catastrophe strikes
    https://www.theatlantic.com/technology/archive/2017/09/saving-the-world-from-code/540393/

    t’s been said that software is “eating the world.” More and more, critical systems that were once controlled mechanically, or by people, are coming to depend on code. This was perhaps never clearer than in the summer of 2015, when on a single day, United Airlines grounded its fleet because of a problem with its departure-management system; trading was suspended on the New York Stock Exchange after an upgrade; the front page of The Wall Street Journal’s website crashed; and Seattle’s 911 system went down again, this time because a different router failed. The simultaneous failure of so many software systems smelled at first of a coordinated cyberattack. Almost more frightening was the realization, late in the day, that it was just a coincidence.

    “When we had electromechanical systems, we used to be able to test them exhaustively,” says Nancy Leveson, a professor of aeronautics and astronautics at the Massachusetts Institute of Technology who has been studying software safety for 35 years. She became known for her report on the Therac-25, a radiation-therapy machine that killed six patients because of a software error. “We used to be able to think through all the things it could do, all the states it could get into.”

    Software is different. Just by editing the text in a file somewhere, the same hunk of silicon can become an autopilot or an inventory-control system. This flexibility is software’s miracle, and its curse. Because it can be changed cheaply, software is constantly changed; and because it’s unmoored from anything physical—a program that is a thousand times more complex than another takes up the same actual space—it tends to grow without bound. “The problem,” Leveson wrote in a book, “is that we are attempting to build systems that are beyond our ability to intellectually manage.”

    The software did exactly what it was told to do. The reason it failed is that it was told to do the wrong thing.

    Our standard framework for thinking about engineering failures—reflected, for instance, in regulations for medical devices—was developed shortly after World War II, before the advent of software, for electromechanical systems. The idea was that you make something reliable by making its parts reliable (say, you build your engine to withstand 40,000 takeoff-and-landing cycles) and by planning for the breakdown of those parts (you have two engines). But software doesn’t break.

    Intrado’s faulty threshold is not like the faulty rivet that leads to the crash of an airliner. The software did exactly what it was told to do. In fact it did it perfectly. The reason it failed is that it was told to do the wrong thing. Software failures are failures of understanding, and of imagination.

    This is the trouble with making things out of code, as opposed to something physical. “The complexity,” as Leveson puts it, “is invisible to the eye.”

    The attempts now underway to change how we make software all seem to start with the same premise: Code is too hard to think about. Before trying to understand the attempts themselves, then, it’s worth understanding why this might be: what it is about code that makes it so foreign to the mind, and so unlike anything that came before it.

    When you press your foot down on your car’s accelerator, for instance, you’re no longer controlling anything directly; there’s no mechanical link from the pedal to the throttle. Instead, you’re issuing a command to a piece of software that decides how much air to give the engine. The car is a computer you can sit inside of. The steering wheel and pedals might as well be keyboard keys.

    Like everything else, the car has been computerized to enable new features.

    Software has enabled us to make the most intricate machines that have ever existed. And yet we have hardly noticed, because all of that complexity is packed into tiny silicon chips as millions and millions of lines of code. But just because we can’t see the complexity doesn’t mean that it has gone away.

    As programmers eagerly poured software into critical systems, they became, more and more, the linchpins of the built world—and Dijkstra thought they had perhaps overestimated themselves.

    “Software engineers don’t understand the problem they’re trying to solve, and don’t care to.”

    What made programming so difficult was that it required you to think like a computer.

    “The problem is that software engineers don’t understand the problem they’re trying to solve, and don’t care to,” says Leveson, the MIT software-safety expert. The reason is that they’re too wrapped up in getting their code to work. “Software engineers like to provide all kinds of tools and stuff for coding errors,” she says, referring to IDEs. “The serious problems that have happened with software have to do with requirements, not coding errors.”

    “There’s 100 million lines of code in cars now,” Leveson says. “You just cannot anticipate all these things.”

    Barr’s team demonstrated that there were actually more than 10 million ways for the onboard computer to cause unintended acceleration. They showed that as little as a single bit flip—a one in the computer’s memory becoming a zero or vice versa—could make a car run out of control. The fail-safe code that Toyota had put in place wasn’t enough to stop it.

    There will be more bad days for software. It’s important that we get better at making it, because if we don’t, and as software becomes more sophisticated and connected—as it takes control of more critical functions—those days could get worse.

    Since the 1980s, the way programmers work and the tools they use have changed remarkably little. There is a small but growing chorus that worries the status quo is unsustainable. “Even very good programmers are struggling to make sense of the systems that they are working with,”

    “Visual Studio is one of the single largest pieces of software in the world,” he said. “It’s over 55 million lines of code. And one of the things that I found out in this study is more than 98 percent of it is completely irrelevant.”

    Computers had doubled in power every 18 months for the last 40 years. Why hadn’t programming changed?

    Chris Granger, who had worked at Microsoft on Visual Studio, was likewise inspired. Within days of seeing a video of Victor’s talk, in January of 2012, he built a prototype of a new programming environment. Its key capability was that it would give you instant feedback on your program’s behavior. You’d see what your system was doing right next to the code that controlled it. It was like taking off a blindfold. Granger called the project “Light Table.”

    In April of 2012, he sought funding for Light Table on Kickstarter. In programming circles, it was a sensation. Within a month, the project raised more than $200,000. The ideas spread. The notion of liveness, of being able to see data flowing through your program instantly, made its way into flagship programming tools offered by Google and Apple. The default language for making new iPhone and Mac apps, called Swift, was developed by Apple from the ground up to support an environment, called Playgrounds, that was directly inspired by Light Table.

    “A lot of those things seemed like misinterpretations of what I was saying,”

    Although code had increasingly become the tool of choice for creating dynamic behavior, it remained one of the worst tools for understanding it. The point of “Inventing on Principle” was to show that you could mitigate that problem by making the connection between a system’s behavior and its code immediate.

    “Nobody would build a car by hand,” he says. “Code is still, in many places, handicraft. When you’re crafting manually 10,000 lines of code, that’s okay. But you have systems that have 30 million lines of code, like an Airbus, or 100 million lines of code, like your Tesla or high-end cars—that’s becoming very, very complicated.”

    Bantégnie’s company is one of the pioneers in the industrial use of model-based design, in which you no longer write code directly. Instead, you create a kind of flowchart that describes the rules your program should follow (the “model”), and the computer generates code for you based on those rules. If you were making the control system for an elevator, for instance, one rule might be that when the door is open, and someone presses the button for the lobby, you should close the door and start moving the car.

    “The people know how to code. The problem is what to code.”

    “Typically the main problem with software coding—and I’m a coder myself,” Bantégnie says, “is not the skills of the coders. The people know how to code. The problem is what to code. Because most of the requirements are kind of natural language, ambiguous, and a requirement is never extremely precise, it’s often understood differently by the guy who’s supposed to code.”

    On this view, software becomes unruly because the media for describing what software should do—conversations, prose descriptions, drawings on a sheet of paper—are too different from the media describing what software does do, namely, code itself. Too much is lost going from one to the other. The idea behind model-based design is to close the gap. The very same model is used both by system designers to express what they want and by the computer to automatically generate code.

    Of course, for this approach to succeed, much of the work has to be done well before the project even begins.

    The idea behind Esterel was that while traditional programming languages might be good for describing simple procedures that happened in a predetermined order—like a recipe—if you tried to use them in systems where lots of events could happen at nearly any time, in nearly any order—like in the cockpit of a plane—you inevitably got a mess. And a mess in control software was dangerous. In a paper, Berry went as far as to predict that “low-level programming techniques will not remain acceptable for large safety-critical programs, since they make behavior understanding and analysis almost impracticable.”

    Esterel was designed to make the computer handle this complexity for you. That was the promise of the model-based approach: Instead of writing normal programming code, you created a model of the system’s behavior—in this case, a model focused on how individual events should be handled, how to prioritize events, which events depended on which others, and so on. The model becomes the detailed blueprint that the computer would use to do the actual programming.

    Today, the ANSYS SCADE product family (for “safety-critical application development environment”) is used to generate code by companies in the aerospace and defense industries, in nuclear power plants, transit systems, heavy industry, and medical devices. “My initial dream was to have SCADE-generated code in every plane in the world,”

    Part of the draw for customers, especially in aviation, is that while it is possible to build highly reliable software by hand, it can be a Herculean effort.

    traditional projects begin with a massive requirements document in English, which specifies everything the software should do

    The problem with describing the requirements this way is that when you implement them in code, you have to painstakingly check that each one is satisfied. And when the customer changes the requirements, the code has to be changed, too, and tested extensively to make sure that nothing else was broken in the process.

    The cost is compounded by exacting regulatory standards. The FAA is fanatical about software safety. The agency mandates that every requirement for a piece of safety-critical software be traceable to the lines of code that implement it, and vice versa. So every time a line of code changes, it must be retraced to the corresponding requirement in the design document, and you must be able to demonstrate that the code actually satisfies the requirement. The idea is that if something goes wrong, you’re able to figure out why;

    “it’s a very labor-intensive process.” He estimates that before they used model-based design, on a two-year-long project only two to three months was spent writing code—the rest was spent working on the documentation.

    As Bantégnie explains, the beauty of having a computer turn your requirements into code, rather than a human, is that you can be sure—in fact you can mathematically prove—that the generated code actually satisfies those requirements.

    Still, most software, even in the safety-obsessed world of aviation, is made the old-fashioned way

    Most programmers feel the same way. They like code. At least they understand it. Tools that write your code for you and verify its correctness using the mathematics of “finite-state machines” and “recurrent systems” sound esoteric and hard to use, if not just too good to be true.

    It is a pattern that has played itself out before.

    You could do all the testing you wanted and you’d never find all the bugs.

    when assembly language was itself phased out in favor of the programming languages still popular today, like C, it was the assembly programmers who were skeptical this time

    No wonder, he said, that “people are not so easily transitioning to model-based software development: They perceive it as another opportunity to lose control, even more than they have already.”

    The bias against model-based design, sometimes known as model-driven engineering, or MDE, is in fact so ingrained that according to a recent paper, “Some even argue that there is a stronger need to investigate people’s perception of MDE than to research new MDE technologies.”

    “Human intuition is poor at estimating the true probability of supposedly ‘extremely rare’ combinations of events in systems operating at a scale of millions of requests per second,” he wrote in a paper. “That human fallibility means that some of the more subtle, dangerous bugs turn out to be errors in design; the code faithfully implements the intended design, but the design fails to correctly handle a particular ‘rare’ scenario.”

    “Few programmers write even a rough sketch of what their programs will do before they start coding.”

    TLA+, which stands for “Temporal Logic of Actions,” is similar in spirit to model-based design: It’s a language for writing down the requirements—TLA+ calls them “specifications”—of computer programs. These specifications can then be completely verified by a computer.

    The language was invented by Leslie Lamport, a Turing Award–winning computer scientist.

    For Lamport, a major reason today’s software is so full of bugs is that programmers jump straight into writing code. “Architects draw detailed plans before a brick is laid or a nail is hammered,” he wrote in an article. “But few programmers write even a rough sketch of what their programs will do before they start coding.”

    “The idea that there’s some higher level than the code in which you need to be able to think precisely, and that mathematics actually allows you to think precisely about it, is just completely foreign. Because they never learned it.”

    Lamport sees this failure to think mathematically about what they’re doing as the problem of modern software development in a nutshell: The stakes keep rising, but programmers aren’t stepping up—they haven’t developed the chops required to handle increasingly complex problems.

    Newcombe isn’t so sure that it’s the programmer who is to blame.

    Most programmers who took computer science in college have briefly encountered formal methods.

    “I needed to change people’s perceptions on what formal methods were,”

    Instead, he presented TLA+ as a new kind of “pseudocode,” a stepping-stone to real code that allowed you to exhaustively test your algorithms—and that got you thinking precisely early on in the design process. “Engineers think in terms of debugging rather than ‘verification,’”

    In the summer of 2015, a pair of American security researchers, Charlie Miller and Chris Valasek, convinced that car manufacturers weren’t taking software flaws seriously enough

    “We need to think about software differently,” Valasek told me. Car companies have long assembled their final product from parts made by hundreds of different suppliers. But where those parts were once purely mechanical, they now, as often as not, come with millions of lines of code.

    “There are lots of bugs in cars,” Gerard Berry, the French researcher behind Esterel, said in a talk. “It’s not like avionics—in avionics it’s taken very seriously. And it’s admitted that software is different from mechanics.” The automotive industry is perhaps among those that haven’t yet realized they are actually in the software business.

    “We don’t in the automaker industry have a regulator for software safety that knows what it’s doing,”

    One suspects the incentives are changing. “I think the autonomous car might push them,” Ledinot told me—“ISO 26262 and the autonomous car might slowly push them to adopt this kind of approach on critical parts.” (ISO 26262 is a safety standard for cars published in 2011.) Barr said much the same thing: In the world of the self-driving car, software can’t be an afterthought. It can’t be built like today’s airline-reservation systems or 911 systems or stock-trading systems. Code will be put in charge of hundreds of millions of lives on the road and it has to work. That is no small task.

    “When your tires are flat, you look at your tires, they are flat. When your software is broken, you look at your software, you see nothing.”

    Reply
  42. Tomi Engdahl says:

    The USB bus is now twice as fast

    The USBIF organization, the Universal Serial Bus Implementers Forum, has released the final version of the USB standard 3.2 specification. Four years have passed since the previous standard, ie 3.1 Superspeed.

    With the 3.2 standard, the speed of the bus increases from the current 10 gigabytes per second to double that of 20 gigabits per second. The bus upgrade is underpinned by the higher number of conductors of the new C-type connector.

    With the 3.2 standard, data is obtained via a C-type USB connection to run at 20 gigabits per second, or 2.5 gigabytes per second. The pace is twice as old as the 3.1 standard, but half of the Thunderbolt 3.0 speed.

    Equipment manufacturers have not yet been able to publish 3.2-standard equipment

    Source: http://www.etn.fi/index.php/13-news/6927-usb-vaeylae-on-nyt-kaksi-kertaa-nopeampi

    Reply
  43. Tomi Engdahl says:

    Good Linux news for Android

    The Linux kernel will become a new version every couple of months. In addition, some kernels are proclaimed as long-term support for LTS versions. Now, this LTS support is being extended for as long as six years. This is good news not only for many AMR devices but also for Android.

    Google developer Iliyan Malchev told LTS support for two to six years at the Linaro Connect 2017 event. Linux’s LTS maintainer Greg Kroah-Hartman confirmed on Twitter that kernel 4.4 has been running LTS support for six years.

    At this time, Google provides major functional updates for its Android versions for two years and security updates for three years.

    Source: http://www.etn.fi/index.php/13-news/6925-hyviae-linux-uutisia-androidille

    Reply
  44. Tomi Engdahl says:

    BYOD might be a hipster honeypot but it’s rarely worth the extra hassle
    Security, compatibility, control… we enter another world of pain
    https://www.theregister.co.uk/2017/10/02/falling_out_of_love_with_byod/

    I have a confession: I’ve fallen out of love with Bring Your Own Device.

    Over the years, I’ve worked with, and administered, a number of BYOD schemes. I’ve even written positive things about BYOD.

    After all, what was not to love? Users providing the mobile equipment and the company not needing to worry about maintaining the kit while at the same time treating them like company property, being able to manage device and content securely.

    Just four years ago, Gartner reckoned by 2017 half of employers would be leaning on staff to supply their own smartphones or tablets. Somehow, this would let us deliver all kinds of business apps at the touch of a screen. Things like self-service HR or mobile CRM.

    Some ludicrous statements started being made: BYOD had become a critical plank in attracting millennials – a generation addicted to mobiles and social media – to your place of work. If you didn’t have a BYOD programme and the competition did, well, guess where that potential new, hire wearing the chin thatch and lumberjack shirt would choose to work.

    The kit belongs to the user

    On the face of it, users owning the kit is a great idea. When they sign up to the scheme they’re agreeing that the equipment is their responsibility. It’s up to them to have a warranty that’ll get it fixed if it breaks. If it doesn’t work, that’s their problem. Well worth the price we paid to help fund the kit.

    Except it doesn’t work like that. Unless they’ve paid for a stonkingly expensive maintenance contract their kit will likely be on a collect-and-repair scheme, which means that if it exudes blue smoke (or simply goes silent on them) they’re without it for a few days while the vendor wrangles with it, bangs it with a hammer, and so on. So what do they do in the meantime? At the very least you’ll need to have a small stock of spare kit

    And even when the equipment is alive, this doesn’t mean it won’t get sick once in a while. Even my own kit has a bit of a hiccup sometimes…

    The next question is how you give the users connectivity into your systems. Connecting stuff you don’t own into the corporate network is a security nightmare – you absolutely don’t want to hook it in directly, because one outdated anti-malware package can wreak havoc with your world. So you have a number of options.

    First is the concept of a “quarantine” VLAN. The idea’s simple: when anything accesses the network for the first time in a session, the infrastructure puts it in a VLAN that can’t see much – generally it can’t see anything but the internet and a server that deals with network admission. The admission server won’t let the device join the proper LAN unless it’s convinced that the device’s OS is up-to-date with patches, that it’s running a suitable anti-malware package, and that the latter is also current with regard to its patches and virus signature files. Now, although it’s a simple idea it’s also relatively complex to implement and has a non-trivial cost: so unless your BYOD world is extensive, it may not be worth it.

    An alternative is to decide that anything BYOD needs to stay outside the network completely, and act simply as a dumb terminal to the corporate system. You generally achieve this using some kind of virtual desktop à la Citrix or VMware.

    Now I want to manage it

    Yes, there are sandbox-based apps that sit alongside the users’ own apps and are manageable centrally. But these cost money,

    So what’s the alternative?

    Well, you could just decide to buy a bunch of company-owned equipment. You’d devise a standard desktop/laptop build (maybe you’ll issue laptops and docking stations so rovers don’t have to have a desktop and a laptop) which your service desk was trained on.

    That old buzzword

    BYOD sounds – sounded – like a great idea. But it opened a whole new world of complexity in terms of support and device management that had not been foreseen beforehand.

    It raised soft problems, too: it greyed the lines of who owned the device and what you’re allowed to do it. Deleting somebody’s files during an application update, for example, probably wouldn’t go down down so well.

    No, much better to bring back control over the ownership and supply of devices.

    Reply
  45. Tomi Engdahl says:

    Linux Marketshare on Desktops Apparently Hit 6.91% in September
    http://www.omgubuntu.co.uk/2017/10/linux-marketshare-6-91-percent-september-2017

    Desktop Linux marketshare hit an all-time high of 6.91% in September 2017, according to preliminary data from web analytics firm NetMarketShare.

    The figure is impressive but is also highly irregular and out of sync with the reported Linux marketshare from other companies like StatCounter and Wikimedia.

    Interestingly, if this figure is in any way accurate, it would mean that Linux marketshare has not only pretty much doubled in the space of 30 days but has overtaken macOS in the process!

    Reply
  46. Tomi Engdahl says:

    How Microsoft built the world’s most powerful game console
    https://www.theverge.com/2017/10/2/16347446/microsoft-xbox-one-x-design-sony-playstation-4-pro-console-generations

    An inside look at the development of the Xbox One X, and what it means for the future of console gaming

    When discussing how he and his team designed the Xbox One X, Sparks says the process revolved around one pivotal benchmark. The new console had to be compact, more so than any comparable PC. In fact, it would need to be even smaller than the Xbox One S, itself a slim version of the original Xbox One. There was one big issue: the new console wouldn’t just have to be smaller, it also needed to be 40 percent more powerful.

    To tell the story of how its Xbox team accomplished such a feat, Microsoft invited us up to its Redmond, Washington headquarters to give an inside look at the design process behind the Xbox One X. But the company was also willing to go beyond the question of how and into the question of why — why make such a device, and why now?

    Reply
  47. Tomi Engdahl says:

    SiliconANGLE:
    Oracle unveils 18c, a ML-driven self-patching and self-tuning database, coming December for data warehousing and June for online transaction processing

    Targeting cybersecurity, Larry Ellison debuts Oracle’s new ‘self-driving’ database
    https://siliconangle.com/blog/2017/10/01/larry-ellison-debuts-oracles-next-gen-self-driving-database/

    Oracle Corp. Sunday announced the next generation of its database, which founder and Chief Technology Officer Larry Ellison said will be able to handle key tasks such as critical software patches automatically.

    Ellison (pictured) also said that the autonomous database will help on another product to be introduced in detail Tuesday: new cybersecurity technology that will offer automated threat detection and immediate fixes for the threat. Both the new database and the new cybersecurity services are driven by machine learning, the branch of artificial intelligence that allows computers to learn on their own.

    Reply
  48. Tomi Engdahl says:

    Oracle CEO Mark Hurd reads ‘mean tweets’ about his 2025 vision
    Hurd flames first after his 2016 predictions were live-trashed
    https://www.theregister.co.uk/2017/10/03/oracle_ceo_has_trump_moment_takes_on_mean_tweets/

    Data Centre Arrow Cloud
    Oracle CEO Mark Hurd reads ‘mean tweets’ about his 2025 vision
    Hurd flames first after his 2016 predictions were live-trashed
    By Rebecca Hill 3 Oct 2017 at 00:33
    7 Reg comments SHARE ▼
    Mark Hurd
    Mark Hurd

    OpenWorld 2017 Oracle Co-CEO Mark Hurd appears not to have shrugged off past criticism of his predictions for the state of cloud computing in the year 2025, a staple of his recent appearances at Big Red’s OpenWorld gabfest.

    However, during today’s keynote, he decided to do something a little more … adversarial.

    In a moment reminiscent of celebrities reading less-than-kind social media feedback on talk shows, or perhaps president Trump’s late-night tweet storms against his detractors, Hurd flashed up a shot of some “mean tweets” he’d received about past predictions.

    These predictions included that 80 per cent of IT spending would be on cloud services, that the number of corporate-owned data centres would drop by 80 per cent and that there would be two SaaS providers by 2025.

    “They’re predictions, so you’d think that not many people wouldn’t find them that challenging; you wouldn’t get that much negative commentary,” Hurd said.

    The main thrust of Hurd’s speech was on the importance of cloud, combined with the increased pressure on companies to secure their systems and manage risk. ‘

    Reply
  49. Tomi Engdahl says:

    Java EE 8 takes final bow under Oracle’s wing: Here’s what’s new
    Long-delayed update adds support for modern web tech
    https://www.theregister.co.uk/2017/10/02/java_ee_8_takes_final_bow_under_oracles_wing/

    OpenWorld Java EE 8 arrived last month rather later than expected – but it landed in time for Oracle OpenWorld and JavaOne, which are taking place this week in San Francisco, California.

    Enterprise-flavored Java hasn’t seen an update since June 2013. Linda DeMichiel, Java EE 8 specification lead at Oracle, recounted the long road to delivering the update in a session on Monday at the JavaOne conference.

    “Java 8 is the next and probably last step in the Java EE brand for the enterprise platform here at Oracle,” said DeMichiel, in reference to Oracle’s plan to turn Java EE 8 over to the Eclipse Foundation.

    Reply
  50. Tomi Engdahl says:

    Data Centre Arrow Cloud
    Mainframes are hip now! Compuware fires its dev environment into cloud
    But analysts say good luck convincing newcomers
    https://www.theregister.co.uk/2017/10/02/compuware_shows_off_shiny_new_mainframe_cloud_ide_toy/

    In an attempt to entice new blood to those dinosaur systems of record known as mainframes, Detroit software firm Compuware has moved its development environment to the cloud.

    The company’s flagship mainframe Agile/DevOps product Topaz is now available on Amazon Web Services.

    As Compuware and competitors such as IBM and Micro Focus have realised, in addition to convincing firms to take on high licensing usage fees, there’s a mainframe developer shortage. Industry has struggled to convince students to learn ancient programming languages such as COBOL.

    Compuware Introduces Cloud Access to Mainframe Development
    Topaz Availability on AWS is Industry-first, Transforming Enterprises’ Ability to Quickly Modernize COBOL
    https://globenewswire.com/news-release/2017/10/02/1138589/0/en/Compuware-Introduces-Cloud-Access-to-Mainframe-Development.html

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*