Computer trends 2017

I did not have time to post my computer technologies predictions t the ends of 2016. Because I missed the year end deadline, I though that there is no point on posting anything before the news from CES 2017 have been published. Here are some of myck picks on the current computer technologies trends:

CES 2017 had 3 significant technology trends: deep learning goes deep, Alexa everywhere and Wi-Fi gets meshy. The PC sector seemed to be pretty boring.

Gartner expects that IT sales will growth (2.7%) but hardware sales will not have any growth – can drop this year. TEKsystems 2017 IT forecast shows IT budgets rebounding from a slump in 2016, and IT leaders’ confidence high going into the new year. But challenges around talent acquisition and organizational alignment will persist. Programming and software development continue to be among the most crucial and hard-to-find IT skill sets.

Smart phones sales (expected to be 1.89 billion) and PC sales (expected to be 432 million) do not grow in 2017. According to IDC PC shipments declined for a fifth consecutive year in 2016 as the industry continued to suffer from stagnation and lack of compelling drivers for upgrades. Both Gartner and IDC estimated that PC shipments declined about 6% in 2016.Revenue in the traditional (non-cloud) IT infrastructure segment decreased 10.8 per cent year over year in the third quarter of 2016. Only PC category that has potential for growth is ultramobile (includes Microsoft Surface ja Apple MacBook Air). Need for memory chips is increasing.

Browser suffers from JavaScript-creep disease: This causes that the browing experience seems to be become slower even though computer and broadband connections are getting faster all the time. Bloat on web pages has been going on for ages, and this trend seems to continue.

Microsoft tries all it can to make people to switch from older Windows versions to Windows 10. Microsoft says that continued usage of Windows 7 increases maintenance and operating costs for businesses as malware attacks that could have been avoided by upgrading to Windows 10. Microsoft says that continued usage of Windows 7 increases maintenance and operating costs for businesses. Microsoft: Windows 7 Does Not Meet the Demands of Modern Technology; Recommends Windows 10. On February 2017 Microsoft stops the 20 year long tradition of monthly security updates. Windows 10 “Creators Update” coming early 2017 for free, featuring 3D and mixed reality, 4K gaming, more.

Microsoft plans to emulate x86 instructions on ARM chips, throwing a compatibility lifeline to future Windows tablets and phones. Microsoft’s x86 on ARM64 Emulation is coming in 2017. This capability is coming to Windows 10, though not until “Redstone 3″ in the Fall of 2017

Parents should worry less about the amount of time their children spend using smartphones, computers and playing video games because screen time is actually beneficial, the University of Oxford has concluded. 257 minutes is the time teens can spend on computers each day before harming wellbeing.

Outsourcing IT operations to foreign countries is not trendy anymore and companied live at uncertain times. India’s $150 billion outsourcing industry stares at an uncertain future. In the past five years, revenue and profit growth for the top five companies listed on the BSE have halved. Industry leader TCS too felt the impact as it made a shift in business model towards software platforms and chased digital contacts.

Containers will become hot this year and cloud will stay hot. Research firm 451 Research predicts this year containerization will be US $ 762 million business and that Containers will become 2.6 billion worth of software business in 2020. (40 per cent a year growth rate).

Cloud services are expected to have  22 percent annual growth rate. By 2020, the sector would grow from the current 22.2 billion to $ 46 billion. In Finland 30% of companies now prefer to buy cloud services when buying IT (20 per cent of IT budget goes to cloud).Cloud spend to make up over a third of IT budgets by 2017. Cloud and hosting services will be responsible for 34% of IT budgets by 2017, up from 28% by the end of 2016, according to 451 Research. Cloud services have many advantages, but cloud services have also disadvantages. In five years, SaaS will be the cloud that matters.

When cloud is growing, so is the spending on cloud hardware by the cloud companies. Cloud hardware spend hits US$8.4bn/quarter, as traditional kit sinks – 2017 forecast to see cloud kit clock $11bn every 90 days. In 2016′s third quarter vendor revenue from sales of infrastructure products (server, storage, and Ethernet switch) for cloud IT, including public and private cloud, grew by 8.1 per cent year over year to $8.4 billion. Private cloud accounted for $3.3 billion with the rest going to public clouds. Data centers need lower latency components so Google Searches for Better Silicon.

The first signs of the decline and fall of the 20+ year x86 hegemony will appear in 2017. The availability of industry leading fab processes will allow other processor architectures (including AMD x86, ARM, Open Power and even the new RISC-V architecture) to compete with Intel on a level playing field.

USB-C will now come to screens – C-type USB connector promises to really become the only all equipment for the physical interface.The HDMI connection will be lost from laptops in the future. Thunderbolt 3 is arranged to work with USB Type-C,  but it’s not the same thing (Thunderbolt is four times faster than USB 3.1).

World’s first ‘exascale’ supercomputer prototype will be ready by the end of 2017, says China

It seems that Oracle Begins Aggressively Pursuing Java Licensing Fees in 2017. Java SE is free, but Java SE Suite and various flavors of Java SE Advanced are not. Oracle is massively ramping up audits of Java customers it claims are in breach of its licences – six years after it bought Sun Microsystems. Huge sums of money are at stake. The version of Java in contention is Java SE, with three paid flavours that range from $40 to $300 per named user and from $5,000 to $15,000 for a processor licence. If you download Java, you get everything – and you need to make sure you are installing only the components you are entitled to and you need to remove the bits you aren’t using.

Your Year in Review, Unsung Hero article sees the following trends in 2017:

  • A battle between ASICs, GPUs, and FPGAs to run emerging workloads in artificial intelligence
  • A race to create the first generation of 5G silicon
  • Continued efforts to define new memories that have meaningful impact
  • New players trying to take share in the huge market for smartphones
  • An emerging market for VR gaining critical mass

Virtual Reality Will Stay Hot on both PC and mobile.“VR is the heaviest heterogeneous workload we encounter in mobile—there’s a lot going on, much more than in a standard app,” said Tim Leland, a vice president for graphics and imaging at Qualcomm. The challenges are in the needs to calculate data from multiple sensors and respond to it with updated visuals in less than 18 ms to keep up with the viewer’s head motions so the CPUs, GPUs, DSPs, sensor fusion core, display engine, and video-decoding block are all running at close to full tilt.

 


932 Comments

  1. Tomi Engdahl says:

    Open-source world resurrects Oracle-free Solaris project OmniOS
    People power!
    https://www.theregister.co.uk/2017/07/13/open_source_community_resurrects_omnios/

    The open-source community has fought back and resurrected the development of OmniOS – an Oracle-free non-proprietary variant of Solaris, which had been shelved in April.

    The development of OmniOS, a distribution of Illumos derived from Sun’s open-source flavor of Solaris, was killed after five years of work by web applications biz OmniTI.

    It was hoped OmniOS would be community-driven, simple to use, and fast to install and operate. However, the project was axed, as the project failed to make any cash out of the development and a community failed to emerge. Consequently all work stopped and support contracts were not renewed.

    OmniTI had expressed the hope that “the community” would take up further development of the OS. Some 14 weeks later, the launch of Swiss association OmniOS Community Edition was announced in a blog.

    OmniOS Community Edition r151022k
    http://www.omniosce.org/

    Reply
  2. Tomi Engdahl says:

    Adobe Should Open-Source Flash Player, Petition Requires
    Adobe plans to retire Flash Player by the end of 2020
    http://news.softpedia.com/news/adobe-should-open-source-flash-player-petition-requires-517227.shtml

    Adobe recently announced that it would be killing off Flash Player by the end of 2020, and tech giants like Microsoft, Apple, and Google revealed plans to migrate to alternative solutions and ensure a smooth transition for all users.

    But not everyone seems to be ready to let Flash Player go, especially given the impact it could have on the browsing experience, as millions of websites out there are believed to be using Flash content that would become obsolete once Adobe pulls the plug on its software.

    A petition published on GitHub by developer Juha Lindstedt calls for Adobe to open-source Flash Player, explaining that otherwise “future generations can’t access the past and games, experiments, and websites would be forgotten.”

    Petition to open source Flash spec
    https://github.com/pakastin/open-source-flash

    However Flash along with its sister project Shockwave is an important piece of Internet history and killing Flash and Shockwave means future generations can’t access the past. Games, experiments and websites would be forgotten.

    Open sourcing Flash and the Shockwave spec would be a good solution to keep Flash and Shockwave projects alive safely for archive reasons. Don’t know how, but that’s the beauty of open source: you never know what will come up after you go open source! There might be a way to convert swf/fla/drc/dir to HTML5/canvas/webgl/webassembly, or some might write a standalone player for it. Another possibility would be to have a separate browser. We’re not saying Flash and Shockwave player should be preserved as is.

    Reply
  3. Tomi Engdahl says:

    Adobe announces the end of Flash
    https://www.neowin.net/news/adobe-announces-the-end-of-flash

    It’s finally happening. Long regarded as a miserable blight on the world wide web, Flash is going to the great tech graveyard in the sky – or perhaps to the depths of hell – and a date has been set for its demise.

    Today, Adobe announced that it is “planning to end-of-life Flash”, and said it will “stop updating and distributing the Flash Player at the end of 2020″. For now, Adobe remains “committed to supporting Flash through 2020″, and will continue to distribute security patches, maintaining OS and browser compatibility, and even adding new features and capabilities “where needed”.

    With the availability of newer web standards, such as HTML5 and WebGL, Flash has little reason to exist; major browsers have already begun phasing out their support, including Firefox, Chrome, and Microsoft Edge, which started blocking Flash content by default with the Windows 10 Creators Update.

    Reply
  4. Tomi Engdahl says:

    Artem Tashkinov: Independent Hardware Vendors Hate Linux
    http://phoronix.com/scan.php?page=news_item&px=Tashkinov-2017-Linux

    Independent commentator Artem S. Tashkinov is back at it again with his latest thoughts on GNU/Linux and its problems in a post entitled “Why Linux/GNU might never succeed on a large scale”.

    Tashkinov has previously ranted about problems he views with Linux as well as other operating systems like Android and Windows 10. His latest controversial thoughts are on why he thinks GNU/Linux might never succeed on a large scale. But then again, many of you will probably agree GNU/Linux has already succeeded on an enormous scale — well, at least in clouds, servers, and workstations. When it comes to Linux on the desktop, most reports still put the overall Linux desktop at around 2% with the Linux gaming market-share at under 1%. And, of course, there still hasn’t been a break-through GNU/Linux smartphone that’s done well in overall markets.

    The “IT guy” argues that hardware vendors hate dealing with the Linux kernel over the lack of control, the frequent breakage of the Linux kernel API, the inability for some vendors to publish documentation on their hardware, and regressions in drivers created by the open-source community.

    When it comes to IHVs, there are plenty out there still scared of Linux or think it’s not worthwhile to invest in given the current desktop numbers, especially within the PR/marketing departments at some of these companies.

    While some points raised by Tashkinov have some merit such as around the Linux kernel’s lack of a stable API/ABI, others of you probably see things quite differently.

    Why Linux/GNU might never succeed on a large scale
    https://itvision.altervista.org/why-linux-gnu-might-never-succeed.html

    I’ve always wanted to talk about the Linux kernel and how well it supports various hardware devices and recently I got a perfect opportunity when I was asked the following question:

    “Why can’t computer manufacturers work with Linux, FreeBSD and other open source developers to make their hardware work properly first time? I think we all know the answer: Microsoft has a very tight grip over the computer manufacturers and they produce everything for Windows only and dare not offer alternatives in case Microsoft increases the price of Windows or withholds technical information”.

    In reality, independent hardware vendors hate to support the Linux kernel for many reasons:

    They cannot control their code in the Linux kernel.
    They cannot properly do QA/QC for eight kernel releases (at the time of writing there are eight supported kernel releases) and Linux developers love to break APIs all the time and new APIs sometimes require heavy modification of the code which means a new round of QA which costs a lot of money. Compare that to three supported Windows releases: 7, 8.1 and 10. Also consider the fact that only the display driver model substantially differs between these three Windows versions.

    Many hardware devices don’t have proper spec sheets and documentation, or such things cannot be published due to laws (e.g. HDMI algorithms, various TPMs devices, etc. etc. etc.) or out of a fear of competition.

    The Linux kernel “features” such massive code changes every release along with often a complete absence of QC/QA, drivers might break due to very tangential changes in unrelated kernel subsystems. You just cannot develop drivers in such a massively unstable environment even if you tried.

    Linux kernel developers love to say, “release your specs and we’ll write the code for you” and while it’s true, too often they don’t have the resources to properly debug the code on a multitude of different devices and devices combinations.

    The open source development model might work for very basic very standard devices like motherboards (sans ACPI and software suspend), PCI NICs, mouse/KB controllers, USB buses, etc.

    It’s almost impossible to apply to GPUs (Radeon/Intel open source drivers don’t support many hardware features of respective GPUs), proprietary RAID/storage controllers, cameras, touch sensors, hardware sensors, various devices which implement image processing, encryption, protection and central management and many other classes of devices.

    Reply
  5. Tomi Engdahl says:

    Cryptocurrency miners are renting Boeing 747s to ship graphics cards
    http://www.pcgamer.com/cryptocurrency-miners-are-renting-boeing-747s-to-ship-graphics-cards/

    Have you ever had a moment where you didn’t know whether to laugh or cry? That’s the situation playing out in the graphics card market because of the cryptocurrency mining boom, a topic we’ve covered extensively in recent months. But just when we thought there was nothing left to report on the matter, it’s come out that some of the most active Ethereum miners are renting Boeing 747 airplanes to ship orders of graphics cards. Yes, seriously.

    That is the sort of money that is at stake here. Cryptocurrency is highly volatile, Ethereum included. For miners with massive setups, shipping by sea is just too slow.

    “Time is critical, very critical,” Marco Streng, chief executive of Genesis Mining, told Quartz. “For example, we are renting entire airplanes, Boeing 747s, to ship on time. Anything else, like shipping by sea, loses so much opportunity.”

    Ethereum miners are renting Boeing 747s to ship graphics cards and AMD shares are soaring
    https://qz.com/1039809/amd-shares-are-soaring-ethereum-miners-are-renting-boeing-747s-to-ship-graphics-cards-to-mines/

    Advanced Micro Devices’ (AMD) share price jumped after it beat revenue estimates thanks to cryptocurrency miners snapping up the firm’s graphics cards. Shares rose 11% after the chip company announced earnings on July 25, but the firm’s stock is up 152% over the last 12 months, making it the fourth best performer on the S&P 500, CNBC reported.

    Lisa Su, AMD’s chief executive, said the firm saw “elevated demand” from cryptocurrency miners during the quarter. This need for graphics cards helped AMD give a “solid beat” to analyst estimates,

    Reply
  6. Tomi Engdahl says:

    Mozilla launches experimental voice search, file-sharing and note-taking tools for Firefox
    https://techcrunch.com/2017/08/01/mozilla-launches-experimental-voice-search-file-sharing-and-note-taking-tools-for-firefox/

    Firefox is due for a comeback. A lot of the work the team spent on refactoring core parts of the browser is starting to pay off, and, while its market share continues to decline (especially on the desktop), Firefox today feels faster and leaner than it has in a long time. And today is a good day to give Firefox a new try because the team just launched three new Test Pilot experiments that bring voice search, built-in note taking and a tool for sending large files to the browser.

    These are obviously experimental tools and there’s no guarantee they will ever make it into a release version of Firefox. Indeed, the idea behind Test Pilot is to allow the Firefox team to test new concepts.

    Reply
  7. Tomi Engdahl says:

    Google Classroom passes 1 billion submitted assignments and gets 10 new features
    https://venturebeat.com/2017/08/01/google-classroom-passes-1-billion-submitted-assignments-and-gets-10-new-features/

    Google today announced 10 updates to Google Classroom and Google Forms aimed to help teachers save time and stay organized. At the same time, the company revealed that students have submitted more than 1 billion assignments since Classroom launched almost three years ago.

    Google Classroom first launched in August 2014 as a learning platform for schools looking to handle assignments, coursework, and grades in a paperless way. The company has updated the service multiple times since then, including by adding a Chrome extension, a Coursework API, and email updates for parents.

    Reply
  8. Tomi Engdahl says:

    7 Things EEs Should Know About Artificial Intelligence
    http://www.electronicdesign.com/industrial/7-things-ees-should-know-about-artificial-intelligence?code=UM_NN7TT3&utm_rid=CPG05000002750211&utm_campaign=12213&utm_medium=email&elq2=49c083757ae4409d9d7ebabf70a084d5

    What exactly are the basic concepts that comprise AI? This “Artificial Intelligence 101” treatise gives a quick tour of the sometimes controversial technology.

    Just what the devil is artificial intelligence (AI) anyway? You’ve probably been hearing a lot about it in the context of self-driving vehicles and voice-recognition devices like Apple’s Siri, Amazon’s Alexa, Google’s Assistant, and Microsoft’s Cortana. But there’s more to it than that.

    AI has been around since the 1950s, when it was first discovered. It has had its ups and downs over the years, and today is considered as a key technology going forward. Thanks to new software and ever faster processors, AI is finding more applications than ever. AI is an unusual software technology that all EEs should be familiar with. Here is a brief introductory tutorial for the uninitiated.

    Reply
  9. Tomi Engdahl says:

    IBM and Sony Cram Up To 330 Terabytes Into Tiny Tape Cartridge
    https://hardware.slashdot.org/story/17/08/02/1552214/ibm-and-sony-cram-up-to-330-terabytes-into-tiny-tape-cartridge?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Slashdot%2Fslashdot%2Fto+%28%28Title%29Slashdot+%28rdf%29%29

    IBM and Sony have developed a new magnetic tape system capable of storing 201 gigabits of data per square inch, for a max theoretical capacity of 330 terabytes in a single palm-sized cartridge

    To achieve such a dramatic increase in areal density, Sony and IBM tackled different parts of the problem: Sony developed a new type of tape that has a higher density of magnetic recording sites, and IBM Research worked on new heads and signal processing tech to actually read and extract data from those nanometre-long patches of magnetism.

    IBM and Sony cram up to 330 terabytes into tiny tape cartridge
    Sputtered magnetic layer, lubricant, and new heads enable massive 200Gb/inch density.
    https://arstechnica.co.uk/information-technology/2017/08/ibm-and-sony-cram-up-to-330tb-into-tiny-tape-cartridge/

    IBM and Sony have developed a new magnetic tape system capable of storing 201 gigabits of data per square inch, for a max theoretical capacity of 330 terabytes in a single palm-sized cartridge.

    For comparison, the world’s largest hard drives—which are about twice the physical size of a Sony tape cartridge—are the 60TB Seagate SSD or 12TB HGST helium-filled HDD. The largest commercially available tapes only store 15TB. So, 330TB is quite a lot.

    Sony’s new tape is underpinned by two novel technologies: an improved built-in lubricant layer, which keeps it running smoothly through the machine, and a new type of magnetic layer. Usually, a tape’s magnetic layer is applied in liquid form, kind of like paint—which is one of the reasons that magnetic tape is so cheap and easy to produce in huge quantities. In this case, Sony has instead used sputter deposition, a mature technique that has been used by the semiconductor and hard drive industries for decades to lay down thin films.

    The new lubrication layer, which we don’t know much about, makes sure that the tape streams out of the cartridge and through the machine extremely smoothly. Some of the biggest difficulties of tape recording and playback are managing friction and air resistance, which cause wear and tear and chaotic movements.

    Reply
  10. Tomi Engdahl says:

    Julie Verhage / Bloomberg:
    Study of 116 unicorns founded after 1994 finds that about half used complex stock mechanics to raise valuation, devaluing employee and early shareholder stakes — Study finds private valuations aren’t grounded in reality — Employees, early investors often lose with stock provisions

    Here’s How Unicorns Trick You Into Thinking They’re Real
    https://www.bloomberg.com/news/articles/2017-08-02/the-magic-behind-many-unicorn-startups-complex-stock-contracts

    Unicorns aren’t real, and neither are the valuations ascribed to many of the startups that say they’re worth $1 billion or more.

    About half of private companies with valuations exceeding $1 billion, known as unicorns, wouldn’t have earned the mythical title without the use of complex stock mechanics, according to a study by business professors at the University of British Columbia and Stanford University. The tools used to negotiate a higher share price with investors often come at the expense of employees and early shareholders, sometimes drastically reducing the actual value of their stock.

    The chasm between public and private valuations is a topic of increasing prominence following several disappointing listings.

    The use of special investor protections has soared in recent years as startups chase dreams of becoming a unicorn. A lofty valuation can build credibility and help recruit talent in a tight labor market. But it has also complicated the already-opaque process of valuing a private business.

    https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2955455

    Reply
  11. Tomi Engdahl says:

    The traditional hard drive will disappear

    The death of a traditional mechanical HDD is not afraid to be announced yet, but the consumer electronics technology is slowly fading. According to the Trendforce Research Institute, HDD market size is shrinking all the time, even though business class servers are using the disks in the past.

    Mechanical recording is disappearing for faster, less power-consuming and silent NAND flash. The HDD would disappear even faster if the flash disks were still not much more expensive.

    According to Trendforce, over 170 million HDDs were delivered in the third quarter of 2011. In the first quarter of this year, less than a hundred million HDDs were delivered.

    At the same time, the capacity of HDDs marketed has doubled in the last five years. This is explained by the fact that corporate servers have different criteria than laptops.

    According to Trendforce, the largest HDD manufacturer is Western Digital. Seagate and Toshiba will be on the list afterwards.

    Source: http://www.etn.fi/index.php/13-news/6619-perinteinen-kiintolevy-katoaa

    Reply
  12. Tomi Engdahl says:

    Intel’s Upcoming Coffee Lake CPUs Won’t Work With Today’s Motherboards
    https://hardware.slashdot.org/story/17/08/03/2221200/intels-upcoming-coffee-lake-cpus-wont-work-with-todays-motherboards

    Intel’s upcoming Coffee Lake CPUs won’t work with existing 200-series motherboards that support Kaby Lake, a manufacturer confirmed on Wednesday. In a Twitter post by Asrock last Saturday, the company confirmed the news when asked if “the Z270 Supercarrier [will] get support for the upcoming @intel Coffee Lake CPUs.” Their response: “No, Coffee Lake CPU is not compatible with 200-series motherboards.”

    Why this matters: The vast majority of new CPU sales are in new systems, and they likely won’t be impacted by the incompatibility. However, there’s also a very large and very vocal crowd of builders and upgraders who still swap out older, slower CPUs for newer, faster CPUs to maximize their investment. An upgrade-in-place doesn’t sell an Intel chipset, but it at least keeps them on the Intel platform.

    Facepalm: Intel’s upcoming Coffee Lake CPUs won’t work with today’s motherboards
    http://www.pcworld.com/article/3213387/computers/facepalm-intels-upcoming-coffee-lake-cpus-wont-work-with-todays-motherboards.html

    Builders and upgraders are crying foul as they lose their ability to upgrade-in-place on an older motherboard.

    Reply
  13. Tomi Engdahl says:

    Ubuntu Will Revert Window Controls To the Right-Hand Side in Next Release
    https://news.slashdot.org/story/17/08/03/1757237/ubuntu-will-revert-window-controls-to-the-right-hand-side-in-next-release

    In the survey 46.2% of people said they prefer their window controls on the left-hand side and 53.8% said they prefer them on the right. The decision comes after seven years of window controls being on the left

    Ubuntu will revert window controls to the right-hand side in next release
    https://www.neowin.net/news/ubuntu-will-revert-window-controls-to-the-right-hand-side-in-next-release

    Reply
  14. Tomi Engdahl says:

    Inside Mozilla’s Fight To Make Firefox Relevant Again
    https://news.slashdot.org/story/17/08/03/1713206/inside-mozillas-fight-to-make-firefox-relevant-again

    News outlet CNET has a big profile on Firefox today, for which it has spoken with several Mozilla executives. Mozilla hopes to fight back Chrome, which owns more than half of the desktop market share, with Firefox 57, a massive overhaul due November 14.

    https://www.cnet.com/special-reports/mozilla-firefox-fights-back-against-google-chrome/

    Reply
  15. Tomi Engdahl says:

    Rethinking SSDs In Data Centers
    A frenzy of activity aims to make solid-state drives faster and more efficient.
    https://semiengineering.com/rethinking-ssds-in-data-centers/

    Semiconductors that control how data gets on and off solid-state drives (SSDs) inside of data centers are having a moment in the sun.

    This surge in interest involves much more than just the SSD device. It leverages an entire ecosystem, starting with system architects and design engineers, who must figure out the best paths for data flow on- and off-chip and through a system. It also includes the I/O IP, the physical distance of the storage device from the server racks. And it includes the interconnect fabric, which in the case of the cloud and large enterprises increasingly relies on silicon photonics.

    There are four main issues that come into play around SSDs—latency, power, cost, and bandwidth to, from and between different SSDs. There is some overlap between these various elements, because power and performance may be traded off, depending upon the application, or simply accepted as the cost of doing business for some jobs. In every case, though, this is a balancing act between whatever is important to the users of that data. So a cloud communicating safety-critical information will have much higher performance requirements than a static records search or an online shopping transaction.

    Reply
  16. Tomi Engdahl says:

    Under the Hood of AMD’s Threadripper
    http://hackaday.com/2017/08/03/under-the-hood-of-amds-threadripper/

    Although AMD has been losing market share to Intel over the past decade, they’ve recently started to pick up steam again in the great battle for desktop processor superiority. A large part of this surge comes in the high-end, multi-core processor arena, where it seems like AMD’s threadripper is clearly superior to Intel’s competition. Thanks to overclocking expert [der8auer] we can finally see what’s going on inside of this huge chunk of silicon.

    AMD Threadripper Delidded, With a Multi-Core Surprise Under the Hood
    https://www.extremetech.com/computing/253248-amd-threadripper-delidded-multi-core-surprise-hood

    Reply
  17. Tomi Engdahl says:

    A Chrome Extension for Being a Jerk
    http://hackaday.com/2017/08/03/a-chrome-extension-for-being-a-jerk/

    What do you do to someone you want to make suffer, slowly? Specifically, at around 70% speed. To [Stephen], the answer is clear, you hit them where it really hurts: YouTube.

    Creatively named “Chrome Engine,” [Stephen]’s diabolical Chrome extension has one purpose: be annoying. Every day, it lowers playback rate by 1% on YouTube. It’s a linear progression: 100% the first day, 99% the second day, 98% the third day, etc. It only stops 30 days later, once it hits its target rate of 70% the original speed. This progression is designed to be slow enough not to be noticed. Its icon is nothing more than the standard Chrome icon as [Stephen] firmly believes in the tactic of hiding in plain sight.

    Reply
  18. Tomi Engdahl says:

    Wayland Confirmed as Default for Ubuntu 17.10
    http://www.omgubuntu.co.uk/2017/08/ubuntu-confirm-wayland-default-17-10

    Wayland WILL ship as the default display server in Ubuntu 17.10 ‘Artful Aardvark’.

    Although always the intended plan, Ubuntu devs did express doubts about making switch last month, with Ubuntu desktop team lead Will Cooke saying he felt that “Wayland isn’t ready yet”.

    But several weeks, lots of user testing, and one GUADEC later a decision is now in place: Ubuntu 17.10 will ship with Wayland as the default session.

    there is a stack of ideological, political and technical reasons why Ubuntu would want to switch to Wayland in this release, as soon as possible

    The Xorg session is still included

    If you’re a NVIDIA user, a big gamer, or happen to be using hardware that this next-gen display server tech doesn’t yet play nice with, do not panic. Although Wayland will be default it isn’t compulsory.

    Ubuntu will still have an X session available out of the box, ready to roll, a click or two away.

    Reply
  19. Tomi Engdahl says:

    IBM Claims Tape Density Record
    Teams with Sony to target cloud storage
    http://www.eetimes.com/document.asp?doc_id=1332110&

    IBM researchers have set a tape areal-density record of 201 gigabytes per square inch — 20 times the areal density of current commercial tape drives — enabling a single palm-sized cartridge to hold 330 terabytes of uncompressed data. IBM Research and Sony Storage Media Solutions, which developed the nano-grained sputtered tape used for the demonstration prototype, described the achievement in Tsukuba, Japan, today (Aug. 2) at The Magnetic Recording Conference (TMRC 2017).

    Reply
  20. Tomi Engdahl says:

    Khari Johnson / VentureBeat:
    Facebook moves entire translation backend to neural networks, built on Caffe2, to handle 4.5B+ translations per day, says it’s seen ~11% increase in accuracy — Facebook announced today that it has started using neural network systems to carry out more than 4.5 billion translations …

    Facebook now uses Caffe2 deep learning for the site’s 4.5 billion daily translations
    https://venturebeat.com/2017/08/03/facebook-now-uses-caffe2-deep-learning-for-the-sites-4-5-billion-daily-translations/

    Reply
  21. Tomi Engdahl says:

    Neural Nets in the Browser: Why Not?
    http://hackaday.com/2017/08/04/neural-nets-in-the-browser-why-not/

    We keep seeing more and more Tensor Flow neural network projects. We also keep seeing more and more things running in the browser. You don’t have to be Mr. Spock to see this one coming. TensorFire runs neural networks in the browser and claims that WebGL allows it to run as quickly as it would on the user’s desktop computer.

    https://tenso.rs/demos/fast-neural-style/

    Reply
  22. Tomi Engdahl says:

    Julie Bort / Business Insider:
    Inside the world of Silicon Valley’s ‘coasters’ — the millionaire engineers who get paid gobs of money and barely work

    Inside the world of Silicon Valley’s ‘coasters’ — the millionaire engineers who get paid gobs of money and barely work
    http://www.businessinsider.com/rest-and-vest-millionaire-engineers-who-barely-work-silicon-valley-2017-7?op=1&r=US&IR=T&IR=T

    The least-secret secret in the Valley is called “resters and vesters,” or “coasters” – which refers to engineers who get paid big bucks without doing too much work, waiting for their stock to vest.

    She was having a bad reaction to her job.

    She was making $1 million a year, mostly in stock, and running a team of about three dozen people, she told Business Insider. And she had worked herself into a state of exhaustion in the three years since Facebook had acquired her previous company. The acquisition had been highly political, the integration wasn’t going well and she had been killing herself to make it more successful and protect her people from losing their jobs over it.

    As tired as she was, she couldn’t just quit this job. She owed a big chunk of money in taxes thanks to that stock and needed her salary to pay those taxes.

    But after getting violently ill at the thought of going to work, she decided not to go in. Not that day. Not ever again. And she knew she wouldn’t get fired.

    Because not going to work was actually her manager’s idea.

    The previous day she had told him she would be leaving the company at the end of the year, six months away. She wanted to spend the rest of the year wrapping up her projects but not taking on any more, collecting on the stock that would vest by year end and making the money she needed to pay her taxes.

    She panicked thinking he was firing her but he explained she wasn’t being terminated at all. “Just don’t come to work. You’re burned out and need a break. Just don’t talk about it and everyone will assume you’re on someone else’s team,” her told her.

    The manager’s proposal didn’t go over well.

    “Resting and vesting” is when an employee, typically an engineer, has an easy work load (if any job responsibilities at all) and hangs out on the company’s payroll collecting full pay and stock. Stock is often the bigger chunk of total compensation for a senior engineer than salary.

    Once she was in rest-and-vest mode, this engineer spent her time attending tech conferences, working on pet coding projects and networking with friends, quietly developing an idea for her next gig, a startup.

    She realized that her manager let her “rest and vest” to keep her quiet about the problems with that acquisition, so she had time to find her next thing.

    Business Insider talked to about a half a dozen people with direct knowledge of the rest-and-vest culture. Some were “fat cats” themselves

    Engineers can wind up in “rest and vest” jobs in a variety of ways.

    Years later, he landed at Microsoft and says he saw how Microsoft used high-paying jobs strategically, both within its engineering ranks and with its R&D unit, Microsoft Research. The company, he says, would nab hard-to-find experts in up-and-coming fields like artificial intelligence, robotics, natural speech language, quantum computing and so on, often allowing them to collect their Microsoft pay while maintaining a job as a professor or researcher at a university.

    “You keep engineering talent but also you prevent a competitor from having it and that’s very valuable,” he said. “It’s a defensive measure.”

    Another person confirmed the tactic, telling us, “That’s Microsoft Research’s whole model.”

    At other companies it’s less about defense and more about becoming indispensable.

    For instance, Facebook has a fairly hush bonus program called “discretionary equity” or “DE,”

    “DE” is when the company hands an engineer a massive, extra chunk of restricted stock units, worth tens to hundreds of thousands of dollars. It’s a thank you for a job well done. It also helps keep the person from jumping ship because DE vests over time. These are bonus grants that are signed by top execs

    These DE bonuses are not specifically designed as a mechanism for resting and vesting, but they can play a role in enabling it.

    “These are really smart people and they don’t leave. They’re in their mid-30s, pulling in seven figures a year, and they don’t have to work as hard. We say they’re just coasting,” he said.

    The 10x engineer

    Other “rest-and-vest” types are part of a tribe in the Valley known as “the 10x engineer,” a term used to describe someone said to be 10-times more effective than a so-called ordinary engineer.

    Legend has it that a 10x engineer can do in one hour what it would take others 10 hours to do. Some of these folks are just plain brilliant. Others aren’t necessarily smarter but they know every detail of a critical system.

    “One guy at Facebook didn’t seem to work a lot, but when the site would go down he could find things that couldn’t be found,” the engineer described.

    Other members of the “rest and vest” set are the “coasters,” the long-timers who have reached a company’s top engineering ranks and don’t need to work hard to stay there.

    They may not be 10x engineers, but they are institutional employees who know how to do just the right amount of work to get a good annual review and collect their next batch of stock grants.

    According to all the folks we talked to, Google is known as a place where this type of rester-and-vester flourishes.

    Reply
  23. Tomi Engdahl says:

    Stephen Shankland / CNET:
    Mozilla execs on company’s recovery efforts starting with Firefox 57, past missteps that led to Firefox losing market share, a possible membership plan, more

    Firefox fights back
    https://www.cnet.com/special-reports/mozilla-firefox-fights-back-against-google-chrome/

    Inside Mozilla, CEO Chris Beard and his team are preparing to outmaneuver Google’s Chrome browser. The battle begins in November, with their release of Firefox 57.

    Reply
  24. Tomi Engdahl says:

    Tom Simonite / Wired:
    A look at Google’s Ray Kurzweil-led internal group, Kona, which is responsible for neural net-powered Smart Reply feature in Gmail

    What Is Ray Kurzweil Up to at Google? Writing Your Emails
    https://www.wired.com/story/what-is-ray-kurzweil-up-to-at-google-writing-your-emails

    Ray Kurzweil has invented a few things in his time.

    His group powers Smart Reply, the feature on the Gmail mobile app that offers three suggested email replies for you to select with a tap. In May it rolled out to all of the service’s English-speaking users, and last week was presented to Spanish speakers too. The responses may be short—“Let’s do Monday” “Yay! Awesome!” “La semana que viene”—but they sure can be useful. (A tip: You can edit them before sending.) “It’s a good example of artificial intelligence working hand in glove with human intelligence,” Kurzweil says.

    And Kurzweil claims he’s just getting started. His team is experimenting with empowering Smart Reply to elaborate on its initial terse suggestions.

    Reply
  25. Tomi Engdahl says:

    Made in China 2025: AI in U.S. Factories? Not There Yet
    http://www.eetimes.com/document.asp?doc_id=1332106

    AI is not just for driverless cars, digital assistants, or movie recommendations anymore; in multiple industries, it’s a wave about to break. According to a recent McKinsey Global Institute study surveying 3,000 “AI-aware” companies around the globe, only 20 percent are using AI-related technologies in a core part of the business, but the majority expect to ramp up AI spending in the next three years.

    Other studies yield similar results. In an Infosys-funded survey of 1,600 business and IT leaders in seven countries, although only 25 percent said AI technologies were fully deployed and working within their organizations, 76 percent overall called AI critical to their companies’ success. Organizations with partially or fully deployed AI technologies expect them to contribute a 39 percent increase in revenue and a 37 percent reduction in costs by 2020. Businesses on average have been using AI for about two years and expect mature adoption in three more years.

    Echoing insights from industry analysts and other observers, these reports conclude that AI in manufacturing is nearing a tipping point in the emerging factories of the future. There is consensus that AI applications ranging from smart and collaborative robotics to virtual assistants will upend how factories operate, requiring a complete rethinking of plant designs, manufacturing footprints, and supply chain models. So it’s not surprising that most AI tech investment has come from internal R&D dollars at big tech-savvy companies like Amazon, Baidu, and Google, according to the McKinsey study.

    Reply
  26. Tomi Engdahl says:

    CGI to Cut 1,600 Jobs, Reshaping Workforce in Online Push
    https://www.bloomberg.com/news/articles/2017-08-02/cgi-boosts-spending-on-digital-shift-as-profit-misses-estimates

    Technology-service provider still boosting overall headcount
    Clients want to spend more on applications, automation: CGI

    Canadian technology-service provider CGI Group Inc. said it’s cutting 1,600 jobs while hiring in other areas as it responds to clients’ accelerating demands to provide more online services.

    The Montreal-based company will post a pretax expense of C$165 million ($131 million) over the next year to account for the job cuts and “address underutilized resources,” Chief Financial Officer François Boulanger told analysts on a conference call Wednesday. The stock had its biggest intraday drop since December.

    Reply
  27. Tomi Engdahl says:

    Nick Statt / The Verge:
    Report details how Apple, HP, other manufacturers influence Green Electronics Council regulators to undermine green tech standards of reusability and repair — A new report says the tech industry is using its outsized influence to combat environmental product standards

    Why Apple and other tech companies are fighting to keep devices hard to repair
    A new report says the tech industry is using its outsized influence to combat environmental product standards
    https://www.theverge.com/2017/8/3/16087628/apple-e-waste-environmental-standards-ieee-right-to-repair

    Apple is the largest company on Earth by market cap, and its success is derived from selling brand-new high-end smartphones consistently month after month. At the peak of its iPhone business, back in 2015, Apple sold a staggering 231.5 million smartphones. Though sales have begun to slow, that one device alone still accounts for more than 50 percent of Apple’s entire business. The company’s second quarter earnings results for 2017, reported on Tuesday, showed a quarterly profit of $8.7 billion, a majority of which came from the sale of 41 million iPhones.

    Reply
  28. Tomi Engdahl says:

    Christopher M. Schroeder / MIT Technology Review:
    A look at how Middle Eastern startups are overcoming cultural and other barriers to tap into a growing local taste for tech, from bitcoin to digital publishing

    A Different Story from the Middle East: Entrepreneurs Building an Arab Tech Economy
    https://www.technologyreview.com/s/608468/a-different-story-from-the-middle-east-entrepreneurs-building-an-arab-tech-economy/

    Middle Eastern startups are overcoming cultural and other barriers to tap into a growing local taste for technology, from Bitcoin wallets to digital publishing.

    At the end of March, it was announced that the largest e-commerce company in the Middle East and North Africa, Souq.com, would be acquired by Amazon for nearly $600 million. This was unusual: when Amazon enters a new geographic market, it typically does so by launching its existing platform and investing a lot of money to grow it. Instead, Amazon—apparently impressed by Souq.com’s management team, its technology, and its ability to navigate a complicated region—decided on a different strategy.

    A week after the announcement, at the Step Conference in Dubai, one of the most popular startup gatherings in the region, it felt as if lightning had struck. Over 2,000 aspiring entrepreneurs filled the arena, standing room only, for a panel with Souq.com founder ­Ronaldo ­Mouchawar.

    A few months earlier, Careem—the region’s fast-growing ride-sharing company—had been valued by venture capitalists at over $1 billion.

    One young aspiring entrepreneur taking copious notes on her laptop told me, “I can do this. I will do this.”

    Bankruptcies and visas

    The UAE’s government has recently made legal changes to encourage entrepreneurship. In 2016, the government enacted its first bankruptcy law. The freedom to fail, learn from failure, and quickly start the next enterprise has been crucial to the Silicon Valley blueprint, but in some parts of the Middle East, cultural traditions around debt and obligations to others had made failure a criminal act: executives could literally serve prison time. And during a time when some in America are fighting any expansion of H1-B visa programs, which allow foreigners to work in the country in specialized occupations, the UAE just announced a new visa offering residency to the best technologists from anywhere on earth.

    The Dubai government is embracing technology too. By the end of 2020, all government documentation and interactions will be available on blockchain, a decentralized record-keeping technology that verifies and records transactions securely. By 2019, as part of a strategy to improve efficiency and construction safety while reducing costs, 2 percent of all new construction will have to use 3-D-printed components in order to receive building permits, a number set to increase each year until it reaches 25 percent by 2030. The UAE even has its own space program; it plans to expand satellite efforts and launch the first Mars probe in the Arab world.

    “There is nothing like Dubai in the Middle East—nothing really like it anywhere.”

    Reply
  29. Tomi Engdahl says:

    Ian Cutress / AnandTech:
    Intel finalizes specs of Core i9-7980XE: 18-cores, 2.6GHz clock speed with 4.4GHz Turbo Boost, 24.75MB L3 Cache, 165W TDP; priced at $1999, ships September 25 — When Intel launched its new high-end desktop platform a few weeks ago, we were provided with Core-X CPUs from quad cores …

    Intel Finalizes Skylake-X Processor Specifications: 18-Cores, 4.4 GHz Turbo, 165W on September 25th
    by Ian Cutress on August 7, 2017 12:34 PM EST
    http://www.anandtech.com/show/11698/intel-finalizes-skylakex-processor-specifications-18cores-44-ghz-165w-on-september-25th

    Reply
  30. Tomi Engdahl says:

    Cisco’s server CTO says NVMe will shift from speed to capacity tier
    Raghunath Nambiar says data centres will be asked to do more with more
    https://www.theregister.co.uk/2017/08/08/cisco_ucs_cto_raghunath_nambiar_interview/

    NVMe storage is becoming denser, faster, than other forms of storage and will therefore become a capacity tier according to Cisco’s chief technology officer for UCS Raghunath Nambiar.

    “Right now people are looking at NVMe from a performance point of view,” Nambiar told The Register in Sydney last week, “but the real game changer is going to be capacity.”

    Nambiar said 2.5 inch SSDs will soon hit seven-terabyte capacities, but “NVMe will go to 32 terabytes 18 months from now.” That density will mean that even small servers like the UCS B200, Cisco’s half-width workhorse, will be able to work with 64 terabytes of data in each server and plenty more across a blade chassis or a fabric.

    Nambiar said businesses will put that data to work with more intensive just-in-time analytics.

    Reply
  31. Tomi Engdahl says:

    Preview of AMD Ryzen Threadripper Shows Chip Handily Out-Pacing Intel Core i9
    https://hardware.slashdot.org/story/17/08/07/2048229/preview-of-amd-ryzen-threadripper-shows-chip-handily-out-pacing-intel-core-i9

    AMD is still days away from the formal launch of their Ryzen Threadripper family of 12 and 16-core processors but OEM system builder Dell and its Alienware gaming PC division had an inside track on first silicon in the channel. The Alienware Area-51 Threadripper Edition sports a 16-core Ryzen Threadripper 1950X processor that boosts to 4GHz with a base clock of 3.4GHz and an all-core boost at 3.6GHz. From a price standpoint, the 16-core Threadripper chip goes head-to-head with Intel’s 10-core Core i9-7900X at a $999 MSRP. In early benchmark runs of the Alienware system, AMD’s Ryzen Threadripper is showing as much as a 37% percent performance advantage over the Intel Core i9 Skylake-X chip, in highly threaded general compute workload benchmarks like Cinebench and Blender.

    Exclusive: Ryzen Threadripper 1950X Performance First Look With Alienware Area-51 Threadripper Edition
    https://hothardware.com/reviews/amd-threadripper-1950x-performance-preview

    Reply
  32. Tomi Engdahl says:

    HP’s Latest Enterprise VR Workstation Is a Backpack
    https://www.designnews.com/electronics-test/hps-latest-enterprise-vr-workstation-backpack/138592786257249?cid=nl.x.dn14.edt.aud.dn.20170808.tst004t

    HP has unveiled the Z VR Backpack, a wearable PC for enterprise VR applications. But will the new design be a step forward or a detour on the road to wireless VR?

    As VR (virtual reality) looks to transform itself into an increasingly more mobile (aka convenient) experience HP has debuted a novel solution to the wires and tethers associated with VR today – a backpack.

    At SIGGRAPH 2017, HP announced the HP Z VR Backpack, a 10-lb, wearable PC with enough horsepower for both experiencing and creating VR content. While it is easy to imagine the entertainment potential of the Z VR, HP has made it clear that it wants its new wearable PC to be catalyst for bringing more robust VR experiences to business and enterprise first and foremost. Safer simulation and training, virtual walkthroughs for architectural design, and better collaboration in virtual environments for product designers , are just a few of the use cases cited by HP.

    At its core the Z VR Backpack is a Windows 10 PC with an Intel Core i7 processor, 32 GB of SDRAM, a Nvidia Quadro P5200 GPU, and up to 1 TB of internal storage. It measures in at 13.11 x 9.29 x 2.39 inches and weighs 10.25 lbs according to specs released by HP. The backpack is powered by a 55Whr lithium-ion battery and features two, external portable 74 Whr hot-swappable batteries. The Z VR can also be docked and serve as a desktop PC.

    Reply
  33. Tomi Engdahl says:

    Ian Cutress / AnandTech:
    Intel finalizes specs of Core i9-7980XE: 18-cores, 2.6GHz clock speed with 4.4GHz Turbo Boost, 24.75MB L3 Cache, 165W TDP; priced at $1999, ships September 25

    Intel Finalizes Skylake-X Processor Specifications: 18-Cores, 4.4 GHz Turbo, 165W on September 25th
    by Ian Cutress on August 7, 2017 12:34 PM EST
    http://www.anandtech.com/show/11698/intel-finalizes-skylakex-processor-specifications-18cores-44-ghz-165w-on-september-25th

    When Intel launched its new high-end desktop platform a few weeks ago, we were provided with Core-X CPUs from quad cores on the latest Kaby Lake microarchitecture, and 6/8/10 core parts on the Skylake-SP microarchitecture derived from the enterprise line and taking a different route to how the cache was structured over Skylake-S. At the time we were told that these latter parts would be joined by bigger SKUs all the way up to 18 cores, and up to $2000. Aside from core-counts and price, Intel was tight lipped on the CPU specifications until today.

    Reply
  34. Tomi Engdahl says:

    Program that repairs programs: how to achieve 78.3 percent precision in automated program repair
    https://www.microsoft.com/en-us/research/blog/program-repairs-programs-achieve-78-3-percent-precision-automated-program-repair/

    In February 2017, Microsoft and Cambridge University announced a DeepCoder algorithm that produces programs from problem inputs/outputs. DeepCoder, which operates on a novel yet greatly simplified programming language, cannot handle complex problems—general programming languages are still too hard for DeepCoder to master. So, currently, programmers don’t have to worry about being replaced by machines.

    But programmers have plenty of other worries, including programming bugs. Could machines assist programmers by taking over the task of bug fixes?

    DeepCoder: Learning to Write Programs
    https://www.microsoft.com/en-us/research/publication/deepcoder-learning-write-programs/#

    Reply
  35. Tomi Engdahl says:

    Micron Pushes Capacity Threshold in NVMe SSDs
    http://www.eetimes.com/document.asp?doc_id=1332136&

    Micron Technology unveiled its second generation of NVM Express (NVMe) SSDs at the Flash Memory Summit, using its 3D NAND to push capacities past 10TB.

    In an advance telephone briefing with EE Times, Dan Florence, SSD product manager for Micron’s Storage Business Unit, said the 9200 Series of NVMe SSDs were built from the ground up to break the shackles of legacy hard drive interfaces. The new storage portfolio is designed to address surging data demands while at the same time maximizing data center efficiency so customers can improve their overall total cost of ownership, he said, and is the storage foundation for the Micron SolidScale Platform, an NVMe over Fabric architecture ahead of standards development, announced earlier this year.

    Florence said the Micron 9200 SSD is up to 10 times faster than the fastest SATA SSDs with transfer speeds up to 4.6 GB/s and up to one million read IOPS, making them ideal for performance, high-capacity use cases as application/database acceleration, high frequency trading, and high-performance computing. “NVMe just as an interface offers a lot advantages over the legacy interfaces that were really built for spinning media,” he said. “It cuts out a huge chunk of latency and obviously because it sits on the PCIe bus it offers a higher bandwidth which allows you to get much higher IOPS.”

    NVMe also offers better ease of use of previous iterations of PCIe, Florence added, which had a lot of custom drivers. The industry standard that allows for NVMe to be plugged into pretty much any system with any operating system is helping to fuel its adoption.

    Reply
  36. Tomi Engdahl says:

    Samsung Promises 2018 Tbit NAND
    Z-NAND samples at 15-microsecond latency
    http://www.eetimes.com/document.asp?doc_id=1332135

    Samsung sketched out plans for a terabit 3D-NAND chip that it will ship next year as well as dense solid-state drives using its current chips. It also said that it is sampling the Z-NAND products that it announced last year at latency levels that match or beat Intel’s 3DXP memories.

    Samsung’s Tbit NAND will support data rates up to 1.2 Gbits/second and pack four terabytes in a package that stacks 32 die. The chip will embed peripheral circuits in a new metal bonding layer at the bottom of a cell stack as one way to hit the new density level, said Kye Hyun Kyung, Samsung’s executive vice president of flash products and technology, in a talk at the company’s Silicon Valley headquarters.

    The news was part of Samsung’s keynote at the Flash Memory Summit, where some rivals described chips using 96 layers and four bits per cell. Samsung’s current 512-Gbit chips use nine vertical channels and 64 layers built in a descending stair fashion for stability, up from four channels and 48 layers in the prior generation.

    Kyung said that versions of 3D-NAND with up to four bits per cell will fill most of the industry’s non-volatile memory needs.

    Reply
  37. Tomi Engdahl says:

    AnandTech:
    Intel’s new server-class SSD with “ruler” form-factor allows for up to 1 PB storage in a 1U server, features hot-swapping, PCIe Gen 5, power, cooling benefits

    Intel Introduces “Ruler” Server SSD Form-Factor: SFF-TA-1002 Connector, PCIe Gen 5 Ready
    by Billy Tallis & Anton Shilov on August 9, 2017 3:00 PM EST
    http://www.anandtech.com/show/11702/intel-introduces-new-ruler-ssd-for-servers

    Intel on Tuesday introduced its new form-factor for server-class SSDs. The new “ruler” design is based on the in-development Enterprise & Datacenter Storage Form Factor (EDSFF), and is intended to enable server makers to install up to 1 PB of storage into 1U machines while supporting all enterprise-grade features. The first SSDs in the ruler form-factor will be available “in the near future” and the form-factor itself is here for a long run: it is expandable in terms of interface performance, power, density and even dimensions.

    To better address client computers and some types of servers, Intel developed the M.2 form-factor for modular SSDs several years ago. While such drives have a lot of advantages when it comes to storage density, they were not designed to support such functionality as hot-plugging, whereas their cooling is a yet another concern. By contrast, the ruler form-factor was developed specifically for server drives and is tailored for requirements of datacenters. As Intel puts it, the ruler form-factor “delivers the most storage capacity for a server, with the lowest required cooling and power needs”.

    From technical point of view, each ruler SSD is a long hot-swappable module that can accommodate tens of NAND flash or 3D XPoint chips, and thus offer capacities and performance levels that easily exceed those of M.2 modules.

    The initial ruler SSDs will use the SFF-TA-1002 “Gen-Z” connector, supporting PCIe 3.1 x4 and x8 interfaces with a maximum theoretical bandwidth of around 3.94 GB/s and 7.88 GB/s in both directions. Eventually, the modules could gain an x16 interface featuring 8 GT/s, 16 GT/s (PCIe Gen 4) or even 25 – 32 GT/s (PCIe Gen 5) data transfer rate (should the industry need SSDs with ~50 – 63 GB/s throughput). In fact, connectors are ready for PCIe Gen 5 speeds even now, but there are no hosts to support the interfac

    One of the key things about the ruler form-factor is that it was designed specifically for server-grade SSDs and therefore offers a lot more than standards for client systems. For example, when compared to the consumer-grade M.2, a PCIe 3.1 x4-based EDSFF ruler SSD has extra SMBus pins for NVMe management, additional pins to charge power loss protection capacitors separately from the drive itself (thus enabling passive backplanes and lowering their costs). The standard is set to use +12 V lane to power the ruler SSDs and Intel expects the most powerful drives to consume 50 W or more.

    EDSFF itself has yet to be formalized as a standard, however the working group for the standard already counts Dell, Lenovo, HPE, and Samsung as among its promotors, and Western Digital as one of several contributors.

    Reply
  38. Tomi Engdahl says:

    Nvidia’s shares jump as AI and gaming drive graphics chip demand
    https://siliconangle.com/blog/2017/05/09/nvidia-shares-jump-ai-gaming-drive-graphics-chip-demand/

    Artificial intelligence and gaming once again boosted the fortunes of Nvidia Corp. as the maker of graphics chips reported better-than-expected results for the eighth quarter in a row.

    In its first fiscal quarter reported today, the company earned a profit of $507 million, or $533 million before certain expenses such as stock compensation, equaling 85 cents a share. That’s more than double a year ago. Revenue jumped 48 percent, to $1.94 billion.

    Profits blew away forecasts. Analysts had expected the company, a leader in graphics processing unit chips and systems used in high-performance gaming personal computers and servers used in artificial intelligence, to post a 68-cent profit on revenue of $1.9 billion.

    Nvidia also issued an outlook for the second quarter of $1.95 billion in revenue, give or take 2 percent. Gross profit margins are expected to be 58.4 percent, or 58.6 percent before certain expenses, both plus or minus a half-percentage point. For the first quarter, net margins were 59.4 percent.

    NVIDIA Announces Financial Results for First Quarter Fiscal 2018
    http://www.marketwired.com/press-release/nvidia-announces-financial-results-for-first-quarter-fiscal-2018-nasdaq-nvda-2215162.htm

    Reply
  39. Tomi Engdahl says:

    What is ‘dark data’ and how could it be impacting your cloud migration?
    http://www.cloudpro.co.uk/it-infrastructure/cloud-management/6923/what-is-dark-data-and-how-could-it-be-impacting-your-cloud

    Too many businesses are tackling the problem of dark data with additional storage, but this is an expensive and inefficient fix

    Multi-cloud solutions are no longer simply an alternative for enterprise data storage, but are rapidly becoming part of the new normal in the business world. By utilising public, on-premises, private and non-cloud infrastructure, organisations are able to keep better control of their data and overall business strategy.

    However, more and more companies are beginning their migration to the cloud with no idea of the hidden pitfalls that could derail them further down the line. One of these is the growing issue of ‘dark data’, and the impact it can have on a company’s cloud migration.

    What is dark data?

    Dark data is information that is collected, processed and stored before likely never being used again. Because it is hidden, IT departments have no idea if it contains sensitive information or data that should have been deleted long ago, and is therefore a ticking time bomb of potential issues.

    “Similar to dark matter in physics, dark data often comprises most organizations’ universe of information assets,” Gartner describes. “Thus, organizations often retain dark data for compliance purposes only. Storing and securing data typically incurs more expense (and sometimes greater risk) than value.”

    Reply
  40. Tomi Engdahl says:

    Hiring in the age of dispersed technology spending
    http://www.cio.com/article/3209625/hiring-and-staffing/how-the-decentralization-of-it-impacts-tech-hiring.html

    The tech hiring landscape is shifting now that line-of-business units command their share of technology spending. The result? A rise in hybrid business-IT positions — and a need for a new approach to filling them.

    Technology is spreading beyond the confines of IT, and with this shift is a change in control over technical hiring.

    Rather than rely on IT to configure and administer technologies they procure themselves, line of business (LOB) units such as HR, finance and marketing are hiring their own IT expertise in the form of hybrid positions that mix business and technical skills.

    “These roles are different from typical IT roles as they often focus on configuration and best practice implementation of external cloud solutions versus the traditional IT roles of developing software and implementing technology,” says Paul Watson, senior vice president at SenecaGlobal, a company offering IT solutions.

    But the shift toward hybrid LOB hiring doesn’t mean IT shouldn’t be involved. Here’s how IT leaders should approach hiring for technical roles outside of IT.

    Reply
  41. Tomi Engdahl says:

    The 7 hottest jobs in IT
    These emerging and resurging IT roles may be your best path forward in the years to come
    http://www.cio.com/article/3199084/careers-staffing/the-7-hottest-jobs-in-it.html

    f you’re burning out on your current gig, or feel that your role may be heading toward a dead end, it might be time for a change. To that end, we reached out to recruiters, executives, and tech pros, asking them to weigh in on the best opportunities they see evolving in the year ahead. What they came up with may surprise you, a mix of bleeding-edge tech and standbys that make up the hottest jobs currently hiring in IT.

    Some of these high-demand roles come with signing bonuses, stock options, and the ability to work remotely, of course. More eyebrow-raising perks include college debt payoffs and planned sabbaticals.

    AI and deep learning engineers

    AI and deep learning engineers

    As AI speeds how we work with massive amounts of data and converts it into actionable insights, the area is starved for new talent. Corporate and consumer interest are on the rise in areas like automation and autonomous driving, which means engineers with deep learning experience are hard to find.

    VR/AR

    Recruiting firm Randstad recently reported that, despite being one of the most in-demand fields, there were fewer than 5,000 potential candidates for virtual reality jobs as of the end of last year.

    Security analyst

    With all the recent cybersecurity breaches and rise of advanced persistent threats, it should come as no surprise that security analysts are in high demand, marked by high starting salaries, potential for growth, and greater influence in the workplace these days.

    “[Security analysts] are expected to stay up to date on the latest intelligence, including hackers’ methodologies, in order to anticipate security breaches,”

    Cloud integrator

    According to IT association CompTIA, the evolution of IT can be divided into three stages: the mainframe era, the PC/internet era, and now the cloud/mobile era, where new technologies built with the cloud in mind will gain more traction, including machine learning and blockchain.

    Full-stack engineers

    Web users are increasingly demanding more robust, app-like consumer experiences, which has led to strong demand for front- and back-end web developers — and even more for those who combine those skills as full-stack engineers.

    “Technologies like progressive web apps are bringing the web experience closer to native on mobile platforms,”

    Data scientist

    As AI becomes part of the business toolkit, making decisions quickly based on large amounts of data is increasingly important to firms hiring new developers.

    “All developer roles are in high demand, but there is especially high demand for data scientists,”

    IoT engineer

    Randstad reports that job postings for IoT (internet of things) architects spiked more than 40 percent in the last year, and the company predicts that growth is just the start.

    “The internet of things is where the world of technology is going,” says Dino Grigorakakis, vice president of recruiting at Randstad. “Working as an IoT engineer has a lot of current and future opportunity, the position is often competitively compensated, and experience with IoT will prepare candidates to move forward within the information technology industry even if they choose to move away from working directly with the internet of things.”

    Reply
  42. Tomi Engdahl says:

    9 forces shaping the future of IT
    New technologies and approaches will free IT leaders to cut costs, save time and let machine intelligence do the heavy lifting.
    http://www.cio.com/article/3206770/it-strategy/9-forces-shaping-the-future-of-it.html

    T is on the precipice of unprecedented change. Every company, now in the business of technology, is experiencing glimmers of larger shifts to come: automation, decentralized technology budgets, rapid adoption of cloud-based services, and most recently, artificial intelligence as a business necessity.

    Thanks to these emerging and converging trends, technology is increasingly freeing workers from routine tasks, from the warehouse to the C-suite. Massive amounts of data are being ingested in real time, as business decisions are beginning to be offloaded to machines, leaving more time to focus on planning, pursuing leads, and adopting new technologies.

    Reply
  43. Tomi Engdahl says:

    11 technologies developers should explore now
    From machine learning to digital twins, opportunities abound in emerging (and converging) tech trends
    http://www.cio.com/article/3191886/application-development/11-technologies-developers-should-explore-now.html

    New and evolving technologies are rapidly reshaping how we work—offering creative opportunities for developers who are willing to pivot and adopt new skills. We took a look at 11 tech trends experts say are likely to disrupt current IT approaches and create demand for engineers with an eye on the future.

    It isn’t all about The Next Big Thing. Future opportunities for developers are emerging from a confluence of cutting-edge technologies, such as AI, VR. augmented reality, IoT, and cloud technology … and, of course, dealing with the security issues that are evolving from these convergences.

    Internet of things security

    After tens of millions of connected devices were hijacked last year, even casual observers could see that unprotected IoT devices create nightmarish security problems.

    Artificial intelligence

    As we prepare for the next wave of autonomous vehicles, robots, and smart electronics, the demand for AI-savvy engineers is exploding.

    Machine learning

    A form of artificial intelligence, machine learning can take massive amounts of data to very quickly find patterns—like facial recognition—and solve problems, like recommending a movie to stream, without being explicitly programmed to do so.

    “Cognitive technologies, aided by bots and machine learning, will start to add value as organizations strive to find the ‘signals in the noise,’

    Data science

    Data science is another hot area, requiring multidisciplinary skills that vary by industry. Requirements can include experience with machine learning and AI to take large amounts of data and shape it in a form that can be used to make business decisions.

    Blockchain

    This means of creating a distributed ledger for transactions offers benefits in transparency and security, though a lack of standardization may slow its adoption across wide industries.

    Peter Loop, associate vice president and principal technology architect at Infosys, is bullish on the technology: “Despite misconceptions that blockchain is years away, we’ll see full deployments in financial services, insurance, and health care industries next year. This will completely disrupt our payment systems on an international scale.”

    Mesh app and service architecture (MASA)

    Demand for apps that seamlessly stay connected as we move through our home, commute, and work are increasingly in demand.

    “The purpose of a mesh network or app is that will it be high availability—everything connected to everything,” says Joseph Carson of Thycotic. “If the path is unavailable, it will find another device to establish the connection. We have seen this being used for example with the Tile tracker devices, which has created a community of tracking devices, and with bitcoin being a distributed ledger.”

    But some see a lack of device compatibility as a potential bottleneck.

    “Each vendor has their own way of trying to drive trust into this system, so they are all walled gardens, if they even exist at all,” says Derek Collison, formerly of Cloud Foundry and CEO of Apcera.

    Digital twins: Prepare to fail

    Software models tied to physical and virtual sensors can help predict product or service failures so that organizations are able to plan and assign resources to make repairs before the failure occurs. Advances in machine learning and the adoption of IoT technology are helping to bring down costs for this sort of predictive “digital twin” modeling, which boosts efficiency and can bring down operating costs over the life of, say, a jet engine or a power plant.

    Matias Woloski, CTO and co-founder of Auth0, says companies can also use digital twins in the concept and design stage, testing new products in simulations, then making changes until the engineers have the product they want. Findings from the digital twin are then used to build the product.

    “A few organizations have already launched digital-twin initiatives, although the primary projects leveraging this technology are the ones with large upfront development expense where the cost of failure is too high,” Woloski says.

    Autonomous vehicles, robots, and appliances

    New opportunities are seen developing as AI and machine learning smarten up home devices, industrial equipment, cars, and drones. Research firm Gartner estimates that by 2020, automakers will send 61 million data-connected cars off production lines.

    “There are entire economies already cropping up in these areas,”

    Virtual and augmented realities

    After decades of hype, virtual reality and augmented reality finally seem to be having their moment. For those looking to develop products for these technologies, there are opportunities beyond creating isolated gaming experiences.

    “While these technologies are not pervasive yet, they definitely have matured in the last few years,”

    Humanlike assistants

    The next stage of AI could eliminate the clunky tools we now use to interact with the digital world. Importantly, these changes are also increasingly making their way into the office.

    “The workplace of the future is integrating intelligent apps into the day-to-day workplace to enhance overall productivity. We’re seeing significant levels of automation in IT that are driving 40 to 50 percent productivity improvements,”

    And the winner is … convergence

    While AI is probably the most frequently cited breakthrough technology of the year, the most important trend of 2017 may be the merging of emerging, disruptive technologies.

    Maarten Ectors of Canonical name-checked a dozen disparate technologies that, when joined, are much more than the sum of their parts: “the cloud, mobile, IoT, artificial intelligence, blockchain, augmented reality, voice interfaces, software-defined radio, Industry 4.0 [automation and data exchange in manufacturing], robotics, edge computing, and autonomous driving.”

    Rocket Software’s Spedding says the siloed technologies are converging partly because of a need for businesses to dig themselves out of their own data—for example, analyzing website traffic.

    “Add to that the increasing proliferation of new data sources, such as IoT,” he says, “and we see challenges just to keep up with the volume of information available to support business decision-making.”

    Reply
  44. Tomi Engdahl says:

    Micron Pushes Capacity Threshold in NVMe SSDs
    http://www.eetimes.com/document.asp?doc_id=1332136&

    Micron Technology unveiled its second generation of NVM Express (NVMe) SSDs at the Flash Memory Summit, using its 3D NAND to push capacities past 10TB.

    In an advance telephone briefing with EE Times, Dan Florence, SSD product manager for Micron’s Storage Business Unit, said the 9200 Series of NVMe SSDs were built from the ground up to break the shackles of legacy hard drive interfaces. The new storage portfolio is designed to address surging data demands while at the same time maximizing data center efficiency so customers can improve their overall total cost of ownership, he said, and is the storage foundation for the Micron SolidScale Platform, an NVMe over Fabric architecture ahead of standards development, announced earlier this year.

    Reply
  45. Tomi Engdahl says:

    James Vincent / The Verge:
    DeepMind and Blizzard release SC2LE, a toolkit which includes an API, for AI research in real-time strategy game StarCraft II — What can computers learn from playing video games? Quite a lot actually — Teaching computers to play games has always been a useful (if somewhat crude) measure of their intelligence.

    DeepMind and Blizzard release new tools to train AI using Starcraft
    What can computers learn from playing video games? Quite a lot actually
    https://www.theverge.com/2017/8/9/16117850/deepmind-blizzard-starcraft-ai-toolset-api

    Teaching computers to play games has always been a useful (if somewhat crude) measure of their intelligence. But as our machines have gotten smarter, we’ve had to find new challenges for them. First it was chess, then Atari, then the board game Go, and now they’re taking on their biggest challenge yet: Starcraft.

    To be precise, Starcraft II, which researchers at Google’s AI subsidiary DeepMind say is the perfect environment for teaching computers advanced skills like memory and planning. Last year, DeepMind said it was going to work with Starcraft creator Blizzard to turn the space-based strategy game into a proper research environment for AI engineers, and today, that software is being released to the public.

    DeepMind and Blizzard open StarCraft II as an AI research environment
    https://deepmind.com/blog/deepmind-and-blizzard-open-starcraft-ii-ai-research-environment/

    Reply
  46. Tomi Engdahl says:

    Tom Warren / The Verge:
    Microsoft unveils Windows 10 Pro for Workstations, aimed at server-grade PC hardware, expected to arrive this fall

    Microsoft reveals new Windows 10 Workstations edition for power users
    https://www.theverge.com/2017/8/10/16128072/microsoft-windows-10-pro-for-workstations-features

    Microsoft is officially unveiling Windows 10 Pro for Workstations today. While the operating system was originally rumored back in June, Microsoft is providing the full details on the special edition today. As expected, Windows 10 Pro for Workstations is primarily designed for server grade PC hardware and true power users. Windows 10 Pro for Workstations scales up for machines with a high number of logical processors and large amounts of RAM.

    The software giant is making four major changes to Windows 10 Pro for Workstations to support high-end PC hardware. Resilient File System (ReFS) will be enabled by default, providing more resilience against data corruption, and optimization for handling large data volumes and auto-correcting. Windows 10 Pro for Workstations will also include support for non-volatile memory modules (NVDIMM-N) with persistent memory. This means that read and write speeds will be as fast as possible, and files will still be there even if a workstation is switched off.

    Reply
  47. Tomi Engdahl says:

    DeepMind and Blizzard release new tools to train AI using Starcraft
    9 comments
    What can computers learn from playing video games? Quite a lot actually
    https://www.theverge.com/2017/8/9/16117850/deepmind-blizzard-starcraft-ai-toolset-api

    Reply
  48. Tomi Engdahl says:

    The software industry has a record number of jobs in Finland

    In Finland, software work is available in relation to more than in other European countries.

    According to Eurostat’s recent statistics, 6.6% of the workforce is employed in the ICT sector in Finland. In other European countries, the average is 3.7 per cent.

    About Janne Kalliola, President of the Code of Finland Association, writes in Aamulehti. The association is an association founded by Finnish software companies.

    According to Kalliola, the industry’s first problem is recruiting. It is difficult for the sector to find a labor force in all European countries.

    Over the past year, recruitment problems in Finland suffered more than every other company in the industry.

    Source: http://www.tivi.fi/Kaikki_uutiset/ohjelmistoalalla-ennatysmaara-tyota-suomessa-6668413

    Reply
  49. Tomi Engdahl says:

    Why Everyone Is Hating on IBM Watson—Including the People Who Helped Make It
    http://gizmodo.com/why-everyone-is-hating-on-watson-including-the-people-w-1797510888

    You’ve probably seen the Watson commercials, where what looks like a sentient box interacts with celebrities like Bob Dylan, Carrie Fisher, and Serena Williams; or doctors; or a young cancer survivor. Maybe you caught the IBM artificial intelligence technology’s appearance in H&R Block’s Super Bowl commercial starring Jon Hamm. “It is one of the most powerful tools our species has created. It helps doctors fight disease,” Hamm says. “It can predict global weather patterns. It improves education for children everywhere. And now we unleash it on your taxes.”

    In the commercial, which advertises what is essentially a smart tax prep service, Watson is portrayed as a glowing sci-fi cube that holds the key to humankind’s greatest problems. But it’s not a wizard. And lately, several experts from Silicon Valley and Wall Street have spoken up, criticizing the people behind the curtain—asking if Watson is a joke or a savior for IBM. So why is everyone being so tough on Big Blue and its golden child?

    The splashy vision of Watson has been integral to IBM’s branding since the groundbreaking question-answering system made its debut in 2011 on Jeopardy! Now, thanks to billions of dollars of investment and years of aggressive marketing, Watson has come to represent AI in the popular imagination. Siri and Alexa may get more attention these days, but when it comes to big-data computing, Watson was the first to offer up its name, and it has remained a cocksure mascot.

    “What did you talk about? And what can IBM do for our country?”

    Of course, the answer is, a lot, according to Rometty: IBM can improve health care for vets, modernize the nation’s tech infrastructure, create jobs, and train the next generation of “new collar” workers. But it’s the “cognitive computing” (IBM’s buzzword for AI) of Watson that will help many of these visions come true. Peppered throughout the softball exchange were a dozen mentions of Watson. She made deceptively aggrandizing claims that by the end of this year Watson “will touch a billion people” and “be able to address, diagnose, and treat 80 percent of what causes 80 percent of the cancer in the world.”

    The interview was representative of IBM’s marketing strategy: promote Watson as a world-changing technology, describe it in quixotic ways, then let reporters run wild with clickbait-y headlines about a robot helping humans do their jobs

    IBM seems to believe the Watson brand can breathe new life into their company. And it sure could use some resuscitation right about now. IBM’s revenue has fallen for 22 consecutive quarters.

    What is Watson, really?

    When Watson debuted in 2011, it was an outstanding achievement that proved the century-old hardware company could still find its way to the bleeding edge. It ushered in the era of big-data computing, or as Rometty put it during her conversation with Cramer: “We are the ones that woke up the AI world here.”

    Watson was more than just a supercomputer that answered trivia questions. It could process natural language, including wordplay and unusually phrased questions. When Watson was given a question, dozens of algorithms analyzed the query and produced a list of responses, then ranked the answers. If the confidence score was high enough, Watson hit the buzzer.

    “IBM Watson is the Donald Trump of the AI industry—outlandish claims that aren’t backed by credible data.”

    Three years later, in 2014, IBM created the Watson business unit to figure out ways to use the technology to actually make money. Now “Watson” represents a whole array of AI technologies built and acquired by IBM—including sentiment analysis, voice recognition, and natural-language processing. UX and UI designers use these tools to create platforms for IBM clients in various industries. Many of these collaborations inspire a wave of buzz, generating headlines that read like the newspaper clippings collected by proud parents of a prodigy

    Many of the products mentioned in those articles rely on machine learning—algorithms that “learn” how to perform tasks by processing massive amounts of data (hence the buzzword big-data computing) and finding patterns in the information.

    But the “cognitive computing” technologies under the Watson umbrella aren’t as unique as they once were. “In the data-science community the sense is that whatever Watson can do, you can probably get as freeware somewhere, or possibly build yourself with your own knowledge,”

    “In the data-science community the sense is that whatever Watson can do, you can probably get as freeware somewhere, or possibly build yourself with your own knowledge.”

    Reply
  50. Tomi Engdahl says:

    Q&A: How Can Artificial Intelligence Impact Smart Cities?
    Technologies>Embedded Revolution
    Q&A: How Can Artificial Intelligence Impact Smart Cities?
    http://www.electronicdesign.com/embedded/qa-how-can-artificial-intelligence-impact-smart-cities?code=UM_NN7SC1&utm_rid=CPG05000002750211&utm_campaign=12395&utm_medium=email&elq2=0a77e1cc74f249fe97bc825d320adefc

    Industry analyst Susan Etlinger shares her views on where AI technology is now, what needs to change for it to evolve, and its future as a utility.

    How is Big Data being handled?

    Right now, it is a bit different from country to country. For example, in Germany, it is highly regulated. In the U.S., it is less regulated. Then, there is the General Data Protection Regulation [GDPR, a European Union initiative], so the whole world is dealing with this in different ways.

    I think the most important thing for organizations is to be able to think about what questions they want to be able to answer and what services they will be able to provide, and then use data related to that (as opposed to collecting data and then later figuring out what to do with it). I think part of the problem now is that there is so much data but not that much information. Just because a sensor collects data does not necessarily mean that there is a good use for the data. There might be things that can be interesting but not valuable.

    I think one of the things we need to think about is the context in which we collect data, in addition to the potential uses that we want to put the data. Then also do what is called “scenario planning” to try to determine what the best thing is that could happen, and the worst thing that could happen.

    How can artificial intelligence be used in smart cities?

    The massive amount of data makes AI different now than before. In addition, the algorithms are getting better and computers are able to handle Big Data more quickly. AI becomes interesting for smart cities when AI developers create systems that can learn from past experiences. For example, in a system where energy spikes tend to happen, AI can learn where they usually occur and under which circumstances. You can then make better use of your power grid. Other examples could be systems that, by learning, can provide services to disabled people or elderly people who might not have the opportunity to go grocery shopping, for example.

    I think the uses for AI are almost infinite. It is just a question of what are the right things to do, and being conscious of commercial potential and the potential downsides. The advantage is the ability to fix problems as they are beginning to happen instead of long after they have happened.

    How will AI systems be deployed?

    It is very complicated. Part of the challenge we have now is that a lot of AI’s resources and power have been gathered by very few companies. So there is sort of an AI monopoly that has started to happen. Google, Microsoft, and Facebook have all started AI ecosystems. In the future, we need to think about AI as a service and a utility itself. For now, at least, it seems that it might be good to start with the idea of having a service where you can be paid for volume of data, or paid per hour or per project to have access to that technology and apply it for particular use.

    Right now, not enough people know how to code and build intelligent systems at that level; the universities aren’t putting them out fast enough. That will probably change over time and we’ll have a better workforce and artificial-intelligence systems.

    Will everybody get access to artificial intelligence?

    I think in the future, AI is going to be a utility. It is going to be as normal as cloud computing today. It has taken 10 years for cloud computing to become something that companies accept, and some companies still don’t accept it. But I think it is going to take at least 10 years before we really have broad access to AI.

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*