Computer trends 2017

I did not have time to post my computer technologies predictions t the ends of 2016. Because I missed the year end deadline, I though that there is no point on posting anything before the news from CES 2017 have been published. Here are some of myck picks on the current computer technologies trends:

CES 2017 had 3 significant technology trends: deep learning goes deep, Alexa everywhere and Wi-Fi gets meshy. The PC sector seemed to be pretty boring.

Gartner expects that IT sales will growth (2.7%) but hardware sales will not have any growth – can drop this year. TEKsystems 2017 IT forecast shows IT budgets rebounding from a slump in 2016, and IT leaders’ confidence high going into the new year. But challenges around talent acquisition and organizational alignment will persist. Programming and software development continue to be among the most crucial and hard-to-find IT skill sets.

Smart phones sales (expected to be 1.89 billion) and PC sales (expected to be 432 million) do not grow in 2017. According to IDC PC shipments declined for a fifth consecutive year in 2016 as the industry continued to suffer from stagnation and lack of compelling drivers for upgrades. Both Gartner and IDC estimated that PC shipments declined about 6% in 2016.Revenue in the traditional (non-cloud) IT infrastructure segment decreased 10.8 per cent year over year in the third quarter of 2016. Only PC category that has potential for growth is ultramobile (includes Microsoft Surface ja Apple MacBook Air). Need for memory chips is increasing.

Browser suffers from JavaScript-creep disease: This causes that the browing experience seems to be become slower even though computer and broadband connections are getting faster all the time. Bloat on web pages has been going on for ages, and this trend seems to continue.

Microsoft tries all it can to make people to switch from older Windows versions to Windows 10. Microsoft says that continued usage of Windows 7 increases maintenance and operating costs for businesses as malware attacks that could have been avoided by upgrading to Windows 10. Microsoft says that continued usage of Windows 7 increases maintenance and operating costs for businesses. Microsoft: Windows 7 Does Not Meet the Demands of Modern Technology; Recommends Windows 10. On February 2017 Microsoft stops the 20 year long tradition of monthly security updates. Windows 10 “Creators Update” coming early 2017 for free, featuring 3D and mixed reality, 4K gaming, more.

Microsoft plans to emulate x86 instructions on ARM chips, throwing a compatibility lifeline to future Windows tablets and phones. Microsoft’s x86 on ARM64 Emulation is coming in 2017. This capability is coming to Windows 10, though not until “Redstone 3″ in the Fall of 2017

Parents should worry less about the amount of time their children spend using smartphones, computers and playing video games because screen time is actually beneficial, the University of Oxford has concluded. 257 minutes is the time teens can spend on computers each day before harming wellbeing.

Outsourcing IT operations to foreign countries is not trendy anymore and companied live at uncertain times. India’s $150 billion outsourcing industry stares at an uncertain future. In the past five years, revenue and profit growth for the top five companies listed on the BSE have halved. Industry leader TCS too felt the impact as it made a shift in business model towards software platforms and chased digital contacts.

Containers will become hot this year and cloud will stay hot. Research firm 451 Research predicts this year containerization will be US $ 762 million business and that Containers will become 2.6 billion worth of software business in 2020. (40 per cent a year growth rate).

Cloud services are expected to have  22 percent annual growth rate. By 2020, the sector would grow from the current 22.2 billion to $ 46 billion. In Finland 30% of companies now prefer to buy cloud services when buying IT (20 per cent of IT budget goes to cloud).Cloud spend to make up over a third of IT budgets by 2017. Cloud and hosting services will be responsible for 34% of IT budgets by 2017, up from 28% by the end of 2016, according to 451 Research. Cloud services have many advantages, but cloud services have also disadvantages. In five years, SaaS will be the cloud that matters.

When cloud is growing, so is the spending on cloud hardware by the cloud companies. Cloud hardware spend hits US$8.4bn/quarter, as traditional kit sinks – 2017 forecast to see cloud kit clock $11bn every 90 days. In 2016′s third quarter vendor revenue from sales of infrastructure products (server, storage, and Ethernet switch) for cloud IT, including public and private cloud, grew by 8.1 per cent year over year to $8.4 billion. Private cloud accounted for $3.3 billion with the rest going to public clouds. Data centers need lower latency components so Google Searches for Better Silicon.

The first signs of the decline and fall of the 20+ year x86 hegemony will appear in 2017. The availability of industry leading fab processes will allow other processor architectures (including AMD x86, ARM, Open Power and even the new RISC-V architecture) to compete with Intel on a level playing field.

USB-C will now come to screens – C-type USB connector promises to really become the only all equipment for the physical interface.The HDMI connection will be lost from laptops in the future. Thunderbolt 3 is arranged to work with USB Type-C,  but it’s not the same thing (Thunderbolt is four times faster than USB 3.1).

World’s first ‘exascale’ supercomputer prototype will be ready by the end of 2017, says China

It seems that Oracle Begins Aggressively Pursuing Java Licensing Fees in 2017. Java SE is free, but Java SE Suite and various flavors of Java SE Advanced are not. Oracle is massively ramping up audits of Java customers it claims are in breach of its licences – six years after it bought Sun Microsystems. Huge sums of money are at stake. The version of Java in contention is Java SE, with three paid flavours that range from $40 to $300 per named user and from $5,000 to $15,000 for a processor licence. If you download Java, you get everything – and you need to make sure you are installing only the components you are entitled to and you need to remove the bits you aren’t using.

Your Year in Review, Unsung Hero article sees the following trends in 2017:

  • A battle between ASICs, GPUs, and FPGAs to run emerging workloads in artificial intelligence
  • A race to create the first generation of 5G silicon
  • Continued efforts to define new memories that have meaningful impact
  • New players trying to take share in the huge market for smartphones
  • An emerging market for VR gaining critical mass

Virtual Reality Will Stay Hot on both PC and mobile.“VR is the heaviest heterogeneous workload we encounter in mobile—there’s a lot going on, much more than in a standard app,” said Tim Leland, a vice president for graphics and imaging at Qualcomm. The challenges are in the needs to calculate data from multiple sensors and respond to it with updated visuals in less than 18 ms to keep up with the viewer’s head motions so the CPUs, GPUs, DSPs, sensor fusion core, display engine, and video-decoding block are all running at close to full tilt.

 


932 Comments

  1. Tomi Engdahl says:

    Nvidia Rolls Volta GPU For ‘AI Revolution’
    Volta taps 12nm TSMC and Samsung HBM2
    http://www.eetimes.com/document.asp?doc_id=1331729

    Machine learning is sparking a new era in computing, according to Nvidia’s chief executive, who hopes that his latest GPU, Volta, becomes its favorite fuel.

    The Volta announcement was the centerpiece of a two-hour keynote at GTC on “Powering the AI Revolution.” The annual Nvidia event attracted a record of more than 7,000 attendees, thanks to rising interest in using an expanding array of neural networks across a broadening horizon of applications from agriculture to pharmaceuticals and public safety.

    Reply
  2. Tomi Engdahl says:

    Why the Largest Companies in the World Count on Linux Servers
    http://www.linuxjournal.com/content/why-largest-companies-world-count-linux-servers?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+linuxjournalcom+%28Linux+Journal+-+The+Original+Magazine+of+the+Linux+Community%29

    Linux started its life in the data center as a cheaper alternative to UNIX. At the time, UNIX operating systems ruled the industry and for good reason. They were performant, fault tolerant and extremely stable. They also were very expensive and ran on very proprietary hardware. A lot of the familiar utilities and applications developed for those UNIX platforms eventually were ported over to Linux. So, once Linux ran services like Apache, it came as no surprise that Linux would usurp and replace the very same technologies that once inspired its creation. The very best part was that Linux ran on commodity x86 hardware. At the end of the day, anyone could deploy a Linux server at a fraction of the cost to deploy something from Sun Microsystems, Silicon Graphics (SGI) or from any other UNIX distributor.

    Will Anything Make Linux Obsolete?
    http://www.linuxjournal.com/content/will-anything-make-linux-obsolete?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+linuxjournalcom+%28Linux+Journal+-+The+Original+Magazine+of+the+Linux+Community%29

    Reply
  3. Tomi Engdahl says:

    History of Git
    http://hackaday.com/2017/05/11/history-of-git/

    Git is one of those tools that is so simple to use, that you often don’t learn a lot of nuance to it. You wind up cloning a repository from the Internet and that’s about it. If you make changes, maybe you track them and if you are really polite you might create a pull request to give back to the project. But there’s a lot more you can do. For example, did you know that Git can track collaborative Word documents? Or manage your startup files across multiple Linux boxes?

    Git belongs to a family of software products that do revision (or version) control.

    Closed Tool

    A very large distributed team develops the Linux kernel. By late 1998 the team was struggling with revision management. A kernel developer, [Larry McVoy], had a company that produced a scalable distributed version control product called BitKeeper. Although it was a commercial product, there was a community license that allowed you to use it as long as you didn’t work on a competing tool while you were using the product and for a year thereafter. The restriction applied to both commercial and open source competition. Although the product kept most data on your machine, there was a server component, so the company could, in fact, track your usage of the product.

    In 2002, the Linux kernel team adopted BitKeeper. [Linux Torvalds] was among the proponents of the new system. However, other developers (and interested parties like [Richard Stallman] were concerned about using a proprietary tool to develop open source.

    For the most part, things quieted down with only occasional flame skirmishes erupting here and there. That is until 2005 when [McVoy’s] company announced it would discontinue the free version of BitKeeper.

    As a result, two projects spun up to develop a replacement. Mercurial was one and Git, of course, was the other.

    Of course, both Mercurial and Git came to fruition, with Git becoming not only the kernel team’s version control system but the system for a lot of other people as well.

    Initial development is said to have taken a few days. Since the version 1.0 release in late 2005, the software has spawned more than one major website and has become the system of choice for many developers, both open source, and commercial.

    Reply
  4. Tomi Engdahl says:

    The IT experts in this field are now out of control: “Must have musks”

    Earlier this year, Tieto’s IT service management team was hard pressed when up to five experts moved to other companies. Juha Huovisen Sofigate CEO of the ITSM specialists are currently very sought after professionals.

    Sofigate recruits four of the people who are leaving, though, according to Huovinen, through a normal recruitment process.

    “That’s right. Just this year, we have recruited 20 of the industry. In Finland, we have altogether 60 experts in the field of self-management, he says.

    A major factor in the recruitment boom is the transition to service integration, the siam activity, which has been a hot trend for three years now. There, one IT vendor manages other suppliers on behalf of the customer. According to Huovinen, however, it may be problematic if one actor takes over all the guides.

    Desired digital learners

    There is a fair recruitment competition in the labor market of its know-how.

    “Yes, that is, especially in the digital age. But we are a management company, we do not seek coders, but management experts, “Huovinen describes.

    Sofigate’s employees know how to run the digital business and are able to talk smoothly with both business and IT. According to Huovinen, the company has been successful in recruiting well
    Siam service demand is on the rise

    Also, information technology and service unit manager Patrik Ekström identify ITSM trend.

    “Itsm experts have a growing need in the market, and many players seek this expertise. We have also recruited more of its know-how in Information, “she says.

    There are a number of self-tools on the market. People who moved to Sofigate were experts in Tieto’s ServiceNow tool. It manages Tieto’s production operations, including platform services, application services, and configuration management.

    Source: http://www.tivi.fi/Kaikki_uutiset/taman-alan-it-asiantuntijat-viedaan-nyt-kasista-pitaa-olla-muskelia-6648889

    Reply
  5. Tomi Engdahl says:

    Intel’s Itanium, once destined to replace x86 processors in PCs, hits end of line
    Intel has released its Itanium 9700 chip, but that also means the end for the processor family.
    http://www.pcworld.com/article/3196080/data-center/intels-itanium-once-destined-to-replace-x86-in-pcs-hits-end-of-line.html

    It’s the end of the line for Intel’s Itanium chip, a troubled processor family that spawned many product delays and bad blood between HP and Oracle.

    Intel on Thursday started shipping its latest Itanium 9700 chip, code-named Kittson, in volume. It’s the last of the Itanium chips, which first appeared in early 2001.

    Beyond Kittson, there will be no more chips coming from the Itanium family, an Intel spokesman said in an email. That ends a tumultuous, 16-year journey for Itanium, which Intel once envisioned as a replacement for x86 chips in 64-bit PCs and servers.

    Support for Itanium has dwindled over the past decade, which has led to its gradual death. Server makers stopped offering hardware, software development stalled, and Intel has been openly asking customers to switch to x86-based Xeon chips.

    When introduced in 2001, the Itanium instruction set was much more advanced than that of the x86. It had features like machine-check, ECC memory and RAS (reliability availability and serviceability). But the first Itanium chips were extremely power hungry.

    Intel hedged its bets and planned for Itanium 64—also called IA-64—to ultimately go down the stack from servers to PCs. But that never happened, and the market shifted quickly after AMD introduced the first 64-bit x86 server chips in 2003. That gave AMD a competitive edge over Intel, which still was offering 32-bit x86 chips.

    Intel got back to the drawing board and sped up the development of x86 chips with 64-bit extensions, which were released soon after. The transition disrupted Intel’s vision of Itanium as an architecture of the future for 64-bit servers and PCs. Instead, x86 chips started moving up the stack into more powerful servers.

    Intel subsequently made a concerted push to move customers from Itanium to x86 in 2012 when it introduced the new 15-core Xeon E7 v2 chip with Itanium-like error correction and RAS features.

    At the time, HP also made it easier to transition from Itanium to x86-based Xeon servers. The message from Intel and HP, notes Dean McCarron, principal analyst at Mercury Research, was clear: It was time to move on from Itanium.

    Companies with machines running on HP-UX and Integrity will likely remain committed to Itanium, and won’t transition for many years. But that won’t stop HPE from trying to get customers to move to x86.

    HPE will enable “customers to re-host their HP-UX workloads on Linux-based containers running on industry-standard x86 servers in the future,” said Jeff Kyle, director of product management for enterprise servers at HPE.

    Xeon and x86 will drive Intel into the future, especially with applications and data center designs changing.

    Reply
  6. Tomi Engdahl says:

    Open Source vs Proprietary: What organisations need to know
    Meeting IT and business priorities with a hybrid approach to open source and proprietary solutions
    https://www.sas.com/en_gb/whitepapers/open-source-vs-proprietary.html

    Open source technologies, like Hadoop, R and Python, have been vital to the spread of big data. However, production deployment of these technologies has its own, often unexpected costs and projects are not necessarily succeeding as hoped. However, organisations see clear benefits from both open source and proprietary solutions, with a mixture of the two perceived as delivering the best return on investment.

    This report, based on a survey of 300 senior IT and business decision-makers across the UK & Ireland, aims to create a better understanding of the key considerations for open source and proprietary analytics technology, drivers for business transformation and any limitations to success.

    Reply
  7. Tomi Engdahl says:

    TensorFlow: I want to like you, but you’re tricksy
    Wrestling with Google’s machine learning framework
    https://www.theregister.co.uk/2017/05/12/tensor_flow_hands_on/

    Users described their ML models as dataflow graphs, combining a number of machine learning techniques into a single model. TensorFlow itself does nothing to reduce the learning curve found in ML (in fact it might make it steeper), but Google’s framework does enormously simplify the deployment of ML models. If you think of ML model construction as a data science then TensorFlow is a Data Engineering tool for deployment.

    Google open-sourced TensorFlow in November 2015 under an Apache 2.0 license with a 1.0 release in February this year that introduced a number of improvements. For example, the TensorFlow had been limited to C++ and Python, but version 1.0 introduces interfaces to Go and Java, and there’s a C library for building new language interfaces.

    What makes TensorFlow so powerful for deployment is its ability to map ML models on to a number of hardware devices and platforms with very little change to the model. The TensorFlow framework will happily run on Android or iOS devices, a laptop, machines with one or more GPU devices to huge networks of machines of specialised machines with many GPU cards.

    Most ML can be divided into two stages: training a model and running the model to make new inferences from new data. Generally, it is the training phase that needs the most computational power, the model is fed data that has been classified and the model learns (using a number of well-known techniques) features in the data that allow it to identify those features in unknown data. This is called the training phase and is the phase that needs the most power. Once a model has been trained it can be used to infer new facts about data it hasn’t “seen before”, generally running this inference model takes less power so it can be run on a mobile phone for instance.

    The bigger the dataset used for training, the more power is required.

    What is it like to work with TensorFlow? I’m not going to pretend I’m in a position to start writing new ML models on the platform, but – luckily – we may not need to. The TensorFlow webpage contains a link to a TensorFlow Zoo model library on Github with a number of user-contributed models. The zoo has around 27 exhibits at the moment, but it should be pointed out that other user-generated models are available around the interwebz so – for instance – if you want a movie recommendation engine then you could pick TF-recomm from user songgc here.

    Reply
  8. Tomi Engdahl says:

    Natasha Singer / New York Times:
    How Google outmaneuvered Apple and Microsoft over five years to gain dominance in classrooms with Chromebooks and free apps like Classroom, Docs, and Gmail

    How Google Took
    Over the Classroom
    https://www.nytimes.com/2017/05/13/technology/google-education-chromebooks-schools.html

    The tech giant is transforming public education with low-cost laptops and
    free apps. But schools may be giving Google more than they are getting.

    Today, more than half the nation’s primary- and secondary-school students — more than 30 million children — use Google education apps like Gmail and Docs, the company said. And Chromebooks, Google-powered laptops that initially struggled to find a purpose, are now a powerhouse in America’s schools. Today they account for more than half the mobile devices shipped to schools.

    “Between the fall of 2012 and now, Google went from an interesting possibility to the dominant way that schools around the country” teach students to find information, create documents and turn them in, said Hal Friedlander, former chief information officer for the New York City Department of Education, the nation’s largest school district. “Google established itself as a fact in schools.”

    In doing so, Google is helping to drive a philosophical change in public education — prioritizing training children in skills like teamwork and problem-solving while de-emphasizing the teaching of traditional academic knowledge, like math formulas. It puts Google, and the tech economy, at the center of one of the great debates that has raged in American education for more than a century: whether the purpose of public schools is to turn out knowledgeable citizens or skilled workers.

    Reply
  9. Tomi Engdahl says:

    David Cassel / The New Stack:
    Dwindling expertise a big worry as COBOL still underpins much of the US financial infrastructure — Think COBOL is dead? About 95 percent of ATM swipes use COBOL code, Reuters reported in April, and the 58-year-old language even powers 80 percent of in-person transactions.

    COBOL Is Everywhere. Who Will Maintain It?
    https://thenewstack.io/cobol-everywhere-will-maintain/

    Think COBOL is dead? About 95 percent of ATM swipes use COBOL code, Reuters reported in April, and the 58-year-old language even powers 80 percent of in-person transactions. In fact, Reuters calculates that there’s still 220 billion lines of COBOL code currently being used in production today, and that every day, COBOL systems handle $3 trillion in commerce. Back in 2014, the prevalence of COBOL drew some concern from the trade newspaper American Banker.

    “The mainframe was supposed to have been be replaced by farms of smaller commodity servers and cloud computing by now, but it still endures at many banks,” the trade pub reported.

    But should we be concerned that so much of our financial infrastructure runs on an ancient infrastructure? American Banker found 92 of the top 100 banks were still using mainframe computers — and so were 71 percent of the companies in the Fortune 500. As recently as five years ago, the IT group at the Bank of New York Mellon had to tend to 112,500 different COBOL programs — 343 million lines of code, according to a 2012 article in Computerworld. And today a quick Google search today shows the Bank of New York Mellon is still hiring COBOL developers.

    COBOL was originally developed in the 1950s as a stop-gap by the Department of Defense, but then computer manufacturers began supporting it, “resulting in widespread adoption,”

    There’s now some concerns about where the next generation of COBOL programmers will come from. In 2014, American Banker reported banks are “having trouble finding talented young techies who want to work in a bank and a shortage of people with mainframe and COBOL skills.”

    “COBOL isn’t as sexy as working with Elixir, or Golang,” argued The Next Web. COBOL historically hasn’t been the most attractive option for a hip young programmer

    Another commenter complained that “You will most likely spend the rest of your career doing maintenance work rather than any greenfield development. There is nothing wrong with that but not everybody likes the fact they can’t create something new.”

    They’re willing to pay almost anything, he told Reuters, and “You better believe they are nice since they have a problem only you can fix.”

    There are strong reactions to a recent article arguing banks should let COBOL die. “The idea that large corporations are simply going to move on from COBOL is out of touch with reality,” one commenter wrote on Hacker News. “It really can’t be overstated how deeply old COBOL programs are embedded into these corporations. I worked for one that had been using them since the language itself was created, and while they all could see the writing on the wall, the money to do the change simply wasn’t there.”

    Even if companies transitioned to Java, the problem could recur later. “Will a future generation of young programmers want to transition away from Java to a newer language — and companies will have to once again go through another expensive and time-consuming transition.”

    Reply
  10. Tomi Engdahl says:

    So your client’s under-spent on IT for decades and lives in fear of an audit
    Oh-so-trendy infrastructure as code could save your bacon
    https://www.theregister.co.uk/2017/05/12/infrastructure_as_code/

    Infrastructure as code is a buzzword frequently thrown out alongside DevOps and continuous integration as being the modern way of doing things. Proponents cite benefits ranging from an amorphous “agility” to reducing the time to deploy new workloads. I have an argument for infrastructure as code that boils down to “cover your ass”, and have discovered it’s not quite so difficult as we might think.

    Recently, a client of mine went through an ownership change. The new owners, appalled at how much was being spent on IT, decided that the best path forward was an external audit. The client in question, of course, is an SMB who had been massively under-spending on IT for 15 years, and there no way they were ready for – or would pass – an audit.

    Trying to cram eight months’ worth of migrations, consolidations, R&D, application replacement and so forth into four frantic, sleepless nights of panic ended how you might imagine it ending. The techies focused on making sure their asses were covered when the audit landed. Overall network performance slowed to a crawl and everyone went home angry.

    Reply
  11. Tomi Engdahl says:

    For now, GNU GPL is an enforceable contract, says US federal judge
    The software hippies’ minds are going to be blown over this one
    https://www.theregister.co.uk/2017/05/13/gnu_gpl_enforceable_contract/

    A question mark over whether the GNU GPL – the widely used free-software license – is enforceable as a contract may have been resolved by a US federal judge.

    In a California district court, Judge Jacqueline Scott Corley refused [PDF] to accept what has been an uncomfortable legal precedent for the past decade. She ruled that the GNU General Public License – the GNU GPL – is an enforceable legal contract even though it is not actually signed.

    Loads and loads of free-software projects are covered by the GPL, from the Linux kernel to the GCC toolchain. The license is designed to ensure software code stays free, as in freedom can be distributed for free, as in free beer; and can be used by anyone anywhere provided they adhere to the license.

    Hancom decided not pay Artifex for a commercial license. Instead, it opted for the AGPL route, but then failed to obey the license and make freely available the changes it made to Ghostscript while integrating it into its product.

    Seeing as Hancom was effectively going down the closed-source commercial license lane but without paying a dime, Artifex knocked on the Koreans’ door. The Ghostscript developer demanded backdated license fees on the hundreds of millions of dollars Hancom had made from the sale of its – infringing – software. Hancom refused. Artifex sued.

    In its defense, Hancom claims various things, including that it is not based in the US so there cannot be an infringement in the US, but critically it makes two arguments that get to the heart of the enforceability of the GNU GPL:

    1. That since it did not sign anything when it downloaded Artifex’s software there is no contract to be enforced.
    2. That the contract claim is preempted by federal copyright law.

    At the end of April, Judge Corley denied Hancom’s motion to dismiss the case based on the claim that because the South Korean company didn’t actually sign something, no “mutual assent” had been demonstrated.

    “Not so,” she decided. “The GNU GPL provides that the Ghostscript user agrees to its terms if the user does not obtain a commercial license … these allegations sufficiently plead the existence of a contract.” She references a MedioStream v Microsoft case where Microsoft successfully sued for breach of a shrink-wrap license.

    As to the claim that the GNU GPL was preempted by federal copyright law, the judge took issue with the Jacobsen case acting as a precedent and highlighted an argument that, she notes, was “apparently not made in Jacobsen.”

    Reply
  12. Tomi Engdahl says:

    Linus Torvalds stops personally signing Linux kernel RC tarballs
    But Linux 4.12 rc1 made it out before Mother’s Day anyway, thanks to new kernel.org plan
    https://www.theregister.co.uk/2017/05/14/linux_4_12_rc1_released/

    Torvalds also points out that “ I haven’t uploaded diffs or tar-balls for this rc.” That matters a bit because he adds “Those should now be automagically generated by kernel.org for the rc’s, but that also means that they won’t be signed by my key. If you really care about signing, get the git repo and check the tag.”

    A small change, but one to be aware of for the security-conscious.

    Reply
  13. Tomi Engdahl says:

    Nvidia Rolls Volta GPU For ‘AI Revolution’
    Volta taps 12nm TSMC and Samsung HBM2
    http://www.eetimes.com/document.asp?doc_id=1331729

    Machine learning is sparking a new era in computing, according to Nvidia’s chief executive, who hopes that his latest GPU, Volta, becomes its favorite fuel.

    The Volta announcement was the centerpiece of a two-hour keynote at GTC on “Powering the AI Revolution.” The annual Nvidia event attracted a record of more than 7,000 attendees, thanks to rising interest in using an expanding array of neural networks across a broadening horizon of applications from agriculture to pharmaceuticals and public safety.

    Nvidia’s graphics processors hold a strong position in training neural nets for machine learning. “Every single cloud company in the world has Nvidia GPUs provisioned for a cloud service,” said founder and CEO, Jensen Huang.

    But it’s a hotly competitive field. More than a half-dozen startups are working on new architectures, two of them acquired last year by Nvidia’s largest rival, Intel, The x86 giant also bought established FPGA maker Altera, whose chips are used as accelerators in the data centers of Baidu and Microsoft.

    Rival AMD also is accelerating its rollout of new GPUs with its Vega chip due soon. However, AMD has only recently added a strong focus on machine learning to its pursuit of the game market.

    Reply
  14. Tomi Engdahl says:

    Cray Rolls Clustered Supercomputers for AI
    AI Targeted by Supercomputers
    http://www.eetimes.com/document.asp?doc_id=1331738&

    Cray Inc.’s new CS-Storm accelerated cluster supercomputers — the Cray CS-Storm 500GT and the Cray CS-Storm 500NX — boosted their artifical intelligence (AI) capabilities with massive arrays of Nvidia Tesla graphic processor unit (GPU) accelerators for super-deep machine-learning.

    Reply
  15. Tomi Engdahl says:

    Developer Creates An Experimental Perl 5 To Java Compiler
    https://developers.slashdot.org/story/17/05/14/0617225/developer-creates-an-experimental-perl-5-to-java-compiler

    Saturday night saw the announcement of an experimental Perl 5 to Java compiler. “This is the first release,” posted developer FlÃvio S. Glock — after 100 weeks of development. “Note that you don’t need to compile a Java file. Perlito5 now compiles the Perl code to JVM bytecode in memory and executes it.”

    Perl5 to Java compiler – first release
    http://blogs.perl.org/users/flavio_s_glock/2017/05/perl5-to-java-compiler—first-release.html

    Reply
  16. Tomi Engdahl says:

    Open Source SQL Database CockroachDB Hits 1.0
    https://news.slashdot.org/story/17/05/13/0045234/open-source-sql-database-cockroachdb-hits-10

    CockroachDB, an open source, fault-tolerant SQL database with horizontal scaling and strong consistency across nodes — and a name few people will likely forget — is now officially available. Cockroach Labs, the company behind its development, touts CockroachDB as a “cloud native” database solution — a system engineered to run as a distributed resource.

    Open source SQL database CockroachDB hits 1.0
    http://www.infoworld.com/article/3195773/database/open-source-sql-database-cockroachdb-hits-10.html

    Cockroach Labs is also offering its enterprise features as open source and trusting enterprises will pay for what they use in production

    Reply
  17. Tomi Engdahl says:

    Growth in the IT sector exceeded other sectors – a shortage of experts paralyzes

    Finland IT sector grew more strongly than all the other sectors in reported in 2014-2015.

    The total turnover of companies in the industry has already plummeted to more than 10 billion euros, reports Balance Consulting and Valor’s latest industry overview.

    The sector is absorbing digitalisation: it’s share of corporate budgets grew and focused specifically on the development of digital solutions. Similarly, the It budget has provided less money for other IT acquisitions. For example, the amount of money spent on servers and basic information technology has fallen sharply.

    The wave of digitalisation supports especially the growth of IT consultants and software companies. However, the rate of increase brake specialist shortage, estimated shareholder Antti Halonen consulting company Valorista: “. At the moment, the biggest IT companies the problem is not demand, but finding a talent”

    According to Halonen, the shortage of labor is deepened by the fact that digiosive professionals are interested in companies in other industries. There is also a risk of wage inflation: experienced experts know the labor situation and plan their wage demand accordingly.

    Many IT company is considering to solve expert shortage with acquisition, for the purchase of a good team ready to venture may become easier as the expert in a bigger set of hiring. Manufacturing industry companies are also now seeking software affirmations. Last year, for example, Wärtsilä acquired Eniram and Etteplan Espotel.

    The number of acquisitions is also picked up by strategy choices: almost all listed Finnish IT companies speak of acquisitions as one of their key strategies. Venture capitalists also strive for IT companies.

    Source: http://www.tivi.fi/Kaikki_uutiset/it-alan-kasvu-ohitti-muut-alat-osaajapula-jarruttaa-tahtia-6649362

    Reply
  18. Tomi Engdahl says:

    John Mannes / TechCrunch:
    Partnership on AI adds Intel, Salesforce, SAP, others as it formalizes Grand Challenges, which provide incentives to tackle big issues with social ramifications

    The Partnership on AI adds Intel, Salesforce and others as it formalizes Grand Challenges and work groups
    https://techcrunch.com/2017/05/16/the-partnership-on-ai-adds-intel-salesforce-and-others-as-it-formalizes-grand-challenges-and-work-groups/

    Intel, Salesforce, eBay, Sony, SAP, McKinsey & Company, Zalando and Cogitai are joining the Partnership on AI, a collection of companies and non-profits that have committed to sharing best practices and communicating openly about the benefits and risks of artificial intelligence research. The new members will be working alongside existing partners that include Facebook, Amazon, Google, IBM, Microsoft and Apple.

    Collectively, the partners will be hosting a series of AI Grand Challenges to incentivize researchers to contribute to key roadblocks in the field and to address some of the social and societal ramifications of artificial intelligence research. The group is also announcing a best paper award for the greatest contribution to “AI, People, and Society,” to aid in addressing a similar goal.

    Reply
  19. Tomi Engdahl says:

    Gordon Mah Ung / PCWorld:
    AMD unveils Threadripper, a 16-core, 32-thread CPU for desktops, coming summer 2017 — It’s called Threadripper, and it will be the first consumer-focused chip with 16 cores and 32 threads of computing power. Yes, you read that right: 16 cores and 32 threads of computing power …

    It’s official: AMD’s Threadripper will bring a 16-core, 32-thread monster to the desktop
    Threadripper will ship this summer on consumer PCs, and Intel should be worried.
    http://www.pcworld.com/article/3197147/components-processors/its-official-amds-threadripper-will-bring-a-16-core-32-thread-monster-to-the-desktop.html

    It’s called Threadripper, and it will be the first consumer-focused chip with 16 cores and 32 threads of computing power. Yes, you read that right: 16 cores and 32 threads of computing power, confirming that the CPU wars will rage on at the high end.

    AMD’s Jim Anderson announced the high-end desktop chip during the company’s financial analyst day Tuesday. Details were sparse, but the CPU has been making the rumor mills the last few weeks, though the company was expecting to announce at the upcoming Computex show.

    Reply
  20. Tomi Engdahl says:

    AMD Unveils ‘EPYC’ Server CPUs, Ryzen Mobile, Threadripper CPU and Radeon Vega Frontier Edition GPU
    https://hardware.slashdot.org/story/17/05/17/0245243/amd-unveils-epyc-server-cpus-ryzen-mobile-threadripper-cpu-and-radeon-vega-frontier-edition-gpu?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Slashdot%2Fslashdot%2Fto+%28%28Title%29Slashdot+%28rdf%29%29

    Today, at its financial analyst day, AMD lifted the veil on a number of new products based on the company’s Zen CPU architecture and next generation Vega GPU architecture. AMD CEO Lisa Su lifted a very large server chip in the air that the company now has branded EPYC. AMD is going for the jugular when it comes to comparisons with Intel’s Xeon family, providing up to 128 PCI Express 3.0 lanes, which Su says “allows you to connect more GPUs directly to the CPU than any other solution in the industry.”

    Reply
  21. Tomi Engdahl says:

    Tencent is creating an entire town dedicated to esports
    http://mashable.com/2017/05/16/esports-town-tencent/

    Tencent, China’s largest online games developer among other things, is building an entire town dedicated to esports.

    The town will be located in Wuhu, east China, where Tencent has just signed a framework agreement with the local government.

    The planned esports town will have an esports theme park, esports university, cultural and creative park, animation industrial park, a creative neighbourhood, a Tencent technology entrepreneurship community, and even a Tencent cloud data center.

    Reply
  22. Tomi Engdahl says:

    Job cuts at Indian outsourcing firms aren’t Donald Trump’s fault. Read why
    http://economictimes.indiatimes.com/tech/ites/job-cuts-at-indian-outsourcing-firms-arent-donald-trumps-fault-read-why/articleshow/58639601.cms

    India’s outsourcing firms are firing workers. Don’t blame it on President Donald Trump’s hawkish stance on US visas.

    The “end of hyper-globalization” story makes for compelling headlines. But the Indian code-writers’ misfortune has more prosaic roots in technology and customer tastes.

    Global corporations, the paymasters of Indian software vendors, are no longer so keen to ante up for application development and maintenance. The flab-shedding has been in the works for five years now; it’s only getting noticed in the age of Trump as muted hiring gives way to firings.

    The Indian industry is coy about the word “layoffs.”

    All this is only to be expected. As Bloomberg Intelligence analysts have argued, digital products are now the IT services industry’s growth driver. For all the Indian companies’ talk of social, mobile, analytics and cloud, the share of digital at Tata Consultancy Ltd. and Wipro ranges between 18 percent and 22 percent, compared with 42 percent to 45 percent at International Business Machines Corp. and Accenture Plc.

    Last October, when Gadfly wrote about the Indian outsourcing industry’s failure to embrace the future, Trump was yet to win the election. Offshoring firms like Infosys are hoping to counter his subsequent threat to curb H-1B visas by hiring more engineers in the U.S.

    Cognizant, Tata Consultancy and Infosys have all decided to return more cash. Yet pessimism continues to deepen

    Reply
  23. Tomi Engdahl says:

    If only India’s IT companies could focus on real transformation instead of appeasing Trump
    https://qz.com/982277/if-only-indias-it-companies-could-focus-on-real-transformation-instead-of-appeasing-trump/

    On May 02, Infosys, India’s second largest information technology (IT) company, announced that it will hire 10,000 American workers over the next two years. Three days later, on May 05, Nasdaq-listed Cognizant Technology Solutions, which has a large presence in India, also revealed plans to significantly ramp up hiring in the US.

    The companies spun these moves as part of the re-engineering underway at India’s IT sector—a decisive shift away from the traditional back-office services to more higher-end technology work.

    Infosys CEO Vishal Sikka explained that his company was hoping to “help invent and deliver the digital futures for our clients in the United States.” Cognizant president Rajeev Mehta maintained a similar line. “We are shifting our workforce largely in response to clients’ increasing need for co-innovation,” he said.

    But it’s hard to miss the timing.

    Reply
  24. Tomi Engdahl says:

    Infosys to Hire 10,000 American Workers Over the Next Two Years and Establish Four Technology and Innovation Hubs in the United States
    https://www.infosys.com/newsroom/press-releases/Pages/technology-innovation-hubs-usa.aspx

    Infosys (NYSE: INFY), a global leader in consulting, technology and next-generation services, today announced that it plans to hire 10,000 American workers over the next two years. As part of this initiative, Infosys will open four new Technology and Innovation Hubs across the country focusing on cutting-edge technology areas, including artificial intelligence, machine learning, user experience, emerging digital technologies, cloud, and big data.

    Reply
  25. Tomi Engdahl says:

    Now jobs losses at IBM India, 5,000 employees under fire
    http://economictimes.indiatimes.com/jobs/now-jobs-losses-at-ibm-india-5000-employees-under-fire/articleshow/58698521.cms

    MUMBAI: IBM may let go of at least 5,000 employees over the next few quarters, people familiar with the development told ET NOW, making it the latest in the job-cuts cycle that has plagued the IT sector in the past few weeks.

    IBM’s layoffs come close to the heels of similar moves at Infosys, Wipro and Cognizant, in what is turning out to be a difficult year for the IT sector.

    Reply
  26. Tomi Engdahl says:

    Zack Kanter / TechCrunch:
    How Amazon is eliminating internal inefficiencies and avoiding technological stagnation by exposing its internal operations to external competition — I co-founded a software startup in December. Each month, I send out an update to our investors to keep them updated on our progress.

    Why Amazon is eating the world
    https://techcrunch.com/2017/05/14/why-amazon-is-eating-the-world/

    Reply
  27. Tomi Engdahl says:

    Windows 10: Triumphs and tragedies from Microsoft Build
    Redmond’s OS needs to be cool for consumers, but its best chances are with business
    https://www.theregister.co.uk/2017/05/18/windows_10_microsoft_build/

    Microsoft presented its latest Windows 10 strategy to developers at its Build event in Seattle last week.

    Microsoft states that Windows 10 is now installed on more than 500 million devices, halfway towards the goal of 1 billion by 2018 that it set itself at Build 2015.

    In July 2016 the company acknowledged that its target was not realistic, telling the press that “due to the focusing of our phone hardware business, it will take longer than FY18 for us to reach our goal”.

    “Focusing” in this context meant killing Windows Phone, other than for a business niche the company (and partners like HP) thinks may exist for multi-purpose devices that work like a PC when docked.

    Build saw the announcement of a new Windows 10 release, the Fall Creators Update, along with key new features.

    Reply
  28. Tomi Engdahl says:

    ‘WannaCry Makes an Easy Case For Linux’
    https://linux.slashdot.org/story/17/05/18/1757205/wannacry-makes-an-easy-case-for-linux

    The thing is, WannaCry isn’t the first of its kind. In fact, ransomware has been exploiting Windows vulnerabilities for a while. The first known ransomware attack was called “AIDS Trojan” that infected Windows machines back in 1989.

    The important question here is this: Have their been any ransomware attacks on the Linux desktop? The answer is no. With that in mind, it’s pretty easy to draw the conclusion that now would be a great time to start deploying Linux on the desktop. I can already hear the tired arguments. The primary issue: software. I will counter that argument by saying this: Most software has migrated to either Software as a Service (SaaS) or the cloud. The majority of work people do is via a web browser. Chrome, Firefox, Edge, Safari; with few exceptions, SaaS doesn’t care.

    WannaCrypt makes an easy case for Linux
    http://www.techrepublic.com/article/wannacrypt-makes-an-easy-case-for-linux/

    Ransomware got you down? There’s a solution that could save you from dealing with this issue ever again. That’s right. It’s Linux.

    Reply
  29. Tomi Engdahl says:

    Deep Algo offers simple code visualization for people who don’t know how to code
    https://techcrunch.com/2017/05/19/deep-algo-offers-simple-code-visualization-for-people-who-dont-know-how-to-code/?ncid=rss&utm_source=tcfbpage&utm_medium=feed&utm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29&utm_content=FaceBook&sr_share=facebook

    We make non-IT people understand code,” CEO Xavier Lagarrigue told TechCrunch at the event.

    After all, there are lots that can go wrong when attempting to get a non-coder to build in code. Remember when everyone started building Geocities websites? It was all MIDI songs and ‘under construction’ GIFs.

    Reply
  30. Tomi Engdahl says:

    AI in 5 years: One CTO’s take
    https://enterprisersproject.com/article/2017/4/ai-5-years-one-ctos-take?sc_cid=7016000000127ECAAY

    How will artificial intelligence change the way we do business five years from now? In part two of our conversation with Seal Software CTO Kevin Gidney, he explains how AI is revolutionizing industries and changing ideas about what jobs will require humans.

    Reply
  31. Tomi Engdahl says:

    Liam Tung / ZDNet:
    ‘Safe and predictable’ Windows 10 S won’t run Linux, says Microsoft

    ‘Safe and predictable’ Windows 10 S won’t run Linux, says Microsoft
    http://www.zdnet.com/article/safe-and-predictable-windows-10-s-wont-run-linux-says-microsoft/?ftag=COS-05-10aaa0g&utm_campaign=trueAnthem:+Trending+Content

    Microsoft wants to clear up confusion about Windows 10 S and Linux distributions available on the Windows Store.

    Just because Linux distributions are coming to the Windows Store, it doesn’t mean they will work on laptops running Microsoft’s streamlined Windows 10 S.

    Microsoft wants to clear up any confusion over two recent announcements. At the beginning of May it unveiled Windows 10 S, a fast-booting, locked-down version of Windows 10 that can only install apps from the Windows Store and is restricted to Microsoft’s Edge browser.

    Windows 10 S ships with Microsoft’s $1,000 Surface Laptop, as well as with forthcoming third-party Windows laptops that will be priced from $189 to take on the Chromebook market.

    Reply
  32. Tomi Engdahl says:

    IBM tells thousands of remote employees to come back to office or find new jobs
    While selling benefits of “telework” to others, IBM forces relocation in stealth layoff.
    https://arstechnica.com/information-technology/2017/05/ibm-to-remote-workers-come-back-to-the-mothership-or-else/

    IBM, one of the earliest companies to embrace the concept of employees working en masse from home or small satellite offices, has informed thousands of employees that it’s time to return to the mothership—or find a new job. As The Wall Street Journal reports, this week is the deadline for remote employees—who make up as much as 40 percent of IBM’s workforce—to decide whether to move or leave.

    IBM once heralded the savings and productivity gains it won from its “Mobility Initiative.” The company has also made untold millions over the past two decades selling software and consulting services, such as its Sametime instant messaging and voice products, to companies looking to support far-flung workforce

    Reply
  33. Tomi Engdahl says:

    Why The US Government Open Sources Its Code
    https://news.slashdot.org/story/17/05/21/2242244/why-the-us-government-open-sources-its-code?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Slashdot%2Fslashdot%2Fto+%28%28Title%29Slashdot+%28rdf%29%29

    The Federal Source Code Policy, released in August 2016, was the first U.S. government policy to support open source across the government… All new custom source code developed by or for the federal government must be available to all other federal agencies for sharing and reuse; and at least 20% of new government custom-developed code must be released to the public as open source. It also established Code.gov as a platform for access to government-developed open source code and a way for other developers to participate.

    Before this policy was released, agencies were spending a lot of money to redevelop software already in use by other government agencies. This initiative is expected to save the government millions of dollars in wasteful and duplicative spending on software development.

    Sharing America’s code
    https://opensource.com/article/17/5/sharing-americas-code

    White House senior advisor Alvand Salehi explains at OSCON why the federal government’s support of open source is “here to stay.”

    Reply
  34. Tomi Engdahl says:

    GitHub throws open doors on ‘app’ souk
    Online store will try to make developer tools work better for businesses
    https://www.theregister.co.uk/2017/05/22/github_opens_app_market/

    Social code repository GitHub is making it easier for developers to fetch data and spend money.

    In conjunction with the company’s Satellite 2017 conference in London on Monday, GitHub is rolling out GitHub Marketplace, a store for development tools that assist with tasks like continuous integration, code reviews, and project management.

    Kyle Daigle, senior engineering manager for GitHub’s API group, in a phone interview with The Register, said that some 60 per cent of GitHub’s business customers use an integration of some sort. An example might be crash reporting and error logging software Sentry.

    GitHub Marketplace
    Tools to build on and improve your workflow
    https://github.com/marketplace

    Reply
  35. Tomi Engdahl says:

    The eternal battle for OpenStack’s soul will conclude in three years. Again
    Beyond cars, toasters and American Gods
    https://www.theregister.co.uk/2017/05/22/window_closing_on_openstacks_future/

    Reply
  36. Tomi Engdahl says:

    Parallel programming masterclass with compsci maven online
    Dr Panda’s recent Swiss presentation free to view
    https://www.theregister.co.uk/2017/05/22/parallel_programming_101_201_and_1001/

    Dr DK Panda is a world-recognised expert on parallel programming and networking. He’s a Distinguished Scholar at The Ohio State University and his research group has developed the MVAPICH2 (high performance MPI and MIP+PGAS) libraries for InfiniBand, iWARP, and RoCE with support for GPUs, Xeon Phi, and virtualization.

    High-Performance and Scalable Designs of Programming Models for Exascale Systems
    https://www.youtube.com/watch?v=rep3eZN9znM
    https://www.slideshare.net/insideHPC/highperformance-and-scalable-designs-of-programming-models-for-exascale-systems

    Reply
  37. Tomi Engdahl says:

    Timothy B. Lee / Vox:
    Google, Microsoft, Amazon, and others bolster cutting-edge AI tools for third-party developers, setting up the next tech platform war

    Artificial intelligence is getting more powerful, and it’s about to be everywhere
    https://www.vox.com/new-money/2017/5/18/15655274/google-io-ai-everywhere

    There wasn’t any one big product announcement at Google I/O keynote on Wednesday, the annual event when thousands of programmers meet to learn about Google’s software platforms. Instead, it was a steady trickle of incremental improvements across Google’s product portfolio. And almost all of the improvements were driven by breakthroughs in artificial intelligence — the software’s growing ability to understand complex nuances of the world around it.

    Companies have been hyping artificial intelligence for so long — and often delivering such mediocre results — that it’s easy to tune it out. AI is also easy to underestimate because it’s often used to add value to existing products rather than creating new ones.

    But even if you’ve dismissed AI technology in the past, there are two big reasons to start taking it seriously. First, the software really is getting better at a remarkable pace. Problems that artificial intelligence researchers struggled with for decades are suddenly getting solved

    “Our software is going to get superpowers” thanks to AI, says Frank Chen, a partner at the venture capital firm Andreessen Horowitz. Computer programs will be able to do things that “we thought were human-only activities: recognizing what’s in a picture, telling when someone’s going to get mad, summarizing documents.”

    Reply
  38. Tomi Engdahl says:

    SSD price premium over disk faaaalling
    Updated IDC SSD forecast sees 44 per cent capacity CAGR 2016-2021
    https://www.theregister.co.uk/2017/05/22/ssd_price_premium_over_disk_falling/

    The general solid state drive price premium over disk should decline from 6.6x now to 2.2x in 2021, according to new IDC numbers.

    IDC updated its SSD forecast last week and Stifel analyst and MD Aaron Rakers has had a look at the updated forecast. It says SSD shipments will increase from around 157.5 million units shipped in 2016 to 312.9 million in 2021; a 15 per cent CAGR (compound annual growth rate.)

    SSD capacity shipped will grow much faster, with a 44 per cent CAGR over the period. It is expected that SSD capacity shipped will grow from 7 per cent of total HDD + SSD capacity shipped in 2016 to about 19 per cent in 2021.

    Reply
  39. Tomi Engdahl says:

    AWS signs Java ‘father’ James Gosling
    https://venturebeat.com/2017/05/22/aws-signs-java-father-james-gosling/

    Amazon Web Services has added another computer science heavyweight to its employee roster. James Gosling, often referred to as the “Father of Java,” announced on Facebook Monday that he would be joining the cloud provider as a distinguished engineer.

    Gosling came up with the original design of Java and implemented its first compiler and virtual machine as part of his work at Sun Microsystems. He left Sun in 2010 after the company was acquired by Oracle, spent a short time at Google, and most recently worked at Liquid Robotics designing software for an underwater robot.

    Reply
  40. Tomi Engdahl says:

    AMD Goes Epyc for Data Centers
    http://www.eetimes.com/author.asp?section_id=36&doc_id=1331764&

    So far AMD is making good progress in its return to profitability, but it’s too early to raise the “mission accomplished” banner.

    The key for AMD to return to more robust profitability is to move into more profitable segments of the industries that it already is in and reenter the server market. To do so, it has to move up the performance and price stack, as it did with the Ryzen processor for high performance PCs.

    The one area where AMD has the most growth opportunity is in the $21 billion server market. Around 2005, AMD had more than 20 percent market share of the x86 server business with its Opteron processors. The company now readily admits it has nearly zero presence in the data center market.

    AMD hopes to change all that with a new Zen-based server processor code-named Naples. At its analyst event this week it gave these chips the Epyc brand.

    Reply
  41. Tomi Engdahl says:

    IEEE says zero hot air in Fujitsu liquid immersion cooling for data centers
    http://www.cablinginstall.com/articles/pt/2017/05/ieee-says-zero-hot-air-in-fujitsu-liquid-immersion-cooling-for-data-centers.html?cmpid=enl_cim_cimdatacenternewsletter_2017-05-23

    Given the prodigious heat generated by the trillions of transistors switching on and off 24 hours a day in data centers, air conditioning has become a major operating expense. Consequently, engineers have come up with several imaginative ways to ameliorate such costs, which can amount to a third or more of data center operations.
    One favored method is to set up hot and cold aisles of moving air through a center to achieve maximum cooling efficiency. Meanwhile, Facebook has chosen to set up a data center in Lulea, northern Sweden on the fringe of the Arctic Circle to take advantage of the natural cold conditions there; and Microsoft engineers have seriously proposed putting server farms under water.

    Fujitsu, on the other hand, is preparing to launch a less exotic solution: a liquid immersion cooling system it says will usher in a “next generation of ultra-dense data centers.”

    Fujitsu Liquid Immersion Not All Hot Air When It Comes to Cooling Data Centers
    http://spectrum.ieee.org/tech-talk/computing/hardware/fujitsu-liquid-immersion-not-all-hot-air-when-it-comes-to-cooling-data-centers

    Reply
  42. Tomi Engdahl says:

    Brian Barrett / Wired:
    Intel to offer Thunderbolt 3 protocol specification to chipmakers royalty free next year and integrate Thunderbolt into its CPUs — Nearly two years ago, Intel gave a major boost to Thunderbolt, its zippy hardware interface, by embracing USB-C, the do-it-all port that will eventually eat the world.

    Intel’s Plan to Thunderbolt 3 All of the Things
    https://www.wired.com/2017/05/intels-plan-thunderbolt-3-things/

    Nearly two years ago, Intel gave a major boost to Thunderbolt, its zippy hardware interface, by embracing USB-C, the do-it-all port that will eventually eat the world. Now, the company’s attempting another kickstart, this time focusing on making Thunderbolt available to anyone who wants it.

    That may be more people than you’d think. Thunderbolt 3 comes equipped with transfer speeds of 40Gbps, which roughly works out to a 4K movie in 30 seconds. It can power devices, and connect to 4K peripherals. But Thunderbolt’s had six years to go mainstream, a combination of high cost and low availability have hampered its success.

    USB-C helped with that some: 180 Intel Core PCs now offer Thunderbolt 3, with another 30 or so expected by the end of the year. They’re accompanied by over 60 peripherals. Now, though, Intel is taking two steps to push that adoption even further: integrating Thunderbolt 3 into Intel CPUs, and then making the Thunderbolt protocol specification available to third-party chipmakers, royalty-free, next year.

    Reply
  43. Tomi Engdahl says:

    Natalie Gagliordi / ZDNet:
    Google, IBM, and Lyft debut Istio, an open source tool for managing microservices from different vendors on cloud platforms — Istio gives developers a vendor-neutral way to connect, secure, manage, and monitor networks of different microservices on cloud platforms.

    Google, IBM, and Lyft launch open source project Istio
    http://www.zdnet.com/article/google-ibm-and-lyft-launch-open-source-project-istio/

    Istio gives developers a vendor-neutral way to connect, secure, manage, and monitor networks of different microservices on cloud platforms.

    Reply
  44. Tomi Engdahl says:

    Stephanie Condon / ZDNet:
    LinkedIn, HPE, and others launch Open19 Foundation, a nonprofit to support open and customizable datacenter designs

    LinkedIn, HPE launch nonprofit to support open datacenter designs
    http://www.zdnet.com/article/linkedin-hpe-launch-nonprofit-to-support-open-datacenter-designs/

    The Open19 Foundation, whose founding members also include GE Digital, Vapor IO and Flex, will support open hardware and open software projects.

    A group of technology companies including LinkedIn and HPE announced Tuesday that they’re launching the Open19 Foundation, a nonprofit aimed at supporting datacenter designs that are open, economical and customizable.

    Other founding members include Flex, GE Digital and Vapor IO.

    The foundation sounds similar to the Facebook-launched Open Compute Project. However, OCP shares datacenter designs that largely benefit major internet companies. The Open19 Foundation, by contrast, will focus on optimizing datacenters of any size, with datacenter models that can fit any location or facility footprint.

    One of the foundation’s first contributions will be the Open19 Platform industry specification, built and incubated by LinkedIn. It defines a cross-industry common server form factor using racks, cages and pre-defined network and power.

    “The Open19 Platform represents a simple solution for a complex problem,” Yuval Bachar, president of the Open19 Foundation and a principal engineer at LinkedIn, said in a statement

    Reply
  45. Tomi Engdahl says:

    Frederic Lardinois / TechCrunch:
    Microsoft now uses Git to develop Windows, along with its open source Git Virtual File System to help manage the ~300GB repository

    Microsoft now uses Git and GVFS to develop Windows
    https://techcrunch.com/2017/05/24/microsoft-now-uses-git-and-gvfs-to-develop-windows/

    Microsoft today announced that virtually all of its engineers now use the Git version control system to develop its Windows operating system. The Windows Git repository includes about 3.5 million files that weigh in at about 300GB when you check them into Git. Git, however, wasn’t built for a project of this size, so Microsoft developed the Git Virtual File System to be able to get the benefits of using Git without having to wait hours for even the simplest of Git commands to run.

    The code for the Git Virtual File system is now available under the MIT license on GitHub and open for community contributions.

    The move to Git took about three months. Before this, Microsoft used Source Depot to manage the Windows code, though other groups with smaller code bases also still use Team Foundation Server.

    Over the course of the last three months, Microsoft moved some Windows developers over to the new Git repository to test the system. Then, in March, it rolled out Git for all 2,000 engineers that work on the Windows OneCore team. Today, about 3,500 of the roughly 4,000 engineers on the Windows team are on Git.

    Reply
  46. Tomi Engdahl says:

    Google in, Google out
    https://techcrunch.com/2017/05/21/google-in-google-out/

    all it the Triumph of the Stacks. I attended Google I/O this week, and saw a lot of cool things: but what really hit home for me, at the keynote and the demos and the developer sessions, was just how dominant Google has become, in so many different domains … and, especially, how its only real competition comes from the four other tech behemoths who dominate our industry’s landscape.

    Typically, Bruce Sterling saw it coming first, five full years ago:

    “The Stacks” […] Google, Apple, Facebook, Amazon and Microsoft. These big five American vertically organized silos are re-making the world in their image.

    Today the five companies he cited are the five most valuable publicly traded companies on the planet, and practically a software oligopoly. They all make hardware, too, but of course there are many more important hardware companies: Intel, AMD, ARM, Qualcomm, Nvidia, Samsung, Tesla, Taiwan Semiconductor, Hon Hai, et al. When you talk about software, though — you know, the stuff that’s eating the world — then you’re almost certainly, if indirectly, ultimately talking about the Stacks.

    Google has so many tentacles in so many pies that it was a bit mind-numbing to see them all sardined into a single event. (takes a deep breath) Polymer and Angular and AMP and Dart and Flutter and WebAssembly and Instant Apps and Analytics and Fabric. Compute Engine and App Engine and Firebase. Tango and Daydream. TensorFlow and Mobile Vision and Cloud TPUs. Android Phone, Android Wear, Android TV, Android Auto, Android Things, Android Pay. Google Home and Google Assistant and Google Play. And that’s without mentioning Maps, YouTube, Gmail, and Chrome — each of which has a billion-plus users — much less Alphabet’s “Other Bets.”

    That list also deliberately elides the Giant Google Fountain Of Money, i.e. the search and advertising engine that still provides 90% of Google’s income. There always seems to be a certain awkward disjoint between that colossal money machine and everything else Google does. “We use deep computer science to solve hard technical problems at scale,” Pichai said at the keynote, with relish.

    Reply
  47. Tomi Engdahl says:

    The Competitive Advantage of NVMe SSDs in the Data Center
    http://www.eetimes.com/author.asp?section_id=36&doc_id=1331774&

    The real improvements have been with solid state drives (SSDs), which are based on NAND flash memory.

    We are in the midst of a remarkable electronics era, where consumers are seeking immediate delivery of services, including data access. Companies that can rapidly distill important information from the vast ocean of data and provide a useful service to consumers are thriving.

    Services such as these have been made possible by major advances in computing power. Data centers are filled with computing systems that analyze lots of data rapidly. However, there is a tradeoff between how much data you want to process versus how quickly you want to process it.

    DRAM is the quickest form of memory, but is limited in how long it can store data, which is the reason it is called a volatile memory. Hard disk drives (HDDs) provide lots of capacity, but are much slower. The real improvements have been with solid state drives (SSDs), which are based on NAND flash memory. The industry has made tremendous gains in performance, capacity, and cost-effectiveness in recent years by moving to V-NAND[1] as well as shifting to the NVMe (non-volatile memory express) protocol that uses the PCIe interface.

    An SSD using a PCIe x4 interface (featuring NVMe) can connect to the host computer at speeds up to 32Gbps, which is about 5x faster than the SATA interface commonly used for storage drives. The NVMe protocol, being optimized for flash memory, also reduces latency by over 3x compared to the SATA protocol.

    To illustrate and monetize the value of low latency, Amazon has determined that every 100ms of latency cost it one percent in sales. Google has determined that an extra 500ms in search page generation time dropped traffic by 20 percent. Another study showed that a broker could lose $4M per millisecond if his or her electronic trading platform is 5ms behind the

    Reply
  48. Tomi Engdahl says:

    Big Data Reshapes Silicon
    The view of Compute 2.0 from Bristol
    http://www.eetimes.com/document.asp?doc_id=1331781

    The huge data sets collected by web giants such as Amazon, Google, and Facebook are fueling a renaissance of new chips to process them. Two of the latest efforts will be described at an annual conference on computer architecture in late June.

    Stanford researchers will describe Plasticine, a reconfigurable processor that sports nearly 100x better performance/watt than an FPGA while being easier to program. Separately, two veteran designers at Nvidia were part of a team that defined an inference processor that delivers more than twice the performance and energy efficiency of exiting devices.

    The chips represent tips of an iceberg of work. Intel acquired three machine-learning startups in the past year. Rival Samsung, along with Dell EMC, invested in Graphcore (Bristol, U.K.), one of a half-dozen independent startups in the area.

    Meanwhile, Nvidia is racking up rising sales for its GPUs as neural network training engines. Simultaneously, it is morphing its architecture to better handle such jobs.

    Google claims that neither its massive clusters of x86 CPUs nor Nvidia’s GPUs are adequate. So it has rolled out two versions of its own accelerator, the TPU.

    “This is Compute 2.0; it is absolutely a new world of computing,” said Nigel Toon, chief executive of Graphcore. “Google eventually will use racks and racks of TPUs and almost no CPUs because 98 percent of its revenues come from search,” a good application for machine learning.

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*