Computer technology trends for 2016

It seems that PC market seems to be stabilizing in 2016. I expect that the PC market to shrinks slightly. While mobile devices have been named as culprits for the fall of PC shipments, IDC said that other factors may be in play. It is still pretty hard to make any decent profits with building PC hardware unless you are one of the biggest players – so again Lenovo, HP, and Dell are increasing their collective dominance of the PC market like they did in 2015. I expect changes like spin-offs and maybe some mergers with with smaller players like Fujitsu, Toshiba and Sony. The EMEA server market looks to be a two-horse race between Hewlett Packard Enterprise and Dell, according to Gartner. HPE, Dell and Cisco “all benefited” from Lenovo’s acquisition of IBM’s EMEA x86 server organisation.

Tablet market is no longer high grow market – tablet maker has started to decline, and decline continues in 2016 as owners are holding onto their existing devices for more than 3 years. iPad sales are set to continue decline and iPad Air 3 to be released in 1st half of 2016 does not change that. IDC predicts that detachable tablet market set for growth in 2016 as more people are turning to hybrid devices. Two-in-one tablets have been popularized by offerings like the Microsoft Surface, with options ranging dramatically in price and specs. I am not myself convinced that the growth will be as IDC forecasts, even though Company have started to make purchases of tablets for workers in jobs such as retail sales or field work (Apple iPads, Windows and Android tablets managed by company). Combined volume shipments of PCs, tablets and smartphones are expected to increase only in the single digits.

All your consumer tech gear should be cheaper come July as shere will be less import tariffs for IT products as World Trade Organization (WTO) deal agrees that tariffs on imports of consumer electronics will be phased out over 7 years starting in July 2016. The agreement affects around 10 percent of the world trade in information and communications technology products and will eliminate around $50 billion in tariffs annually.

Happy Computer Laptop

In 2015 the storage was rocked to its foundations and those new innovations will be taken into wider use in 2016. The storage market in 2015 went through strategic foundation-shaking turmoil as the external shared disk array storage playbook was torn to shreds: The all-flash data centre idea has definitely taken off as a vision that could be achieved so that primary data is stored in flash with the rest being held in cheap and deep storage.  Flash drives generally solve the dusk drive latency access problem, so not so much need for hybrid drives. There is conviction that storage should be located as close to servers as possible (virtual SANs, hyper-converged industry appliances  and NVMe fabrics). The existing hybrid cloud concept was adopted/supported by everybody. Flash started out in 2-bits/cell MLC form and this rapidly became standard and TLC (3-bits/cell or triple layer cell) had started appearing. Industry-standard NVMe drivers for PCIe flash cards appeared. Intel and Micron blew non-volatile memory preconceptions out of the water in the second half of the year with their joint 3D XPoint memory announcement. Boring old disk  disk tech got shingled magnetic recording (SMR) and helium-filled drive technology; drive industry is focused on capacity-optimizing its drives.  We got key:value store disk drives with an Ethernet NIC on-board and basic GET and PUT object storage facilities came into being. Tape industry developed a 15TB LTO-7 format.

The use of SSD will increase and it’s price will drop. SSDs will be in more than 25% of new laptops sold in 2015.  SSDs are expected to be in 31% of new consumer laptops in 2016 and more than 40% by 2017. The prices of mainstream consumer SSDs have fallen dramatically every year over the past three years while HDD prices have not changed much.  SSD prices will decline to 24 cents per gigabyte in 2016. In 2017 they’re expected to drop to 11-17 cents per gigabyte (means a 1TB SSD on average would retail for $170 or less).

Hard disk sales will decrease, but this technology is not dead. Sales of hard disk drives have been decreasing for several years now (118 million units in the third quarter of 2015), but according to Seagate hard disk drives (HDDs) are set to still stay relevant around for at least 15 years to 20 years.  HDDs remain the most popular data storage technology as it is cheapest in terms of per-gigabyte costs. While SSDs are generally getting more affordable, high-capacity solid-state drives are not going to become as inexpensive as hard drives any time soon. 

Because all-flash storage systems with homogenous flash media are still too expensive to serve as a solution to for every enterprise application workload, enterprises will increasingly turn to performance optimized storage solutions that use a combination of multiple media types to deliver cost-effective performance. The speed advantage of Fibre Channel over Ethernet has evaporated. Enterprises also start  to seek alternatives to snapshots that are simpler and easier to manage, and will allow data and application recovery to a second before the data error or logical corruption occurred.

Local storage and the cloud finally make peace in 2016 as the decision-makers across the industry have now acknowledged the potential for enterprise storage and the cloud to work in tandem. Over 40 percent of data worldwide is expected to live on or move through the cloud by 2020 according to IDC.

Happy Computer Laptop

Open standards for data center development are now a reality thanks to advances in cloud technology. Facebook’s Open Compute Project has served as the industry’s leader in this regard.This allows more consolidation for those that want that. Consolidation used to refer to companies moving all of their infrastructure to the same facility. However, some experts have begun to question this strategy as  the rapid increase in data quantities and apps in the data center have made centralized facilities more difficult to operate than ever before. Server virtualization, more powerful servers and an increasing number of enterprise applications will continue to drive higher IO requirements in the datacenter.

Cloud consolidation starts heavily in 2016: number of options for general infrastructure-as-a-service (IaaS) cloud services and cloud management software will be much smaller at the end of 2016 than the beginning. The major public cloud providers will gain strength, with Amazon, IBM SoftLayer, and Microsoft capturing a greater share of the business cloud services market. Lock-in is a real concern for cloud users, because PaaS players have the ancient imperative to find ways to tie customers to their platforms and aren’t afraid to use them so advanced users want to establish reliable portability across PaaS products in a multi-vendor, multi-cloud environment.

Year 2016 will be harder for legacy IT providers than 2015. In its report, IDC states that “By 2020, More than 30 percent of the IT Vendors Will Not Exist as We Know Them Today.” Many enterprises are turning away from traditional vendors and toward cloud providers. They’re increasingly leveraging open source. In short, they’re becoming software companies. The best companies will build cultures of performance and doing the right thing — and will make data and the processes around it self-service for all their employees. Design Thinking to guide companies who want to change the lives of its customers and employees. 2016 will see a lot more work in trying to manage services that simply aren’t designed to work together or even be managed – for example Whatever-As-A-Service cloud systems to play nicely together with their existing legacy systems. So competent developers are the scarce commodity. Some companies start to see Cloud as a form of outsourcing that is fast burning up inhouse ITops jobs with varying success.

There are still too many old fashioned companies that just can’t understand what digitalization will mean to their business. In 2016, some companies’ boards still think the web is just for brochures and porn and don’t believe their business models can be disrupted. It gets worse for many traditional companies. For example Amazon is a retailer both on the web and increasingly for things like food deliveries. Amazon and other are playing to win. Digital disruption has happened and will continue.
Happy Computer Laptop

Windows 10 is coming more on 2016. If 2015 was a year of revolution, 2016 promises to be a year of consolidation for Microsoft’s operating system. I expect that Windows 10 adoption in companies starts in 2016. Windows 10 is likely to be a success for the enterprise, but I expect that word from heavyweights like Gartner, Forrester and Spiceworks, suggesting that half of enterprise users plan to switch to Windows 10 in 2016, are more than a bit optimistic. Windows 10 will also be used in China as Microsoft played the game with it better than with Windows 8 that was banned in China.

Windows is now delivered “as a service”, meaning incremental updates with new features as well as security patches, but Microsoft still seems works internally to a schedule of milestone releases. Next up is Redstone, rumoured to arrive around the anniversary of Windows 10, midway through 2016. Also Windows servers will get update in 2016: 2016 should also include the release of Windows Server 2016. Server 2016 includes updates to the Hyper-V virtualisation platform, support for Docker-style containers, and a new cut-down edition called Nano Server.

Windows 10 will get some of the already promised features not delivered in 2015 delivered in 2016. Windows 10 was promised coming  to PCs and Mobile devices in 2015 to deliver unified user experience. Continuum is a new, adaptive user experience offered in Windows 10 that optimizes the look and behavior of apps and the Windows shell for the physical form factor and customer’s usage preferences. The promise was same unified interface for PCs, tablets and smart phones – but it was only delivered in 2015 for only PCs and some tablets. Mobile Windows 10 for smart phone is expected to start finally in 2016 – The release of Microsoft’s new Windows 10 operating system may be the last roll of the dice for its struggling mobile platform. Because Microsoft Plan A is to get as many apps and as much activity as it can on Windows on all form factor with Universal Windows Platform (UWP), which enables the same Windows 10 code to run on phone and desktop. Despite a steady inflow of new well-known apps, it remains unclear whether the Universal Windows Platform can maintain momentum with developer. Can Microsoft keep the developer momentum going? I am not sure. In addition there are also plans for tools for porting iOS apps and an Android runtime, so expect also delivery of some or all of the Windows Bridges (iOS, web app, desktop app, Android) announced at the April 2015 Build conference in hope to get more apps to unified Windows 10 app store. Windows 10 does hold out some promise for Windows Phone, but it’s not going to make an enormous difference. Losing the battle for the Web and mobile computing is a brutal loss for Microsoft. When you consider the size of those two markets combined, the desktop market seems like a stagnant backwater.

Older Windows versions will not die in 2016 as fast as Microsoft and security people would like. Expect Windows 7 diehards to continue holding out in 2016 and beyond. And there are still many companies that run their critical systems on Windows XP as “There are some people who don’t have an option to change.” Many times the OS is running in automation and process control systems that run business and mission-critical systems, both in private sector and government enterprises. For example US Navy is using obsolete operating system Microsoft Windows XP to run critical tasks. It all comes down to money and resources, but if someone is obliged to keep something running on an obsolete system, it’s the wrong approach to information security completely.

Happy Computer Laptop

Virtual reality has grown immensely over the past few years, but 2016 looks like the most important year yet: it will be the first time that consumers can get their hands on a number of powerful headsets for viewing alternate realities in immersive 3-D. Virtual Reality will become the mainstream when Sony, and Samsung Oculus bring consumer products on the market in 2016. Whole virtual reality hype could be rebooted as Early build of final Oculus Rift hardware starts shipping to devs. Maybe HTC‘s and Valve‘s Vive VR headset will suffer in the next few month. Expect a banner year for virtual reality.

GPU and FPGA acceleration will be used in high performance computing widely. Both Intel and AMD have products with CPU and GPU in the same chip, and there is software support for using GPU (learn CUDA and/or OpenCL). Also there are many mobile processors have CPU and GPU on the same chip. FPGAs are circuits that can be baked into a specific application, but can also be reprogrammed later. There was lots of interest in 2015 for using FPGA for accelerating computations as the nest step after GPU, and I expect that the interest will grow even more in 2016. FPGAs are not quite as efficient as a dedicated ASIC, but it’s about as close as you can get without translating the actual source code directly into a circuit. Intel bought Altera (big FPGA company) in 2015 and plans in 2016 to begin selling products with a Xeon chip and an Altera FPGA in a single packagepossibly available in early 2016.

Artificial intelligence, machine learning and deep learning will be talked about a lot in 2016. Neural networks, which have been academic exercises (but little more) for decades, are increasingly becoming mainstream success stories: Heavy (and growing) investment in the technology, which enables the identification of objects in still and video images, words in audio streams, and the like after an initial training phase, comes from the formidable likes of Amazon, Baidu, Facebook, Google, Microsoft, and others. So-called “deep learning” has been enabled by the combination of the evolution of traditional neural network techniques, the steadily increasing processing “muscle” of CPUs (aided by algorithm acceleration via FPGAs, GPUs, and, more recently, dedicated co-processors), and the steadily decreasing cost of system memory and storage. There were many interesting releases on this in the end of 2015: Facebook Inc. in February, released portions of its Torch software, while Alphabet Inc.’s Google division earlier this month open-sourced parts of its TensorFlow system. Also IBM Turns Up Heat Under Competition in Artificial Intelligence as SystemML would be freely available to share and modify through the Apache Software Foundation. So I expect that the year 2016 will be the year those are tried in practice. I expect that deep learning will be hot in CES 2016 Several respected scientists issued a letter warning about the dangers of artificial intelligence (AI) in 2015, but I don’t worry about a rogue AI exterminating mankind. I worry about an inadequate AI being given control over things that it’s not ready for. How machine learning will affect your business? MIT has a good free intro to AI and ML.

Computers, which excel at big data analysis, can help doctors deliver more personalized care. Can machines outperform doctors? Not yet. But in some areas of medicine, they can make the care doctors deliver better. Humans repeatedly fail where computers — or humans behaving a little bit more like computers — can help. Computers excel at searching and combining vastly more data than a human so algorithms can be put to good use in certain areas of medicine. There are also things that can slow down development in 2016: To many patients, the very idea of receiving a medical diagnosis or treatment from a machine is probably off-putting.

Internet of Things (IoT) was talked a lot in 2015, and it will be a hot topics for IT departments in 2016 as well. Many companies will notice that security issues are important in it. The newest wearable technology, smart watches and other smart devices corresponding to the voice commands and interpret the data we produce - it learns from its users, and generate appropriate  responses in real time. Interest in Internet of Things (IoT) will as bring interest to  real-time business systems: Not only real-time analytics, but real-time everything. This will start in earnest in 2016, but the trend will take years to play out.

Connectivity and networking will be hot. And it is not just about IoT.  CES will focus on how connectivity is proliferating everything from cars to homes, realigning diverse markets. The interest will affect job markets: Network jobs are hot; salaries expected to rise in 2016  as wireless network engineers, network admins, and network security pros can expect above-average pay gains.

Linux will stay big in network server marker in 2016. Web server marketplace is one arena where Linux has had the greatest impact. Today, the majority of Web servers are Linux boxes. This includes most of the world’s busiest sites. Linux will also run many parts of out Internet infrastructure that moves the bits from server to the user. Linux will also continue to rule smart phone market as being in the core of Android. New IoT solutions will be moist likely to be built mainly using Linux in many parts of the systems.

Microsoft and Linux are not such enemies that they were few years go. Common sense says that Microsoft and the FOSS movement should be perpetual enemies.  It looks like Microsoft is waking up to the fact that Linux is here to stay. Microsoft cannot feasibly wipe it out, so it has to embrace it. Microsoft is already partnering with Linux companies to bring popular distros to its Azure platform. In fact, Microsoft even has gone so far as to create its own Linux distro for its Azure data center.

Happy Computer Laptop

Web browsers are coming more and more 64 bit as Firefox started 64 bit era on Windows and Google is killing Chrome for 32-bit Linux. At the same time web browsers are loosing old legacy features like NPAPI and Silverlight. Who will miss them? The venerable NPAPI plugins standard, which dates back to the days of Netscape, is now showing its age, and causing more problems than it solves, and will see native support removed by the end of 2016 from Firefox. It was already removed from Google Chrome browsers with very little impact. Biggest issue was lack of support for Microsoft’s Silverlight which brought down several top streaming media sites – but they are actively switching to HTML5 in 2016. I don’t miss Silverlight. Flash will continue to be available owing to its popularity for web video.

SHA-1 will be at least partially retired in 2016. Due to recent research showing that SHA-1 is weaker than previously believed, Mozilla, Microsoft and now Google are all considering bringing the deadline forward by six months to July 1, 2016.

Adobe’s Flash has been under attack from many quarters over security as well as slowing down Web pages. If you wish that Flash would be finally dead in 2016 you might be disappointed. Adobe seems to be trying to kill the name by rebranding trick: Adobe Flash Professional CC is now Adobe Animate CC. In practive it propably does not mean much but Adobe seems to acknowledge the inevitability of an HTML5 world. Adobe wants to remain a leader in interactive tools and the pivot to HTML5 requires new messaging.

The trend to try to use same same language and tools on both user end and the server back-end continues. Microsoft is pushing it’s .NET and Azure cloud platform tools. Amazon, Google and IBM have their own set of tools. Java is on decline. JavaScript is going strong on both web browser and server end with node.js , React and many other JavaScript libraries. Apple also tries to bend it’s Swift programming language now used to make mainly iOS applications also to run on servers with project Perfect.

Java will still stick around, but Java’s decline as a language will accelerate as new stuff isn’t being written in Java, even if it runs on the JVM. We will  not see new Java 9 in 2016 as Oracle’s delayed the release of Java 9 by six months. The register tells that Java 9 delayed until Thursday March 23rd, 2017, just after tea-time.

Containers will rule the world as Docker will continue to develop, gain security features, and add various forms of governanceUntil now Docker has been tire-kicking, used in production by the early-adopter crowd only, but it can change when vendors are starting to claim that they can do proper management of big data and container farms.

NoSQL databases will take hold as they be called as “highly scalable” or “cloud-ready.” Expect 2016 to be the year when a lot of big brick-and-mortar companies publicly adopt NoSQL for critical operations. Basically NoSQL could be seem as key:value store, and this idea has also expanded to storage systems: We got key:value store disk drives with an Ethernet NIC on-board and basic GET and PUT object storage facilities came into being.

In the database world Big Data will be still big but it needs to be analyzed in real-time. A typical big data project usually involves some semi-structured data, a bit of unstructured (such as email), and a whole lot of structured data (stuff stored in an RDBMS). The cost of Hadoop on a per-node basis is pretty inconsequential, the cost of understanding all of the schemas, getting them into Hadoop, and structuring them well enough to perform the analytics is still considerable. Remember that you’re not “moving” to Hadoop, you’re adding a downstream repository, so you need to worry on systems integration and latency issues. Apache Spark will also get interest as Spark’s multi-stage in-memory primitives provides more performance  for certain applications. Big data brings with it responsibility – Digital consumer confidence must be earned.

IT security continues to be a huge issue in 2016. You might be able to achieve adequate security against hackers and internal threats but every attempt to make systems idiot proof just means the idiots get upgraded. Firms are ever more connected to each other and the general outside world. So in 2016 we will see even more service firms accidentally leaking critical information and a lot more firms having their reputations scorched by incompetence fuelled security screw-ups. Good security people are needed more and more – a joke doing the rounds of ITExecs doing interviews is “if you’re a decent security bod, why do you need to look for a job”

There will still be unexpected single points of failures in big distributed networked system. The cloud behind the silver lining is that Amazon or any other cloud vendor can be as fault tolerant, distributed and well supported as you like, but if a service like Akamai or Cloudflare was to die, you still stop. That’s not a single point of failure in the classical sense but it’s really hard to manage unless you go for full cloud agnosticism – which is costly. This is hard to justify when their failure rate is so low, so the irony is that the reliability of the content delivery networks means fewer businesses work out what to do if they fail. Oh, and no one seems to test their mission-critical data centre properly, because it’s mission criticalSo they just over-specify where they can and cross their fingers (= pay twice and get the half the coverage for other vulnerabilities).

For IT start-ups it seems that Silicon Valley’s cash party is coming to an end. Silicon Valley is cooling, not crashing. Valuations are falling. The era of cheap money could be over and valuation expectations are re-calibrating down. The cheap capital party is over. It could mean trouble for weaker startups.

 

933 Comments

  1. Tomi Engdahl says:

    Battle lines are drawn: IBM prepares Power9 to take on Intel and ARM
    http://www.computerworld.com/article/3087541/data-center/battle-lines-are-drawn-ibm-prepares-power9-to-take-on-intel-and-arm.html?token=%23tk.CTWNLE_nlt_computerworld_servers_datacenter_2016-06-29&idg_eid=051598d6597df87056c54033166b3242&utm_source=Sailthru&utm_medium=email&utm_campaign=Computerworld%20Data%20Center%202016-06-29&utm_term=computerworld_servers_datacenter#tk.CW_nlt_computerworld_servers_datacenter_2016-06-29

    IBM predicts Power will take a double-digit server chip market share by 2020

    IBM has many goals with its upcoming Power9 chip, and one of them is to challenge the dominance of Intel’s x86 chips in the data center.

    The company wants chips based on Power architecture to take a double-digit server chip market share by 2020, Doug Balog, general manager for Power Systems at IBM, said in an interview.

    It’ll be a three-way battle between x86, Power and ARM, which has a similar goal of a double-digit market share in the next four years. IBM’s Power is off to a better start in terms of socket share, Balog said. IBM already is being used in servers, while ARM server processors are largely still being tested.

    Intel dominates the data center server chip market with a 90-plus percent market share. But IDC has predicted that Intel’s share will shrink as ARM-based chips and AMD’s x86-based Zen take away some of that lead.

    Power chips are already used in mainframes and high-end servers, and they are starting to show up in low-end and mid-range servers. In the second half of next year, IBM will start shipping servers based on Power9 architecture, which boasts significant upgrades from the current Power8.

    Until a few years ago, IBM was the only company selling Power servers, but today other server makers like Tyan and Supermicro are offering systems with the chips. IBM opened up the Power architecture three years ago through the OpenPower Foundation, which boasts members like Google, Samsung, and Nvidia.

    IBM is focused on selling Power servers costing more than US$6,000. Chinese vendors could sell Power servers at cheaper prices

    “They can compete with [IBM], I hope they do. That will be success in my view,” Balog said.

    Hyperscale servers are a big target market for Power9 chips

    Server requirements have changed in recent years with more focus on accelerated computing than on building technology around a chip

    IBM has doubled the number of Power9 CPU cores to 24, but Balog said co-processors like FPGAs (field-programmable gate arrays) and GPUs are playing a bigger role in server computation. Power9 will support a new interconnect called NVLink so CPUs can communicate with components significantly faster than existing PCI-Express 3.0.

    An interface called CAPI, linking FPGAs and new memory types to Power9 chips, will also be faster.

    Reply
  2. Tomi Engdahl says:

    Devindra Hardawar / Engadget:
    AMD’s Radeon RX 480 graphics card offers solid VR and 1440p gaming performance starting at $200

    AMD’s Radeon RX 480 is the new king of budget video cards
    It delivers solid VR and 1440p gaming performance, starting at $200.
    https://www.engadget.com/2016/06/29/amds-radeon-rx-480-is-the-new-king-of-budget-video-cards/

    Instead of trying to build the biggest and most powerful video card on the market, AMD aimed at the low end with the Radeon RX 480. But that doesn’t make it any less exciting than NVIDIA’s recent powerhouse GeForce GTX 1080 and 1070 GPUs. AMD’s pitch for the RX 480 is simple: It’s a $200 card that’s VR ready. That’s huge, especially since the current batch of GPUs that meet minimum VR specs cost around $350.

    To be fair, AMD did prime the pump a bit by sending me the 8GB version of the RX 480. That version of the card will retail for around $239, a bit more than the $200 figure it reached with the 4GB model.

    Compared to the last AMD card I tested — the mammoth R9 Fury X — the RX 480 is elegant in its simplicity.

    Installing the RX 480 was like any other GPU: Plug it in a PCI Express slot and connect additional power (in this case, it’s a single 6-pin PSU cable). I hooked a 4K monitor into one of the three DisplayPort slots (there’s also an HDMI slot) and installed AMD’s latest drivers, and I was ready to start gaming.

    In most of the 3DMark tests, the RX 480 scored around half as well as the GTX 1080. That’s actually quite impressive, considering that the 1080 costs upward of $600.

    Reply
  3. Tomi Engdahl says:

    Dell stops selling Android devices to focus on Windows
    Dell has discontinued Venue tablets with Android, and won’t be pushing out OS upgrades to current customers
    http://www.pcworld.com/article/3090466/android/dell-stops-selling-android-devices.html

    Dell has stopped selling Android devices as it steps away from slate-style tablets to focus on Windows 2-in-1 devices.

    The company isn’t refreshing the Venue line of Android tablets, and will no longer offer the Android-based Wyse Cloud Connect, a thumb-size computer that can turn a display into a PC. Other Android devices were discontinued some time ago.

    “The slate tablet market is over-saturated and is experiencing declining demand from consumers, so we’ve decided to discontinue the Android-based Venue tablet line,” a Dell spokesman said in an email.

    “We are seeing 2-in-1s rising in popularity since they provide a more optimal blend of PC capabilities with tablet mobility. This is especially true in the commercial space,”

    Dell won’t be offering OS upgrades to Android-based Venue tablets already being used by customers.

    “For customers who own Android-based Venue products, Dell will continue to support currently active warranty and service contracts until they expire, but we will not be pushing out future OS upgrades,” the spokesman said.

    Reply
  4. Tomi Engdahl says:

    Microsoft President Brad Smith: Computer Science Is Space Race of Today
    https://news.slashdot.org/story/16/07/01/0514242/microsoft-president-brad-smith-computer-science-is-space-race-of-today

    Q. How is K-12 computer science like the Cold War? A. It could use a Sputnik moment, at least that’s the gist of an op-ed penned by Senator Jerry Moran (R., KS) and Microsoft President Brad Smith.

    Computer science is space race of today
    http://www.kansas.com/opinion/opn-columns-blogs/article86754587.html

    In the wake of the Soviet Union’s 1957 Sputnik launch, President Eisenhower confronted the reality that America’s educational standards were holding back the country’s opportunity to compete on a global technological scale. He responded and called for support of math and science, which resulted in the National Defense Education Act of 1958 and helped send the country to the moon by the end of the next decade. It also created the educational foundation for a new generation of technology, leadership and prosperity.

    Today we face a similar challenge as the United States competes with nations across the globe in the indispensable field of computer science. To be up to the task, we must do a better job preparing our students for tomorrow’s jobs.

    These fields and others offer computing jobs in Kansas that pay on average $72,128 – roughly 70 percent higher than the average Kansas salary of $42,020. Unfortunately, there are more than 3,000 unfilled computing jobs in the state.

    Nationally, it’s the same picture: There are more than 500,000 unfilled computing jobs – with a projected million computing openings by 2024.

    We’re at an important intersection of technology and agriculture. Enormous investments are being made in “farm tech” startups – more than $2.06 billion in the first half of 2015 alone – that will shape the future of farming. As the agricultural sector depends more on data from computers, our need for workers with a basic understanding of computer science grows.

    Meanwhile, nations as large as China and as small as Estonia are taking steps to ensure that computer science education is available to all of their students. That puts our future workforce at a disadvantage in the increasingly globalized economy.

    Reply
  5. Tomi Engdahl says:

    Ben Lang / Road to VR:
    Google adds WebVR and VR Shell to beta and dev versions of Chrome on Android, potentially allowing all websites to be viewed in VR

    Google is Adding a VR Shell to Chrome to Let You Browse the Entire Web in VR
    http://www.roadtovr.com/google-is-adding-a-vr-shell-to-chrome-to-let-you-browse-the-entire-web-in-vr/

    Google is working to add fully immersive browsing capability to Chrome, allowing users to browse any part of the web in VR, not just those sites that are specially built for VR.

    Google has played an active role in helping to define and deploy ‘WebVR‘, a set of standard capabilities that allow for the creation of VR websites which can serve their content directly to VR headsets. But what about accessing the billions of websites already on the web? Today you’d have to take your headset on and off as you go from a WebVR site to a non-WebVR site. Google’s ultimate vision however is to let people stay in VR for all of their web browsing.

    https://webvr.info/

    Reply
  6. Tomi Engdahl says:

    StoreServ’s ASIC architect must have one heckuva crystal ball
    Post-NAND explorations and adaptations; or, predicting future storage uses
    http://www.theregister.co.uk/2016/07/01/storeserv_asic_architecture/

    StoreServ arrays use special hardware, an ASIC, to accelerate storage array operations, and this is redesigned for each major generation of the arrays. The current design is generation 5.’

    An ASIC design has to last for five years or so once systems using it start shipping

    The gen 5 ASIC has been active during a general array evolution from pure disk and hybrid flash/disk use to all-flash designs, with consequent dramatic reduction in media access latency. Nazari assumes that there will be an evolution to post-NAND media, such as Resistive RAM (ReRAM), with Memristor being in that general category, 3D XPoint and maybe STT-RAM (Spin Transfer Torque RAM) and PCM (Phase Change Memory). The ASIC should try to be agnostic and cover the general attributes exposed by these device technologies, such as lower latency and, perhaps, byte instead of NAND’s block erasability.

    Nazari said that HPE sees a role for XPoint as well as ReRAM. He said the HPE-SanDisk (now WDC) partnership was ongoing, and driven by HPE’s server operation. So he is seeing HPE servers (and others) using ReRAM and XPoint media with their sub-microsecond access latency when used in DIMM form. He is also expecting NVMe over Fabrics style networking with much lower network latency, which puts demands on the array to respond commensurately faster.

    He sees the StoreServ array world having six general components; accessing server hosts, host-array fabric (typically Fibre Channel now), array controller complex with the ASIC, controller-media fabric, and the array’s media drives.

    Nazari aims for the gen 6 ASIC to have NVMe over fabric optimisations. He is concerned that data services such as snapshot and replication should have latencies appropriate to the gen 6 ASIC era, meaning lower.

    He characterises a XEON core as having 150MB/sec bandwidth.

    In the gen 5 ASIC StoreServ arrays the controllers spend a lot of time waiting for locks, and he would like to reduce that time. One way is to add more queues to the hardware, more queues than cores, and HPE us working with HBA vendors – QLogic, Emulex, LSI, etc – to add more queues to the array side of their adapter products; they are already there on the client side. This initiative is independent of specific Fibre Channel standards, such as 16Gbit/sec and 32Gbit/sec.

    gen 6 ASIC StoreServ arrays will fit right in to a post-NAND media and NVMe fabrics era

    Reply
  7. Tomi Engdahl says:

    HP Inc. to Offer Personal Computers as a Service
    Corporate customers will be able to pay a fixed monthly fee per employee for computing equipment
    http://www.wsj.com/article_email/hp-inc-to-offer-personal-computers-as-a-service-1467322733-lMyQjAxMTE2MzM4MDIzMzA5Wj

    HP Inc. on Thursday said it plans to provide companies with personal computers and other devices as part of a service, the latest example of shifting business models as PC makers grapple with weak demand and other pressures.

    Corporate customers of HP’s new service will be able to pay a fixed monthly fee per employee for computing equipment, HP said, eliminating the need to pay upfront for hardware and letting companies shift capital spending to other purposes.

    But HP stressed other attractions beyond that selling point, which has long been available through computer rentals. HP said it would use software to manage how devices are deployed and used, helping customers make sure employees have sufficient processing power or data-storage capacity—or don’t have more sophisticated hardware than they need. The company also expects to monitor the health of components in the devices, so it can, for instance, provide replacement batteries before older ones wear out, HP said.

    Patrick Moorhead, an analyst with Moor Insights & Strategy, said the analytical capabilities are a key element of HP’s plans. “That’s the biggest thing that separates them” from existing computer-leasing and service offerings, he said.

    HP said it would ensure that all data is erased from devices that go out of service and that hardware is recycled, something that analysts say often doesn’t happen when companies buy or rent gear, risking the loss of sensitive information.

    Reply
  8. Tomi Engdahl says:

    Matt Jarzemsky / Wall Street Journal:
    Report: Tech firms accounted for 46% of US private-equity buyouts so far this year, the sector’s highest level since at least 1995

    Private Equity Has a Crush on Tech
    http://www.wsj.com/article_email/private-equity-has-a-crush-on-tech-1467308434-lMyQjAxMTE2NDAzMTgwMTEyWj

    Buyout shops have become comfortable with some corners of the tech sector that they see as relatively stable, such as corporate-software providers

    Reply
  9. Tomi Engdahl says:

    NVMe Brings Performance to Software-Defined Storage
    http://www.eetimes.com/document.asp?doc_id=1330028&

    NVM Express (NVMe) has begun to gather steam in the market, with many vendors releasing NVMe SSDs over the past year or so, while the supporting ecosystem has matured. In the meantime, Samsung Electronics Co., Ltd. and Red Hat are collaborating to give the emerging technology an extra push.

    At the recent Red Hat Summit, Samsung announced its NVMe (SSD) Reference Design platform, which can be used with Red Hat Ceph Storage, a software-defined storage platform. Red Hat Ceph/Samsung Reference Architecture is essentially a recipe to build unified storage for enterprise IT or cloud environments that handle transactional databases, machine-generated data and unstructured data.

    The reference architecture can be used as is or customized for a specific data center environment, and it’s first reference architecture Samsung has announced for SSDs

    company is seeing strong demand for NVMe SSDs as a means to support cloud and high performance workloads. In the past, the company hasn’t done reference designs, nor is it looking to sell enterprise storage systems, but it sees collaborating with Red Hat as a way to make it easier to adopt the relatively new NVMe technology.

    The reference design is also open source, and can be deployed in an OpenStack environment to support the bandwidth, latency and IOPS requirements of high performance workloads and use cases, such as distributed MySQL databases.

    Napaa said working with partners such as Red Hat and developing reference architectures will enable Samsung to provide proof points for its NVMe SSDs in high performance systems.

    Samsung is not the only company Red Hat has worked with to optimize Ceph. It has collaborated with SanDisk on its InfiniFlash technology, which uses flash, but not SSDs.

    Reply
  10. Tomi Engdahl says:

    Client-Side Performance
    http://www.linuxjournal.com/content/client-side-performance

    Web applications, when they first started, were dynamic only on the server side. Sure, they output HTML—and later, CSS and JavaScript—but the overwhelming majority of the processing and computation took place on the server.

    This model, of course, has changed dramatically in the last decade, to such a degree that you now accurately can claim to be a web developer and work almost exclusively in HTML, CSS and JavaScript, with little or no server-side component. Entire MVC frameworks, such as Ember.js, Angular.js and React.js, assume that you’ll be writing your application in JavaScript and provide you with the objects and infrastructure necessary for doing so.

    If you’re worried about the performance of your web application, you need to concern yourself not only with what happens on the server, but also with what happens in the browser. Some commercial performance-monitoring solutions already take this into account, allowing you to see how long it takes for elements to render, and then to execute, on your users’ browsers. However, there is also no shortage of open-source tools available for you to check and improve the ways in which your client-side programs are executing.

    Client-side code is written in JavaScript.

    Because so many modern web applications take place in JavaScript, the fact that you’re often loading JavaScript from remote servers means that the time it takes to render a page depends not just on the server speed, the network bandwidth and the page’s complexity, but also on the servers and networks serving such JavaScript, as well as those pages’ complexity. As a result, it’s generally considered to be good practice to load as many libraries as possible late in the game, at the bottom of your HTML page.

    Even better, you should consolidate your JavaScript files into a single file. This has a number of advantages. It means the user’s browser needs to download a single file, rather than many of them. If you include all of the JavaScript needed on your site in a single file, it also means that the file needs to be loaded only a single time.

    Better yet, you can run JavaScript code through a minimizer (or “minifier”), which removes comments, extraneous whitespace and anything else that isn’t necessary for client-side programs to run. By minifying JavaScript files, combining the files and then compressing the resulting combination, you can dramatically reduce the size of the JavaScript being sent to the user’s browser and ensure that it is loaded only once per visit to your website.

    Download Time

    Once the JavaScript is in the user’s browser, things are both easier and harder to analyze. On the one hand, you (the developer) can download the program, test it and check the performance—and then, you also can use in-browser debugging tools to test and improve things.

    One of the most important tools offered by both Chrome and Firefox is the display of files being sent to the browser. Even if your site appears to be loading and rendering quickly, a quick look at the download timeline almost certainly will be somewhere between surprising and shocking to you.

    Even New Relic, which normally is considered a (commercial) server-side performance monitor, now offers some client-side performance checking.

    Once you have combined and compressed your JavaScript files, you seriously should consider putting them, as well as any other static assets (such as CSS files and images), on a content distribution network (CDN).

    Benchmarking JavaScript

    Although JavaScript has a (well deserved, I think) reputation for being a frustrating and quirky language, the fact is that modern JavaScript implementations run very quickly—assuming that you use the language in the right way. However, it’s sometimes hard to know where your program is spending most of its time. Fortunately, the Chrome developer tools (a part of the Chrome browser) include a profiling tool. Go to the “profile” tab in the developer tools, and select the CPU option before visiting your site. You can run through your site for a few seconds or minutes before stopping the profiling from taking place. Once you’ve done that, you’ll get a (very long) indication of which JavaScript programs were running, where they came from and how much time the CPU spent in each one.

    You similarly can ask the profiler to examine memory usage.

    In Firebug, the Firefox-based debugger, you can profile a page by going to the “console” tab and clicking on “profile”. You’ll see a table showing how much time was spent in each function, and what percentage of the total time was spent there.

    Summary

    Although server-side programming still is a vital part of the web, the client is where much of the action is, and where the user often perceives lags and slowness.

    Reply
  11. Tomi Engdahl says:

    Hannah Kuchler / Financial Times:
    Pitchbook: VC investment in Europe dropped to $2.8B in Q2 2016 from $4.3B in Q2 2015; UK fell 5% to $994M

    European VC funding falls by a third ahead of Brexit vote
    http://www.ft.com/cms/s/0%2F185236be-3fdd-11e6-9f2c-36b487ebd80a.html?ft_site=falcon&desktop=true#axzz4DQyNNpeH

    High quality global journalism requires investment. Please share this article with others using the link below, do not cut & paste the article. See our Ts&Cs and Copyright Policy for more detail. Email [email protected] to buy additional rights. http://www.ft.com/cms/s/0/185236be-3fdd-11e6-9f2c-36b487ebd80a.html#ixzz4DQyVQlpT

    Venture capital investment in European start-ups dropped by over a third in the second quarter, contributing to anxiety about the continent’s technology sector that is expected to be buffeted by Britain’s exit from the EU.

    In Europe, funding from venture capitalists fell from $4.3bn in the second quarter of 2015 to $2.8bn in the three months that ended last week, according to preliminary data provided by research firm Pitchbook.

    High quality global journalism requires investment. Please share this article with others using the link below, do not cut & paste the article. See our Ts&Cs and Copyright Policy for more detail. Email [email protected] to buy additional rights. http://www.ft.com/cms/s/0/185236be-3fdd-11e6-9f2c-36b487ebd80a.html#ixzz4DQyY5zWS

    The fall in VC funding comes as technology executives in London worry whether the Brexit vote could jeopardise the UK capital’s future as a start-up hub for the continent. London is home to more than four in ten of Europe’s start-ups valued at $1bn or more

    Reply
  12. Tomi Engdahl says:

    Brian Crecente / Polygon:
    Xbox Play Anywhere, which delivers participating games to Xbox One and Windows 10 PCs with a single purchase, launches September 13

    Xbox Play Anywhere launches Sept. 13
    Play Anywhere … as long as it’s an Xbox or PC
    http://www.polygon.com/2016/7/1/12077802/xbox-play-anywhere-launches-sept-13

    Xbox Play Anywhere, which delivers certain games to the Xbox One and Windows 10 PC with a single purchase, goes live on Sept. 13, Microsoft confirmed to Polygon this week.

    The new program was during last month’s E3 Microsoft press conference. Play Anywhere saves your progress to a single file, tracks a single set of achievements and won’t cost any more than a regular game. It also means that any DLC you purchase will be playable on both systems. This includes game add-ons, season passes, consumables and in-game unlocks.

    The service requires that you have the Windows 10 Anniversary Edition update on your PC, which hits on Aug. 2, and the the summer update on your Xbox One console. It only supports the download version of the game, not the disc version.

    Reply
  13. Tomi Engdahl says:

    The tick-tock story of how LinkedIn shopped itself to Microsoft, Salesforce and Google
    How the $26 billion deal went down.
    http://www.recode.net/2016/7/2/12085428/linkedin-microsoft-salesforce-google-deal-timeline

    Most Securities and Exchange Commission filings are dry affairs. LinkedIn’s latest, a proxy statement that details its acquisition by Microsoft and the interest of four other suitors, is a lively one!

    Paperwork after mergers and acquisitions do tend to be lively — particularly ones, like this, involving two public companies, as the seller wants to show to investors they negotiated the best price.

    LinkedIn’s does that in some detail. While Microsoft didn’t bid the most, its cash-only offering won out. The filing does not name the losing bidders, but refers to a particularly aggressive one as “Party A.”

    Reply
  14. Tomi Engdahl says:

    Push-Button Generation of Deep Neural Networks
    http://www.eetimes.com/author.asp?section_id=36&doc_id=1330029&

    Commercial options exist for Deep Neural Networks. CEVA, for one, has just announced its second generation CEVA Deep Neural Network (CDNN2) featuring its Network Generator.

    The term “deep learning” refers to using deep (multi-layer) artificial neural networks to make sense out of complex data such as images, sounds, and text. Until recently, this technology has been largely relegated to academia. Over the past couple of years, however, increased computing performance coupled with reduced power consumption and augmented by major strides in neural network frameworks and algorithms has thrust deep learning into the mainstream.

    At the recent Embedded Vision Summit, for example, a machine vision demo of a deep neural network (DNN) running on an FPGA was identifying randomly presented images in real time

    Another example, researchers at MIT used a deep learning algorithm to analyze videos showing tens of thousands of different objects and materials being prodded, scraped, and hit with a drumstick.

    Two of the most popular frameworks for deep learning are Caffe-based networks and Google’s TensorFlow-based networks. Caffe is a well-known and widely-used machine-vision library that ported Matlab’s implementation of fast convolutional nets to C and C++; it was created with expression, speed, and modularity in mind; and it’s primarily employed by academics and researchers with some commercial use. TensorFlow is a relatively new alternative to Caffe that is supported and promoted by Google; it features a software library for numerical computation using data flow graphs; and it’s scalable and applicable to both research and commercial applications.

    Caffe was designed for image classification and is not intended for other deep-learning applications such as text or sound.

    There are several steps involved in creating a deep neural network. The first is to define and implement the network architecture and topology. Next, the network undergoes a training stage, which is performed offline on a powerful computing platform using tens or hundreds of thousands of images (in the case of a machine vision application). The result is a floating-point representation of the network and its “weights” (coefficients).

    More:
    http://www.embedded.com/electronics-blogs/max-unleashed-and-unfettered/4442308/Push-button-generation-of-deep-neural-networks

    Reply
  15. Tomi Engdahl says:

    Why Tech Support Is (Purposely) Unbearable
    https://news.slashdot.org/story/16/07/04/216221/why-tech-support-is-purposely-unbearable

    Getting caught in a tech support loop — waiting on hold, interacting with automated systems, talking to people reading from unhelpful scripts and then finding yourself on hold yet again — is a peculiar kind of aggravation that mental health experts say can provoke rage in even the most mild-mannered person. Now Kate Murphy writes at the NYT that just as you suspected, companies are aware of the torture they are putting you through as 92 percent of customer service managers say their agents could be more effective and 74 percent say their company procedures prevented agents from providing satisfactory experiences. “Don’t think companies haven’t studied how far they can take things in providing the minimal level of service,”

    Why Tech Support Is (Purposely) Unbearable
    http://www.nytimes.com/2016/07/04/technology/why-tech-support-is-purposely-unbearable.html?_r=0

    Your face turns red. You shout things into the phone that would appall your mother.

    It’s called tech support rage.

    And you are not alone. Getting caught in a tech support loop — waiting on hold, interacting with automated systems, talking to people reading from unhelpful scripts and then finding yourself on hold yet again — is a peculiar kind of aggravation that mental health experts say can provoke rage in even the most mild-mannered person.

    Worse, just as you suspected, companies are aware of the torture they are putting you through.

    According to a survey conducted last year by the industry group International Customer Management Institute, or ICMI, 92 percent of customer service managers said their agents could be more effective and 74 percent said their company procedures prevented agents from providing satisfactory experiences.

    Moreover, 73 percent said the complexity of tech support calls is increasing as customers have become more technologically sophisticated and can resolve simpler issues on their own.

    Many organizations are running a cost-per-contact model, which limits the time agents can be on the phone with you, hence the agony of round-robin transfers and continually being placed on hold

    “Don’t think companies haven’t studied how far they can take things in providing the minimal level of service,” Mr. Robbins said. “Some organizations have even monetized it by intentionally engineering it so you have to wait an hour at least to speak to someone in support, and while you are on hold, you’re hearing messages like, ‘If you’d like premium support, call this number and for a fee, we will get to you immediately.’”

    The most egregious offenders are companies like cable and mobile service providers, which typically have little competition and whose customers are bound by contracts or would be considerably inconvenienced if they canceled their service. Not surprisingly, cable and mobile service providers are consistently ranked by consumers as providing the worst customer support.

    When things don’t make sense and feel out of control, mental health experts say, humans instinctively feel threatened.

    You can also find excellent tech support in competitive markets like domain name providers, where operators such as Hover and GoDaddy receive high marks.

    Reply
  16. Tomi Engdahl says:

    Linux Desktop Marketshare Just Passed 2 Percent
    http://www.omgubuntu.co.uk/2016/07/linux-marketshare-reaches-2-percent

    Worldwide Linux marketshare has passed 2% for the first time, according to data from analytics company Net Market Share.

    The stat firm’s figure for Linux desktop usage in June 2016 is the highest share they’ve ever reported.

    Interestingly, Net Market Share do track Android and Linux separately, something that many similar companies do not.

    Fact Is: We’ll Never Really Know Linux Marketshare

    Trying to track Linux usage is always going to be fraught. With multiple distributions, kernels and browsers, there’s never a truly clear picture.

    Reply
  17. Tomi Engdahl says:

    GPUs Power Ahead
    http://semiengineering.com/gpus-power-ahead/

    AI, ADAS, deep learning, gaming, and high performance computing continue to drive GPUs into new application areas.

    GPUs, long a sideshow for CPUs, are suddenly the rising stars of the processor world.

    They are a first choice in everything from artificial intelligence systems to automotive ADAS applications and deep learning systems powered by convolutional neural network. And they are still the mainstays of high-performance computing, gaming and scientific computation, to name a few. Even well-known challenges in programming GPUs have not stopping their trajectory, thanks to better programming languages.

    The increase in GPU computing is very much tied to the explosion in parallel programming, said Roy Kim, Accelerated Computing Product Lead at Nvidia. “Going forward, and it has been true for a while now, the way you get performance out of processors is to go parallel for many, many reasons. It just happened to be that when Nvidia discovered that GPUs are great for general-purpose processing, it was around the time when a lot of researchers and developers were scrambling to figure out how to get more performance out of their code because they couldn’t rely on single cores to go any faster into the frequency.”

    Big Data Meets Chip Design
    http://semiengineering.com/big-data-meets-chip-design/

    Reply
  18. Tomi Engdahl says:

    Brexit Threatens London’s Data Center Market
    http://www.eetimes.com/document.asp?doc_id=1330044&

    The historic decision by voters in the UK to leave the European Union has generated an atmosphere of uncertainty about the role of London in the multi-tenant data center and hosting markets, a new report by 451 Research claims.

    The report notes the Brexit vote could challenge the strategy of international hosting firms using London as their European base — and could possibly prompt providers to consider other locations. It’s also a huge people issue, with more than 1.5 million of them employed in “digital companies” throughout the UK.

    “London is currently the European capital of the data center, hosting, and cloud markets,”

    Brexit Threatens London’s Data Center Market
    http://www.informationweek.com/data-centers/brexit-threatens-londons-data-center-market-/d/d-id/1326142

    “It’s considered somewhere that is easy to do business. It’s the most cosmopolitan and diverse location in terms of IT service providers, has the broadest supply, the largest choice of facilities, a common working language, and a skilled labor force — particularly in the tech-rich Thames Valley.”

    The uncertainty following the vote could mean other main metro markets for data center and hosting services in Europe — like Frankfurt, Paris, and Amsterdam — stand to gain from customers or providers looking to move their data or data center facilities out of the UK, Duncan said.

    However, Duncan cautioned that while European providers may be opportunistically hungry for a slice of London’s business, they face their own supply issues.

    “Perhaps Dublin potentially stands to gain the most from any immediate move: Some of the largest new data center builds are happening at the moment — including facilities for Facebook and Apple — and its location just across the Irish Sea and well-connected to transatlantic sub-sea cables might make it a convenient halfway house for some looking to have low-latency connectivity to the UK, but still be in the EU,” Duncan said.

    Reply
  19. Tomi Engdahl says:

    Glyn Moody / Ars Technica:
    Google’s DeepMind AI to use 1M anonymous NHS eye scans in five-year project to spot common diseases earlier — Privacy is unlikely to be an issue for this fully anonymised dataset. — Google’s DeepMind division has announced a partnership with the NHS’s Moorfields Eye Hospital

    Google’s DeepMind AI to use 1 million NHS eye scans to spot diseases earlier
    Privacy is unlikely to be an issue for this fully anonymised dataset.
    http://arstechnica.com/information-technology/2016/07/googles-deepmind-ai-to-use-1-million-nhs-eye-scans-to-spot-common-diseases-earlier/

    Reply
  20. Tomi Engdahl says:

    Bulgaria passes law requiring government software to be open source
    http://www.zdnet.com/article/bulgaria-passes-law-requiring-government-software-to-be-open-source/

    Amendments to the country’s Electronic Governance Act have been voted on in Parliament and are now in effect.

    Amendments have been passed by the Bulgarian Parliament requiring all software written for the government to be open source and developed in a public repository, making custom software procured by the government accessible to everyone.

    Article 58 of the Electronic Governance Act states that administrative authorities must include the following requirements: “When the subject of the contract includes the development of computer programs, computer programs must meet the criteria for open-source software; all copyright and related rights on the relevant computer programs, their source code, the design of interfaces, and databases which are subject to the order should arise for the principal in full, without limitations in the use, modification, and distribution; and development should be done in the repository maintained by the agency in accordance with Art 7c pt. 18.”

    A new government agency is charged with enforcing the law and setting up the public repository. A public register will also be developed in the next few weeks to track all projects, from inception through to technical specs, deliverables, and subsequent control, Bozhanov said.

    Existing solutions are unaffected. As part of the same law, all IT contracts are also made public

    Reply
  21. Tomi Engdahl says:

    Algorithm can spot lies in emails and dating sites
    http://www.telegraph.co.uk/technology/2016/07/05/algorithm-can-spot-lies-in-emails-and-dating-sites/

    Researchers have created a computer program that can detect lies, be it an email, dating profile or visa application.

    The algorithm created at City University London can tell if a person is lying just by analysing their word use, structure and context, according to the researchers.

    To create the algorithm, researchers compared text in tens of thousands of emails that contained lies and truthful contents.

    The comparison revealed that people who are lying are less likely to use personal pronouns – such as “I”, “me”, “mine” – and tend to use more adjectives, such as “brilliant” and “sublime”

    The algorithm is better at detecting lies than the average human. People manage to spot a lie 54 per cent of the time, according to the researchers, whereas the computer lie detector detects it 70 per cent of the time.

    “Humans are startlingly bad at consciously detecting deception,”

    Untangling a Web of Lies: Exploring Automated Detection of Deception in Computer-Mediated Communication
    http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2576197

    Reply
  22. Tomi Engdahl says:

    How to tell if your company needs a chief digital officer
    http://www.cio.com/article/3091829/leadership-management/how-to-tell-if-your-company-needs-a-chief-digital-officer.html

    Transforming your business to a digital operation requires a dedicated professional who thrives as a change agent, a.k.a., a chief digital officer. Here’s how to determine whether or not your organization can benefit from another executive in the new C-suite.

    The CDO role has been around for some time, but the position continues to evolve as the C-suite becomes more crowded. “The CDO owns identifying the key areas where digital transformation can dramatically improve the customer experience and is the influencer who drives that change to happen in the organization,” says Mark Orttung, CEO at Nexient, a technology services provider that specializes in agile software development.

    Reply
  23. Tomi Engdahl says:

    App-V birthday to you, Win10: Virty tools baked in Anniversary update
    Only enterprises and schools invited to this party
    http://www.theregister.co.uk/2016/07/06/win_10_desktop_virt_for_volume_customers/

    Microsoft is packing its desktop virtualization into Windows 10 Anniversary Update next month – but you’ll need an Enterprise or Education agreement to receive it.

    From August 2, the client’s release date, Application Virtualization (App-V) and User Environment Virtualization (UE-V) will come as standard for both the Windows 10 Enterprise and Education editions. Until now, you had to download and install App-V and UE-V separately.

    However, those on Windows Professional who’d also installed App-V and UE-V won’t get the virtualization software with their upgrade package.

    Reply
  24. Tomi Engdahl says:

    Sysadmins: Use these scripts to fully check out of your conference calls
    ‘Sorry, can you repeat that?’
    http://www.theregister.co.uk/2016/07/07/sys_admins_use_these_scripts_to_fully_check_out_of_your_call_conferences/

    Rejoice, system admins; Splunk developer Josh Newlan has created a series of scripts that will with the right tools get you out of time-wasting teleconference meetings.

    The scripts, built on Splunk and IBM Speech to Text Watson but which can be ported to use open source tools, allow over-worked crushed souls to have relevant chatter automatically transcribed to notes for later perusal.

    In the event that the admin’s name is mentioned, Newlan’s majestic masterpiece will play pre-recorded audio of the admin apologising and asking for the question to be repeated.

    “This script listens to meetings I’m supposed to be paying attention to and pings me on hipchat when my name is mentioned,” Newlan wrote on Github.

    “It sends me a transcript of what was said in the minute before my name was mentioned and some time after.

    Using speech-to-text to fully check out during con calls
    https://github.com/joshnewlan/say_what

    Reply
  25. Tomi Engdahl says:

    Alarming information – only two per cent of bugs found

    Bad news for the quality of the software code. Software errors or bugs can be found in only about two per cent. That’s why the New York Tandon University researchers are developing new methods to improve the quality of the code.

    This technique is called LAVA (large-scale automated vulnerability addition). LAVA experiments have shown that the existing coding tools are able to find only two errors in one hundred.

    Since the actual number of bug is not known, can not be properly assessed, as well tools work.

    Source: http://etn.fi/index.php?option=com_content&view=article&id=4666:halyttava-tieto-vain-kaksi-prosenttia-bugeista-loytyy&catid=13&Itemid=101

    Reply
  26. Tomi Engdahl says:

    Unreal Engine and Unity To Get NVIDIA’s New VR Rendering Tech
    https://games.slashdot.org/story/16/07/07/2125224/unreal-engine-and-unity-to-get-nvidias-new-vr-rendering-tech

    NVIDIA has announced that Unreal Engine and Unity will see integrations of its new Simultaneous Multi-projection (SMP) rendering tech, which the company says can yield “a 3x VR graphics performance improvement over previous generation GPUs.” NVIDIA recently introduced the technology as a unique feature of its latest series of GPUs built on the ‘Pascal’ architecture. According to the company, Simultaneous Multi-projection allows up to 16 views to be rendered from a single point with just one geometry pass, whereas older cards would need to add an additional pass for each additional view.

    http://www.roadtovr.com/unreal-engine-unity-vr-simultaneous-multi-projection-smp-nvidia/#comments

    Reply
  27. Tomi Engdahl says:

    Whether to buy or build yourself from the outside, now in the big puzzle software

    Digitalizing businesses controversy continues as to whether it is better to build (and invest in) its own software systems, or it’s better to buy from the outside the same application packages than the competitors have. Fortunately, there is a third option.

    The emergence of cloud services, has led many to discover that everything does not have to make from scratch. The cloud will give more resources to focus on the essentials, ie the areas where the company has strong businesses.

    Today, all companies want to be software companies. This is true even for the least digitalized industries, such as conservative farming.

    “This company selects the bits and pieces of different IT vendors pre-made package solutions. Software packages as if glued to the company with the peculiar and self-developed application functional entity, which will bring value to the company,” explains Lawson software solution to the dilemma.

    Legos-like solutions

    Research Gartnerkin House seems to believe the spread of software solutions, such as building blocks.

    First, it predicts that by 2020 three quarters of the applications built for yourself. According to researchers, companies shy away from ready-made package solutions and end up tuning their own application components, just like Lawsonkin wonders.

    “These software components are increasingly obtained Startup, market distrupters or highly specialized local corporations to software,” Gartner researchers write.

    Building blocks can be selected Box cloud storage, Microsoft’s image identification, IBM Watson artificial intelligence and analytics, or data from the company’s own ERP system.

    Low-code approach

    Companies are able to build simple mobile applications faster using the Microsoft services such as Flow, and Sales Force Lightning. Salesforce cloud application business group director Adam Seligman calls this low-code approach.

    “Our customers talk how they have almost every day to come up with business ideas over and over again. Innovation must produce ever faster. We believe that the low-code methods provide a good solution to these problems,” Seligman says.

    Uber has gotten up building blocks

    “One of the world’s most prestigious companies have outsourced 80 percent of their business idea of ​​the core functions. Uber has been in my possession only customer experiences,” Patel will update.

    “IT managers must define its own role in this figure CIO.. Does not need to be a builder of systems, or even an IT architect that much product designer must, however, be the case that he is able to reconcile the various architectures and applications of the company’s business, products and services,”

    Werbyn thinks that the new companies and their youth leaders choose more easily than others IT applications, building block method.

    “Diginatives and cloud native young people bake their own applications, forms of their dough. The rest of us wonder how the old and familiar IT systems being torn open and how these pieces are sewn software applications to comply with the new order,”

    Source: http://www.tivi.fi/Kaikki_uutiset/ostaako-ulkoa-vai-rakentaako-itse-kas-siina-iso-softapulma-6565156

    More:

    How to decide when to buy software and when to build it
    http://www.cio.com/article/3090963/application-development/how-to-decide-when-to-buy-software-and-when-to-build-it.html

    Every company may be a software company now, but that doesn’t mean you have to build it all yourself. When can you buy the same standard apps your competitors use? Why should you invest in building your own? And when you do build, how little code can you get away with? We explain.

    Reply
  28. Tomi Engdahl says:

    Big Pharma Is So 2015. Welcome to the Era of Big Software
    http://www.wired.com/2016/07/entering-age-big-software/

    Kellogg’s uses a cartoon tiger and elves to sell $14 billion dollars worth of refined carbohydrates each year. But this calorie-laden corporation was once an idealistic startup. Created by the eccentric Dr. John Harvey Kellogg, Corn Flakes were intended as a health food that made it easier for the masses to adopt a vegetarian lifestyle. Kellogg’s was the Soylent of its day.

    Today, Pfizer is a $188 billion dollar drug conglomerate. But there was a time when the biggest of “Big Pharma” companies was a lot like today’s Young Turks.

    These companies that we now think of as the epitomes of “Big Food” and “Big Pharma” were once humble startups. But as success beget success, they managed to dominate their markets for over a century. Over the next hundred years, we could see the same thing happen with the most high-minded tech of tech companies. Google, Amazon, Apple, and Facebook could grow to dominate the market in the same way Pfizer and Kellogg’s have dominated theirs. We could be witnessing the dawn of a new era: “Big Software.”

    Accelerating Innovation

    Conglomeration hasn’t hurt entrepreneurship in pharma or food. In fact, it has accelerated it.

    Efficient Entrepreneurship

    There are downsides to this new reality. Exits will likely be much smaller. We may not see another software startup approach Google’s half trillion dollar market value in the near term. Even Facebook’s $250 billion dollar market cap will be hard to match. Founders and investors will just have to “settle” for more “humble” single billion dollar valuations—or maybe double digit.

    Every tech invention doesn’t need to become a company and not every business is built to last, but if you set out to create more value than you capture, everyone can win.

    Reply
  29. Tomi Engdahl says:

    Storyboarding is a useful tool for the software design process
    http://www.controleng.com/single-article/storyboarding-is-a-useful-tool-for-the-software-design-process/56501cdde516cadcf79c7d37fec31c54.html

    Though storyboarding adds an additional step to the software design process, it is invaluable to the client and the engineer when used in software development because it offers transparency and clarity to the client while streamlining the process for the developers and engineers.

    While storyboarding has long been considered a tool employed by those in the film and entertainment industry, its use in the software design process is becoming more expected and appreciated by both the developer and the client.

    Though storyboarding adds an additional step to the software design process, it is invaluable to the client and the engineer when used in software development. This process avoids confusion and miscommunication when a team is working on a set of complex ideas. It affords the ability to easily present a cohesive plan to the client. Storyboarding also offers transparency and clarity to the client, while streamlining the process for the developers and engineers.

    During the specification phase of development, screens that the software will display are drawn, either on paper, or using specialized software, to illustrate significant elements of the user experience.

    Focus on savings

    Storyboards can be configured to run with accurate navigation and user interaction. This provides a visual representation of the software as well as a process flow with the feel of a completed solution. Altering the storyboard is less time-consuming than making changes to an implemented piece of software representing a significant time and cost savings. The engineers then modify the storyboard and create a custom end product meeting the client’s specific needs.

    While a verbal description of a screen or product is useful, there is room for misinterpretation. Sample images provided by a storyboard allow little room for misunderstanding and help the user understand exactly how the software will be employed in real-world circumstances

    Basic sketches of a storyboard on paper can be helpful, but there are a variety of more advanced and efficient tools available to software engineers.

    Storyboarding is a crucial part of the software design process that allows designers, in collaboration with their clients, to capture all the relevant information needed to produce the most custom and tailored product.

    Reply
  30. Tomi Engdahl says:

    Casey Newton / The Verge:
    Satya Nadella and other Microsoft executives discuss the company’s AI plans and why it has a competitive advantage

    Exclusive: Why Microsoft is betting its future on AI
    Inside Satya Nadella’s plan to outsmart Google
    http://www.theverge.com/2016/7/7/12111028/microsoft-bot-framework-artificial-intelligence-satya-nadella-interview

    Satya Nadella bounded into the conference room, eager to talk about intelligence. I was at Microsoft’s headquarters in Redmond, WA, and the company’s CEO was touting the company’s progress in building more intelligent apps and services.

    No matter where we work in the future, Nadella says, Microsoft will have a place in it. The company’s “conversation as a platform” offering, which it unveiled in March, represents a bet that chat-based interfaces will overtake apps as our primary way of using the internet: for finding information, for shopping, and for accessing a range of services. And apps will become smarter thanks to “cognitive APIs,” made available by Microsoft, that let them understand faces, emotions, and other information contained in photos and videos.

    Microsoft argues that it has the best “brain,” built on nearly two decades of advancements in machine learning and natural language processing, for delivering a future powered by artificial intelligence. It has a head start in building bots that resonate with users emotionally, thanks to an early experiment in China. And among the giants, Microsoft was first to release a true platform for text-based chat interfaces — a point of pride at a company that was mostly sidelined during the rise of smartphones.

    Microsoft is proud of its work on AI, and eager to convey the sense that this time around, it’s poised to win.

    The company, as ever, talks a big game. Microsoft’s historical instincts about where technology is going have been spot-on. But the company has a record of dropping the ball when it comes to acting on that instinct.

    It organized a conference in San Francisco in June to promote cooperation among bot-makers. “We’re really interested in it being interoperable — we want it to be an ecosystem,”

    Of course, Microsoft isn’t alone in trying to build the defining platform for the next generation of computing — if conversation even turns out to be that platform. Every major tech company and a host of startups are building AI divisions, often with impressive results. But here it’s worth saying that comparing AI across companies is difficult to the point of being impossible. Much of what companies like Google, Facebook, and Amazon are working on remains unreleased.

    Benedict Evans, the resident futurist at venture capital firm Andreessen Horowitz, said in a recent blog post that the future of AI remains opaque. “This field is moving so fast that it’s not easy to say where the strongest leads necessarily are, nor to work out which things will be commodities and which will be strong points of difference,” he wrote. “Though most of the primary computer science around these techniques is being published and open-sourced, the implementation is not trivial — these techniques are not necessarily commodities, yet.”

    When bots do their work in the background, they can feel a little bit like magic

    Reply
  31. Tomi Engdahl says:

    Dean Takahashi / VentureBeat:
    Nvidia launches its first game, VR Funhouse, for HTC Vive VR on Steam, will release it as open source later this summer

    Nvidia boosts virtual reality with its physics-based VR Funhouse minigames
    http://venturebeat.com/2016/07/14/nvidia-boosts-virtual-reality-with-its-own-physics-based-vr-funhouse-mini-games/

    Virtual reality could be a big driver for gamer PCs with powerful graphics cards. That’s why Nvidia wants it to succeed. And to boost VR hardware sales, Nvidia is launching its own game, VR Funhouse for the HTC Vive VR headset. The title is the first game that Nvidia has ever released, and it’s available today on Valve’s Steam digital distribution system.

    Reply
  32. Tomi Engdahl says:

    Mary Jo Foley / ZDNet:
    Microsoft concedes that it won’t have Windows 10 installed on 1B devices by mid-2018, as previously projected — Microsoft isn’t going to make its self-imposed deadline of having Windows 10 installed on 1 billion devices by mid-2018, company officials have conceded.

    Microsoft: Windows 10 won’t hit 1 billion devices by mid-2018
    http://www.zdnet.com/article/microsoft-windows-10-wont-hit-1-billion-devices-by-mid-2018/

    Microsoft isn’t going to make its self-imposed deadline of having Windows 10 installed on 1 billion devices by mid-2018, company officials have conceded.

    A little over a year ago, with much fanfare, Microsoft execs drew a line in the sand, predicting that Windows 10 would be installed on 1 billion devices by mid-2018.

    But Microsoft officials conceded today, July 15, that they likely won’t make that deadline.

    My ZDNet colleague Ed Bott noted at the end of a blog post Friday that Microsoft officials still think they can hit the 1 billion Windows 10 market, but that “it’s unlikely to happen by 2018 as originally projected”.

    “Windows 10 is off to the hottest start in history with over 350m monthly active devices, with record customer satisfaction and engagement.”

    Microsoft Windows and Devices chief Terry Myerson made the original claim at Build 2015, noting the 1 billion would encompass all kinds of devices that would run Windows 10 in some variant, including desktops, PCs, laptops, tablets, Windows Phones, Xbox One gaming consoles, Surface Hub conferencing systems, HoloLens augmented reality glasses, and various Internet of Things (IoT) devices. Officials said at that time the majority of those 1 billion devices would be PCs and tablets.

    But Windows Phones running Windows 10 Mobile were also expected to help Microsoft reach that total by mid-2018. Since April 2015, the bottom has fallen out of the Windows Phone market, with Microsoft officials conceding that Windows Phone isn’t much of a focus for Microsoft in calendar 2016.

    After one year, 10 lessons learned for Windows 10
    http://www.zdnet.com/article/after-one-year-10-lessons-learned-for-windows-10/

    It’s been a busy year in Redmond, with Windows 10 delivering three major releases to 350 million active users. Here’s a look back at some major milestones and stumbles along the way, and new predictions about when Windows 10 will hit its ambitious goal of a billion devices.

    Reply
  33. Tomi Engdahl says:

    Mark Bergen / Recode:
    Google recently shut down project to create high-end standalone VR headset with non-Android OS — Another sign that Google is putting its eggs into mobile VR. — Google recently shut down an internal project to create a high-end standalone virtual reality headset akin to devices …

    Google had an Oculus competitor in the works — but it nixed the project
    Another sign that Google is putting its eggs into mobile VR.
    http://www.recode.net/2016/7/15/12201032/google-virtual-reality-oculus-headset

    Google recently shut down an internal project to create a high-end standalone virtual reality headset akin to devices from Facebook’s Oculus and HTC, according to sources familiar with the plans.

    The decision likely stems from Google’s effort to streamline its more ambitious projects, an ongoing slog at the company. In this instance, Google is shifting more resources behind mobile VR — tools for companies to build apps, games and services on smartphones that use the nascent media — rather than expensive hardware.

    Google declined to comment.

    In May, the company released Daydream, a platform and reference design for new VR hardware that’s a more advanced version of its thrifty Cardboard headset. (Google also said it would be releasing its own Daydream headset.) The Daydream platform is built on the latest version of Google’s Android operating system.

    Meanwhile, a different VR project was germinated inside the X research lab (now a separate Alphabet company) with around 50 employees working on it, according to one source. More critically, that project was creating a separate operating system for the device, unique from Android.

    Now, it seems, that OS and project were scratched in favor of Android.

    But this move does indicate the company is less interested in competing directly with hardware from Facebook, Samsung, HTC and others, which have released pricier VR equipment.

    Reply
  34. Tomi Engdahl says:

    A site called “Motherboard” reports assembling a computer is too hard and a ‘nerve-wrecking [sic]’ process.

    PC Gaming Is Still Way Too Hard
    http://motherboard.vice.com/read/pc-gaming-is-still-way-too-hard?trk_source=popular

    Here’s Motherboard’s super simple guide to building your first gaming PC:

    Step 1: Have an unreasonable amount of disposable income.
    Step 2: Have an unreasonable amount of time to research, shop around, and assemble parts for your computer.
    Step 3: Get used to the idea that this is something you’re going to have to keep investing time and money in as long as you want to stay at the cutting edge or recommended specifications range for new PC games.

    The details, of course, are much more complicated, but that’s the gist of what it takes to enter the holy kingdom of PC gaming. If it sounds like a bad deal, I agree, which is why the majority of people are better off with an Xbox One or PlayStation 4, despite why the awfully self-titled “PC Master Race” might tell you.

    Reply
  35. Tomi Engdahl says:

    SoftBank Bought ARM
    http://hackaday.com/2016/07/18/softbank-bought-arm/

    $32 billion USD doesn’t buy as much as it used to. Unless you convert it into British Pounds, battered by the UK’s decision to leave the European Union, and make an offer for ARM Holdings. In that case, it will buy you our favorite fabless chip-design company.

    The company putting up 32 Really Big Ones is Japan’s SoftBank, a diversified technology conglomerate. SoftBank is most visible as a mobile phone operator in Japan, but their business strategy lately has been latching on to emerging technology and making very good investments. (With the notable exception of purchasing the US’s Sprint Telecom, which they say is turning around.) Recently, they’ve focused on wireless and IoT. And now, they’re going to buy ARM.

    We suspect that this won’t mean much for ARM in the long term. SoftBank isn’t a semiconductor firm, they just want a piece of the action. With the Japanese economy relatively stagnant, a strong Yen and a weak Pound, ARM became a bargain.

    Reply
  36. Tomi Engdahl says:

    Open-source Microsoft protocol aims to be a programming standard
    http://www.zdnet.com/article/open-source-microsoft-protocol-aims-to-be-a-programming-standard/

    Codenvy, Microsoft, and Red Hat have announced they are adopting a universal Language Server Protocol for integrated development environments.

    This may sound shocking. Keep in mind though that Microsoft has been embracing open-source methods at a deep level. And besides that, Microsoft has been working in bringing together Visual Studio with the open-source Eclipse integrated development environment. And, lest we forget, Microsoft just made it possible for you to run SQL Server, .NET Core 1.0, and ASP.NET on Red Hat Enterprise Linux (RHEL).

    So, when you put it together, it’s not too surprising that Microsoft and its open-source partners have created the Language Server Protocol (LSP). The LSP is a collaborative effort to provide a common way to integrate programming languages across code editors and integrated development environments (IDEs). The protocol extends developer flexibility and productivity by enabling a rich editing experience within a variety of tools for different programming languages.

    The LSP is an open-source project that defines a JavaScript Object Notation (JSON)-based data exchange protocol for language servers. This project is being hosted on GitHub. It’s licensed under the creative commons and MIT licenses.

    LSP is designed to promote interoperability between editors and language servers. The protocol also enables developers to access intelligent programming language assistants. These include such functions as: Find by symbol, syntax analysis, code completion, go to definition, outlining, and refactoring with their editor or IDE of choice.

    Reply
  37. Tomi Engdahl says:

    Rachael King / Wall Street Journal:
    IBM beats expectations as Q2 revenue drops 2.8% to $20.24B, 17th straight quarterly decline

    IBM Results Decline Amid Pockets of Growth
    Big Blue’s revenue has fallen for 17 straight quarters
    http://www.wsj.com/article_email/ibm-results-decline-again-but-top-expectations-1468873170-lMyQjAxMTA2OTE0ODExMTgwWj

    International Business Machines Corp. sales continued to decline in its most recent quarter, even as it made gains in strategic areas, such as cloud computing and data analytics.

    The Armonk, N.Y., company on Monday reported that its revenue for the second quarter dropped 2.8% to $20.24 billion, as its established businesses continue lose ground to cloud computing services delivered over the internet

    revenue was better than analysts’ expectation of $20.03 billion

    “Brexit didn’t help, but from everything we’ve seen we haven’t changed our view,”

    The computing giant said earnings fell to $2.5 billion, or $2.61 a share, from $3.45 billion, or $3.50 a share, one year earlier.

    IBM’s path to new businesses has been challenging. The laid off tens of thousands of employees over the past year, even as it has hired tens of thousands of new employees in these new businesses.

    Reply
  38. Tomi Engdahl says:

    Reuters:
    Chinese consortium to acquire Opera’s browser and other businesses for $600M, except for ad, marketing, TV, and game-related operations — A $1.2 billion takeover of Opera Software by a group of Chinese internet firms fell through on Monday after failing to get regulatory approval in time …

    Chinese takeover of Norway’s Opera fails, alternative proposed
    http://www.reuters.com/article/us-opera-software-m-a-china-idUSKCN0ZY0CA

    A $1.2 billion takeover of Opera Software by a group of Chinese internet firms fell through on Monday after failing to get regulatory approval in time, sending the Norwegian browser firm’s shares to a seven-month low.

    The deal needed a green light from the United States and China, and one firm in the Chinese consortium said U.S. privacy concerns would have led to an investigation into some of Opera’s products that risked delaying the acquisition for up to a year.

    Opera and the Chinese group have instead come up with an alternative deal worth $600 million which strips out some products and services in a bid to overcome regulatory hurdles.

    Reply
  39. Tomi Engdahl says:

    Companies can save a third of software costs

    Companies can cut the cost of expenditure by up to 30 percent of the software, if adopting new practices, believes the research institute Gartner. The process is not easy

    First you need to configure applications optimization tools.

    Second, the Institute recommends recycling software licenses

    Thirdly software license management software called SAM tools (software asset management).

    Gartner forecasts that companies and organizations consuming software this year to $ 332 billion.

    Gartner points out that many software’s default setting is usually the company’s point of view the most expensive solution. In particular, this applies to the central server software.

    Source: http://etn.fi/index.php?option=com_content&view=article&id=4715:yritykset-voivat-saastaa-kolmanneksen-softakuluissaan&catid=13&Itemid=101

    Reply
  40. Tomi Engdahl says:

    OpenBSD makes the difference between linux

    OpenBSD is a Unix-like free operating system, which is especially known for robust investments in information security. Now, the operating system is being prepared version 6.0, which is made more clear break with many projects with the competing Linux.

    OpenBSD development blog describes changes to be published at the beginning of September, the 6.0 version. Security side, disappearing is support for linux emulation.

    Source: http://etn.fi/index.php?option=com_content&view=article&id=4744:openbsd-tekee-eroa-linuxiin&catid=13&Itemid=101

    More: https://www.openbsd.org/60.html

    Reply
  41. Tomi Engdahl says:

    Why Big Business is usually last to the party
    Please Ms CFO, can we have some new hardware?
    http://www.theregister.co.uk/2016/07/29/why_big_business_is_usually_last_to_the_party/

    Big businesses tend to be exceptionally risk averse. There’s a general reluctance to adopt new, bleeding-edge technology because the priority – understandably – is to be able to maintain productivity.

    Small companies can live with the occasional glitch in systems – a couple of dozen people without email for a couple of hours is far from the end of the world. The same isn’t true if you have 10,000 people around the world relying on core systems, with thousands of pounds in lost revenue for every minute of downtime.

    Actually, though, the avoidance of adopting a new system or technology “because it’s risky” is dumb. What’s sensible is the adoption of it because it’s too risky.

    Yes, I understand the aversion of big businesses to changing something that’s not broken. Fixing a dead system is one thing, but replacing something that works just fine with something a bit unknown can be daunting. But big businesses have a fairly fundamental advantage over the rest of us when it comes to improvements, upgrades and replacements: they’re … well … big.

    First of all, big businesses can afford proper development, test, staging and production environments for their systems. They can afford proper version management software, deployment systems that can roll new things in under strict control and which can roll back change in the event that something didn’t quite work correctly.

    Big businesses can also afford quality, qualified staff to manage all stages of an improvemen

    Big businesses also have more than an amorphous blob of people: they have locations, divisions, teams, departments, groups, … regardless of the structure there’s always some way to divide them into manageable chunks and hence manage the introduction of change in a controlled, piece-by-piece manner.

    Getting back to the subject of change, and the risk thereof, you’re right to ask yourself the cost of something going wrong – both financially and reputationally. You also need to ask about the cost of not changing, though – or, more accurately, you need to stop dancing around change using weasel-words to hide what you’re up to.

    If you’re looking for funds to replace a broken system, that system must have a value to the company. If it didn’t, you wouldn’t need to replace it. By not executing the replacement (which may be a like-for-like replacement, though often it’s a more modern alternative) you will incur a financial or reputational cost. And where you’re looking to case (b), the cost of not proceeding is Y-X – and were that difference not significant you wouldn’t be suggesting going to all that hassle.

    Reply
  42. Tomi Engdahl says:

    CB Insights:
    Only two of 21 tech companies acquired in 2016 for $1B+ were venture-backed unicorns
    https://www.cbinsights.com/blog/unicorn-overvaluation-acquisitions/

    Reply
  43. Tomi Engdahl says:

    Adi Robertson / The Verge:
    Valve will license its SteamVR Tracking system used in HTC Vive to developers for free, allowing custom motion controllers for the Vive headset — A new Valve program will let companies use the HTC Vive virtual reality tracking system for their own hardware.

    Valve is opening up its Vive VR trackers to custom hardware
    http://www.theverge.com/circuitbreaker/2016/8/4/12374508/valve-htc-vive-steamvr-tracking-license-custom-hardware-program

    A new Valve program will let companies use the HTC Vive virtual reality tracking system for their own hardware. Under the SteamVR Tracking license, product designers can buy sensors like those on the Vive headset and controllers, attach them to their own products, and then track them with the Vive’s base stations. This means that developers could make their own custom motion controllers for the Vive — or, in the long run, that Valve could establish a motion tracking standard for all kinds of objects.

    Valve isn’t attaching any licensing fees to its system; an FAQ says “the largest value for our customers and for Valve will come from allowing SteamVR Tracking to proliferate as widely as possible.”

    There’s also a modular reference object that people can use to start building their own projects

    The kit is free for developers, but in order to participate in the program, each company must send at least one person to a $3,000 training course held in Seattle. The first of these multi-day programs will be held in mid-September, and Valve says that at some point in the future, it hopes to eliminate the in-person training requirement.

    Valve doesn’t give a definitive list of how people will use its tech, but it references tracking for “VR golf clubs,” indoor drones, and custom head-mounted displays. The most obvious short-term option is custom peripherals for the HTC Vive itself

    Reply
  44. Tomi Engdahl says:

    Sam Byford / The Verge:
    Every Mac in Apple’s current lineup except the 12-inch MacBook is outdated; the 13-inch non-Retina MacBook Pro was last updated in June 2012, still costs $1.1K — One thousand, five hundred and fourteen days. Or: four years, one month, and twenty-four days.

    First Click: Apple should stop selling four-year-old computers
    August 4th, 2016
    http://www.theverge.com/2016/8/4/12373776/2012-macbook-pro-still-alive-not-dead-why

    One thousand, five hundred and fourteen days. Or: four years, one month, and twenty-four days.

    That’s how long it’s been since Apple released the last MacBook Pro to come without a Retina display.

    Nothing unusual about that, of course — technology moves on. Except it’s now August 2016, and Apple is inexplicably still selling the exact same laptop.

    MacRumors lists almost every Mac as “Don’t Buy”

    there’s a certain point at which it just starts to look like absentmindedness

    Joel Hruska / ExtremeTech:
    Apple’s slow Mac refresh cycle is mostly driven by the low rate of improvements in the PC hardware today — Most of Apple’s product lines are severely overdue for a refresh. Apart from the recently refreshed MacBook, many of the company’s Mac products are well over a year old.

    Apple’s stagnant product lines mostly reflect the state of the computer industry
    http://www.extremetech.com/computing/233058-apples-stagnant-product-lines-mostly-reflect-the-state-of-the-computer-industry

    Most of Apple’s product lines are severely overdue for a refresh. Apart from the recently refreshed MacBook, many of the company’s Mac products are well over a year old. The Mac Mini is nearly two, the high-end workstation Mac Pro is almost three, and the sole remaining non-Retina MacBook Pro is now more than four years old.

    Writing for The Verge, Sam Byford recently argued that “Apple should stop selling four year-old computers.” He’s not wrong to note the 2012-era MacBook Pro is pretty long in the tooth with its 4GB of RAM and Ivy Bridge-based processor, or that Apple has neglected specific products, like the Mac Mini.

    Meet the new CPU, same as the old CPU

    Setting aside the non-Retina MacBook Pro from 2012, most of Apple’s laptops are running on Broadwell or Skylake (the 2016 MacBook). There’s a single SKU left over from Haswell at the $1,999 price point, but the laptop lineup is pretty new.

    Apple has held off on making fundamental platform changes precisely because it’s been waiting for the underlying technology to advance enough to make the changes worthwhile. Given the frustration of sorting through hundreds of nearly identical laptops from multiple manufacturers every time a friend or family member asks for help choosing a laptop, I’m not sure I can blame them.

    Reply
  45. Tomi Engdahl says:

    Elizabeth Dwoskin / Washington Post:
    China floods Silicon Valley with cash: $6B+ invested total, $3B+ in the last 18 months; Alibaba’s troubled investment in Quixey shows how such deals can go bad — SAN FRANCISCO — Mountain View, Calif., start-up Quixey was the envy of many in Silicon Valley when the company announced …

    China is flooding Silicon Valley with cash. Here’s what can go wrong.
    https://www.washingtonpost.com/business/economy/new-wave-of-chinese-start-up-investments-comes-with-complications/2016/08/05/2051db0e-505d-11e6-aa14-e0c1087f7583_story.html

    Reply
  46. Tomi Engdahl says:

    Data Center IT Equipment Industry Demand, Supply and Forecast 2021
    http://prsync.com/reportsweb/data-center-it-equipment-industry-demand-supply-and-forecast–987972/

    The 2016 Market Research Report on Global Data Center IT Equipment Industry is a professional and in-depth study on the current state of the Data Center IT Equipment industry.

    Reply
  47. Tomi Engdahl says:

    Advanced analytics will put big data to good use
    http://www.edn.com/electronics-blogs/from-the-edge-/4442379/Advanced-analytics-will-put-big-data-to-good-use?_mc=NL_EDN_EDT_EDN_today_20160720&cid=NL_EDN_EDT_EDN_today_20160720&elqTrackId=16661d6dea0540ab83e2adc544e0808b&elq=22f8fabdd1444e8f9a456968c48d4226&elqaid=33125&elqat=1&elqCampaignId=28955

    According to Technavio, there are four emerging trends that are impacting the global big data enabled market. In a recent report on the topic, they outlined the impact of enterprise mobility, big data and cloud computing, social media, and analytics in the 2015-2019 timeframe.

    By 2019, the global big data enabled segment is forecast to exceed $209 billion, representing a CAGR in excess of 23% during the period. Amit Sharma, lead IT professional services research analyst at Technavio, indicates the rate of growth mirrors the increased data volume across all sectors, as well as the need to enhance productivity and efficiency.

    The report stresses that mobile devices, applications and, services are reshaping enterprise business strategies, revenue models, and relationships with customers and partners.

    Cloud-based solutions are increasing efficiency and reducing cost associated with big data solution deployment. Big data analysis takes place via cloud-based solutions. Data is available “anywhere, anytime” and stored anywhere in the world.

    Becoming a major data source, social media enables instant feedback on products and services.

    Making strategic decisions based on big data now at its fingertips

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*