Computer trends for 2015

Here are comes my long list of computer technology trends for 2015:

Digitalisation is coming to change all business sectors and through our daily work even more than before. Digitalisation also changes the IT sector: Traditional software package are moving rapidly into the cloud.  Need to own or rent own IT infrastructure is dramatically reduced. Automation application for configuration and monitoring will be truly possible. Workloads software implementation projects will be reduced significantly as software is a need to adjust less. Traditional IT outsourcing is definitely threatened. The security management is one of the key factors to change as security threats are increasingly digital world. IT sector digitalisation simply means: “more cheaper and better.”

The phrase “Communications Transforming Business” is becoming the new normal. The pace of change in enterprise communications and collaboration is very fast. A new set of capabilities, empowered by the combination of Mobility, the Cloud, Video, software architectures and Unified Communications, is changing expectations for what IT can deliver.

Global Citizenship: Technology Is Rapidly Dissolving National Borders. Besides your passport, what really defines your nationality these days? Is it where you were live? Where you work? The language you speak? The currency you use? If it is, then we may see the idea of “nationality” quickly dissolve in the decades ahead. Language, currency and residency are rapidly being disrupted and dematerialized by technology. Increasingly, technological developments will allow us to live and work almost anywhere on the planet… (and even beyond). In my mind, a borderless world will be a more creative, lucrative, healthy, and frankly, exciting one. Especially for entrepreneurs.

The traditional enterprise workflow is ripe for huge change as the focus moves away from working in a single context on a single device to the workflow being portable and contextual. InfoWorld’s executive editor, Galen Gruman, has coined a phrase for this: “liquid computing.”   The increase in productivity is promised be stunning, but the loss of control over data will cross an alarming threshold for many IT professionals.

Mobile will be used more and more. Currently, 49 percent of businesses across North America adopt between one and ten mobile applications, indicating a significant acceptance of these solutions. Embracing mobility promises to increase visibility and responsiveness in the supply chain when properly leveraged. Increased employee productivity and business process efficiencies are seen as key business impacts.

The Internet of things is a big, confusing field waiting to explode.  Answer a call or go to a conference these days, and someone is likely trying to sell you on the concept of the Internet of things. However, the Internet of things doesn’t necessarily involve the Internet, and sometimes things aren’t actually on it, either.

The next IT revolution will come from an emerging confluence of Liquid computing plus the Internet of things. Those the two trends are connected — or should connect, at least. If we are to trust on consultants, are in sweet spot for significant change in computing that all companies and users should look forward to.

Cloud will be talked a lot and taken more into use. Cloud is the next-generation of supply chain for ITA global survey of executives predicted a growing shift towards third party providers to supplement internal capabilities with external resources.  CIOs are expected to adopt a more service-centric enterprise IT model.  Global business spending for infrastructure and services related to the cloud will reach an estimated $174.2 billion in 2014 (up a 20% from $145.2 billion in 2013), and growth will continue to be fast (“By 2017, enterprise spending on the cloud will amount to a projected $235.1 billion, triple the $78.2 billion in 2011“).

The rapid growth in mobile, big data, and cloud technologies has profoundly changed market dynamics in every industry, driving the convergence of the digital and physical worlds, and changing customer behavior. It’s an evolution that IT organizations struggle to keep up with.To success in this situation there is need to combine traditional IT with agile and web-scale innovation. There is value in both the back-end operational systems and the fast-changing world of user engagement. You are now effectively operating two-speed IT (bimodal IT, two-speed IT, or traditional IT/agile IT). You need a new API-centric layer in the enterprise stack, one that enables two-speed IT.

As Robots Grow Smarter, American Workers Struggle to Keep Up. Although fears that technology will displace jobs are at least as old as the Luddites, there are signs that this time may really be different. The technological breakthroughs of recent years — allowing machines to mimic the human mind — are enabling machines to do knowledge jobs and service jobs, in addition to factory and clerical work. Automation is not only replacing manufacturing jobs, it is displacing knowledge and service workers too.

In many countries IT recruitment market is flying, having picked up to a post-recession high. Employers beware – after years of relative inactivity, job seekers are gearing up for changeEconomic improvements and an increase in business confidence have led to a burgeoning jobs market and an epidemic of itchy feet.

Hopefully the IT department is increasingly being seen as a profit rather than a cost centre with IT budgets commonly split between keeping the lights on and spend on innovation and revenue-generating projects. Historically IT was about keeping the infrastructure running and there was no real understanding outside of that, but the days of IT being locked in a basement are gradually changing.CIOs and CMOs must work more closely to increase focus on customers next year or risk losing market share, Forrester Research has warned.

Good questions to ask: Where do you see the corporate IT department in five years’ time? With the consumerization of IT continuing to drive employee expectations of corporate IT, how will this potentially disrupt the way companies deliver IT? What IT process or activity is the most important in creating superior user experiences to boost user/customer satisfaction?

 

Windows Server 2003 goes end of life in summer 2015 (July 14 2015).  There are millions of servers globally still running the 13 year-old OS with one in five customers forecast to miss the 14 July deadline when Microsoft turns off extended support. There were estimated to be 2.7 million WS2003 servers in operation in Europe some months back. This will keep the system administrators busy, because there is just around half year time and update for Windows Server 2008 or Windows 2012 to may be have difficulties. Microsoft and support companies do not seem to be interested in continuing Windows Server 2003 support, so those who need that the custom pricing can be ” incredibly expensive”. At this point is seems that many organizations have the desire for new architecture and consider one option to to move the servers to cloud.

Windows 10 is coming  to PCs and Mobile devices. Just few months back  Microsoft unveiled a new operating system Windows 10. The new Windows 10 OS is designed to run across a wide range of machines, including everything from tiny “internet of things” devices in business offices to phones, tablets, laptops, and desktops to computer servers. Windows 10 will have exactly the same requirements as Windows 8.1 (same minimum PC requirements that have existed since 2006: 1GHz, 32-bit chip with just 1GB of RAM). There is technical review available. Microsoft says to expect AWESOME things of Windows 10 in January. Microsoft will share more about the Windows 10 ‘consumer experience’ at an event on January 21 in Redmond and is expected to show Windows 10 mobile SKU at the event.

Microsoft is going to monetize Windows differently than earlier.Microsoft Windows has made headway in the market for low-end laptops and tablets this year by reducing the price it charges device manufacturers, charging no royalty on devices with screens of 9 inches or less. That has resulted in a new wave of Windows notebooks in the $200 price range and tablets in the $99 price range. The long-term success of the strategy against Android tablets and Chromebooks remains to be seen.

Microsoft is pushing Universal Apps concept. Microsoft has announced Universal Windows Apps, allowing a single app to run across Windows 8.1 and Windows Phone 8.1 for the first time, with additional support for Xbox coming. Microsoft promotes a unified Windows Store for all Windows devices. Windows Phone Store and Windows Store would be unified with the release of Windows 10.

Under new CEO Satya Nadella, Microsoft realizes that, in the modern world, its software must run on more than just Windows.  Microsoft has already revealed Microsoft office programs for Apple iPad and iPhone. It also has email client compatible on both iOS and Android mobile operating systems.

With Mozilla Firefox and Google Chrome grabbing so much of the desktop market—and Apple Safari, Google Chrome, and Google’s Android browser dominating the mobile market—Internet Explorer is no longer the force it once was. Microsoft May Soon Replace Internet Explorer With a New Web Browser article says that Microsoft’s Windows 10 operating system will debut with an entirely new web browser code-named Spartan. This new browser is a departure from Internet Explorer, the Microsoft browser whose relevance has waned in recent years.

SSD capacity has always lag well behind hard disk drives (hard disks are in 6TB and 8TB territory while SSDs were primarily 256GB to 512GB). Intel and Micron will try to kill the hard drives with new flash technologies. Intel announced it will begin offering 3D NAND drives in the second half of next year as part of its joint flash venture with Micron. Later (next two years) Intel promises 10TB+ SSDs thanks to 3D Vertical NAND flash memory. Also interfaces to SSD are evolving from traditional hard disk interfaces. PCIe flash and NVDIMMs will make their way into shared storage devices more in 2015. The ULLtraDIMM™ SSD connects flash storage to the memory channel via standard DIMM slots, in order to close the gap between storage devices and system memory (less than five microseconds write latency at the DIMM level).

Hard disks will be still made in large amounts in 2015. It seems that NAND is not taking over the data centre immediately. The huge great problem is $/GB. Estimates of shipped disk and SSD capacity out to 2018 shows disk growing faster than flash. The world’s ability to make and ship SSDs is falling behind its ability to make and ship disk drives – for SSD capacity to match disk by 2018 we would need roughly eight times more flash foundry capacity than we have. New disk technologies such as shingling, TDMR and HAMR are upping areal density per platter and bringing down cost/GB faster than NAND technology can. At present solid-state drives with extreme capacities are very expensive. I expect that with 2015, the prices for SSD will will still be so much higher than hard disks, that everybody who needs to store large amounts of data wants to consider SSD + hard disk hybrid storage systems.

PC sales, and even laptops, are down, and manufacturers are pulling out of the market. The future is all about the device. We have entered the post-PC era so deeply, that even tablet market seem to be saturating as most people who want one have already one. The crazy years of huge tables sales growth are over. The tablet shipment in 2014 was already quite low (7.2% In 2014 To 235.7M units). There is no great reasons or growth or decline to be seen in tablet market in 2015, so I expect it to be stable. IDC expects that iPad Sees First-Ever Decline, and I expect that also because the market seems to be more and more taken by Android tablets that have turned to be “good enough”. Wearables, Bitcoin or messaging may underpin the next consumer computing epoch, after the PC, internet, and mobile.

There will be new tiny PC form factors coming. Intel is shrinking PCs to thumb-sized “compute sticks” that will be out next year. The stick will plug into the back of a smart TV or monitor “and bring intelligence to that”. It is  likened the compute stick to similar thumb PCs that plug to HDMI port and are offered by PC makers with the Android OS and ARM processor (for example Wyse Cloud Connect and many cheap Android sticks).  Such devices typically don’t have internal storage, but can be used to access files and services in the cloudIntel expects that sticks size PC market will grow to tens of millions of devices.

We have entered the Post-Microsoft, post-PC programming: The portable REVOLUTION era. Tablets and smart phones are fine for consuming information: a great way to browse the web, check email, stay in touch with friends, and so on. But what does a post-PC world mean for creating things? If you’re writing platform-specific mobile apps in Objective C or Java then no, the iPad alone is not going to cut it. You’ll need some kind of iPad-to-server setup in which your iPad becomes a mythical thin client for the development environment running on your PC or in cloud. If, however, you’re working with scripting languages (such as Python and Ruby) or building web-based applications, the iPad or other tablet could be an useable development environment. At least worth to test.

You need prepare to learn new languages that are good for specific tasks. Attack of the one-letter programming languages: From D to R, these lesser-known languages tackle specific problems in ways worthy of a cult following. Watch out! The coder in the next cubicle might have been bitten and infected with a crazy-eyed obsession with a programming language that is not Java and goes by the mysterious one letter name. Each offers compelling ideas that could do the trick in solving a particular problem you need fixed.

HTML5′s “Dirty Little Secret”: It’s Already Everywhere, Even In Mobile. Just look under the hood. “The dirty little secret of native [app] development is that huge swaths of the UIs we interact with every day are powered by Web technologies under the hood.”  When people say Web technology lags behind native development, what they’re really talking about is the distribution model. It’s not that the pace of innovation on the Web is slower, it’s just solving a problem that is an order of magnitude more challenging than how to build and distribute trusted apps for a single platform. Efforts like the Extensible Web Manifesto have been largely successful at overhauling the historically glacial pace of standardization. Vine is a great example of a modern JavaScript app. It’s lightning fast on desktop and on mobile, and shares the same codebase for ease of maintenance.

Docker, meet hype. Hype, meet Docker. Docker: Sorry, you’re just going to have to learn about it. Containers aren’t a new idea, and Docker isn’t remotely the only company working on productising containers. It is, however, the one that has captured hearts and minds. Docker containers are supported by very many Linux systems. And it is not just only Linux anymore as Docker’s app containers are coming to Windows Server, says Microsoft. Containerization lets you do is launch multiple applications that share the same OS kernel and other system resources but otherwise act as though they’re running on separate machines. Each is sandboxed off from the others so that they can’t interfere with each other. What Docker brings to the table is an easy way to package, distribute, deploy, and manage containerized applications.

Domestic Software is on rise in China. China is Planning to Purge Foreign Technology and Replace With Homegrown SuppliersChina is aiming to purge most foreign technology from banks, the military, state-owned enterprises and key government agencies by 2020, stepping up efforts to shift to Chinese suppliers, according to people familiar with the effort. In tests workers have replaced Microsoft Corp.’s Windows with a homegrown operating system called NeoKylin (FreeBSD based desktop O/S). Dell Commercial PCs to Preinstall NeoKylin in China. The plan for changes is driven by national security concerns and marks an increasingly determined move away from foreign suppliers. There are cases of replacing foreign products at all layers from application, middleware down to the infrastructure software and hardware. Foreign suppliers may be able to avoid replacement if they share their core technology or give China’s security inspectors access to their products. The campaign could have lasting consequences for U.S. companies including Cisco Systems Inc. (CSCO), International Business Machines Corp. (IBM), Intel Corp. (INTC) and Hewlett-Packard Co. A key government motivation is to bring China up from low-end manufacturing to the high end.

 

Data center markets will grow. MarketsandMarkets forecasts the data center rack server market to grow from $22.01 billion in 2014 to $40.25 billion by 2019, at a compound annual growth rate (CAGR) of 7.17%. North America (NA) is expected to be the largest region for the market’s growth in terms of revenues generated, but Asia-Pacific (APAC) is also expected to emerge as a high-growth market.

The rising need for virtualized data centers and incessantly increasing data traffic is considered as a strong driver for the global data center automation market. The SDDC comprises software defined storage (SDS), software defined networking (SDN) and software defined server/compute, wherein all the three components of networking are empowered by specialized controllers, which abstract the controlling plane from the underlying physical equipment. This controller virtualizes the network, server and storage capabilities of a data center, thereby giving a better visibility into data traffic routing and server utilization.

New software-defined networking apps will be delivered in 2015. And so will be software defined storage. And software defined almost anything (I an waiting when we see software defined software). Customers are ready to move away from vendor-driven proprietary systems that are overly complex and impede their ability to rapidly respond to changing business requirements.

Large data center operators will be using more and more of their own custom hardware instead of standard PC from traditional computer manufacturers. Intel Betting on (Customized) Commodity Chips for Cloud Computing and it expects that Over half the chips Intel will sell to public clouds in 2015 will have custom designs. The biggest public clouds (Amazon Web Services, Google Compute, Microsoft Azure),other big players (like Facebook or China’s Baidu) and other public clouds  (like Twitter and eBay) all have huge data centers that they want to run optimally. Companies like A.W.S. “are running a million servers, so floor space, power, cooling, people — you want to optimize everything”. That is why they want specialized chips. Customers are willing to pay a little more for the special run of chips. While most of Intel’s chips still go into PCs, about one-quarter of Intel’s revenue, and a much bigger share of its profits, come from semiconductors for data centers. In the first nine months of 2014, the average selling price of PC chips fell 4 percent, but the average price on data center chips was up 10 percent.

We have seen GPU acceleration taken in to wider use. Special servers and supercomputer systems have long been accelerated by moving the calculation of the graphics processors. The next step in acceleration will be adding FPGA to accelerate x86 servers. FPGAs provide a unique combination of highly parallel custom computation, relatively low manufacturing/engineering costs, and low power requirements. FPGA circuits may provide a lot more power out of a much lower power consumption, but traditionally programming then has been time consuming. But this can change with the introduction of new tools (just next step from technologies learned from GPU accelerations). Xilinx has developed a SDAccel-tools to  to develop algorithms in C, C ++ – and OpenCL languages and translated it to FPGA easily. IBM and Xilinx have already demoed FPGA accelerated systems. Microsoft is also doing research on Accelerating Applications with FPGAs.


If there is one enduring trend for memory design in 2014 that will carry through to next year, it’s the continued demand for higher performance. The trend toward high performance is never going away. At the same time, the goal is to keep costs down, especially when it comes to consumer applications using DDR4 and mobile devices using LPDDR4. LPDDR4 will gain a strong foothold in 2015, and not just to address mobile computing demands. The reality is that LPDRR3, or even DDR3 for that matter, will be around for the foreseeable future (lowest-cost DRAM, whatever that may be). Designers are looking for subsystems that can easily accommodate DDR3 in the immediate future, but will also be able to support DDR4 when it becomes cost-effective or makes more sense.

Universal Memory for Instant-On Computing will be talked about. New memory technologies promise to be strong contenders for replacing the entire memory hierarchy for instant-on operation in computers. HP is working with memristor memories that are promised to be akin to RAM but can hold data without power.  The memristor is also denser than DRAM, the current RAM technology used for main memory. According to HP, it is 64 and 128 times denser, in fact. You could very well have a 512 GB memristor RAM in the near future. HP has what it calls “The Machine”, practically a researcher’s plaything for experimenting on emerging computer technologies. Hewlett-Packard’s ambitious plan to reinvent computing will begin with the release of a prototype operating system in 2015 (Linux++, in June 2015). HP must still make significant progress in both software and hardware to make its new computer a reality. A working prototype of The Machine should be ready by 2016.

Chip designs that enable everything from a 6 Gbit/s smartphone interface to the world’s smallest SRAM cell will be described at the International Solid State Circuits Conference (ISSCC) in February 2015. Intel will describe a Xeon processor packing 5.56 billion transistors, and AMD will disclose an integrated processor sporting a new x86 core, according to a just-released preview of the event. The annual ISSCC covers the waterfront of chip designs that enable faster speeds, longer battery life, more performance, more memory, and interesting new capabilities. There will be many presentations on first designs made in 16 and 14 nm FinFET processes at IBM, Samsung, and TSMC.

 

1,403 Comments

  1. Tomi Engdahl says:

    US Tech Companies Expected To Lose More Than $35 Billion Over NSA Spying
    http://yro.slashdot.org/story/15/06/09/1235221/us-tech-companies-expected-to-lose-more-than-35-billion-over-nsa-spying

    Citing significant sales hits taken by big American firms like Apple, Intel, Microsoft, Cisco, Salesforce, Qualcomm, IBM, and Hewlett-Packard, a new report says losses by U.S. tech companies as a result of NSA spying and Snowden’s whistleblowing “will likely far exceed” $35 billion

    U.S. tech companies expected to lose more than $35 billion due to NSA spying
    http://www.dailydot.com/politics/nsa-prism-fallout-35-billion-us-tech-firms/

    U.S. companies will likely lose more than $35 billion in foreign business as a result of the vast NSA-surveillance operations revealed by Edward Snowden, according to a new report from the Information Technology and Innovation Foundation (ITIF).

    “Foreign customers are shunning U.S. companies,” the report asserts, causing American businesses to lose out on foreign contracts and pushing other countries to create protectionist policies that block American businesses out of foreign markets.

    ITIF, a nonpartisan Washington, D.C.-based technology think tank founded my members of Congress, first estimated in 2013 that American losses as a result of the National Security Agency’s PRISM program, which centers on the collection of Internet communications from major American technology firms, would tally between $21.5 billion and $35 billion, with the U.S. cloud-computing industry bearing the brunt of the fallout.

    The actual losses “will likely far exceed $35 billion,” according to the ITIF report, because the entire American tech industry has performed worse than expected as a result of the Snowden leaks.

    The massive financial hit is likely one key reason leading major American tech firms, like Apple and Google, to not only include strong encryption in their smartphones, tablets, and services, but to also publicly oppose the outlawing of strong encryption by law-enforcement authorities like James Comey, director of the Federal Bureau of Investigation, and Manhattan District Attorney Cyrus Vance, Jr.

    Since the first Snowden leaks became public in 2013, foreign businesses and civilians around the world have repeatedly said in polls that American surveillance will cause them to abandon (or at least be extremely wary of) American tech products. U.S.-based companies, including Apple, Intel, Microsoft, Cisco, Salesforce, Qualcomm, IBM, and Hewlett-Packard, have reportedly suffered sales hits in Asia, Europe, and North America as a result of blowback against NSA spying.

    In particular, European cloud companies, like Cloudwatt, Hortnetsecurity, and F-Secure, proudly boast of their non-American credentials and their resistance to NSA spying against foreigners. And the French government has invested $150 million into two cloud startups designed to keep data out of U.S. hands.

    Reply
  2. Tomi Engdahl says:

    How Today’s Low-Power X86 & ARM CPUs Compare To Intel’s Old NetBurst CPUs
    http://hardware.slashdot.org/story/15/06/09/020212/how-todays-low-power-x86-arm-cpus-compare-to-intels-old-netburst-cpus

    Phoronix celebrated their 11th birthday by comparing modern CPUs to old Socket 478 CPUs with the NetBurst Celeron and Pentium 4C on an Intel 875P+ICH5R motherboard. These old NetBurst processors were compared to modern Core and Atom processors from Haswell, Broadwell, Bay Trail and other generations.

    Comparing Today’s Modern CPUs To Intel’s Socket 478 Celeron & Pentium 4
    http://www.phoronix.com/scan.php?page=article&item=intel-478-retro&num=1

    With these processors from ~2003, they were compared to a variety of modern systems ranging from Intel Haswell Xeons and Core i7 EE systems to the low-power Intel Compute Stick to Intel Bay Trail and Broadwell NUCs to other x86 boxes. For making it really interesting, it was also compared to the NVIDIA Jetson TK1 with Tegra K1 quad-core Cortex-A15 SoC as well as a Freescale i.MX6 ARM board.

    Reply
  3. Tomi Engdahl says:

    News & Analysis
    iOS 9, Mac OS X El Capitan Dominate WWDC 2015
    http://www.eetimes.com/document.asp?doc_id=1326819&

    The keynote presentations at this year’s Apple Worldwide Developers Conference (WWDC) were lengthy, but that was expected given the number of updates and services predicted to debut during the event.

    As predicted, many of the system upgrades involved stability and performance fixes, in addition to updates for many native apps. Here is a snapshot of what we learned today:

    OS X: El Capitan
    New Mac system El Capitan brings refinement and advancement to predecessor Yosemite. Apple is focusing on performance with this update and promises greater speed for opening and switching applications.

    iOS 9
    We also learned more details about iOS 9, which will be available on all devices that support iOS 8. Apple’s newest mobile OS is now available to all developers with a public beta version promised for July. When the full upgrade is available, you’ll only need 1.3GB of free space to download iOS 9.

    Reply
  4. Tomi Engdahl says:

    Stress Is Driving Developers From the Video Game Industry
    http://games.slashdot.org/story/15/06/09/1529224/stress-is-driving-developers-from-the-video-game-industry

    For video game developers, life can be tough. The working hours are long, with vicious bursts of so-called “crunch time,” in which developers may pull consecutive all-nighters in order to finish a project—all without overtime pay.

    Faced with what many perceive as draconian working conditions, many developers are taking their skills and leaving video games for another technology sector.

    Is Stress Driving Tech Pros From Video Games?
    http://insights.dice.com/2015/06/09/is-stress-driving-tech-pros-from-video-games/

    For video game developers, life can be tough. The working hours are long, with vicious bursts of so-called “crunch time,” in which developers may pull consecutive all-nighters in order to finish a project—all without overtime pay.

    According to the International Game Developers Association (IGDA) Developer Satisfaction Survey (PDF), many developers aren’t enduring those work conditions for the money: Nearly 50 percent of respondents earned less than $50,000 annually. Average time spent in the game industry was nine years, during which most respondents had worked on 16 different projects. Company failures are common, if not routine.

    Layoffs are common, Blomquist added, with companies often letting people go once a project is near completion. It’s a depressing pattern for developers who endure intense production schedules to complete games, and one that’s received increasing attention from pundits and analysts.

    “The oft-used practice of quickly building up development teams and then laying off the majority when the project is complete isn’t conducive to a positive quality of life for individual game developers,” Kate Edwards, executive director of the IGDA, admitted, “even if the overall industry continues to see growth.”

    Faced with what many perceive as draconian working conditions, many developers are taking their skills and leaving video games for another technology sector. As Edwards noted, “People who opt to leave the game industry have a lot of options where their skills can be utilized. Programmers and software engineers easily find jobs at a wide range of IT-related companies.”

    Reply
  5. Tomi Engdahl says:

    Matthew Lynley / TechCrunch:
    Cloud-based call-center tech provider Talkdesk raises $15M Series A — Talkdesk, A Startup That Spins Up Call Centers, Raises $15M — Talkdesk, which creates software that helps companies create digital call centers that integrate with their other internal company software, said it has raised $15 million.

    Talkdesk, A Startup That Spins Up Call Centers, Raises $15M
    http://techcrunch.com/2015/06/09/talkdesk-a-startup-that-spins-up-call-centers-raises-15m/

    Talkdesk, which creates software that helps companies create digital call centers that integrate with their other internal company software, said it has raised $15 million.

    Companies use Talkdesk to create centers where customers can call into in order for assistance and feedback. Those calls are connected to internal company databases like Salesforce and Zendesk, which bring up all the relevant information about a caller in order to help a customer service representative better resolve the issue or route that customer to the best person.

    For example, when someone calls customer service and the call is routed through Talkdesk, the software automatically puts information about the caller in front of the customer service representative. Examples of that would be order details, how much that person usually spends, their name, or other bits of information that are useful to representatives that help them more quickly resolve an issue.

    Reply
  6. Tomi Engdahl says:

    Puppet Enterprise 3.8 is Now Available
    https://puppetlabs.com/blog/puppet-enterprise-3.8-now-available?ls=content-syndication&ccn=Techmeme-20150520&cid=701G0000000F68e

    Puppet Enterprise 3.8 is here! The release includes powerful new provisioning capabilities for Docker containers, AWS infrastructure and bare-metal environments. In addition, we’ve introduced Puppet Code Manager, a new app for Puppet Enterprise to accelerate deployment of infrastructure changes in a more programmatic and testable fashion.

    Puppet Enterprise 3.8 introduces major enhancements to the Puppet Node Manager app first introduced with version 3.7. New capabilities help significantly accelerate provisioning time across heterogeneous environments and get you to Day 2 operations faster.

    Containers

    A new Puppet Supported module for Docker helps you easily launch and manage Docker containers within your Puppet-managed infrastructure. These new capabilities help avoid configuration issues with the Docker daemon running containers, so teams can spend less time troubleshooting configuration issues, and more time helping to develop and deploy great applications.
    Cloud environments

    Our new Amazon Web Services module makes it easy to provision, configure and manage AWS resources, including EC2, Elastic Load Balancing, Auto Scaling, Virtual Private Cloud, Security Groups and Route53.

    Reply
  7. Tomi Engdahl says:

    Multicore Powering Next-Gen Industrial Servers
    http://www.eetimes.com/author.asp?section_id=36&doc_id=1326822&

    Many-core processors powering industrial servers for control applications are extending the impact of Moore’s Law into the future.

    The hallmark of PC control has always been the benefits it can pass on to machine control applications through increases in PC performance, along with steadily falling component costs. In recent years, higher performance has been achieved using multi-core processors.

    But now, as a next step in the development of technology, a new type of many-core industrial servers are offering both a higher number of processors and at the same time more cores on a board. Current configurations for the industrial control market are available with up to 24 cores, and also provide both a much larger cache and higher clock rates.

    A key is that many-core industrial servers can provide the compute power to implement centralized control system architectures that are capable of controlling complex machinery and equipment from a central location.

    Reply
  8. Tomi Engdahl says:

    Smart Glasses Strap On AR
    http://www.eetimes.com/document.asp?doc_id=1326828&

    Smart glasses were all the rage at the sixth annual Augmented World Expo (AWE) here. In talks and on the expo floor, established companies and startups demonstrated AR designed for consumers and industrial segments.

    “From nurses with X-ray vision to technicians that can teleport, technology is bringing superpowers to the people,” said Ori Inbar, CEO and co-founder of AugmentedReality.org, the producing organization behind AWE. “Augmented and virtual reality, wearables and IoT are enabling us to be better at anything we do in work and life.”

    ”As augmented reality and virtual reality technologies are receiving more attention in the enterprise space, smart glasses are becoming more than a tiny monocular display and are transforming into an immersive 3D experience,”

    Reply
  9. Tomi Engdahl says:

    Shine a light on the rogue IT that hides in the company shadows
    You might even find some gems
    http://www.theregister.co.uk/2015/06/10/shine_a_light_on_the_it_that_hides_in_your_companys_shadows/

    The rapid development of technology over the years has brought us a culture where people use technology wherever they are and whatever they are doing.

    Because most people could not afford their own computers, they relied on their employers to provide the technology.

    If you were the employer, this brought a useful benefit: you had control over the applications people used because you were providing them. Defining and implementing standards was easy: the holder of the purse strings held all the cards.

    Those days are gone, of course. We now carry around in our pockets and briefcases more processing power than anyone could have dreamt of then. And with high-speed mobile data services and open-source software you can do a ridiculous amount of processing on the move using phones, tablets and laptops.

    In fact if you are so inclined you can run most aspects of a small business using just a biggish smartphone (as I discovered when I was asked whether it was possible).

    Such progress, however, brings a problem we call “shadow IT”.

    Techopedia describes shadow IT as “IT solutions and systems created and applied inside companies and organisations without their authorisation”. Analyst Gartner’s website puts it more simply: “IT activity that occurs outside of IT”.

    Listen and learn

    It is particularly common to see people adopting their own range of mobile apps, for a couple of reasons.

    First, even if you have a policy of providing company-owned mobile handsets, it is unusual for every member of staff to be entitled to one. Those that aren’t tend to use their own handsets and hook them up to a cloud storage service like Google Drive to share documents. And if you are daft enough to allow them to sync directly with your company mail server they will use a variety of email applications that you don’t support.

    Second, even if you do provide company devices it is common to support only the core applications – email, calendaring and not a lot more – so users fill the gaps with their own selections.

    In an ideal world you would completely eradicate shadow IT. Dealing with the issues of departments trying to exchange documents in a variety of different formats can be a support headache.

    There are five core considerations you need to take into account to address shadow IT.

    1. Sort out your IT department’s GUI
    2. Have an IT department that works
    3. Put in policies and regulation
    4. Corral the mobile apps
    5. Work within the company’s goals

    Sometimes people simply have to be told: “Stop doing that, it is less important than X”.

    Reply
  10. Tomi Engdahl says:

    Reactions To Apple’s Plans To Open Source Swift
    http://apple.slashdot.org/story/15/06/10/0310248/reactions-to-apples-plans-to-open-source-swift

    At Apple’s WWDC 2015 event yesterday, Craig Federighi, Apple’s senior vice president of software engineering, announced that the company planned to open source the Swift language. Reaction to this announcement so far has sounded more or less like this: Deafening applause with undertones of “we’ll see.”

    Programming languages are glue for SDKs, APIs and libraries. The real value of Swift will be whether it can realistically be used anywhere but Apple’s walled garden.”

    Reply
  11. Tomi Engdahl says:

    Open Sourcing Is No Longer Optional, Not Even for Apple
    http://www.wired.com/2015/06/open-sourcing-no-longer-optional-not-even-apple/

    The biggest round of applause at Apple’s Worldwide Developers Conference keynote yesterday didn’t come when the company announced new versions of iOS and OS X, or even the new Apple Music service. It came when Apple’s vice president of engineering Craig Federighi announced that the company will open source the next version of its programming language Swift.

    Why the excitement? Developers have demonstrated a growing preference for open source tools and platforms over the past 15 years. Apple, meanwhile, has pushed iOS developers towards its own in-house development technologies and away from third-party tools, such as Adobe Flash, that it deems inefficient. But even Apple can only risk alienating the developers on whom it relies for so many third-party apps and services so far. Coders have myriad options available to let them do their jobs the way they want; to keep them in-house, it turns out, Apple has to open up.

    To be sure, Swift is already growing like mad. But many other new programming languages have been created in recent years that are vying for devs’ attention. Facebook is experimenting with Hack and D; Google open sourced its Go language; and Mozilla just released the first full version of its language Rust. Each of these languages has its strengths and weaknesses and one or more of them could become the next de facto standard for software development. And each one is open source.

    Most significant, however, was Microsoft’s decision last year to open source its .NET framework. In doing so, Microsoft gave its official stamp of approval to all developers using its languages outside of the Windows ecosystem

    Faced with the prospect of developers using Microsoft tools to develop apps for Apple products, Apple really had no other choice but to make Swift equally dev-friendly.

    A company called Xamarin has long offered tools that allowed developers to use Microsoft’s languages to build software that could run on Windows, Linux, iOS, Android and more.

    That meant developers didn’t have to write apps in both Java, to target Android, and Objective C, to target iOS. They could even use the same code as the basis for desktop and server side applications as well.

    Although most iOS and OS X developers still use Objective C or Swift, Apple could be trying to head Microsoft off at the pass by making Swift available on other operating systems.

    How Open Is Open?

    In a blog post, Apple writes that its open source release will include core parts of the Swift ecosystem — including the compiler and standard library — under a standard license, though details remain sparse as to how open open will really be.

    Swift 2.0
    https://developer.apple.com/swift/blog/?id=29

    Reply
  12. Tomi Engdahl says:

    Apple’s new Xcode lets anyone sideload apps and emulators onto their iOS device
    http://www.pocketgamer.co.uk/r/iPhone/Emulation/news.asp?c=65781

    The new version of Xcode – the program that developers use to create apps and games for iOS – will let you send apps to your iOS device, even if they haven’t been approved by Apple.

    This means that you will be able to run apps that wouldn’t make it onto the App Store, like emulators, torrent clients, banned games, and other unavailable software.

    On the website for Xcode 7, Apple says it’s now “easier for everyone to build apps and run them directly on their Apple devices. Program membership is not required.”

    The feature is intended to let developers start building and testing apps before committing to the developer license

    https://developer.apple.com/xcode/downloads/

    Reply
  13. Tomi Engdahl says:

    USB bus is a universal technique that can be found in almost all devices. However, the bus is no longer sufficient, for example, a video transfer of security cameras. HD and UHD- that is, ultra-high-definition video producing such a massive data stream that the old USB 2.0 to the bend. Fortunately, USB 3.0 is becoming.

    The five mega pixel camera with 24 frames per second produces a rate of 2.4 Gbps data stream. On older serial-shaped buses such as USB 2.0 and WiFi 802.11n simply do not stay perched on the camera data.

    For example, USB 2.0, the actual bandwidth is only 280-320 megabits per second (from 35 to 40 megabytes per second). Most devices only reach 200-240 Mbps. Continuous USB 2.0 -Transmission depends greatly on the software driver and platform optimization.

    USB 3.0 is a very effective choice for developers of today’s embedded systems. Its two-bus architecture allows communication with older devices, while SuperSpeed ​​USB bus of up to five gigabits per second data transfer rate.

    C-type USB interface will spread quickly, for example, laptops. Cypress Semiconductor has now presented the final development package, which allows equipment manufacturers to develop their own C-USB interface to support in all different display types.

    CY4501-package allows you to develop a connection with a laptop and a HDMI, DVI or even equipped with a VGA connection box. The package includes the necessary circuitry and software.

    Platform also supports the new USB Billboard function. It allows equipped with a C-type USB output device can discuss with a device that does not support DisplayPort bus. DisplayPort is one of the Cypress driver circuit in an alternate interface mode. Such a connection only needs an HDMI / DVI / VGA protocol conversion circuit, which is offered in the Mega Chip In.

    Sources:
    http://etn.fi/index.php?option=com_content&view=article&id=2958:vanha-usb-ei-riita-videon-siirtoon&catid=13&Itemid=101
    http://etn.fi/index.php?option=com_content&view=article&id=2950:usb-c-liitanta-kaikkiin-nayttoihin&catid=13&Itemid=101

    Reply
  14. Tomi Engdahl says:

    Intel Memo on Layoffs Reported
    $300M savings sought amid PC declines
    http://www.eetimes.com/document.asp?doc_id=1326837&

    Intel may cut an untold number of research an administration positions, according to an internal memo obtained by The Oregonian/OregonLive. Following a reduced revenue outlook for 2015, the chip company could aim to cut its budget by $300 million.

    The memo was distributed to Intel managers and stated that the company will “prioritize investments, slow down hiring, and drive other efficiencies, including a performance-based involuntary separation package.” Cuts will begin June 15 and conclude one month later for an unspecified number of people, the Oregonian reported. Intel has roughly 106,000 employees worldwide.

    Intel declined to comment on the layoffs.

    Intel had projected 5% growth in 2015 but realigned that forecast in April to expect a flat revenue year. Market watchers say the PC market’s decline is continuing, and Intel’s spending has been 6-10% ahead of its model for more than a year.

    Reply
  15. Tomi Engdahl says:

    Women are fleeing from the digital sector, reckons UK.gov report
    Could it be all the dick-swinging bearded hipsters?
    http://www.theregister.co.uk/2015/06/11/women_dont_want_to_work_in_digital_report/

    Fewer women are working in the digital sector than a decade ago, according to a report by the UK Commission for Employment and Skills quango.

    Currently just one quarter of employees are women in digital businesses such as software development, visual effects and computer games – down from one-third in 2002 – according to the report.

    The report reckons two million people are employed in digital and creative industries, with the sector worth £137bn to the UK annually.

    It claimed a further 1.2m people will be needed to fill jobs in the sector by 2022.

    “Initiatives like TechFuture Girls that encourage young women to consider a career in this dynamic sector will play an important role in addressing this issue.”

    Reply
  16. Tomi Engdahl says:

    Video game console maker Ouya is in negotiations to sell itself, possibly to Razer
    http://www.cnet.com/news/video-game-console-maker-ouya-is-in-negotiations-to-sell-itself-possibly-to-razer/

    The company, whose app store was pitched as the go-to place for small and independent developers, is negotiating for employees to remain with the company after the sale.

    The possible sale marks a dramatic turn for Ouya, a once-high flying startup, whose video game console and app store were popular with small developers. While the company and its service will likely continue to run under Razer’s ownership, Ouya’s efforts to sell itself underscores the challenges of competing in the video game industry, which is fueled primarily by blockbuster games bought by often finicky customers.

    In many ways Ouya came to the market too early. In the three years since its debut, Internet-connected TV boxes have become a popular hobby of Silicon Valley, which has introduced devices like Google’s Android TV software and Chromecast streaming media stick, Amazon’s Fire TV and Roku, which has sold 10 million units since going on sale in 2008. Apple’s set-top box, called Apple TV, has sold 25 million units since it went on sale in 2007.

    While these devices have begun to take pride of place in people’s living rooms, they haven’t matched the success of Microsoft’s Xbox, Sony’s PlayStation or Nintendo’s Wii. Part of the reason, analysts say, is a lack of entertainment apps, particularly games.

    The company announced its namesake video game console on the crowdfunding site Kickstarter in 2012, promising to bring the ease of making games for smartphones and tablets to the big screen.

    Reply
  17. Tomi Engdahl says:

    Josh Constine / TechCrunch:
    Oculus unveils its consumer Rift VR headset — Oculus Unveils Consumer Rift Headset With Wireless Xbox One Controller — This is no developer kit. Oculus today gave the world the first look at its consumer virtual reality headset. It ships with a small, table-top camera on a stand …

    Oculus Unveils Consumer Rift Headset With Xbox One Controller
    http://techcrunch.com/2015/06/11/oculus-rift-consumer/

    This is no developer kit. Oculus today gave the world the first look at its Rift consumer virtual reality headset which will ship with a wireless Xbox One controller. It also comes with a small, table-top camera on a stand that watches a constellation of LED markers on the Rift to track your head movement.

    The partnership with Microsoft will also see the Rift work “natively” with Windows 10, plus play Xbox One games in the headset.

    The Rift is light enough to hold with one hand, and its black matter fabric-wrapped frame houses two OLED screen and integrated, removable headphones. It’s designed to allow people to wear glasses, and the part that touches your face can be replaced. An adjustable slider lets you change the distance between the eye lenses for faces of different sizes.

    Reply
  18. Tomi Engdahl says:

    How to find the perfect project manager
    http://www.cio.com/article/2933191/project-management/how-to-find-the-perfect-project-manager.html

    The prototypical project manager needs financial, scheduling and management skills to keep projects on time and on budget. They also must communicate effectively to diverse business and technical teams — and that’s just the beginning. Here’s how to target the elusive perfect project manager.

    It’s very hard to teach a non-communicator to communicate but it’s easy to teach them how to do an ROI or do something that’s more of a traditional project management item.

    “Companies have all realized there is power in technology, that it gives you a competitive advantage, it helps you be faster and more effective. And companies realize that the investment is significant around these projects, and it’s really important that you see the return on your investment,” says John Reed, senior executive director of IT staffing firm Robert Half Technology. “The best way to do that is to have a project manager experienced in being able to manage from scoping to implementation. That’s why you see this increase in demand for skilled project managers.”

    Reed estimates that salaries range from around $75,000 for junior-level program managers up into the six figures for more seasoned professionals.

    Hot skills alert: Tips for landing a plum project manager’s job
    http://www.cio.com/article/2933485/it-skills/hot-skills-alert-tips-for-landing-a-plum-project-managers-job.html

    As companies continue to invest heavily in new projects, those IT staffers with project management skills are hot commodities. Here’s how to get started.

    Karen Klein had a typical entrance into the project management profession, evolving into the role after working her way up the IT ranks.

    “It took a while to figure out that I was managing projects, managing task lists, risks and scheduling,” she says.

    With that experience under her belt, Klein charged ahead. She attended seminars on project management, sought out mentors in the field and read as much as she could on the topic. She then moved into full-time project management jobs

    It’s a common pathway, she says. “In small to midsize companies, the lines tend to get blurred and anyone who has the skills to manage projects ends up managing them,” Klein explains.

    Researchers and IT leaders say companies are investing heavily in new projects as they work to stay competitive in the expanding economy and catch up on initiatives that were sidelined during the Great Recession.

    The strong demand for project managers isn’t a temporary blip.

    What’s needed: Cool head, soft voice

    A project manager’s job is challenging, with duties that require a mix of business, technical and leadership skills. Typical responsibilities include developing project charters, defining a project’s scope and outlining objectives. Project managers must develop and maintain schedules and ensure that milestones are met. They often have to calculate estimates of ROI, get approvals of business cases and craft persuasive justifications for projects.

    “It’s the type of role where you’re working across organizations,”

    Reply
  19. Tomi Engdahl says:

    Intel vs. ARM—Will the Bear Strike Back?
    http://intelligentsystemssource.com/intel-vs-arm-will-the-bear-strike-back/?utm_source=Pubpress&utm_medium=email&utm_term=Intel+vs.+ARM%E2%80%94Will+the+Bear+Strike+Back%3F+&utm_content=ISS+Enewsletter+June%3A+Intel+vs.+ARM%E2%80%94Will+the+Bear+Strike+Back%3F&utm_campaign=1506_ISS_enewsletter

    Intel must be feeling like a bear being circled by wolves. The wolves are named ARM and they are gradually tearing meaty chunks out of the x86 bear. But beware of getting a bear cornered. A cornered bear can lash out with some pretty mean claws as well. For some time there has been a battle going on in the embedded space between the Intel Atom family of x86 low-power processors and a variety of ARM architecture-based devices, which are increasingly centered on the ARM Cortex M and R series cores. Intel’s advantage has been that the embedded world has largely been derived from that of the PC—high-volume, low-cost devices as well as widely used peripheral technologies like USB and PCI and now PCI Express. ARM’s big selling point has been its low power consumption, which despite all its efforts, Intel has never been able to match. This difference is being increasingly exacerbated with the surge in mobile devices, wireless connectivity and the need to save power at every step. This and the flattening of the PC market has allowed ARM and its many licensees to make steady gains. Add to this mix, Intel’s old nemesis, AMD, which has successfully marketed lines of processors based on its clean room-developed version of the x86 instruction set. Now even AMD is starting to move into the ARM arena.

    It appears that so far Intel’s strategy has been to more aggressively and specifically target the embedded arena with such things as Atom and Core-based SoCs that incorporate a range of on-chip peripherals and increasingly to move into the server arena with powerful versions of its multi-core Xeon processors. Unfortunately, the lust for power-savings is also now rampant in server farms, which consume enormous amounts of electricity and dissipate ungodly amounts of heat—all of which can be visualized as dollar signs by their managers. And now ARM is targeting the server arena with a new line of 64-bit ARM cores and AMD, among others, have started offering processors targeting the server market based on these new cores.

    Another thing that Intel has to contend with is the emergence of heterogeneous system architectures (HSAs) that integrate such elements as graphic processors, DSPs and FPGAs on the same die

    Both Altera and its major rival Xilinx currently offer families of CPU/FPGA hybrid processors on which the CPU is an ARM device.

    No FPGA company will be able to license something like Atom or Core i5 IP from Intel to build a competing device. So if Intel does acquire Altera, you can expect it to, of course, continue to support and develop Altera’s technologies and serve its customers. There were predictions that after its acquisition of Wind River, it would drop lines that served other processors, but that did not happen. We can, however, expect to see a discontinuation of ARM-based SoC FPGAs for a line of hybrids based on Atom and Core architectures.

    If Intel does acquire a major programmable logic house like Altera with its built-in expertise, it will be able to turn out hybrid processors with a huge range of specific (network server) functions as well as generally programmable and configurable devices.

    Reply
  20. Tomi Engdahl says:

    How Much Python Do You Need To Know To Be Useful?
    http://developers.slashdot.org/story/15/06/11/1825250/how-much-python-do-you-need-to-know-to-be-useful

    Since Python is a general-purpose language, it finds its way into a whole lot of different uses and industries. That means the industry in which you work has a way of determining what you actually need to know in terms of the language, as developer Jeff Cogswell explains in a new Dice piece.

    Learning Enough Python to Land a Job
    http://insights.dice.com/2015/06/11/learning-enough-python-to-land-a-job/?CMPID=AF_SD_UP_JS_AV_OG_DNA_

    If you want a job programming in Python, prepare to do a lot of work beforehand. The language is easy to pick up, but you need to do more than just learn the basics; to get a job, you need to have a strong understanding of some pretty complex processes.

    Python is a general-purpose language, which means it isn’t used for just one purpose such as Web development. Rather, it’s used in many different industries, and the industry in which you choose to work will determine how you actually learn the language.

    For example, if you’re hired to write apps that interact with operating systems and monitor devices, you might not need to know how to use the Python modules for scientific and numerical programming. In a similar fashion, if you’re hired to write Python code that interacts with a MySQL database, then you won’t need to master how it works with CouchDB.

    Therefore, I’m going to suggest that there are three levels to learning the basics of Python:

    Learn the core language itself, such as the syntax and basic types; learn the difference between Python 2 and Python 3.

    Learn the commonly used modules, and familiarize yourself with other modules.

    Learn the bigger picture of software development with Python, such as including Python in a build process, using the pip package manager, and so on. This involves learning about different databases and other technology, depending on where you want to work.

    Reply
  21. Tomi Engdahl says:

    PHP At 20: From Pet Project To Powerhouse
    http://developers.slashdot.org/story/15/06/11/1659201/php-at-20-from-pet-project-to-powerhouse

    Ben Ramsey provides a look at the rise of PHP, the one-time ‘silly little project’ that has transformed into a Web powerhouse, thanks to flexibility, pragmatism, and a vibrant community of Web devs.

    HP at 20: From pet project to powerhouse
    http://www.infoworld.com/article/2933858/php/php-at-20-from-pet-project-to-powerhouse.html

    The one-time ‘silly little project’ has transformed into a Web powerhouse, thanks to flexibility, pragmatism, and a vibrant community of Web devs

    When Rasmus Lerdorf released “a set of small tight CGI binaries written in C,” he had no idea how much his creation would impact Web development. Delivering the opening keynote at this year’s SunshinePHP conference in Miami, Lerdorf quipped, “In 1995, I thought I had unleashed a C API upon the Web. Obviously, that’s not what happened, or we’d all be C programmers.”

    In fact, when Lerdorf released version 1.0 of Personal Home Page Tools — as PHP was then known — the Web was very young.

    Back then, our options were limited when it came to server-side processing for Web apps. PHP stepped in to fill our need for a tool that would enable us to do dynamic things on the Web. That practical flexibility captured our imaginations, and PHP has since grown up with the Web. Now powering more than 80 percent of the Web, PHP has matured into a scripting language that is especially suited to solve the Web problem. Its unique pedigree tells a story of pragmatism over theory and problem solving over purity.

    Reply
  22. Tomi Engdahl says:

    Enterprise mobility slowed by security concerns
    http://www.cio.com/article/2934333/mobile/enterprise-mobility-slowed-by-security-concerns.html

    While mobile technology continues to move forward in all parts of the business, security issues threaten to slow the progress, according to attendees at this week’s MobileIron’s user conference.

    On the upside, mobility in the enterprise has room to grow.

    Even Uber is making moves in the mobility market with its new business initiative.

    Will security concerns derail mobile?

    Most companies, though, continue to face new mobile security challenges. Security can derail their mobile plans — “without it, we risk too much,” say polled attendees. It’s a big reason why MobileIron announced a slew of features to protect mobile enterprise data wherever it lives, such as the app, network and cloud.

    In the network, for instance, MobileIron protects data with multi-OS app VPN. In the cloud, MobileIron gives companies control over content encryption keys.

    Meanwhile, employees are concerned that companies secretly look at their private data on mobile devices. There’s a lot of private data, too. A recent MobileIron survey of millennials found that 87 percent say their “mobile device never leaves their side, night or day.” At MobileIron’s conference, attendees cited the challenge of gaining employee buy-in through education and communication as having the greatest risk of failure.

    That’s why MobileIron announced a visual privacy tool that lets employees know exactly what data the company can see, what actions the company can take. The goal is to build trust between employer and employee.

    “CIOs need new security, but it hasn’t been defined yet. They’ll have to buy again, and buy differently.”

    Reply
  23. Tomi Engdahl says:

    Microsoft CEO, Satya Nadella, Heads Top Tech Leader Rankings 2015
    http://www.juniperresearch.com/press/press-releases/microsoft-ceo-heads-top-leader-rankings-2015

    Below is the Top 10 in full

    Satya Nadella – Microsoft CEO
    Jony Ive – Apple SVP Design
    Min-Liang Tan – Razer Co-Founder & CEO
    Travis Kalanick – Uber CEO
    Reed Hastings – Netflix Co-Founder & CEO
    Jack Ma – Alibaba Founder & Chairman
    Paul Eremenko – Google ATAP Director
    Jeff Bezos – Amazon Founder & CEO
    Elon Musk – Tesla Chairman & CEO
    Lei Jun – Xiaomi CEO

    Reply
  24. Tomi Engdahl says:

    Notepad++ leaves SourceForge
    https://notepad-plus-plus.org/news/notepad-plus-plus-leaves-sf.html

    SourceForge was a good place; unfortunately, sometimes good places don’t last.

    Recently SF hijacked its hosted projects to distribute their wrapped crapware:

    SourceForge grabs GIMP for Windows’ account, wraps installer in bundle-pushing adware
    Black “mirror”: SourceForge has now taken over Nmap audit tool project
    What happened to Sourceforge? The full story between VLC and Sourceforge

    Obviously, the paid component per installation system is one of their important income generating scams. I would be fine with that, if they were the actual owners of the legitimate software. The real problem is, they are polluting these open source software installations for the purpose of filling their pockets by this scam,

    Such a shameless policy should be condemned, and the Notepad++ project will move entirely out of SourceForge.

    https://github.com/notepad-plus-plus/notepad-plus-plus

    Reply
  25. Tomi Engdahl says:

    Julie Bort / Business Insider:
    How Facebook’s Open Compute Project became a major force in data center hardware, with hundreds of companies, including HP, Foxconn, and Goldman Sachs on board — How Facebook is eating the $140 billion hardware market — It started out as a controversial idea inside Facebook.

    How Facebook is eating the $140 billion hardware market
    http://uk.businessinsider.com/facebook-open-compute-project-history-2015-6?op=1?r=US

    It started out as a controversial idea inside Facebook. In four short years, it has turned the $141 billion data-center computer-hardware industry on its head.

    Facebook’s extraordinary Open Compute Project is doing for hardware what Linux, Android, and many other popular products did for software: making it free and “open source.”

    That means that anyone can look at, use, or modify the designs of the hugely expensive computers that big companies use to run their operations — all for free. Contract manufacturers are standing by to build custom designs and to build, in bulk, standard designs agreed upon by the group.

    In software, open source has been revolutionary and disruptive. That movement created Linux, which is the software running most data centers around the world, and Android, the most popular smartphone platform in the world.

    Jonathan Heiliger dreamed up OCP in 2011 back when he was leading Facebook’s infrastructure team

    It started off with Facebook’s data centers.

    Most companies lease space in already existing data centers. But for huge tech companies like Google, Microsoft, Apple, and Amazon, it’s more efficient to build their own.

    The trouble was, in 2011, data centers were becoming known as one of the dirtiest, carbon-spewing parts of the tech industry.

    Facebook built its state-of-the-art data center in Prineville, Oregon, where it invented ways to use less electricity. So Facebook published the Prineville designs to contribute to the green data-center movement.

    Then it occurred to Heiliger: Why not share all of the Facebook’s hardware designs?

    Heiliger argued that the technology, particularly the hardware, “is not our competitive advantage.” and that “open source should be a core tenet at Facebook.”

    There are some huge advantages to making hardware open source.

    Hardware engineers, no matter who they work for, could collaborate. Ideas would flow. New tech would be invented more quickly. Difficult tech problems are fixed faster. And everyone would to share equally in the results.

    It would be 180 degrees from the classic culture of patents and lawsuits and trade secrets that has ruled the tech industry for decades. But Facebook didn’t make hardware, so there was no risk to its business.

    Zuck was in. One argument was particularly persuasive: “A company in Mountain View thinks their tech was a differentiator. We didn’t believe that,” Heiliger says, referring to the fact that Google builds much of its own hardware and a lot of its own software and keeps most of that stuff a closely guarded secret.

    Now that OCP has become a phenomenon, Google’s top hardware-infrastructure guy (a legend in his world), Urs Hölzle, offers a begrudging respect for the project

    When asked about OCP, Hölzle told us, “It actually makes a lot of sense because it’s open source for hardware. It’s relatively basic today,” he said. “It could be the start of something a little bit deeper.”

    “It will be relevant only for the very, very large companies — for the Facebooks, the Ebays, the Microsofts.”

    That’s because Helinger did several smart things when he started this project.

    First, he hired Frank Frankovsky away from Dell to help Facebook invent hardware and to lead Open Compute Project. Frankovsky quickly became its face and biggest evangelist.

    Next, he got Intel, a much older company with lots of experience in open source, on board. Intel’s legal team set up OCP’s legal structure

    Then, he asked Goldman Sachs’ Don Duet to join the board.

    He knew they were onto something almost immediately at OCP’s first conference.

    “We thought maybe 50 people would show up.” Instead over 300 came. “That was incredible,” he remembers.

    Goldman has been happy to buy OCP servers.

    Duet says Godman will never go back to buying servers the old way. “We’ve been clear to the vendor community. There’s no reason to go backwards. We didn’t go back after adopting open-source operating systems.”

    The man told him that OCP had turned his company into a $1 billion business, with hundreds of new customers.

    “You convinced us that it was the right thing to do and it was going to be ok, and we’re not only more profitable but we see new channels of business we hadn’t seen before. It wouldn’t have happened without you,”

    Last December, Frankovsky left Facebook to launch his own OCP hardware-inspired startup, too, an optical-storage startup still in stealth. He remains the chairman of the OCP project. And there have been other startups, like Rex Computing, launched by a teenage electronics wunderkind.

    But perhaps the biggest watershed moment for OCP happened just a few weeks ago, on March 10, 2015.

    He said HP’s server unit had agreed to become an OCP contract manufacturer and had launched a new line of OCP servers.

    Both HP and Dell had been watching and involved in OCP for years, even contributing to the designs. But behind the scenes they were not completely on board.

    One day, Frankovsky hopes that Cisco will follow HP’s lead and join the open-source hardware movement.

    The open-source hardware movement will lead to its own massive hits that will totally change the industry.

    And there’s a good reason for that, says Frankovsky: “Openness always wins, as long as you do it right. You don’t want to wind up on the wrong side of this one. It’s inevitable.”

    Reply
  26. Tomi Engdahl says:

    Steve Lohr / New York Times:
    IBM to invest in Apache Spark for real-time data analysis, will have more than 3,500 developers working on related projects — IBM Invests to Help Open-Source Big Data Software — and Itself — The IBM “endorsement effect” has often shaped the computer industry over the years.

    IBM Invests to Help Open-Source Big Data Software — and Itself
    http://bits.blogs.nytimes.com/2015/06/15/ibm-invests-to-help-open-source-big-data-software-and-itself/

    The IBM “endorsement effect” has often shaped the computer industry over the years. In 1981, when IBM entered the personal computer business, the company decisively pushed an upstart technology into the mainstream.

    In 2000, the open-source operating system Linux was viewed askance in many corporations as an oddball creation and even legally risky to use, since the open-source ethos prefers sharing ideas rather than owning them. But IBM endorsed Linux and poured money and people into accelerating the adoption of the open-source operating system.

    On Monday, IBM is to announce a broadly similar move in big data software. The company is placing a large investment — contributing software developers, technology and education programs — behind an open-source project for real-time data analysis, called Apache Spark.

    The commitment, according to Robert Picciano, senior vice president for IBM’s data analytics business, will amount to “hundreds of millions of dollars” a year.

    Reply
  27. Tomi Engdahl says:

    Jack Clark / Bloomberg Business:
    Oracle’s sales of new licenses decline as more startups embrace open source database software — Oracle Sales Erode as Startups Embrace Souped-Up Free Software — Dan Wagner, the chief executive officer of U.K.-based mobile payments company Powa Technologies Ltd., poses a challenge for database giant Oracle Corp.

    Oracle Sales Erode as Startups Embrace Souped-Up Free Software
    http://www.bloomberg.com/news/articles/2015-06-11/oracle-sales-eroded-as-startups-embrace-souped-up-free-software

    Dan Wagner, the chief executive officer of U.K.-based mobile payments company Powa Technologies Ltd., poses a challenge for database giant Oracle Corp.

    Wagner’s company last year began shifting away from pricey products from Oracle and International Business Machines Corp., replacing them with open-source software, which is freely available and can be modified. Now, Wagner said the closely held company is converting virtually all of its operations to free database software.

    “They scale and operate extremely well and they don’t cost anything,” Wagner said.

    Other companies share Wagner’s view and are shifting to software whose code is public. While the threat to Oracle has been around for years, it’s becoming more intense with recent improvements that make open-source technology more reliable — and appealing to a new generation of multibillion-dollar startups

    “There was pessimism for a decade on whether those things could stand up. The question is largely resolved,”

    The impact shows up in Oracle’s sales of new software licenses, which have declined for seven straight quarters compared with the period a year earlier. New licenses made up 25 percent of total revenue in fiscal 2014, down from 28 percent a year earlier — a sign the company is becoming increasingly dependent on revenue from supporting and maintaining products at existing customers and having a harder time finding new business.

    To blunt this, the Redwood City, California-based company is expanding efforts in cloud computing, which will let it sell packaged high-margin services to customers. That may help balance the slowdown in the basic business. It also operates an open-source database called MySQL.

    Free Programs

    One of the open-source technologies is the Cassandra database, which was created last decade and has been widely used by companies such as Apple Inc. and Netflix Inc.

    DataStax, for example, has a customer that paid about $500,000 in Oracle software licenses and now spends $90,000 with DataStax for a similar project

    “I think I’ve been in this industry too long to use Oracle,”

    Etsy, an online marketplace for hand-crafted goods, runs on a hodge-podge of open-source databases, primarily MySQL.

    ‘Sweet Spot’

    Not all applications are well-suited to open source, as the systems made by Oracle and others still have capabilities far in excess of the free systems, Palanca said.

    “You’re still going to have a class of applications for which these open-source solutions are not yet ready, and that is the continued sweet spot for Oracle,” she said.

    Startups’ Shift

    A Bloomberg survey of 20 startups valued at more than $1 billion supports the trend. The survey, which included companies such as Cloudflare Inc. and Pinterest Inc., found they placed open-source technologies at the heart of their businesses, with the exception of DocuSign, which had built around Microsoft’s SQL Server.

    None of the companies surveyed indicated they had a large Oracle database deployment for their main services, though many used bits of Oracle software to run aspects of their organizations.

    “A lot of the startups now go with MySQL or less expensive options,” said David Wolff, the CEO of Database Specialists, a database consultancy. “The only thing that people complain about with Oracle is how much it costs.”

    Companies can pay Oracle to get extra features of MySQL for $2,000 to $10,000 per computer it runs on, but none of the companies indicated this was the case. Others including Alibaba Group Holding Ltd., Facebook Inc., and Google Inc. have even built on MySQL to create their own free variant called WebScaleSQL.

    As open-source databases continue to improve, there may be less reason to pay for Oracle’s products.

    Reply
  28. Tomi Engdahl says:

    Deep Learning Machine Beats Humans in IQ Test
    http://www.technologyreview.com/view/538431/deep-learning-machine-beats-humans-in-iq-test/

    Computers have never been good at answering the type of verbal reasoning questions found in IQ tests. Now a deep learning machine unveiled in China is changing that.

    Just over 100 years ago, the German psychologist William Stern introduced the intelligence quotient test as a way of evaluating human intelligence. Since then, IQ tests have become a standard feature of modern life and are used to determine children’s suitability for schools and adults’ ability to perform jobs.

    These tests usually contain three categories of questions: logic questions such as patterns in sequences of images, mathematical questions such as finding patterns in sequences of numbers and verbal reasoning questions, which are based around analogies, classifications, as well as synonyms and antonyms.

    It is this last category that has interested Huazheng Wang and pals at the University of Science and Technology of China and Bin Gao and buddies at Microsoft Research in Beijing. Computers have never been good at these. Pose a verbal reasoning question to a natural language processing machine and its performance will be poor, much worse than the average human ability.

    Today, that changes thanks to Huazheng and pals who have built a deep learning machine that outperforms the average human ability to answer verbal reasoning questions for the first time.

    The end result is that words can be thought of as vectors in this high-dimensional parameter space. the advantage is that they can then be treated mathematically: compared, added, subtracted like other vectors.

    This approach has been hugely successful. Google uses it for automatic language translation by assuming that word sequences in different language represented by similar vectors are equivalent in meaning. So they are translations of each other.

    But this approach has a well-known shortcoming: it assumes that each word has a single meaning represented by a single vector. Not only is that often not the case, verbal tests tend to focus on words with more than one meaning as a way of making questions harder.

    Huazheng and pals tackle this by taking each word and looking for other words that often appear nearby in a large corpus of text. They then use an algorithm to see how these words are clustered. The final step is to look up the different meanings of a word in a dictionary and then to match the clusters to each meaning.

    They compare this deep learning technique with other algorithmic approaches to verbal reasoning tests and also with the ability of humans to do it. For this, they posed the questions to 200 humans gathered via Amazon’s Mechanical Turk crowdsourcing facility along with basic information about their ages and educational background.

    And the results are impressive. “To our surprise, the average performance of human beings is a little lower than that of our proposed method,” they say.

    Human performance on these tests tends to correlate with educational background. So people with a high school education tend to do least well, while those with a bachelor’s degree do better and those with a doctorate perform best. “Our model can reach the intelligence level between the people with the bachelor degrees and those with the master degrees,” say Huazheng and co.

    Reply
  29. Tomi Engdahl says:

    Why marketers are betting big on predictive analytics
    http://www.cio.com/article/2934274/why-marketers-are-betting-big-on-predictive-analytics.html

    Give a marketer a sale, and you’ll keep his company afloat for a day; teach him to predict future sales, and you may just ensure his longevity.

    That, in essence, is the premise behind predictive marketing, a concept that’s increasingly taking hold in enterprises today.

    Tapping into the analytics trend that’s being felt throughout the business world as a whole, predictive marketing applies algorithms and machine learning to big data to help marketers direct their efforts in the most profitable directions. Predictive-analytics tools can help marketers gauge ahead of time what a particular customer will buy, for example, as well as when and how much. Equipped with that information, companies can tailor their campaigns accordingly.
    state of cios
    State of the CIO 2015

    More than 500 top IT leaders responded to our online survey to help us gauge the state of the
    Read Now

    Amazon is a shining example: Its recommendations engine reportedly accounts for roughly 30 percent of the company’s sales.

    Successes like that may help explain investors’ excitement about predictive-analytics purveyors such as Lattice Engines.

    Reply
  30. Tomi Engdahl says:

    Intel inside: Six of the best affordable PC laptops
    Budget machines to leave you quids in and spoilt for choice
    http://www.theregister.co.uk/2015/06/15/product_roundup_six_afforable_laptops/

    The continued stagnation of the PC market is bad news for manufacturers, but good news for anyone who needs an affordable new laptop. Manufacturers are having to offer great value in order to attract buyers, and this means that you can now get some really attractive laptops that’ll do the business for a good few years in the £500-600 bracket.

    Reply
  31. Tomi Engdahl says:

    Hypercovergence isn’t about hardware: it’s server-makers becoming software companies
    Clouds sell compute by the glass. On-premises kitmakers want to sell wine-as-a-service
    http://www.theregister.co.uk/2015/06/15/hypercovergence_isnt_about_hardware_its_servermakers_becoming_software_companies/

    Public cloud is supposed to be a mortal threat to enterprise hardware vendors, whose wares look clunky and costly compared to a servers-for-an-hour-for-cents cloud and the threat looks scary … until you actually use a public cloud for a while.

    The Reg increasingly hears that the cost of operating in a public cloud quickly adds up to sums that make on-premises kit look decently-priced. Communications costs to and from public clouds can quickly reach the same level as compute costs, which rather dents the servers-for-cents story. Compute itself isn’t cheap, either, once you do lots of it. Nor is storage. Then there’s the other stuff needed to run real workloads, like firewalls, load balancers, WAN optimisation and so on. They all cost cents-per-hour, too and once you’re not just doing test and dev in the public cloud you need them all. Those cents-per-hour add up.

    So while public cloud does have a very low sticker price, once you work at a decent scale operational costs soon get pretty close to on-premises levels.

    The gap comes from the fact that public clouds do excuse you from cabling, powering and cooling kit, fixing stuck fans, swapping out dead disks and a zillion other pieces of dull meatspace sysadminnery. Cloud also has massive redundancy that on-premises kit can’t match and elasticity that is tough to replicate in your own bit barn.

    Roll out the barrel

    The folks at Nutanix have come up with an interesting analogy to describe this situation. They liken the public cloud to a restaurant or bar where you can buy wine by the glass. Even if you really like the wine, going out for a glass every night makes no sense. It’s more sensible to buy a case you can drink at home. You’ll get the same wine, at a lower price, with more convenience and comfort.

    Nutanix’s argument is based on the company’s belief that its kit does most of the elasticity and redundancy of a public cloud. It’s far from alone in the belief it delivers a cloud-like experience – all enterprise hardware makers are trying to deliver a simpler, more reliable experience with easy scalability – so the wine-by-the-glass/wine-by-the-case analogy works for plenty of vendors.

    Reply
  32. Tomi Engdahl says:

    Cortana threatens to blow away ESC key
    Toshiba’s Windows 10 PCs machines to get top-left key for digital assistant
    http://www.theregister.co.uk/2015/06/15/windows_8_killed_start_now_cortana_threatens_the_esc_key/

    Toshiba USA has revealed that it will add a key dedicated to summoning Cortana, Windows 10′s digital assistant.

    Toshiba’s Jeff Barney, the company’s veep and GM for all things PC in North America, says the new button will appear on every keyboard the company sells.

    PCWorld reports that “the key will sit in the upper left area, near the function keys.”

    A quick glance at the many keyboards in Vulture Souths eyrie suggests that this means the ESC key is in peril.

    Toshiba’s motive seems to be putting Cortana front and centre in its new machines, so that users can more easily access its better-than-Siri search functions.

    Reply
  33. Tomi Engdahl says:

    Linus Torvalds asks kernel devs to take a break so he can too
    Linux 4.1 delayed by driver dramas and Linus’ holiday
    http://www.theregister.co.uk/2015/06/15/linus_torvalds_asks_kernel_devs_to_take_a_break_so_he_can_too/

    In May, Linux overlord Linus Torvalds warned that his holiday might delay the release of Linux 4.1.

    “So I’m on vacation, but time doesn’t stop for that, and it’s Sunday, so time for a hopefully final rc,” Torvalds writes. “It turns out it’s just as well that I wanted to drag the release out by a week so that I don’t have the merge window while on vacation.”

    Reply
  34. Tomi Engdahl says:

    OPEN WIDE: Microsoft Live Writer authoring tool going open source
    Hanselman: ‘I didn’t expect this little tweet reply to cause a ruckus’
    http://www.theregister.co.uk/2015/06/15/microsoft_live_writer_authoring_tool_open_source/

    Microsoft will release its blog authoring tool, Live Writer, as open source, according to a tweet from developer evangelist Scott Hanselman.

    The goal of Live Writer was to make it easy to author blog posts while working offline in Windows. The editor is based on embedded Internet Explorer, and despite that handicap the tool was immediately popular.

    Offline authoring generally beats typing into a browser, especially for bloggers out and about with intermittent connections, and Live Writer does most things right.

    Unlike Word, it generates clean HTML and gives easy access to HTML source. Firing up Live Writer was quicker than using full HTML authoring tools such as Dreamweaver (or Microsoft’s FrontPage), and publishing a post was a single click.

    However, Live Writer was always able to publish to a range of blogging platforms, including WordPress and TypePad. Live Writer supported the Moveable Type API (used by TypePad) and the Atom publishing protocol as well as the ability to post to Live Spaces and SharePoint.

    The product is still available today and you can download it here though as ever you should do a custom install and unselect stuff you do not need, and it has not been updated since 2012.

    However, it is still rather good. One of its smart features is that after configuring a blog account, it downloads the CSS style sheets from the blog so that it can present pretty much what the post will look like as you type and edit.

    Windows Essentials
    Blog with Writer, manage email with Mail, and more
    http://windows.microsoft.com/en-us/windows-live/essentials-other#essentials=overviewother

    Writer

    Create stunning blog posts in minutes, with photos, videos, maps, and more. Then publish them to any of your favorite blog service providers.

    Reply
  35. Tomi Engdahl says:

    Firefox Maker Battles to Save the Internet—and Itself
    http://www.technologyreview.com/featuredstory/537661/firefox-maker-battles-to-save-the-internet-and-itself/

    Mozilla helped an open Web flourish in the 2000s. Now it’s struggling to play a meaningful role on mobile devices.

    In Silicon Valley, most pioneers pursue big ideas and giant personal fortunes with equal zeal. Then there’s Mozilla, an innovation dynamo that refuses to get rich.

    More than 500 million people worldwide use Mozilla products. The company’s Firefox Internet browser is the top choice in countries ranging from Germany to Indonesia. But the company has no venture capital backing, no stock options, no publicly traded shares. It hardly ever patents its breakthroughs. Instead, Mozilla has a business model that’s as open and sprawling as the World Wide Web itself, where everything is free and in the public domain.

    For a long time, it seemed as if Mozilla’s idealistic engineers understood the future better than anyone. By building the Firefox browser with open-source software, Mozilla made it easy for all kinds of people to cook up improvements that the whole world could use. Independent developers in dozens of countries pitched in, creating add-ons that speeded up downloads, blocked unwanted ads, and performed other useful services. Firefox rapidly became the browser in which state-of-the-art development took place–on shoestring budgets.

    Suddenly, though, the Internet looks nightmarish to Mozilla. Most of the world now gets online on mobile devices, and about 96 percent of smartphones run on either the Apple iOS or Google Android operating systems. Both of these are tightly controlled worlds.

    “Many of the principles we associate with the Web–openness, decentralization and the ability of anyone to publish without asking permission from others–are at risk,” declared a lengthy blog post written in November 2014 by Mitchell Baker, chair of the Mozilla Foundation, the nonprofit vehicle that serves as the company’s ultimate owner.

    No matter that users and software developers seem to be thriving in this more structured new milieu, with nearly one billion Apple iOS and Google Android smartphones being sold each year. From Baker’s perspective, “frankly, this direction for the Internet sucks.”

    Reply
  36. Tomi Engdahl says:

    Let’s kill off the meaningless concept of SW-defined storage
    All in all, all things considered, it’s rubbish
    http://www.theregister.co.uk/2015/06/09/kill_meaningless_concept_sw_defined_storage/

    Software-defined storage has become a meaningless and useless concept.

    It doesn’t tell you anything useful beyond the vague idea that software drives the hardware. Well, yes, when virtually every storage hardware product you use is based on commodity hardware then it would, wouldn’t it?

    Saying a product is SW-defined storage is nowhere near good enough. And making this assertion — that because SW-defined storage in general is taking off and growing like a weed on steroids, by extension your SW-defined storage product will ramp up sales like a rocket — is just plain silly.

    SW-defined storage is storage software sold independently of hardware. That’s it. Think Nexenta, Scality, Atlantis, PernixData, DataCore, Maxta and other suppliers.

    It’s HW supplier-independent storage which the supplier then needs to tell you provides file, object or physical/virtual SAN block access with a scale-out/up/both design, etc, etc.

    Anything else is just pseudo-sophisticated marketing bollocks; ordure fit only for fan blade coating.

    Reply
  37. Tomi Engdahl says:

    Intranet, or nah?
    https://www.igloosoftware.com/blogs/inside-igloo/intranet_or_nah?utm_source=techmeme&utm_medium=referral&utm_campaign=blog

    Implementing a new software solution is a big deal. The question that people always forget to ask themselves before they start is strangely a pretty obvious one: Do I even need this thing, or nah? In this post, we give you an easy and foolproof way to answer this question.

    Get on it.

    The best way to figure out the workflow of your company is yet another low tech method, most closely associated with a high-tech process: the user story. What does that mean? It means you have to go talk to people. Start thinking who would be the power users of your hypothetical intranet. Who would be in charge of managing it? Who would represent a typical user, someone who uses it day in and day out, but lightly overall? Make as many categories as you need, and then book 15 min meetings with these people. Prepare a few questions related to the project, but also leave time for them to just talk about what they would like to see. Take notes, obviously.

    Now comes the important part, analyzing this data. You want to find a framework that works for you, and break down every interview into it.

    Do the thing.

    This whole process is useless if you don’t act on it. This sheet isn’t something you shelve. It’s the baseline for the whole project. Bring it to meetings, use it as data, as proof of what you are saying. Your superiors won’t be able to disprove words that are coming right out of their workers’ mouths.

    Reply
  38. Tomi Engdahl says:

    Peter Rubin / Wired:
    Sony says at least 30 games are being developed for Project Morpheus, will price the headset at “several hundred dollars” — Sony’s VR Push Hinges On The Game We Thought We’d Never Play — It’s really my own fault for forgetting about the spiders.

    Sony’s VR Push Hinges on the Game We Thought We’d Never Play
    http://www.wired.com/2015/06/sony-morpheus/

    Reply
  39. Tomi Engdahl says:

    Rachel King / ZDNet:
    Google Sheets updated with data analysis, visualization tools — At the most visual level, users will be able to preview formula results while typing, revealing both progress and potential errors nearly instantly. — Although it is not the most glamorous of apps in Google’s vast catalog …

    Google Sheets updated with data analysis, visualization tools
    http://www.zdnet.com/article/google-sheets-updated-with-data-analysis-visualization-tools/

    Although it is not the most glamorous of apps in Google’s vast catalog, spreadsheet editor Sheets is being tweaked to make larger data sets more digestible.

    At the most visual level, users will be able to preview formula results while typing, revealing both progress and potential errors nearly instantly.

    Sheets has been prepared for a few new functions, like adding calculated fields to apply formulas to pivot table data and the “GETPIVOTDATA” function for retrieving data from a pivot table.

    Reply
  40. Tomi Engdahl says:

    Beyond the USA Freedom Act: How U.S. Surveillance Still Subverts U.S. Competitiveness
    http://www2.itif.org/2015-beyond-usa-freedom-act.pdf?_ga=1.114044933.369159037.1433787396

    Reply
  41. Tomi Engdahl says:

    Devs to pour Java into Amazon’s cloud after AWS Lambda update
    Event-driven model not just for JavaScript anymore
    http://www.theregister.co.uk/2015/06/16/aws_lambda_java_support/

    Amazon Web Services has expanded its AWS Lambda programming model to support functions written in Java, the cloud kingpin said on Monday.

    Lambda, which allows developers to run event-driven code directly on Amazon’s cloud without managing any application infrastructure, launched in November 2014 and initially only supported code written in JavaScript and Node.js.

    With Monday’s update, developers can now write their event handlers in Java 8, provided they code them in a stateless style that doesn’t make any assumptions about the underlying infrastructure.

    “We have had many requests for this and the team is thrilled to be able to respond,” AWS chief evangelist Jeff Barr said in a blog post.

    AWS Lambda functions can be invoked automatically whenever a variety of events take place in Amazon’s cloud. So, for example, you could set a function to be triggered whenever a certain Amazon Simple Storage Service (S3) storage bucket is modified, or to watch for events from the Kinesis data-processing service.

    Lambda functions can also be used as back ends for mobile applications that store data on the AWS cloud.

    Lambda functions written in Java can use any Java 8 features and can even invoke Java libraries. The handler code and any necessary JAR files are bundled up into a JAR or ZIP file for deployment on AWS.

    To make life easier for developers, Amazon has released AWS Toolkit plugin for Eclipse that takes care of packaging and uploading handlers.

    Reply
  42. Tomi Engdahl says:

    A server apocalypse can come in different shapes and sizes. Be prepared
    Plan Bs, from corrupted data to pandemics
    http://www.theregister.co.uk/2015/06/16/time_for_plan_b_server_disaster/

    Reply
  43. Tomi Engdahl says:

    Cinnamon 2.6 – a Linux desktop for Windows XP refugees
    Dual monitor support, better panels, and improvements in speed
    http://www.theregister.co.uk/2015/06/16/cinnamon_2_6_review/

    Cinnamon is best known as one of the two default desktops for Linux Mint, which is fast approaching its next major update. Mint 17.2 will include the brand new Cinnamon 2.6, just released, when delivered later this year.

    So far, so standard – only Cinnamon is no longer just a Linux Mint desktop.

    Cinnamon is now available directly as part of Debian 8 and Fedora 22. Naturally, Cinnamon will work with many other distros as well, but its inclusion in the default installers for big names such as Debian and Fedora marks a turning point for Cinnamon: this really is no longer just an “alternative” desktop for a single distro.

    Part of that growth in popularity is no doubt related to Cinnamon sticking with the traditional desktop paradigm – panels with applets, a start menu and system tray.

    While some will find that “old fashioned”, clearly there are plenty of users looking for just that in a desktop. Indeed, Cinnamon would be my top pick for Linux newcomers, especially those looking to get off the now long abandoned Windows XP.

    There’s good reason too. Cinnamon manages to offer a desktop experience that’s both familiar and yet feels modern.

    Cinnamon 2.6 will be available by default in Linux Mint 17.2, slated to arrive by the end of the month, and in LMDE 2 Betsy. If you don’t want to wait you can install Cinnamon 2.6 in Linux Mint 17.1 as well.

    Reply
  44. Tomi Engdahl says:

    Microsoft finally finishes its PowerPC emulator
    Look what the nerd brought to a gaming conference
    http://www.theregister.co.uk/2015/06/15/microsoft_xbox_360_one_compatibility/

    Microsoft kicked off this year’s E3 gaming conference by announcing that its x86-powered Xbox One console can now play games built for the PowerPC-based Xbox 360.

    The Windows giant took center stage at the Los Angeles Convention Center to announce that its latest console will run games that are available for its predecessor. The biz claims soon more than 100 Xbox 360 games will be playable on the One

    Microsoft didn’t say exactly how the 360 emulation will work, although it has been working on it for some time

    The hardware in the two machines are not similar: the 2013 Xbox One uses a custom 64-bit x86 chip based on the AMD Jaguar architecture, while the 2005 360 relied on a 64-bit IBM Xenon PowerPC processor. The x86 and PowerPC architectures are not compatible, so some emulation is needed to get games running.

    Game developers do not need to change any code for their 360 titles to run on the One, we’re told: popping in a 360 disc or downloading the title from the Xbox marketplace should be enough to run it on the XB1.

    Reply
  45. Tomi Engdahl says:

    Chip, PC Demand Continues Decline
    Some PC chips down as much as 20% in Q2
    http://www.eetimes.com/document.asp?doc_id=1326870&

    Demand for semiconductors continued to weaken in the second quarter with some PC assemblers in Asia suggesting a decline in purchases of some parts of as much as 15-20% from the second quarter of 2014, according to a report from Wall Street analysts at Deutsche Bank.

    “The iPhone supply chain and the automotive market continue to remain the only bright spots,” said analysts who visited nearly 30 companies in Taiwan, Korea and Japan in five days.

    Chip demand also is weak for tablets, TVs and many Android phones, according to the report that gave the chip sector a neutral ranking overall.

    Reply
  46. Tomi Engdahl says:

    Software Development: Not Written Here Is New Norm
    So long, source code
    http://www.eetimes.com/document.asp?doc_id=1326865&

    The new norm in the world of computing is code reuse, much of it proprietary third party or open source. Due to pressures of the market to produce software as fast as possible and at a low cost, many programmers are not doing what even a few years ago would be normal: writing their own original source code.

    The pressure to instead use software developed elsewhere is intense. According to a survey of developers in 2014 by Venture Development Corp., the size of embedded code base alone is increasing at roughly three times the rate of the number of embedded software developers being hired. Where the number of software engineers available is expected to rise 9.6 percent through 2016, the expected code base growth is estimated to grow by 18.6 percent over the same period. Overall, embedded developers included in VDC’s 2014 survey said 51.1 percent of their project budgets were spent on software, versus 41.8 percent in 2012. Equally telling, respondents indicated that 51 percent of the end product value in 2014 was produced by the software versus 35.8 percent in 2012.

    “Companies we surveyed said that they simply cannot keep pace in the embedded space with developers alone,” said Andre Girard, Senior Analyst at VDC. ”More than 40% of the developers in our survey reported their projects are running behind schedule.”

    To deal with the disparity, embedded companies are currently using third party software in 44 percent of their designs. “Overall, 40.5% of respondents in medical device manufacturing, 28.6% in aerospace and defense, and 22.2% in auto and rail all expected to see an increase in commercial and other third-party code,” he said.

    “Given such pressures, companies and their developers would be stupid not to take advantage of all the software code and IP building blocks openly available, and of all of the sources by which it can be obtained to speed up their designs.”

    Reply
  47. Tomi Engdahl says:

    IBM Sparks Machine Learning
    Professionalizing Apache Spark
    http://www.eetimes.com/document.asp?doc_id=1326868&

    Just as IBM put Linux on the commercial open-source operating system (OS) map, it plans to do the same thing with Apache’s Spark platform — promising to educate more than one million data-scientists and -engineers as well as to build Spark into its analytics and commerce platforms, according to keynote speaker Beth Smith, General Manager of Analytics Platforms at IBM, at the Spark Summit 2015 (June 15-to-17, San Francisco).

    “IBM is fully committed to Spark as a foundational technology platform for driving analytics across every business in a fundamental way,” Smith told EE Times in advance of her keynote speech at the Spark Summit. “Spark will advance our user’s data strategies, driving business transformation and differentiation.”

    What is Spark?
    Apache Spark is an open-source cluster computing framework (originating at University of California at Berkeley in 2009). It simplifies the process of developing “smart” distributed applications. By managing in-memory computing resources, it provides primitives that can boodty performance by 100-times for applications like machine learning. Spark keeps all often-used data in-memory, rather than on mass storage devices, allowing it to be quickly and repeatedly accessed, which is why it is appropriate for smart apps such as machine learning.

    Reply
  48. Tomi Engdahl says:

    TRIM and Linux: Tread Cautiously, and Keep Backups Handy
    http://linux.slashdot.org/story/15/06/16/201217/trim-and-linux-tread-cautiously-and-keep-backups-handy

    Algolia is a buzzword-compliant (“Hosted Search API that delivers instant and relevant results”) start-up that uses a lot of open-source software (including various strains of Linux) and a lot of solid-state disk, and as such sometimes runs into problems with each of these. Their blog this week features a fascinating look at troubles that they faced with ext4 filesystems mysteriously flipping to read-only mode: not such a good thing for machines processing a search index, not just dishing it out

    The rest of the story explains how they isolated the problem and worked around it; it turns out that the culprit was TRIM, or rather TRIM’s interaction with certain SSDs: “The system was issuing a TRIM to erase empty blocks, the command got misinterpreted by the drive and the controller erased blocks it was not supposed to. Therefore our files ended-up with 512 bytes of zeroes, files smaller than 512 bytes were completely zeroed. When we were lucky enough, the misbehaving TRIM hit the super-block of the filesystem and caused a corruption.”

    Since SSDs are becoming the norm outside the data center as well as within, some of the problems that their analysis exposed for one company probably would be good to test for elsewhere.

    “As a result, we informed our server provider about the affected SSDs and they informed the manufacturer. Our new deployments were switched to different SSD drives and we don’t recommend anyone to use any SSD that is anyhow mentioned in a bad way by the Linux kernel.”

    When Solid State Drives are not that solid
    https://blog.algolia.com/when-solid-state-drives-are-not-that-solid/

    UPDATE June 16:
    A lot of discussions started pointing out that the issue is related to the newly introduced queued TRIM. This is not correct. The TRIM on our drives is un-queued and the issue we have found is not related to the latest changes in the Linux Kernel to disable this features.

    Broken SSDs:

    SAMSUNG MZ7WD480HCGM-00003
    SAMSUNG MZ7GE480HMHP-00003
    SAMSUNG MZ7GE240HMGR-00003
    Samsung SSD 840 PRO Series
    recently blacklisted for 8-series blacklist
    Samsung SSD 850 PRO 512GB
    recently blacklisted as 850 Pro and later in 8-series blacklist

    Working SSDs:

    Intel S3500
    Intel S3700
    Intel S3710

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*