Computer trends for 2014

Here is my collection of trends and predictions for year 2014:

It seems that PC market is not recovering in 2014. IDC is forecasting that the technology channel will buy in around 34 million fewer PCs this year than last. It seem that things aren’t going to improve any time soon (down, down, down until 2017?). There will be no let-up on any front, with desktops and portables predicted to decline in both the mature and emerging markets. Perhaps the chief concern for future PC demand is a lack of reasons to replace an older system: PC usage has not moved significantly beyond consumption and productivity tasks to differentiate PCs from other devices. As a result, PC lifespan continue to increase. Death of the Desktop article says that sadly for the traditional desktop, this is only a matter of time before its purpose expires and that it would be inevitable it will happen within this decade. (I expect that it will not completely disappear).

When the PC business is slowly decreasing, smartphone and table business will increase quickly. Some time in the next six months, the number of smartphones on earth will pass the number of PCs. This shouldn’t really surprise anyone: the mobile business is much bigger than the computer industry. There are now perhaps 3.5-4 billion mobile phones, replaced every two years, versus 1.7-1.8 billion PCs replaced every 5 years. Smartphones broke down that wall between those industries few years ago – suddenly tech companies could sell to an industry with $1.2 trillion annual revenue. Now you can sell more phones in a quarter than the PC industry sells in a year.

After some years we will end up with somewhere over 3bn smartphones in use on earth, almost double the number of PCs. There are perhaps 900m consumer PCs on earth, and maybe 800m corporate PCs. The consumer PCs are mostly shared and the corporate PCs locked down, and neither are really mobile. Those 3 billion smartphones will all be personal, and all mobile. Mobile browsing is set to overtake traditional desktop browsing in 2015. The smartphone revolution is changing how consumers use the Internet. This will influence web design.

crystalball

The only PC sector that seems to have some growth is server side. Microservers & Cloud Computing to Drive Server Growth article says that increased demand for cloud computing and high-density microserver systems has brought the server market back from a state of decline. We’re seeing fairly significant change in the server market. According to the 2014 IC Market Drivers report, server unit shipment growth will increase in the next several years, thanks to purchases of new, cheaper microservers. The total server IC market is projected to rise by 3% in 2014 to $14.4 billion: multicore MPU segment for microservers and NAND flash memories for solid state drives are expected to see better numbers.

Spinning rust and tape are DEAD. The future’s flash, cache and cloud article tells that the flash is the tier for primary data; the stuff christened tier 0. Data that needs to be written out to a slower response store goes across a local network link to a cloud storage gateway and that holds the tier 1 nearline data in its cache. Never mind software-defined HYPE, 2014 will be the year of storage FRANKENPLIANCES article tells that more hype around Software-Defined-Everything will keep the marketeers and the marchitecture specialists well employed for the next twelve months but don’t expect anything radical. The only innovation is going to be around pricing and consumption models as vendors try to maintain margins. FCoE will continue to be a side-show and FC, like tape, will soldier on happily. NAS will continue to eat away at the block storage market and perhaps 2014 will be the year that object storage finally takes off.

IT managers are increasingly replacing servers with SaaS article says that cloud providers take on a bigger share of the servers as overall market starts declining. An in-house system is no longer the default for many companies. IT managers want to cut the number of servers they manage, or at least slow the growth, and they may be succeeding. IDC expects that anywhere from 25% to 30% of all the servers shipped next year will be delivered to cloud services providers. In three years, 2017, nearly 45% of all the servers leaving manufacturers will be bought by cloud providers. The shift will slow the purchase of server sales to enterprise IT. Big cloud providers are more and more using their own designs instead of servers from big manufacturers. Data center consolidations are eliminating servers as well. For sure, IT managers are going to be managing physical servers for years to come. But, the number will be declining.

I hope that the IT business will start to grow this year as predicted. Information technology spends to increase next financial year according to N Chandrasekaran, chief executive and managing director of Tata Consultancy Services (TCS), India’s largest information technology (IT) services company. IDC predicts that IT consumption will increase next year to 5 per cent worldwide to $ 2.14 trillion. It is expected that the biggest opportunity will lie in the digital space: social, mobility, cloud and analytics. The gradual recovery of the economy in Europe will restore faith in business. Companies are re-imaging their business, keeping in mind changing digital trends.

The death of Windows XP will be on the new many times on the spring. There will be companies try to cash in with death of Windows XP: Microsoft’s plan for Windows XP support to end next spring, has received IT services providers as well as competitors to invest in their own services marketing. HP is peddling their customers Connected Backup 8.8 service to prevent data loss during migration. VMware is selling cloud desktop service. Google is wooing users to switch to ChromeOS system by making Chrome’s user interface familiar to wider audiences. The most effective way XP exploiting is the European defense giant EADS subsidiary of Arkoon, which promises support for XP users who do not want to or can not upgrade their systems.

There will be talk on what will be coming from Microsoft next year. Microsoft is reportedly planning to launch a series of updates in 2015 that could see major revisions for the Windows, Xbox, and Windows RT platforms. Microsoft’s wave of spring 2015 updates to its various Windows-based platforms has a codename: Threshold. If all goes according to early plans, Threshold will include updates to all three OS platforms (Xbox One, Windows and Windows Phone).

crystalball

Amateur programmers are becoming increasingly more prevalent in the IT landscape. A new IDC study has found that of the 18.5 million software developers in the world, about 7.5 million (roughly 40 percent) are “hobbyist developers,” which is what IDC calls people who write code even though it is not their primary occupation. The boom in hobbyist programmers should cheer computer literacy advocates.IDC estimates there are almost 29 million ICT-skilled workers in the world as we enter 2014, including 11 million professional developers.

The Challenge of Cross-language Interoperability will be more and more talked. Interfacing between languages will be increasingly important. You can no longer expect a nontrivial application to be written in a single language. With software becoming ever more complex and hardware less homogeneous, the likelihood of a single language being the correct tool for an entire program is lower than ever. The trend toward increased complexity in software shows no sign of abating, and modern hardware creates new challenges. Now, mobile phones are starting to appear with eight cores with the same ISA (instruction set architecture) but different speeds, some other streaming processors optimized for different workloads (DSPs, GPUs), and other specialized cores.

Just another new USB connector type will be pushed to market. Lightning strikes USB bosses: Next-gen ‘type C’ jacks will be reversible article tells that USB is to get a new, smaller connector that, like Apple’s proprietary Lightning jack, will be reversible. Designed to support both USB 3.1 and USB 2.0, the new connector, dubbed “Type C”, will be the same size as an existing micro USB 2.0 plug.

2,130 Comments

  1. Tomi Engdahl says:

    Information Management and Marketing drift easily on a collision course. The reason for this is a different approach to technology projects.

    Information Management Executives and marketing managers proclaim to the world the best friends. Behind the curtains of cooperation, however, is waning.

    “CIOs need to show the marketing of the line with its objectives. We do not see very often in the construction of a bridge between these two functions, “says customer marketing company Merkle Marketing and Chief Technology Officer Matt Mobley.

    “I do not think anyone has managed to solve information management and marketing in the current conflict,” he continues.

    Mobley gives an example of a consumer product until the company. Its marketing boss wanted to combine all of the data processing platforms and connected together, and devices on the network that locally stored data.

    Source: http://summa.talentum.fi/article/tv/10-2014/97764

    Reply
  2. Tomi Engdahl says:

    Kryder’s law craps out: Race to UBER-CHEAP STORAGE is OVER
    Bad news for cloud and archive
    http://www.theregister.co.uk/2014/11/10/kryders_law_of_ever_cheaper_storage_disproven/

    Kryder’s Law says disk storage will keep on getting cheaper. Bt it is increasingly looking like it won’t, which has baleful implications for cloud storage and archiving.

    Like Moore’s Law, Kryder’s Law is (a) not a law, merely a wonderfully apt observation for a prolonged but not eternal period of time, and (b) borne of an incredibly bold and confident view of technology.

    Mark Kryder was a Seagate SVP for research and its chief technology officer when an article titled Kryder’s Law – where he was interviewed – appeared in a 2005 issue of Scientific American. The “Law” declares that disk drive areal density would more than double every two years, meaning disk drive capacity would do likewise.

    The article said:

    Since the introduction of the disk drive in 1956, the density of information it can record has swelled from a paltry 2,000 bits to 100 billion bits (gigabits), all crowded in the small space of a square inch. That represents a 50-million-fold increase.

    That 50 million increase took place over 41 years, a rate of improvement fantastically faster than Moore’s Law, which, the article said involves, “doubling the power and memory of computer semiconductors every 18 months”.

    After more than a quarter century of Kryder’s Law, the exponential increase in disk capacity at constant form factor, and thus exponential decrease in $/GB, almost everyone believed that long-term storage was effectively free.

    The assumption was that: “If you could afford to store some data for a few years, you could afford to store it forever. The cost of that much storage would have become negligible.” Only that is no longer true.

    Rosenthal discusses “the Kryder rate (the annual percentage drop in $/GB)” and says it is slowing, citing a graph from Preeti Gupta at UCSC in support.

    graph “shows that disk is now about seven times as expensive as it would have been had the industry maintained its pre-2010 Kryder rate.”

    The 2011 Thai floods almost doubled disk capacity cost/GB for a while. Rosenthal writes: “The technical difficulties of migrating from PMR to HAMR, meant that already in 2010 the Kryder rate had slowed significantly and was not expected to return to its trend in the near future. The floods reinforced this.”

    “… what looks like a smooth Kryder’s Law curve is actually the superposition of a series of S-curves, one for each successive technology generation.”

    Each step change increase in capacity will take longer, be relatively more expensive, and disks will have to be engineered for longer life, meaning lengthier warranty periods.

    Rosenthal also works out the effect of a 20 per cent per year Kryder rate (from IHS iSuppli), IT budget growth at two per cent/year, an IDC data growth projection at 40 per cent/year and storage at 12 per cent of the IT budget.

    He reckons that, after eight years of this, the annual cost of storing all the data accumulated since year zero, relative to the cost in year zero, is 100 per cent of your IT budget.

    Let’s turn to the cloud and the Amazon/Google/Microsoft cloud storage cost reductions. The three are in a race to the bottom, a race to zero, so to speak, as they try to wipe out other cloud storage competitors and persuade corporates to store their data in their clouds.

    “We see a future where storage is free and infinite.”

    He is quite, quite wrong. It won’t be free and it won’t be infinite.

    The Kryder rate slowdown will see to that and business models based on a continuation of the Kryder at at pre-Thai flood levels will fail as their costs become untenable.

    Reply
  3. Tomi Engdahl says:

    Dense Flash Arrays Find New Niches
    http://www.eetimes.com/document.asp?doc_id=1324564&

    The early days of flash in storage arrays were defined by its selective use for handling high-priority data, in large part due to its higher cost per gig than spinning disks. But as flash gets cheaper, and traditional HDDs start to show their limitations for some applications, flash adoption is getting more widespread.

    The proliferation of flash in data centers can be seen with the growth and evolution of all-flash arrays.

    But price is not the only factor that might prompt the adoption of more flash in the data center or even an all-flash array. Application workloads and density are part of the equation.

    Skyera is not particularly unique in its use of flash, he said, but it has a unique path to go after certain markets. And like other flash storage players such as Violin and Pure Storage, it has the advantage of having designed a storage array from the ground up with flash in mind. Traditional storage players that are integrating flash into their products, in part due to acquisitions, are often tied to their legacy platforms, including software that was designed to manage spinning disks.

    More broadly, Sinclair said, it’s important to remember that flash is a technology, not a product, so what matters is how it’s being deployed. It is permeating the storage market, but that doesn’t mean all-flash arrays are destined to take over; there’s still a market for hybrid approaches.

    Flash initially found a home in the data center for specific pain points, and now it’s being considered for how it might help the overall bottom line of the business, such as supporting more transactions, which translates into higher revenue, Sinclair said.

    Reply
  4. Tomi Engdahl says:

    PowerVR Guns for AMD, Nvidia
    Series 7 targets notebook, console graphics
    http://www.eetimes.com/document.asp?doc_id=1324546&

    Imagination Technologies announced the latest turn of the crank for its graphics cores, which are mainly used in smartphones and tablets. The PowerVR Series 7 sports 35-60% better performance than the previous generation and adds a range of new features, including support for the first time for use in high-performance computing markets, where AMD and Nvidia currently hold sway.

    The Series 7 chips are the company’s first to support designs that deliver up to 1.5 TFlops, using up to 512 cores — 16 clusters, each with 32 arithmetic logic units. The high-end configuration targets notebook and console graphics, as well as servers running general-purpose GPU programs, typically under OpenCL. Nvidia has long pioneered with its Cuda environment for the GPU server, a diverse space of scientific and business applications.

    Reply
  5. Tomi Engdahl says:

    Mozilla Updates Firefox With Forget Button, DuckDuckGo Search, and Ads
    http://news.slashdot.org/story/14/11/11/0412210/mozilla-updates-firefox-with-forget-button-duckduckgo-search-and-ads

    company is launching a new Forget button in Firefox to help keep your browsing history private, adding DuckDuckGo as a search option

    Reply
  6. Tomi Engdahl says:

    Mozilla Launches Browser Built For Developers
    http://tech.slashdot.org/story/14/11/10/1450201/mozilla-launches-browser-built-for-developers

    “Mozilla announced that they are excited to unveil Firefox Developer Edition, the first browser created specifically for developers that integrates two powerful new features, Valence and WebIDE that improve workflow and help you debug other browsers and apps directly from within Firefox Developer Edition. Valence (previously called Firefox Tools Adapter) lets you develop and debug your app across multiple browsers and devices by connecting the Firefox dev tools to other major browser engines. WebIDE allows you to develop, deploy and debug Web apps directly in your browser, or on a Firefox OS device”

    Mozilla Introduces the First Browser Built For Developers: Firefox Developer Edition
    https://hacks.mozilla.org/2014/11/mozilla-introduces-the-first-browser-built-for-developers-firefox-developer-edition/

    Reply
  7. Tomi Engdahl says:

    Firefox Developer Edition
    https://www.mozilla.org/en-US/firefox/developer/

    All your favorite dev tools and more

    Firefox Developer Edition brings your core dev tools together with some powerful new ones that will extend your ability to work across multiple platforms from one place. It’s everything you’re used to, only better. And only from Firefox.

    Reply
  8. Tomi Engdahl says:

    Reliable, secure and mobile it is no longer enough for companies. Digital business requires the Startup familiar with agility. “CIO’s must continue to be the bedrock but also to manage the wild stream,” the director Tina Nunno research firm Gartner says.

    He speaks to hundreds of CIOs company in Barcelona. Gartner predicts that by 2017, 75 percent of IT organizations to build skills in activities that responded swiftly to the needs of the moment.

    Tina Nunno what companies need to do to increase their response capacity in IT? “Form a team of digital and johtakaa it quite differently than traditional IT. Decide how the IT budget is divided into the experimental and risky business, as well as to maintain it’s between. ”

    Traditionally, IT has tried to reliable operation, which has been the key skills CIOs. CIO’s work, however, is changing rapidly. Management practices need to change the generated new kinds of requirements.

    IT jobs increasingly smaller part is handled CIOs money. IT decisions are dispersed business units that are actually IT startups within the company, Gartner, leaders are illustrative.

    Information management must perform its traditional functions in the new experiments: When a business to develop new applications for smart phones, CIO of the plot belongs to the company and the application, ensuring security.

    Source: http://summa.talentum.fi/article/tv/uutiset/107775

    Reply
  9. Tomi Engdahl says:

    Alibaba Singles’ Day sales hit $9.34B
    http://www.cnbc.com/id/102171362

    Jack Ma, founder and executive chairman of Chinese e-commerce giant Alibaba, said he hopes to fully bring the Singles’ Day shopping holiday to global consumers by 2019. This comes after the online shopping event topped $9 billion in sales, smashing previous records.

    Ma also said that he wants to take Alipay—Alibaba’s online financial services platform—public in the near future.

    Reply
  10. Tomi Engdahl says:

    NETWORKED SOCIETY CITY INDEX 2014
    http://www.ericsson.com/res/docs/2014/networked-society-city-index-2014.pdf

    To prepare the Networked Society City Index each year, we review the leading research literature
    and case studies exploring the connection between ICT and sustainable urban development

    This year, we noted a clear shift in research emphasis, away from proving the case for ICT benefits to
    a focus on how city governments can maximize those benefits As ICT is accepted as a natural springboard for growth and development, the 2014 City Index will follow that same focus

    Reply
  11. Tomi Engdahl says:

    US, China, ink tariff-free technology trade pact
    Vendorland hopes other parties to WTO Information Technology Agreement will follow suit
    http://www.theregister.co.uk/2014/11/12/us_china_ink_trilliondollar_tarifffree_technology_trade_pact/

    The United States of America and the People’s Republic of China have agreed to abolish tariffs on each other’s technology products.

    Announced at the Asia-Pacific Economic Co-operation (APEC) summit in Beijing on Monday, the deal won’t immediately mean unfettered access for US and Chinese companies, because the agreement was struck under the World Trade Organisation’s Information Technology Agreement (ITA) that calls for tariffs on all IT products to be abolished.

    The Agreement’s been on the books since 1997 without ever being signed off by the 80 nations negotiating it, but that changed at the APEC leaders’ summit on Monday when China and the USA reached their agreement.

    Reply
  12. Tomi Engdahl says:

    The last PC replacement cycle is about to start turning
    When even senior sysadmins work on an iPhone connected to an Apple TV, the end is nigh
    http://www.theregister.co.uk/2014/11/12/the_last_pc_replacement_cycle_is_about_to_start_turning/

    The material culture of computing moves so rapidly – and so disposably – we very rarely glimpse such side-by-side comparisons of past to present. In IT, the past is junk: too slow, too big, and too hard.

    Almost my entire presentation consisted of video clips

    In the classroom, I took my brand new iPhone 6, plugged it into the lecture theatre’s HDMI port, and ran the whole presentation – in high definition, complete with nicely animated transitions – off my phone.

    This new iPhone is the second most powerful computer I own, only surpassed by the top-of-the-line MacBook Air I type this on, and far more powerful than my first MacBook Air, purchased in the long-ago days of 2011.

    Something has happened, right under our noses, so close we still can’t quite see it.

    “Need it for work,” he replied. “Gonna see if I can dump my laptop.”

    My friend runs the IT infrastructure for one of Australia’s most successful online retailers. It’s his job to make sure the customer-facing systems ringing up sales are available 24×7. Always on call, getting texts advising him of the status of his servers, services, and staff, he keeps a laptop close at hand, in case something ever needs his personal attention. Something always does.

    “Got a little Bluetooth keyboard to go along with it,” he continued. “When I’m in the office I’ll AirPlay it over to an Apple TV connected to a monitor. What’s the difference between that and a desktop?”

    The desktop has been dead for some years, resurrected to an afterlife of video editing and CAD. Laptops keep getting smaller and more powerful, but we’ve now reached a moment when they’re less useful than our smartphones.

    The computer as we have known it, with integrated keyboard and display, has lost its purpose in a world of tiny, powerful devices that can cast to any nearby screen (Chromecast & AirPlay), browse any website, and run all the important apps. Why carry a boat anchor when you can be light as a feather?

    Many of will continue to sit before keyboards and monitors, physical affordances that interface us with the digital world. But the place where the magic happens, that keeps moving – once lugged over the shoulder, then tucked under the arm, and now stowed away in a pocket.

    Over the next decade, all of IT will reorient itself around this reality.

    Reply
  13. Tomi Engdahl says:

    729 teraflops, 71,000-core Super cost just US$5,500 to build
    Cloud doubters, this isn’t going to be your best day
    http://www.theregister.co.uk/2014/11/12/aws_cloud_turns_super_again/

    CycleCloud has helped hard drive giant Western Digital shove a month’s worth of simulations into eight hours on Amazon cores.

    The simulation workload was non-trivial: to check out new hard drive head designs, the company runs a million simulations, each of which involved a sweep of 22 head design parameters on three types of media.

    In that context, HGST’s in-house computing became a serious bottleneck, with each simulation run taking as much as 30 days to complete.

    Hence, in what it describes as the largest enterprise cloud run so far, CloudCycle spun up nearly 71,000 AWS cores for an eight-hour run.

    CloudCycle claims the cluster delivered 729 teraflops to run HGST’s MRM/MatLab app under the control of CycleCloud cluster-creation software and Chef automation system.

    The cloud outfit says it spun the app up from zero to 50,000 cores in 23 minutes, and calculates that the run, dubbed “Gojira”, completed nearly 620,000 compute-hours.

    Reply
  14. Tomi Engdahl says:

    Big Data Goes Unicorn Hunting
    Thousands of data scientists needed
    http://www.eetimes.com/document.asp?doc_id=1324509&

    In today’s big data era, the U.S. is starving for data scientists, according to panelists at Dell World here. There are plenty of opportunities in data analytics, they said; the big challenge is in finding people who can pluck useful insights from the mountains of information.

    “Actually finding people who can extract insight, wisdom maybe, from increasingly diverse real-time sorts of data is truly the bottleneck,” Michael Chui, partner at research firm McKinsey Global Institute, said during a panel discussion. “The set of skills partly around statistics, partly around machine learning, around visualization, around being able to design experiments…these are the scarce resources.”

    Reply
  15. Tomi Engdahl says:

    Apple A8X’s GPU – GXA6850, Even Better Than I Thought
    by Ryan Smith on November 11, 2014 11:00 PM EST

    Apple’s SoC development is consistently on the cutting edge, so it’s always great to see something new, but Apple has also developed a love for curveballs. Coupled with their infamous secrecy and general lack of willingness to talk about the fine technical details of some of their products, it’s easy to see how well Apple’s SoCs perform but it is a lot harder to figure out why this is.

    Since publishing our initial iPad Air 2 review last week, a few new pieces of information have come in that have changed our perspective on Apple’s latest SoC. As it turns out I was wrong. Powered by what we’re going to call the GXA6850, the A8X’s GPU is even better than I thought.

    Since then, we have learned a few things that have led us to reevaluate our findings and discover that A8X’s GPU is even more powerful than GX6650.

    The second piece of information came from analyzing GFXBench 3.0 data to look for further evidence.

    Reply
  16. Tomi Engdahl says:

    Microsoft shocker: Open-source .NET, free Visual Studio, support for Linux, Mac, Android and iOS
    http://www.geekwire.com/2014/net-visual-studio-microsoft-open-source-cross-platform/

    In a major strategy change designed to expand its horizons in the cloud, Microsoft will take its key software development technologies into areas that the company has long considered enemy territory — giving developers new ways to use .NET and Visual Studio to make software not just for Windows but also for Linux, Mac OS X, iOS and Android.

    The landmark moves, announced this morning, include a plan to open-source the .NET core server runtime and framework, making it possible for outsiders to access and contribute to the code that powers Microsoft’s software development platform.

    As part of the change, Microsoft will give developers the ability to use the .NET runtime and framework to make server- and cloud-based applications for Linux and Mac.

    Microsoft is also releasing a new, full-featured version of Visual Studio 2013 that will be available at no cost to independent developers, students, small companies and others not making enterprise applications.

    And the company is releasing a preview of Visual Studio 2015 and .NET 2015 with new features for building applications that run on platforms including Windows, Linux, iOS and Android.

    Opening up Visual Studio and .NET to Every Developer, Any Application: .NET Server Core open source and cross platform, Visual Studio Community 2013 and preview of Visual Studio 2015 and .NET 2015
    http://blogs.msdn.com/b/somasegar/archive/2014/11/12/opening-up-visual-studio-and-net-to-every-developer-any-application-net-server-core-open-source-and-cross-platform-visual-studio-community-2013-and-preview-of-visual-studio-2015-and-net-2015.aspx

    Reply
  17. Tomi Engdahl says:

    Microsoft debuts Visual Studio 2015 and .NET 2015 previews, free Visual Studio Community 2013
    http://venturebeat.com/2014/11/12/microsoft-debuts-visual-studio-2015-and-net-2015-previews-free-visual-studio-community-2013/

    In addition to announcing that .NET is going open source and cross-platform, Microsoft today unleashed a torrent of Visual Studio news at its Connect() developer event in New York City. The company released Visual Studio 2015 Preview and .NET 2015 Preview, a new free Visual Studio Community 2013 offering, Visual Studio 2013 Update 4, a Visual Studio Online expansion, and a slew

    Top feature highlights of Visual Studio 2015 Preview include:

    Ability to create ASP.NET 5 websites that can run on multiple platforms, including Windows, Mac, and Linux
    Integrated support for building apps that run across Android, iOS, and Windows devices with integration of Visual Studio Tools for Apache Cordova (including new iOS debugging and seamless integration with TypeScript) as well as new Visual C++ tools for cross-platform library development
    Connected Services manager that lets developers discover and consume REST APIs in their applications, including support for Azure Mobile Services, Azure Store, Office 365, and Salesforce today
    Smart Unit Tests (based on the PEX technology developed by Microsoft Research) that analyze code and automatically generate unit tests to describe its behavior
    New coding productivity capabilities, particularly for C# and VB, thanks to built-in integration of the new “Roslyn” .NET compiler platform
    New language features in C# 6 to reduce boilerplate and clutter in everyday code, and new light bulbs in the editor that bring proactive refactoring and code fixing opportunities
    Support for breakpoint configuration and PerfTips, both available directly in context in the editor
    Edit and debug a single set of C++ source code and build it for Android, iOS, and Windows; integrated support for the Clang complier and LLVM optimizer for targeting Android now and iOS “soon”
    More complete C++ 11 and C++ 14 support, as well as dozens of additional productivity features for C++ developers, including new refactorings, improved “Find in Files,” a Memory Diagnostics tool, and improved incremental builds

    Reply
  18. Tomi Engdahl says:

    Structural storage industry changes green light Cisco
    No legacy drag and server-led opportunity
    http://www.theregister.co.uk/2014/11/13/structural_storage_industry_changes_green_light_cisco/

    Structural changes in IT and storage technology are green lighting opportunities for Cisco as it has no legacy storage products holding it back and server-led storage opportunities are growing and growing.

    William Blair analyst Jason Ader has issued a note called “Day of Reckoning for IT Infrastructure Is Near” which talks of “massive secular disruption in the IT infrastructure market,” secular being analyst-speak for long-term and fundamental changes as opposed to cyclical changes, such as seasonal trends.

    Because of these changes “consolidation is inevitable, historical partnerships increasingly lose relevance, and activist and private equity investors gain greater sway.” Seem familiar?

    The IT industry restructuring era has already started;

    Dell has gone private
    HP and Symantec are each splitting in two
    IBM has sold off its PC, server and semi-conductor operations
    Activist investors gave their hooks into EMC, Juniper, NetApp and others
    Tibco and Compuware have been the subject of private equity takeouts

    Ader says “The magnitude of industry change has taken many vendors by surprise and has left some in denial – a situation that occurred during previous technology transitions (mainframe to minicomputer from the 1970s to mid-1980s, minicomputer to client-server from the mid-1980s to late 1990s, and now client-server to cloud, starting in the 2000s). This is a textbook example of the “innovator’s dilemma,” in which incumbent vendors often fail to make the bold moves necessary to address disruptions in their markets, in the hopes of protecting their mature and lucrative businesses.”

    What are these changes?

    The impact of the cloud will have four aspects:

    It will limit vendor pricing over time as spending becomes more concentrated with a small number of large, powerful customers,
    Higher utilisation rates of cloud infrastructure will create a persistent headwind to hardware and software capacity requirements,
    Cloud companies are more likely to employ white-box and software-defined architectures to power their data centres,
    Cloud services are typically defined by consumption-based or subscription pricing, which can often be disruptive to traditional infrastructure vendor business models.

    White box products “decouple software control and intelligence from the underlying hardware of a computing, storage or networking device” and so disaggregate existing product stacks and lower margins.

    Vendor consolidation is overdue

    Ader reckons that one large vendor response is to build vertically-integrated (consolidated) product stacks as a way to increase margin. This will boost pressures for industry consolidation.

    Potential acquisition targets

    The William Blair analyst fingers Arista, CommVault, Juniper and Nimble Storage as potential buyout candidates

    Reply
  19. Tomi Engdahl says:

    Stop coding and clean up your UI, devs, it’s World Usability Day
    Of course we’ve shoved the usability website through the W3C HTML validator
    http://www.theregister.co.uk/2014/11/13/stop_coding_and_clean_up_your_ui_devs_its_world_usability_day/

    November 13th is World Usability Day, the annual event that urges all and sundry “to ensure that the services and products important to life are easier to access and simpler to use.”

    The day’s raison d’être is promoting good design, so that products and services are easy to use, rather than useless. Physical objects are the focus, but the day also recognises that software design is something worth addressing.

    One small gripe: we ran worldusabilityday.org/ through the W3C’s HTML validator, an exercise that produced five errors. None are massive usability SNAFUs, but when even the usability wonks have problems it may behove us all to have a think about the issue today.

    Reply
  20. Tomi Engdahl says:

    Data Center Study Reveals Top 5 SMART Stats That Correlate To Drive Failures
    http://hardware.slashdot.org/story/14/11/12/1946208/data-center-study-reveals-top-5-smart-stats-that-correlate-to-drive-failures

    Backblaze, which has taken to publishing data on hard drive failure rates in its data center, has just released data from a new study of nearly 40,000 spindles revealing what it said are the top 5 SMART (Self-Monitoring, Analysis and Reporting Technology) values that correlate most closely with impending drive failures. The study also revealed that many SMART values that one would innately consider related to drive failures, actually don’t relate it it at all.

    The 5 SMART stats that actually predict hard drive failure
    http://www.computerworld.com/article/2846009/the-5-smart-stats-that-actually-predict-hard-drive-failure.html

    ackblaze’s analysis of nearly 40,000 drives showed five SMART metrics that correlate strongly with impending disk drive failure:

    SMART 5 – Reallocated_Sector_Count.
    SMART 187 – Reported_Uncorrectable_Errors.
    SMART 188 – Command_Timeout.
    SMART 197 – Current_Pending_Sector_Count.
    SMART 198 – Offline_Uncorrectable

    Backblaze counts a drive as failed when it is removed from a storage array and replaced because it has totally stopped working or because it has shown evidence of failing soon.

    Reply
  21. Tomi Engdahl says:

    Linux Foundation Comments On Microsoft’s Increasing Love of Linux
    http://linux.slashdot.org/story/14/11/12/2122200/linux-foundation-comments-on-microsofts-increasing-love-of-linux

    Executive Director Jim Zemlin writes, “We do not agree with everything Microsoft does and certainly many open source projects compete directly with Microsoft products. However, the new Microsoft we are seeing today is certainly a different organization when it comes to open source. The company’s participation in these efforts underscores the fact that nothing has changed more in the last couple of decades than how software is fundamentally built.”

    Microsoft Appeals to Developers, Developers, Developers
    http://www.linux.com/news/featured-blogs/158-jim-zemlin/795282-microsoft-appeals-to-developers-developers-developers

    Former Microsoft CEO Steve Ballmer became infamous in 2006 after leading a Microsoft Windows meeting in a chant, “developers, developers, developers.” While the images of him clapping his hands and screaming became the target of the early social media and YouTube culture, he was right with his intention. Developers are the masters of the universe (at least in the world of software), and Microsoft gets it.

    Today the company is making a rather big announcement: It is open sourcing the server side .NET stack and expanding it to run on Linux and Mac OS platforms. All developers will now be able to build .NET cloud applications on Linux and Mac. These are huge moves for the company and follow its recent acknowledgement that at least 20 percent of Azure VMs are running Linux. This struck a chord in the Twittersphere but wasn’t all that surprising when you consider how pervasive Linux is in the cloud.

    These changes make us keenly aware of how much the software business has transformed over the last decade. Microsoft is redefining itself in response to a world driven by open source software and collaborative development and is demonstrating its commitment to the developer in a variety of ways that include today’s .NET news. A few years ago it was among the top 20 corporate contributors to the Linux kernel. It participates in the open SDN project, OpenDaylight, and the open IoT effort the AllSeen Alliance. And this year Microsoft joined the Core Infrastructure Initiative focused on funding critical open source projects running the world’s infrastructure. We do not agree with everything Microsoft does and certainly many open source projects compete directly with Microsoft products. However, the new Microsoft we are seeing today is certainly a different organization when it comes to open source.

    Today most software is built collaboratively.

    Reply
  22. Tomi Engdahl says:

    Don’t like droopy results, NetApp? Develop server-side SAN
    Declining revenues becoming a common theme
    http://www.theregister.co.uk/2014/11/13/netapp_develop_server_side_san_products_or_face_the_consequences/

    Reply
  23. Tomi Engdahl says:

    Microsoft takes .NET open source and makes it available for Linux and Mac
    Cautious welcome for Microsoft sea change
    http://www.theinquirer.net/inquirer/news/2381172/microsoft-takes-net-open-source-and-makes-it-available-for-linux-and-mac#

    MICROSOFT HAS MADE good on its promise to take the .NET framework that powers many aspects of Windows and its programs to open source, and in doing so, make it available to Linux and Mac users for the first time.

    The entire server stack will be posted to Github, starting with the next edition, an idea that would have been unfathomable less than two years ago.

    Announcing new governance model and project contributions to the .NET Foundation
    http://www.dotnetfoundation.org/blog/announcing-new-governance-model-and-project-contributions-to-the-net-foundation

    Reply
  24. Tomi Engdahl says:

    USB 3.0 chips released at Electronica
    http://www.edn.com/electronics-blogs/catching-waves/4437294/USB-3-0-chips-released-at-Electronica-?_mc=NL_EDN_EDT_EDN_analog_20141113&cid=NL_EDN_EDT_EDN_analog_20141113&elq=419b68d3c39040ed9ce05e02f676a43d&elqCampaignId=20162

    Global Industry Analytics projects that global sales of USB 3.0 enabled devices will reach 3 billion units by 2018. The major market driver is expected to be the need to increase transmission speeds between peripheral devices and computers.

    Reply
  25. Tomi Engdahl says:

    Welcome to the fast-moving world of flash connectors
    A guided tour
    http://www.theregister.co.uk/2014/11/13/flash_connectors/

    The standards

    Advanced Host Controller Interface (AHCI): When SATA first emerged we all had to scramble around to find drivers

    Suddenly there was a common standard that BIOS, operating system and controller could agree upon.
    AHCI is separate from the various SATA standards, although it exposes new functionality in those standards as they are added.

    SCSI has always been the server counterpart to consumer storage interconnects. Where SAS is the connector interface used by server drives, all the bits are actually communicating with each other using an evolution of the old SCSI standard.

    iSCSI is a specific implementation of the SCSI protocol. Not tied to any given interconnect, iSCSI is simply a means of transferring SCSI commands and data across a network.

    NVMe is the industry’s response to the proliferation of PCIe (peripheral component interconnect express) and now memory channel SSDs. Like SATA, it requires a different driver for each manufacturer, and support is not baked into operating systems. NVMe (non-volatile memory express) is to PCIe storage what AHCI was to SATA. It provides a common language for all the various bits to speak, and as a result native drivers exist for most modern operating systems (BSD, Linux, ESXi et al).

    SOP/PQI is the proposed SCSI protocol extension that would compete directly with NVMe.

    The old guard

    ATA, ATAPI, USB and cards are the old school ways of getting things done. Though it might surprise people, SSDs are still found for old parallel ATA connections as they are quite common in embedded devices. Card connectors, ranging from SD cards to USB sticks, can be found with ultra-low-end consumer flash as well as ultra-high-end enterprise SLC flash, if you know where to look. Again, the embedded space makes use of flash extensively and many a VMware ESXi server was built on a USB stick.

    SATA is probably the most common storage interconnect available today. It is predominantly used in tandem with the AHCI standard. Though SATA is significantly faster than its predecessor, parallel ATA, the use of AHCI makes it sub-optimal for flash.

    mSATA (mini-SATA) was an interconnect used for a short time to put SSDs into notebook, embedded devices and so forth. mSATA was succeeded by M.2 and is not likely to be included in any future designs

    SAS is the next most common storage interconnect available today. It was introduced around the same time as SATA as a means of bringing the technical advantages of high-speed serial interfaces to disks, controllers and drivers using the SCSI standard. SAS is just flat-out faster than SATA, which is half duplex: information flows in only one direction at a time. SAS is full duplex: you can read and write at the same time.

    Fibre Channel is a less common SCSI interconnect, popular with high-end enterprises. Despite the name, fibre optics is not required. Drives typically use a copper mechanical interconnect similar to SATA or SAS, plug into a hot-swap backplane and are connected to a host bus adapter (HBA).

    PCIe inside

    As you have probably worked out by now, current interconnect solutions are completely inadequate in fully utilising SSDs.
    And so, everyone hijacked PCIe. It is everywhere, from notebooks to servers, and there are even ARM SoCs with PCIe for tablets and embedded devices. It is as close to the CPU as you will get without using RAM slots, and it is fast.
    Using traditional PCIe slots to attach storage had a major drawback: you needed to power down the server to swap out the card.
    A series of new standards have emerged to bring PCIe to the drive, and they include port protocols to allow for important features like hot-swapping.

    PCIe outside

    SATA Express (SATAe) is the result of the move to PCIe.
    standard defining a bunch of interconnects that can support traditional SATA drives as well as make PCIe lanes available
    The protocol used is NVMe, and SATAe can most accurately be called “NVMe over PCIe”.

    SCSI Express is the evolution of SCSI to fill the same role. “SCSI over PCIe”

    M.2, formerly known as the Next Generation Form Factor (NGFF), is the PCIe-enabled replacement for mSATA. M.2 is to all intents and purposes a SATAe connector. This means it exposes both a traditional SATA interface as well as PCIe lanes.

    As PCIe becomes the connector of choice for the average SSD, memory channel storage (MCS) – SSDs in the RAM slots – is taking up the role once served by PCIe SSDs.

    Reply
  26. Tomi Engdahl says:

    Use Low-Code Platforms to Develop the Apps Customers Want
    http://www.cio.com/article/2845378/development-tools/use-low-code-platforms-to-develop-the-apps-customers-want.html

    Low-code, rapid development platforms provide a way to incorporate user feedback into apps during development. This improves the turnaround time for consumer-facing applications while ensuring that projects don’t turn into white elephants.

    If you’re developing an application for customers, but you only have a rough idea of what they want, then you face a Catch-22: You can’t specify the application’s requirements and develop it until you get feedback from customers, but they can’t provide feedback until you’ve developed it.
    Featured Resource
    Presented by Scribe Software
    10 Best Practices for Integrating Data

    Data integration is often underestimated and poorly implemented, taking time and resources. Yet it
    Learn More

    How do you escape from this predicament? Many organizations have responded by using one of a growing breed of “low-code,” rapid development platforms.

    Clay Richardson, an analyst at Forrester Research, defines a low-code platform as one that enables fast application development and delivery with a minimum of hand coding. The platform should be easy to deploy and is likely to be used to develop customer-facing “systems of engagement.” Familiar names such as Red Hat, Software AG and Salesforce.com offer low-code platforms, as well as lesser-known companies such as Alpha Software, Claysys Technologies, Mendix and Mobideo.

    Low-code platforms certainly don’t eliminate hand coding altogether. As well as minimizing hand coding, though, they speed up application delivery by providing visual tools for the quick definition and assembly of forms and the rapid build-out of multistage workflows, Richardson says. They also allow the easy configuration of data models that help eliminate common data integration headaches.

    These platforms are useful for knocking together applications in a matter of days or weeks and getting them out for customers to try. Depending on how customers receive them, the applications can be abandoned as non-starters or developed in new iterations that incorporate user comments and suggestions.

    Low-code Platforms Works for Experienced, Novice Developers

    The drive for adopting low-code platforms tends to come from the chief marketing officer or the marketing team, Richardson believes. Nevertheless, the people actually using the tools tend to be existing full-time developers, who might otherwise code in Java, .NET or C#, rather than so-called “citizen developers” in the marketing department.

    However, there are low-code developers with a very different skill set to established developers as well. “These tend to be kids coming out of school with no programming background, but who can be trained in days to do this as a full-time job,” Richardson says. “They certainly couldn’t deliver anything in Java or C#, but they can deliver with these platforms – and they can do so at speed.”

    “This is important for applications when you don’t know what the business outcome will be,” he says. “In a world of slow, expensive development, these rapid platforms allow you to shorten the front end so you can get something on the table, tune it and get experience with it.”

    Reply
  27. Tomi Engdahl says:

    SQL Rises from the Ashes of Big Data
    http://www.big-dataforum.com/882/sql-rises-ashes-big-data

    So you thought SQL (“Sequel” for those in the know) was dead? Think again.

    According to an article in Forbes, SQL never really left—and never even lost its relevant status: “…it’s not so much a return to significance for SQL as recognition of its continuing relevance.”

    That continued relevance is a marriage of several different programs, which is the way the majority want to see it: “…most enterprise customers don’t see it as a contest between Big Data, NoSQL, and relational database technology.” According to Andy Menddelsohn, Executive Vice President of Oracle, “It’s not an ‘either/or’ thing, it’s an ‘and’ thing…They don’t want information silos; they want information integration…. now we can integrate data that’s stored in Oracle databases, NoSQL, and Hadoop into a single SQL query.” Getting rid of silos is the name of the game in big data analytics integration—allowing you to see patterns and ideas never otherwise imagined.

    So SQL is more than just here to stay. Mendelsohn adds, “SQL is a standard and all the vendors in this space follow that standard. That’s a big thing you find missing in the NoSQL world.”

    - See more at: http://www.big-dataforum.com/882/sql-rises-ashes-big-data#sthash.SFnkkZGY.dpuf

    Reply
  28. Tomi Engdahl says:

    Five years of Go language:

    Half a decade with Go
    http://blog.golang.org/5years

    Five years ago we launched the Go project. It seems like only yesterday that we were preparing the initial public release

    At launch, there was a flurry of attention. Google had produced a new programming language, and everyone was eager to check it out. Some programmers were turned off by Go’s conservative feature set—at first glance they saw “nothing to see here”—but a smaller group saw the beginnings of an ecosystem tailored to their needs as working software engineers. These few would form the kernel of the Go community.

    After the initial release, it took us a while to properly communicate the goals and design ethos behind Go.

    Reply
  29. Tomi Engdahl says:

    Desktop Linux users beware: the boss thinks you need to be managed
    VMware reveals VDI for Linux desktops plan, plus China lab to do the development
    http://www.theregister.co.uk/2014/10/31/desktop_linux_users_beware_its_decided_you_need_management/

    Desktop Linux users beware: IT has noticed you and decided it’s time you were properly managed.

    So says VMware, which yesterday at its vForum event in China let it be know that it will deliver a desktop virtualisation (VDI) solution for Linux desktops.

    Virtzilla says it hasn’t bothered doing so before now because so few people use Linux on the desktop at work, and those that do are self-sufficient so IT leaves them to their own devices.

    But VMware says its customers now realise that in this highly-regulated age of the megabreach, unmanaged Linux desktops probably aren’t tenable. It therefore plans to take the bits of its Desktone desktop-as-a-service service – which already handles Linux desktops – and build an on-premises equivalent.

    Intriguingly, the product will be developed in China where VMware has just opened a new lab

    That’s a fair comment: a Linux desktop with an HTML5 browser can do just about anything required by a great many end users. If VMware can make Linux desktops more manageable, it’ll be helping a lot more folks than those already using Linux on the desktop. Virtzilla will also gain a an interesting way to pitch VDI as a way to do desktop upgrades without Windows licences.

    Don’t wait up late for the VDI product: VMware says is “is expected in 2015”.

    Reply
  30. Tomi Engdahl says:

    Technology To Automate A Third Of U.K. Jobs Over Next 20 Years, Says Deloitte
    http://techcrunch.com/2014/11/11/automation-uk-jobs/?ncid=rss&cps=gravity

    Technology, automation and robotics destroys jobs by replacing human work with machines, and demanding workforces change up their skills to remain employable. This we know.

    But a new study by professional services firm Deloitte has quantified the rate of destruction for the U.K. jobs market over the next 20 years – predicting that around one-third (35 per cent) of existing jobs across the U.K. are under high risk of replacement via automation over this time period.

    The U.K. study links job destruction to rates of pay, with lower-paid jobs (paying less than £30,000) more than five times more likely to be replaced than higher-paid jobs (paying over £100,000).

    This link between lower paid jobs and automation suggests technology risks fueling growing wealth inequalities — unless education and training can be successfully reconfigured to upskill populations with the digital, management and creative skills that are at reduced risk of automation.

    The study found that lower paid jobs are almost eight times as likely to be replaced than higher paid jobs when looking specifically at London.

    A very large majority (84 per cent) of London businesses said the skills of their employees will need to change over the next 10 years, with ‘digital know-how’, ‘management’ and ‘creativity’ identified as skills increasingly in need, and ‘processing,’ ‘support and clerical work’ and ‘foreign languages’ less so.

    Reply
  31. Tomi Engdahl says:

    Your Incompetent Boss Is Making You Unhappy
    http://developers.slashdot.org/story/14/11/13/1613258/your-incompetent-boss-is-making-you-unhappy

    A new working paper shows strong support for what many have always suspected: your boss’s technical competence is the single strongest predictor of workers’ well-being, way ahead of other factors such as education, earnings, job tenure and public vs. private sector.

    Boss Competence and Worker Well-being
    http://www.andrewoswald.com/docs/NovArtzGoodallOswald2014.pdf

    Nearly all workers have a supervisor or ‘boss’. Yet there is almost no published research by economists into how bosses affect the quality of employees’ lives. This study offers some of the first formal evidence. First, it is shown that a boss’s technical competence is the single strongest predictor of a worker’s well-being. Second, we provide equivalent instrumental – variable results. Third, we demonstrate longitudinally that even if a worker stays in the same job and workplace then a newly competent supervisor greatly improves the worker’s well-being.

    Reply
  32. Tomi Engdahl says:

    Nvidia upgrades Shield tablet, gets Valve classics, and launches cloud-based tablet games
    http://venturebeat.com/2014/11/13/nvidia-upgrades-shield-tablet-gets-valve-exclusive-and-launches-cloud-based-tablet-games/

    Nvidia is going to make its Shield gaming tablet owners a lot happier soon.

    The graphics-chip company is launching a series of updates to its tablet, including an upgrade to the newest version of Google’s Android operating system. These upgrades show that Nvidia is serious about expanding from graphics chips to providing whole gaming systems to its hardcore fans.

    Nvidia will also launch what it’s calling the Green Box Bundle, which includes a 32GB Shield Tablet (LTE) with Android 5.0, a free version of Valve’s Half-Life 2: Episode One (and the earlier Half-Life 2 and Portal), and access to a bunch of cloud-based hardcore games over its cloud-based Nvidia Grid.

    Reply
  33. Tomi Engdahl says:

    Amazon: ARM Chipmakers Aren’t Matching Intel’s Innovation
    http://www.bloomberg.com/news/2014-11-13/amazon-arm-chipmakers-aren-t-matching-intel-s-innovation.html

    Amazon.com Inc. (AMZN), which operates some of the world’s largest data centers, said makers of chips that use ARM Holdings Plc (ARM)’s technology aren’t keeping up with Intel Corp. (INTC)’s pace of innovation.

    As a result, Amazon isn’t ready to start using alternatives to Intel’s chips in its servers, according to James Hamilton, a vice president for Amazon Web Services, which provides computing power and storage over the Internet to other companies.

    “It’s just not quite moving fast enough,” Hamilton said in an interview, referring to the pace of development of ARM-chip technology. He spoke after a presentation at Amazon’s annual Web-services conference in Las Vegas.

    Winning over clients such as AWS in server chips is critical for Advanced Micro Devices Inc. (AMD), Applied Micro Circuits Corp. and other companies seeking to loosen Intel’s grip on the lucrative market for server processors. They argue that Intel’s market share of 98 percent means that its customers pay too much and need alternatives.

    Data-center operators, many of which make their own servers, are looking for more efficient components, not just cheaper ones, since the price of powering and cooling the warehouses full of computers exceeds the cost of the equipment. As yet, it’s not worth replacing Intel-based machines, Hamilton said.

    Reply
  34. Tomi Engdahl says:

    Microsoft surpasses Exxon as 2nd most valuable co.
    http://hosted.ap.org/dynamic/stories/U/US_MICROSOFT_NO_2?SITE=AP&SECTION=HOME&TEMPLATE=DEFAULT

    SAN FRANCISCO (AP) — The bull run in Microsoft’s stock this past year has helped the tech giant surpass Exxon Mobil and seize the rank of the second most valuable company, behind Apple Inc.

    Under new CEO Satya Nadella, Microsoft has worked to overcome its reputation as a clumsy behemoth struggling to keep up with new tech trends and consumer habits. Nadella has cut expenses – and jobs – while pledging to refocus the company on mobile technology and cloud computing. His efforts have fueled a stock surge that drove Microsoft’s total market value above $410 billion on Friday.

    Reply
  35. Tomi Engdahl says:

    IBM, Nvidia land $325M supercomputer deal
    http://www.cnet.com/au/news/ibm-nvidia-land-325-million-supercomputer-deal/

    US Energy Department funds two huge machines that combine IBM and Nvidia chips with Mellanox networking. A further $100 million goes toward making faster next-gen supercomputers.

    In a Department of Energy deal worth $325 million, IBM will build two massive supercomputers called Sierra and Summit that combine a new supercomputing approach from Big Blue with Nvidia processing accelerators and Mellanox high-speed networking.

    The companies and US government agency announced the deal on Friday ahead of a twice-yearly supercomputing conference that begins Monday. The show focuses on the high-end systems — sometimes as large as a basketball court — that are used to calculate car aerodynamics, detect structural weaknesses in airplane designs and predict the performance of new drugs.

    The funds will pay for two machines, one for civilian research at the Oak Ridge National Laboratory in Tennessee and one for nuclear weapons simulation at the Lawrence Livermore National Laboratory in California.

    Reply
  36. Tomi Engdahl says:

    Multi-target IDE for 8-Bit CPUs
    http://hackaday.com/2014/11/15/multi-target-ide-for-8-bit-cpus/

    The project is called ASM80, and includes a code editor, a workspace to put all your code, compilers for the 8080/8085, Z80, 6502, 6800 and 6809 CPUs, emulators for all these CPUs, and emulators for a few Czech computers, the ZX Spectrum, and a few of [Grant Searle]‘s single board computers.

    What makes this project interesting is the syntax for all the different CPUs is pretty much the same. It’s a real, modular code editor that supports macros and everything you would expect for a code editor for ancient computers.

    http://www.asm80.com/

    Reply
  37. Tomi Engdahl says:

    IBM and Nvidia to shame Tianhe-2 with 150 petaflop supercomputer
    US government throws $325m at firms to build world’s fastest systems
    http://www.theinquirer.net/inquirer/news/2381703/ibm-and-nvidia-to-shame-tianhe-2-with-150-petaflop-supercomputer

    THE US DEPARMENT OF ENERGY (DoE) has thrown $325m at IBM and Nvidia to build the world’s fastest supercomputers by 2017.

    Dubbed Sierra and Summit, the two supercomputers are tipped to deliver more three times the performance of those currently available.

    They are expected to perform at 100 petaflops and 150 petaflops, respectively, compared to the world’s current top super-computer, the Intel-powered Tianhe-2, which performs at 55 petaflops.

    This will be thanks to IBM’s OpenPower chips and Nvidia’s new Volta graphics chip, along with Mellanox’s high-speed networking kit.

    IBM is promising much-improved performance thanks its new data-centric architecture, which will embed compute power everywhere data resides in the system, allowing for a “convergence of analytics, modelling, visualization and simulation, driving new insights at incredible speeds.”

    Reply
  38. Tomi Engdahl says:

    ‘Facebook At Work’ Could Be Coming To Your Office
    http://www.buzzfeed.com/claudiakoerner/facebook-at-work-could-be-coming-to-your-office

    The Financial Times reported Sunday that the social networking giant is trying to gain traction in the workplace. Facebook did not immediately comment on the report.

    Reply
  39. Tomi Engdahl says:

    Facebook is making ‘Facebook at Work,’ so you can Facebook at work
    http://mashable.com/2014/11/16/facebook-at-work-2/

    You already Facebook at work, so here’s “Facebook at Work.”

    Facebook is working on extending its network beyond the social realm and into the professional world, according to the Financial Times, citing anonymous sources.

    The company’s new, enterprise-focused product will be similar to the functionality of its current site, with a newsfeed, groups and messaging capability. However, it will also include collaborative tools for work on shared documents. Facebook at Work will be entirely separate from personal accounts, with no information from a user’s social profile appearing on his or her professional page, and vice versa.

    Reply
  40. Tomi Engdahl says:

    LSI driver bug is breaking VSANs, endangering data
    More VSAN hardware trouble for VMware
    http://www.theregister.co.uk/2014/11/17/lsi_driver_bug_is_breaking_vsans_endangering_data/

    VMware says its VSAN virtual storage array is selling well, earning hardware-makers’ attention and making plain the wisdom of the software-defined data centre.

    It may well be, but VSAN is also having some teething problems. Back in July, VMware was forced to change its recommended VSAN system configurationsbecause VSANs were choking on suggested setups.

    Now comes news, thanks to VMware partner SynchroNet, that an LSI component used by several server-makers, is causing some VSANs to fail.

    The component in question is the LSI 2208, a RAID controller SynchroNet’s John Nicholson says is used by the likes of Dell, HP and Cisco in their VSAN boxen. The 2208 is useful because it enables “pass-through” mode whereby the RAID controller is able to optimise use of spinning rust. Without a pass-through-enabled controller, VMware says “VSAN will not function efficiently” and “Performance on the VSAN datastore will not be maximized in this configuration.”

    Nicholson’s post suggest the problem’s been known for a while, and suggests he’s been told by VMware that for now the best workaround is dropping to RAID 0.

    Reply
  41. Tomi Engdahl says:

    GTK+ Developers Call For Help To Finish Cross-Platform OpenGL Support
    http://tech.slashdot.org/story/14/11/16/1619248/gtk-developers-call-for-help-to-finish-cross-platform-opengl-support

    OpenGL support under GTK is getting into good shape for providing a nice, out-of-the-box experience by default on key platforms for the GTK+ 3.16 / GNOME 3.16 release in March. For a few weeks now within mainline GTK+ has been native OpenGL support and as part of that a new GtkGLArea widget for allowing OpenGL drawing within GTK applications.

    OpenGL Support Is Looking Good For GTK+ 3.16, But Help Is Needed
    http://www.phoronix.com/scan.php?page=news_item&px=MTg0MDc

    For a few weeks now within mainline GTK+ has been native OpenGL support and as part of that a new GtkGLArea widget for allowing OpenGL drawing within GTK applications. Since that initial work landed, there’s been more GTK+ OpenGL code progressing that right now primarily benefits Linux X11 and Wayland users.

    Emmanuele Bassi of GNOME has issued a call for help from developers experienced with OpenGL on other platforms like Windows and OS X. GNOME developers are looking for experienced Windows / OS X developers that are familiar with using OpenGL contexts on those platforms to consider implementing the GdkGLContext API using WGL and AppleGL.

    Reply
  42. Tomi Engdahl says:

    Ars Dissects Android’s Problems With Big Screens — Including In Lollipop
    http://mobile.slashdot.org/story/14/11/16/2242212/ars-dissects-androids-problems-with-big-screens—-including-in-lollipop

    Ars Technica writer Andew Cunningham’s detailed, illustrated look at how Android handles screens much larger than seven inches, going back to the first large Android tablets a few years ago, but including Android 5.0 (Lollipop) on the Nexus 10 and similar sized devices. Cunningham is unimpressed with the use of space for both practical and aesthetic reasons, and says that problems crop up areas that are purely under Google’s control, like control panels and default apps, as well as (more understandably) in third party apps.

    The Nexus 10, Lollipop, and the problem with big Android tablets
    When it comes to tablets, Google doesn’t even follow its own design guidelines.
    http://arstechnica.com/gadgets/2014/11/the-nexus-10-lollipop-and-the-problem-with-big-android-tablets/

    I’ve never been tempted to buy a large widescreen tablet. They’re good at certain things, but they’re too wide for everything onscreen to be reachable if you’re holding it with both hands. They’re too tall for portrait mode to be comfortable for long stretches. One-handed use is generally tolerable at best. Smaller widescreen tablets like the Nexus 7 are nice because they’re closer in size and heft to books, but 10-inch-and-up widescreen tablets have always been too gawky for my taste.

    The problem two years ago was that the Android ecosystem was light on good tablet apps. There wasn’t a ton to do with that big screen, which meant there wasn’t much incentive to choose the Nexus 10 over an iPad or a smaller Android tablet. In examining Lollipop on the Nexus 10, our biggest questions are about the ways the redesigned OS and apps make use of that extra space.

    The Nexus 10 took 10-inch tablets back to the “blown-up phone” version of the UI, where buttons and other UI stuff was all put in the center of the screen. This makes using a 10-inch tablet the same as using a 7-inch tablet or a phone, which is good for consistency, but in retrospect it was a big step backward for widescreen tablets.

    Our biggest problem is the way apps look (1) on a screen this large and (2) in landscape mode. Even Google’s first-party apps don’t make great use of this space in their Lollipop and Material Design updates.

    A step up from the worst apps are the “acceptable” apps like Google Drive, Docs, Sheets, and Slides; Google Keep; Google Calendar; Google Maps; or Google+ Photos. These apps do nothing in particular to look great on a big tablet screen (they lack any kind of multi-pane layout, for example), but they at least take all the screen space they’re given and fill it up with stuff. Most of these apps also support swiping in from the left side of the screen to pop out the “hamburger” menu, a handy navigation option.

    Now, we’re not saying that it’s necessarily good user interface design to cram each and every screen full of as much stuff as will fit on it. But our chief frustration with the Nexus 10 is exactly the same as it was two years ago when we reviewed it. The tablet has a sharp, expansive screen, and Android and its apps do almost nothing useful with it.

    Reply
  43. Tomi Engdahl says:

    Open Source Self-Healing Software For Virtual Machines
    http://linux.slashdot.org/story/14/11/16/1846227/open-source-self-healing-software-for-virtual-machines

    Computer scientists have developed Linux based software that not only detects and eradicates never-before-seen viruses and other malware, but also automatically repairs damage caused by them. If a virus or attack stops the service, A3 could repair it in minutes without having to take the servers down.

    Self-repairing software tackles malware
    http://www.sciencedaily.com/releases/2014/11/141113140011.htm

    Computer scientists have developed software that not only detects and eradicates never-before-seen viruses and other malware, but also automatically repairs damage caused by them. The software then prevents the invader from ever infecting the computer again.

    University of Utah computer scientists have developed software that not only detects and eradicates never-before-seen viruses and other malware, but also automatically repairs damage caused by them. The software then prevents the invader from ever infecting the computer again.

    A3 is a software suite that works with a virtual machine — a virtual computer that emulates the operations of a computer without dedicated hardware. The A3 software is designed to watch over the virtual machine’s operating system and applications, says Eric Eide, University of Utah research assistant professor of computer science leading the university’s A3 team with U computer science associate professor John Regehr. A3 is designed to protect servers or similar business-grade computers that run on the Linux operating system. It also has been demonstrated to protect military applications.

    The new software called A3, or Advanced Adaptive Applications, was co-developed by Massachusetts-based defense contractor, Raytheon BBN, and was funded by Clean-Slate Design of Resilient, Adaptive, Secure Hosts, a program of the Defense Advanced Research Projects Agency (DARPA). The four-year project was completed in late September.

    There are no plans to adapt A3 for home computers or laptops, but Eide says this could be possible in the future.

    “A3 technologies could find their way into consumer products someday, which would help consumer devices protect themselves against fast-spreading malware or internal corruption of software components. But we haven’t tried those experiments yet,” he says.

    https://www.flux.utah.edu/project/a3

    Reply
  44. Tomi Engdahl says:

    Every third company takes extra ammo to IT

    Of the recession are still signs of the Nordic ICT market. Nevertheless, as much as about a third of the IDC Nordic CxO survey on participating in the organization intends to increase significantly, or at least some of their IT investments. 42 percent plan to keep their budgets at current levels.

    “It’s a great thing that more and more of the IT decision-makers in the private and public side says panevansa more more money in information technology,” IDC’s Nordic research director Jason Andersson says.

    The survey found that four out of ten respondents to the survey put it, money for infrastructure projects, over a third of the necessary software purchases, and a fifth of new projects.

    “More than eight out of ten respondents said they needed information technology to improve the competitiveness and efficiency. It’s a big change, and one of the reasons that IT investments are growing”

    “Cost reduction is still present and important, but for now here are prioritized more other things. This year, the list of number one in all the Nordic countries went up projects to improve the customer experience.”

    The priority list right place was taken by the security and mobility of the third.

    “Security-related projects of importance will change depending on what happens in the world.”

    Source: http://summa.talentum.fi/article/tv/uutiset/109910

    Reply
  45. Tomi Engdahl says:

    Intel will merge PC and mobile CPU groups, just as the devices themselves are converging
    http://www.pcworld.com/article/2849052/intel-to-combine-pc-and-mobile-divisions-as-market-shifts.html

    Intel will combine its PC and mobile processor divisions under one roof, reflecting a changing market in which the line between tablets and laptops has blurred.

    The chip maker will form a new division at the start of next year called the Client Computing Group, which will include the teams that develop its Core processors for desktops and laptops, as well as those that develop its Atom chips for smartphones and tablets.

    Until recently, Intel served the PC market with its powerful Core processors and the smartphone and tablet markets with its low-power Atom chips, but those lines are no longer so clear. The emergence of hybrid computers, which can switch between a laptop and a tablet, has done much to blur the boundary.

    “Industry-wide, the lines have been blurring,” Mulloy said. “The question is whether we’re organized to map to where the market is going.”

    Reply
  46. Tomi Engdahl says:

    Intel’s cash-cow PC chip division to GOBBLE mobile unit in reorg
    Mighty Chipzilla won’t have money-losing unit to kick around anymore
    http://www.theregister.co.uk/2014/11/18/intels_cash_cow_pc_chip_division_to_gobble_mobile_unit_in_reorg/

    Intel plans to combine its money-printing PC and loss-making mobile processor groups – even as it comes under mounting pressure from shareholders to deliver results in the mobile computing market, where Chipzilla is a piker compared to upstart rival ARM.

    Intel CEO Brian Krzanich let slip the news in an email to employees on Monday, according to a report in the Wall Street Journal.

    Intel Plans Merger Of Mobile And PC Divisions
    by Brett Howse on November 18, 2014 12:40 AM EST
    http://www.anandtech.com/show/8731/intel-plans-merger-of-mobile-and-pc-divisions

    According to a report this evening from the Wall Street Journal, in an email sent to employees by Intel CEO Brian Krzanich, Intel has announced plans to merge their struggling Mobile division with the PC Division. The newly created Client Computing Group would be led by Kirk Skaugen, who currently heads the PC division for Intel. The change in reporting is announced to commence in the beginning of calendar year 2015.

    Reply
  47. Tomi Engdahl says:

    HTML5: It’s Already Everywhere, Even In Mobile
    http://mobile.slashdot.org/story/14/11/18/0121223/html5-its-already-everywhere-even-in-mobile

    Tom Dale has never been shy, and in a Q&A with Matt Asay on ReadWrite, the EmberJS co-founder and JavaScript evangelist makes the outspoken claim that open Web technologies are already everywhere, even in native mobile apps, and that it’s only a matter of time before they catch up to “all the capabilities of a native, proprietary platform.” Take that, Web-is-dead doomsayers.

    HTML5′s “Dirty Little Secret”: It’s Already Everywhere, Even In Mobile
    Just look under the hood, says EmberJS co-founder Tom Dale.
    http://readwrite.com/2014/11/17/html5-javascript-everywhere-mobile-tom-dale-emberjs

    Reply
  48. Tomi Engdahl says:

    Do Good Programmers Need Agents?
    http://developers.slashdot.org/story/14/11/18/0025237/do-good-programmers-need-agents

    A rock star needs an agent, so maybe a rock star programmer needs one, too. As described in The New Yorker, a talent agency called 10x, which got started in the music business, is not your typical head hunter/recruiter agency. “The company’s name comes from the idea, well established in the tech world, that the very best programmers are superstars, capable of achieving ten times the productivity of their merely competent colleagues.

    The Programmer’s Price
    Want to hire a coding superstar? Call the agent.
    http://www.newyorker.com/magazine/2014/11/24/programmers-price?intcid=mod-most-popular

    The agency 10x has nearly eighty clients, mostly in North America, though one codes from India and one from beaches in Thailand.

    Solomon thought that he might be interested in AuthorBee’s use of Twitter. “He knows the Twitter A.P.I. in his sleep.”

    “What kind of price range are we talking about?” Bradley asked.

    “Ballpark, for this role you’re talking a hundred and fifty to two hundred and fifty dollars an hour.”

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*