Computer trends for 2014

Here is my collection of trends and predictions for year 2014:

It seems that PC market is not recovering in 2014. IDC is forecasting that the technology channel will buy in around 34 million fewer PCs this year than last. It seem that things aren’t going to improve any time soon (down, down, down until 2017?). There will be no let-up on any front, with desktops and portables predicted to decline in both the mature and emerging markets. Perhaps the chief concern for future PC demand is a lack of reasons to replace an older system: PC usage has not moved significantly beyond consumption and productivity tasks to differentiate PCs from other devices. As a result, PC lifespan continue to increase. Death of the Desktop article says that sadly for the traditional desktop, this is only a matter of time before its purpose expires and that it would be inevitable it will happen within this decade. (I expect that it will not completely disappear).

When the PC business is slowly decreasing, smartphone and table business will increase quickly. Some time in the next six months, the number of smartphones on earth will pass the number of PCs. This shouldn’t really surprise anyone: the mobile business is much bigger than the computer industry. There are now perhaps 3.5-4 billion mobile phones, replaced every two years, versus 1.7-1.8 billion PCs replaced every 5 years. Smartphones broke down that wall between those industries few years ago – suddenly tech companies could sell to an industry with $1.2 trillion annual revenue. Now you can sell more phones in a quarter than the PC industry sells in a year.

After some years we will end up with somewhere over 3bn smartphones in use on earth, almost double the number of PCs. There are perhaps 900m consumer PCs on earth, and maybe 800m corporate PCs. The consumer PCs are mostly shared and the corporate PCs locked down, and neither are really mobile. Those 3 billion smartphones will all be personal, and all mobile. Mobile browsing is set to overtake traditional desktop browsing in 2015. The smartphone revolution is changing how consumers use the Internet. This will influence web design.

crystalball

The only PC sector that seems to have some growth is server side. Microservers & Cloud Computing to Drive Server Growth article says that increased demand for cloud computing and high-density microserver systems has brought the server market back from a state of decline. We’re seeing fairly significant change in the server market. According to the 2014 IC Market Drivers report, server unit shipment growth will increase in the next several years, thanks to purchases of new, cheaper microservers. The total server IC market is projected to rise by 3% in 2014 to $14.4 billion: multicore MPU segment for microservers and NAND flash memories for solid state drives are expected to see better numbers.

Spinning rust and tape are DEAD. The future’s flash, cache and cloud article tells that the flash is the tier for primary data; the stuff christened tier 0. Data that needs to be written out to a slower response store goes across a local network link to a cloud storage gateway and that holds the tier 1 nearline data in its cache. Never mind software-defined HYPE, 2014 will be the year of storage FRANKENPLIANCES article tells that more hype around Software-Defined-Everything will keep the marketeers and the marchitecture specialists well employed for the next twelve months but don’t expect anything radical. The only innovation is going to be around pricing and consumption models as vendors try to maintain margins. FCoE will continue to be a side-show and FC, like tape, will soldier on happily. NAS will continue to eat away at the block storage market and perhaps 2014 will be the year that object storage finally takes off.

IT managers are increasingly replacing servers with SaaS article says that cloud providers take on a bigger share of the servers as overall market starts declining. An in-house system is no longer the default for many companies. IT managers want to cut the number of servers they manage, or at least slow the growth, and they may be succeeding. IDC expects that anywhere from 25% to 30% of all the servers shipped next year will be delivered to cloud services providers. In three years, 2017, nearly 45% of all the servers leaving manufacturers will be bought by cloud providers. The shift will slow the purchase of server sales to enterprise IT. Big cloud providers are more and more using their own designs instead of servers from big manufacturers. Data center consolidations are eliminating servers as well. For sure, IT managers are going to be managing physical servers for years to come. But, the number will be declining.

I hope that the IT business will start to grow this year as predicted. Information technology spends to increase next financial year according to N Chandrasekaran, chief executive and managing director of Tata Consultancy Services (TCS), India’s largest information technology (IT) services company. IDC predicts that IT consumption will increase next year to 5 per cent worldwide to $ 2.14 trillion. It is expected that the biggest opportunity will lie in the digital space: social, mobility, cloud and analytics. The gradual recovery of the economy in Europe will restore faith in business. Companies are re-imaging their business, keeping in mind changing digital trends.

The death of Windows XP will be on the new many times on the spring. There will be companies try to cash in with death of Windows XP: Microsoft’s plan for Windows XP support to end next spring, has received IT services providers as well as competitors to invest in their own services marketing. HP is peddling their customers Connected Backup 8.8 service to prevent data loss during migration. VMware is selling cloud desktop service. Google is wooing users to switch to ChromeOS system by making Chrome’s user interface familiar to wider audiences. The most effective way XP exploiting is the European defense giant EADS subsidiary of Arkoon, which promises support for XP users who do not want to or can not upgrade their systems.

There will be talk on what will be coming from Microsoft next year. Microsoft is reportedly planning to launch a series of updates in 2015 that could see major revisions for the Windows, Xbox, and Windows RT platforms. Microsoft’s wave of spring 2015 updates to its various Windows-based platforms has a codename: Threshold. If all goes according to early plans, Threshold will include updates to all three OS platforms (Xbox One, Windows and Windows Phone).

crystalball

Amateur programmers are becoming increasingly more prevalent in the IT landscape. A new IDC study has found that of the 18.5 million software developers in the world, about 7.5 million (roughly 40 percent) are “hobbyist developers,” which is what IDC calls people who write code even though it is not their primary occupation. The boom in hobbyist programmers should cheer computer literacy advocates.IDC estimates there are almost 29 million ICT-skilled workers in the world as we enter 2014, including 11 million professional developers.

The Challenge of Cross-language Interoperability will be more and more talked. Interfacing between languages will be increasingly important. You can no longer expect a nontrivial application to be written in a single language. With software becoming ever more complex and hardware less homogeneous, the likelihood of a single language being the correct tool for an entire program is lower than ever. The trend toward increased complexity in software shows no sign of abating, and modern hardware creates new challenges. Now, mobile phones are starting to appear with eight cores with the same ISA (instruction set architecture) but different speeds, some other streaming processors optimized for different workloads (DSPs, GPUs), and other specialized cores.

Just another new USB connector type will be pushed to market. Lightning strikes USB bosses: Next-gen ‘type C’ jacks will be reversible article tells that USB is to get a new, smaller connector that, like Apple’s proprietary Lightning jack, will be reversible. Designed to support both USB 3.1 and USB 2.0, the new connector, dubbed “Type C”, will be the same size as an existing micro USB 2.0 plug.

2,130 Comments

  1. Tomi Engdahl says:

    One Windows? How does that work… and WTF is a Universal App?
    Microsoft’s strategy is to make Store apps popular. Good luck with that
    http://www.theregister.co.uk/2014/10/02/one_windows_how_does_that_work_and_wtf_is_a_universal_app/

    “Whether it’s building a game, or a line of business application, there’ll be one way to write a universal application that targets the entire product family,” said Microsoft Executive VP Terry Myerson, announcing Windows 10 yesterday.

    Windows 10, Myerson said, will embrace devices of every type, from tiny embedded systems to PCs with 80-inch screens.

    Developers will rightly be sceptical. Microsoft is notorious for introducing new APIs and frameworks (the building blocks of applications) and deprecating old ones, including numerous database APIs and game development with XNA.

    Windows 10 everywhere does not mean that the operating system will have the same features and user interface on every kind of device.

    “We’re not talking about one UI to rule them all,” said Myerson. The core of the OS may be the same, but there will be different APIs and features according to the device type. Microsoft’s idea though is that the Windows Runtime will be included on all them, to unify its platforms, and that developers will be able to write universal apps that run everywhere.

    But what is a universal app? The term describes a project type in Visual Studio, Microsoft’s primary development tool and is documented here.

    A universal app has multiple targets, each with its own code. There is also a shared code area. When you build the app, Visual Studio combines the shared code with the target-specific code, creating an executable for each target.

    The universal app concept is a good one, but unfortunately it still takes effort to support multiple targets. The idea is to move as much code as possible into the shared area, but with platforms as different as Windows Phone versus a full size Windows tablet, there will be both user interface code and device-specific code that cannot be shared.

    Another issue is that universal apps only support Windows Runtime (Windows Store app) targets. Standard Windows desktop apps are not included. This means that Microsoft’s One Windows strategy only makes sense if Windows Store apps become popular. Currently that is not the case.

    If you are developing an app for, say, Windows Phone, then a universal app makes sense if it can easily be adapted to run on PCs. Those developing line-of-business apps for Windows PCs face a different decision.

    The bottom line is that even if Microsoft delivers the Windows 10 kernel across all kinds of devices, and improves the commonality between the versions of the Windows Runtime included on those devices, universal apps remain a hard sell.

    Build universal Windows apps that target Windows and Windows Phone
    http://msdn.microsoft.com/en-us/library/windows/apps/xaml/dn609832.aspx

    Reply
  2. Tomi Engdahl says:

    Will Windows 10 address the operating system’s biggest weakness?
    http://www.networkworld.com/article/2690354/microsoft-subnet/will-windows-10-address-the-operating-systems-biggest-weakness.html

    Ever try using a five-year-old PC to get anything done? That’s because over time, the OS decays.

    So the wraps are off, and no one got the name change right. Windows 10 comes with a whole lot of promises, not the least of which is that the company is listening to users and wants their feedback. So something tells me this OS will not be met with the derision of Windows 8.
    windows 10 toc
    Windows 10

    Update and upgrade roadmap crucial to Windows 9 success
    Windows 9 leak shows Storage Sense and Wi-Fi Sense coming over from Windows Phone
    9 things to look for in Windows 9
    Prediction: Windows 9 will hit the market in 2014

    See More

    At the grand unveiling, numerous features were discussed, from the interesting (multiple desktops) to the silly (ctrl-v pasting in the DOS prompt). One of the promises made was that Windows 10 would eliminate the need for reinstalls when a new OS version came out.

    Microsoft is promising continuous, ever-evolving upgrades to the operating system so people won’t have to erase the hard drive and start over, like all current users of Windows 7 and 8 are going to have to do when 10 comes out next year.

    This might not sit well with IT, because they don’t like disruption. Microsoft may push out significant updates the way it does bug fixes on Patch Tuesday, but IT might not want them immediately or they will have to test the updates

    The real question on my mind is whether Windows 10 will finally address a problem that has plagued pretty much every Windows OS since at least 95: the decay of the system over time. As you add and remove apps, as Windows writes more and more temporary and junk files, over time, a system just slows down.

    Reply
  3. Tomi Engdahl says:

    Ambitious Windows 10 Is an Attack Disguised as a Retreat
    http://www.wired.com/2014/10/windows-10-attack/

    From the early looks of Windows 10, it’s a long-overdue concession to the fact that Windows users prefer the way things used to be.

    A video tour hosted by Windows VP Joe Belfiore is most notable for what Microsoft’s new OS is missing, as well as what’s returned from the pre-Windows 8 days. Rather than carrying on Windows 8’s strange hybrid of animated tiles and Start Menu-less desktop, Windows 10 has an interface more akin to that of Windows 7. Those colorful Metro tiles haven’t disappeared completely—they’ve been moved to the Start Menu—but they’re less in-your-face and mission-critical than they were in Windows 8.

    Because of that, consumers and businesses are likely to be ecstatic. Neither of them liked Windows 8 much. The numbers don’t lie. According to Net Applications data, about half of the computers in the world right now are running Windows 7. In second place, with about a 24 percent install base, is the never-say-die Windows XP.

    Reply
  4. Tomi Engdahl says:

    Intel Leads Non-iPad Tablet Processor Market
    http://www.eetimes.com/document.asp?doc_id=1324161&

    Intel, which has been striving to get processor design wins in mobile devices, has made progress in the tablet computer market, rising to No. 2 among suppliers behind Apple in the second quarter, according to the market analysis company Strategy Analytics.

    The global market for tablet computer applications processors grew by 23% from a year earlier to reach $945 million, the firm said. For comparison, the smartphone application processor market grew 22% to reach $5.2 billion.

    Apple maintained its leading position and market share at 26%, followed by Intel with 19% and Qualcomm with 17%. Behind them came MediaTek and Samsung. This gives Intel the leading position in non-iPad tablet computers. Strategy Analytics said that this position is a difficult one to maintain, and that six companies have held it at various times.

    “The non-Apple tablet AP market leadership position continues to change hands and during Q2 2014 it was Intel’s turn,”

    Reply
  5. Tomi Engdahl says:

    The magic storage formula for successful VDI? Just add SSDs
    They’re cheap, they’re plentiful… why not?
    http://www.theregister.co.uk/2014/10/02/storage_ssds/

    Every pure solid-state disk (SSD) and hybrid storage vendor on earth would like you to know how brilliant it is at handling virtual desktop infrastructure (VDI) workloads.

    VDI may be a niche, but it is a miserably difficult niche, in which storage plays a huge role. There is a lot more to making VDI work well than just throwing some SSDs at it.

    Hybrid array vendors would have you believe that the secret is in their patented algorithm for determining what data should be on the SSD and what should be moved to traditional magnetic disk.

    Pure SSD players want you to know all about their data efficiency technology, and how deduplication plus compression equals super powers.

    VDI is a small number of desktops and applications copied multiple times to serve identical copies to multiple users. As you can imagine, the result is a lot of redundant, identical data.

    You can run your VDI in such a way that each child copy of the parent maintains only a copy of the changes that are different between the two. You can run a VDI in which each user has a completely dedicated environment.

    Consider for a moment the humble hybrid hard drive. You can buy a 4TiB traditional magnetic hard drive with a seemingly paltry 8GiB of usable SSD cache built in.

    Seagate did research into the usage patterns of the average desktop user. It discovered that over the course of five days a total of 19.48GiB of data was read from the user’s disk, with 9.59GiB of that data being unique.

    In fact, Seagate found that if you could cache just 2.11GiB of data you would have cached 95 per cent of the unique data read by the average corporate desktop user across five days.

    Now, put the ideas of VDI cloning, deduplication, compression and SSD caching together and roll them around in your mind for a bit.

    You might realise that while an individual VDI instance might consist of a 20GiB operating system and a series of application disks amounting to 50GiB in total, you don’t need to move 70GiB per user to the SSD tier to make VDI fast.

    Reply
  6. Tomi Engdahl says:

    Laying the Groundwork For Data-Driven Science
    http://news.slashdot.org/story/14/10/01/2139234/laying-the-groundwork-for-data-driven-science

    The ability to collect and analyze massive amounts of data is transforming science, industry and everyday life. But what we’ve seen so far is likely just the tip of the iceberg.

    Laying the groundwork for data-driven science
    http://www.nsf.gov/news/news_summ.jsp?cntn_id=132880&org=NSF&from=news

    NSF announces $31 million in awards to develop tools, cyberinfrastructure and best practices for data science

    One of the National Science Foundation’s (NSF) priority goals is to improve the nation’s capacity in data science by investing in the development of infrastructure, building multi-institutional partnerships to increase the number of U.S. data scientists and augmenting the usefulness and ease of using data.

    As part of that effort, NSF today announced $31 million in new funding to support 17 innovative projects under the Data Infrastructure Building Blocks (DIBBs) program.

    “Developed through extensive community input and vetting, NSF has an ambitious vision and strategy for advancing scientific discovery through data,” said Irene Qualters, division director for Advanced Cyberinfrastructure at NSF. “This vision requires a collaborative national data infrastructure that is aligned to research priorities and that is efficient, highly interoperable and anticipates emerging data policies.”

    “Our innovative architecture integrates key features of open source cloud computing software with supercomputing technology,” Fox said. “And our outreach involves ‘data analytics as a service’ with training and curricula set up in a Massive Open Online Course or MOOC.”

    Reply
  7. Tomi Engdahl says:

    Is this why Microsoft named it Windows 10?
    http://www.cnet.com/news/is-this-why-microsoft-named-it-windows-10/

    Seemingly everyone on the Net has a theory as to why Microsoft skipped the name “Windows 9″ and jumped to 10. Here’s one explanation that actually makes sense.

    There are quite a few theories floating around out there as to why Microsoft decided to name the latest version of its flagship operating system Windows 10, skipping over Windows 9. On Tuesday, the company unveiled the name and showed off a brief demo of the OS at a press event in San Francisco. The leap from Windows 8 to Windows 10 easily stole the spotlight from any visual design and developmental changes Microsoft has baked in to the product.

    So what’s the deal? On the surface, it appears to be smart marketing. The Windows 8 brand has been mired in controversy for the last two years

    Other theories are that there are 10 major consumer releases of Windows, making this a celebration of that progress, while some feel it should have been called Windows 11 in that case.

    Yet no explanation seems to come close to matching that of a self-described Microsoft developer who goes by the name cranbourne on the social news site Reddit. The user points the finger at Microsoft’s almost 20-year-old releases that helped make the software maker a household name during the rise of the PC:

    Microsoft dev here, the internal rumours are that early testing revealed just how many third party products that had code of the form

    if(version.StartsWith(“Windows 9″)) { /* 95 and 98 */ } else {

    and that this was the pragmatic solution to avoid that.

    “Having worked on the Windows compatibility team before, I have no difficulty believing this,” wrote user richkzad in response. There are in fact examples of this on publicly available code repositories.

    Reply
  8. Tomi Engdahl says:

    DRAM! Speedy software upstart PernixData’s caching up fast
    Caching speed upped by several orders of magnitude
    http://www.theregister.co.uk/2014/10/02/pernixdata_rams_caching_up_fast/

    VMware IO cacher PernixData has upped the quick access ante by caching virtual machine disk IO in DRAM — much faster for data access than flash.

    The startup’s FVP software runs in the ESXi hypervisor, and it’s also added user-defined fault domains and adaptive network compression to its clustering technology.

    FVP accelerates any storage IO: file, block or object, and requires no VM or app alteration. By adding DRAM caching, it speeds up guest VM data access by several orders of magnitude.

    Data access by memory lookup is in the nanosecond time scale, while flash access is in the 15-100 microseconds ballpark – for the disc, think 1,000 times slower, at 4-7 milliseconds.

    Reply
  9. Tomi Engdahl says:

    HP, AppliedMicro and TI Bring New ARM Servers to Retail
    by Stephen Barrett on September 30, 2014 9:19 AM EST
    http://www.anandtech.com/show/8580/hp-appliedmicro-and-ti-bring-new-arm-servers-to-retail

    Yesterday HP announced retail availability of two ARM based servers, the ProLiant m400 and m800. Each are offered in a server cartridge as part of the Moonshot System. A single 4.3U Moonshot chassis can hold 45 server cartridges.

    The m800 is focused on parallel compute and DSP, while the m400 is focused on compute, memory bandwidth, IO bandwidth and features the first 64-bit ARM processor to reach retail server availability.

    Starting with the m400, HP designed in a single AppliedMicro X-Gene SoC at 2.4 GHz. AppliedMicro has been discussing the X-Gene processor for several years now, and with this announcement becomes the first vendor to achieve retail availability of a 64-bit ARMv8 SoC other than Apple.

    Marquee features of the X-Gene SoC include 8 custom 64-bit ARM cores, which at quad-issue should be higher performance than A57, quad channel DDR3 memory, and integrated PCIe 3.0 and dual 10GbE interfaces.

    The m800 is a 32-bit ARM server containing four Texas Instruments KeyStone II 66AK2H SoCs at 1.0 GHz. Each KeyStone II SoC contains four A15 CPU cores alongside eight TI C66x DSP cores and single channel DDR3 memory, for a total of 16 CPU and 32 DSP cores. IO steps back to dual GbE and PCIe 2.0 interfaces.

    It is clear from the differences in these servers that m400 and m800 target different markets.

    Each server is available with Ubuntu and IBM Informix database preinstalled

    Reply
  10. Tomi Engdahl says:

    Evernote announces a new web client, unveils enterprise productivity platform ‘Work Chat’
    http://venturebeat.com/2014/10/02/evernote-announces-a-redesigned-web-client/

    The connected note-taking app company Evernote today announced a completely redesigned Web client and an enhanced enterprise-focused productivity platform called “Work Chat,” complete with new artificial intelligence features, collaboration tools, and improved presentation features.

    Previously, the web client had been thought of as a backup for Evernote’s native desktop and mobile apps. The new client is available as an opt-in feature today. “It’s built to eliminate distractions,” said Libin, adding that the old client, while still available, will eventually go away.

    Doing away with the ‘tyranny of the inbox’ and ‘hostile’ presentations

    Evernote also announced a digital workspace for teams called “Work Chat.” The idea is replace the dreaded email inbox — something quite a few companies are already trying to do. “The inbox made sense when it was actually a physical inbox — around the time of the Civil War,” Libin said. “We are trying to break the tyranny of the inbox, that long list of messages that you’re never going to get through.”

    Reply
  11. Tomi Engdahl says:

    Desktop, schmesktop: Microsoft reveals next WINDOWS SERVER
    Run it in Azure today, sysadmins, and get ready for lots of hybrid hype
    http://www.theregister.co.uk/2014/10/03/desktop_schmesktop_microsoft_reveals_next_windows_server/

    Windows 10 has hogged the limelight this week, but Microsoft has also revealed a Technical Preview of Windows Server and its System Centre control freak.

    The releases aren’t unexpected: Windows Server’s last full version emerged in 2012 and while substantial updates have landed in the years since, Redmond’s increasing ardour for Azure means a Windows Server refresh with lots of extra hybrid cloud bits sounds like a fine idea.

    And that’s more or less what Microsoft is talking up in its posts about the preview, as the new OS is described as “the core of our cloud platform vision”.

    Microsoft says the following are the biggies it’s revealing so far:

    Rolling upgrades for Hyper-V clusters to the next version of Windows Server without downtime for your applications and workloads. This includes support for mixed versions as you transition your infrastructure.
    New components for our software-defined networking stack that enable greater flexibility and control, including a network controller role to manage virtual and physical networks.
    New synchronous storage replication that enhances availability for key applications and workloads plus storage Quality of Service to deliver minimum and maximum IOPS in environments with workloads with diverse storage requirements.
    Enhanced application compatibility with OpenGL and OpenCL support.
    New scenarios to reduce the risk profile of administrators with elevated rights, including time-based access with fine-grained privileges, and new application publishing capabilities.

    The Technical Preview is offered with a “whatever you do, don’t use this in production” caveat.

    Reply
  12. Tomi Engdahl says:

    ARM Servers Want a Tune Up
    http://www.eetimes.com/document.asp?doc_id=1324172&

    Users and vendors of ARM-based servers say their biggest need is better support from middleware such as compilers for scripting languages. The systems aim to use an emerging class of relatively power-efficient ARM-based SoCs to grab a slice of a server market currently dominated by Intel’s muscular x86 processors.

    Multiple chip vendors showed ARM servers up and running a variety of Linux-based operating systems and applications at the annual ARM Tech Con here. One vendor showed its ARM SoCs beating Intel servers in a live test. However, users said code for ARM-based systems has plenty of room for optimizations that promise better performance.

    “You can get stuff compiled and running, but what you can’t do is get awesome code out of compiler chain — we have to work really hard on that,” said one researcher from Sandia National Labs testing Hewlett-Packard’s Proliant m400 system announced earlier this week that uses the 64-bit X-Gene 1 SoC from Applied Micro.

    “Enterprise users will be surprised how much code they can get up and running,” said the researcher who preferred not to be named. “We can hand tune the code for tens of percent faster performance, so we know there’s much more head room,” he said.

    Sandia also found the performance of the ARM systems scale better than x86 servers on their scientific applications

    Intel’s chips typically have much greater processor performance than the ARM SoCs, but the lack of memory bandwidth and I/O means the maximum performance of the Intel chips cannot be achieved or sustained.

    Reply
  13. Tomi Engdahl says:

    Symantec offers BIGGER, FASTER NetBackup appliance
    Move into EMC’s market begins with refresh of 5000 Series
    http://www.theregister.co.uk/2014/10/03/symantec_offers_bigger_faster_netbackup_appliance/

    Symantec is refreshing its NetBackup 5000 series appliance line with a new model that operates faster and holds more data.

    The firm’s 5000 Series Purpose-Built Backup Appliance (PBBA) range has expandable storage and end-to-end (both client and target sides) deduplication for physical and virtual environments.

    There is a Backup Exec equivalent to the NetBackup appliance called the 3600 Appliance.

    Symantec said it has sold more than 10,000 NetBackup appliances, and revenues from these boxes grew 35 per cent in its most recent quarter.

    Reply
  14. Tomi Engdahl says:

    Internet Explorer Implements HTTP/2 Support
    http://tech.slashdot.org/story/14/10/03/129220/internet-explorer-implements-http2-support

    As part of the Windows 10 Technical Preview, Internet Explorer will introduce HTTP 2 support, along with performance improvements to the Chakra JavaScript engine, and a top-level domains parsing algorithm based on publicsuffix.org. HTTP 2 is a new standard by the Internet Engineering Task Force. Unlike HTTP 1.1, the new standard communicates metadata in binary format to significantly reduce parsing complexity.

    Reply
  15. Tomi Engdahl says:

    Tivi interview devopsin godfather Patrick Debois

    Sometimes, even the company’s bonus system to kill the software production devops reform the starting blocks. How to succeed devops and what it is essential?

    Tivi interviewed a large software development trend, devops godfather, Belgian Patrick Debois. He invented the term ‘devops in 2009 and organized the first Devopsdays event in Ghent, Belgium, where is the end of October devopsin 5-year anniversary event.

    Debois devopsin emphasizes the cultural and philosophical aspects.

    Devops concept is pretty hype surrounded. What do you think about that?

    To be honest, I no longer follow on Twitter devops key word, because it is so over-discharged. Devops, the term has lost its charm, as more and more people use it. But that does not mean that the devops itself would have lost tenhonsa that it would no longer be valid. Devops concept develops. Cloud has dozens of definitions, but it does not detract from any of its usefulness. The same thing happens devopsille. People learn and progress towards something new. .

    What are your own experiences devops in companies?

    In Belgium, the companies say that we need devopsia and that we need to be flexible, but unfortunately, mostly so that the need to automate and make the infrastructure code. I do not often see that companies are changing the culture of cooperation. I would say that the cloud infrastructure and the ongoing publication of the proportion of 20 and 1 of culture, that is, I mean a 20: 1 ratio. The culture is the most difficult to change. I suspect that the same situation is in other countries tweets judging. The same can also see conferences. They have a lot of talks on technical issues and a bit of culture.

    If the culture is just a small role, failed and devops?

    It depends on how you define failure. If you can push a button faster if you need less people to do things, it is a good thing, but it is difficult to measure the success of anyone.

    Devops, agile and ITIL

    Devopsia is said to agile development to succeed. Can it be seen as a response to problems of agile development?

    Agile itself, the idea has not been a barrier, but the fact that people do not ever talk to. The same applies to ITIL, help desk, all the heavy processes. ITIL toolbox there is nothing that would prevent people from being more flexible. Hindered by traditional implementations of the company. It is truly ironic that in fact the people themselves have pushed themselves out and restrict people from different groups, which do not need to talk to each other, because it is supposedly not important. Agile does not say anything like that.

    If you think about the business, how devops practice to help it?

    Even in continuous publication, many companies could not reach what you thought of the end users or exported back to their feedback, or come across, why the response times were bad. Many companies are realizing how important it is to have a good system side, to keep the service up and get it to work very quickly in the world, with all the surf on the web and decide in seconds, which is bad, what is good.

    Now, it is not enough that you have a good website. It must be efficient, it needs to stay afloat and provide good support. All of these things have value, and they make the difference between you and your competitors. All services are used to create the time wasted if people do not trust you or think you are insecure.

    Devopsin thanks, if I need to change, I can do it much faster, better and more stable, because I do more tests. I do not need to invite people to a little matter of war convened a meeting because they already sit together to discuss the issues.

    If you find bugs in continuous publication of the flow at an early stage, it is less expensive than if they reach beyond the publishing stream. In particular, if a bug gets to production, repair can be difficult and you may lose data. You could say that the continued publication itself accelerates the pace and minimize bugs. When I make small changes and do them often, to get better, and the impact of changes is smaller and faster than I can get feedback.

    The advantages, however, do not only get the word chime in about devops. We need collaboration, problems speaking and rapid feedback.

    I ask people to be very simple question: Can you draw me a map of all the components, which are included in production enviroment. Some people draw only web server and database, some say that we have web application server and also the build server. What’s more mature devops-thinking, they become, the more things they like production. In the end, the entire production together is the beginning of the tube to measure up to.

    I call this monitoring and mapping test. It really helps to ask the developers and maintainers to draw in the production of a mental picture of the formula. Always turns out that there is a server or a path to a database or Active Directory server, where only one person is aware of. For people begin to emerge using a shared vision.

    Source: http://summa.talentum.fi/article/tv/96201

    Reply
  16. Tomi Engdahl says:

    This published hack could be the beginning of the end for USB
    http://linustechtips.com/main/topic/226485-this-published-hack-could-be-the-beginning-of-the-end-for-usb/

    USB has a huge security problem that could take years to fix
    http://www.theverge.com/2014/10/2/6896095/this-published-hack-could-be-the-beginning-of-the-end-for-usb

    In July, researchers Karsten Nohl and Jakob Lell announced that they’d found a critical security flaw they called BadUSB, allowing attackers to smuggle malware on the devices effectively undetected. Even worse, there didn’t seem to be a clear fix for the attack. Anyone who plugged in a USB stick was opening themselves up to the attack, and because the bad code was residing in USB firmware, it was hard to protect against it without completely redesigning the system. The only good news was that Nohl and Lell didn’t publish the code, so the industry had some time to prepare for a world without USB.

    As of this week, that’s no longer true. In a joint talk at DerbyCon, Adam Caudill and Brandon Wilson announced they had successfully reverse-engineered BadUSB, and they didn’t share Nohl and Lell’s concerns about publishing the code. The pair has published the code on GitHub, and demonstrated various uses for it, including an attack that takes over a user’s keyboard input and turns control over to the attacker.

    According to Caudill, the motive for the release was to put pressure on manufacturers. “If the only people who can do this are those with significant budgets, the manufacturers will never do anything about it,” he told Wired’s Andy Greenberg. “You have to prove to the world that it’s practical, that anyone can do it.”

    Reply
  17. Tomi Engdahl says:

    ARMv8 Goes Embedded with Applied Micro’s HeliX SoCs
    by Ganesh T S on October 3, 2014 10:00 AM EST
    http://www.anandtech.com/show/8588/armv8-goes-embedded-with-applied-micros-helix-socs

    We covered the news of the first shipment of 64-bit ARMv8 processors in the HP Moonshot product line earlier this week. At ARM TechCon 2014, Applied Micro (APM) had a very interesting update to their 64-bit ARM v8 product line. They launched two SoC families, HeliX 1 and HeliX 2. Both of them are based on the X-Gene ARMv8 cores developed for servers, but appropriately scaled down to fit in the 8 W – 42 W TDP scenarios for the embedded market. The HeliX 1 is fabricated in a 40 nm process, while the HeliX 2 uses a 28 nm process. The latter uses the second generation X-Gene ARMv8 core

    Applied Micro has traditionally been a PowerPC house.

    APM is hoping to get HeliX into the embedded market, with focus on communication and networking, imaging, storage and industrial computing verticals. They believe ARMv8 is the architecture of the future and had a number of companies (including Cisco, Netgear, Konica Minolta, Wind River and Canonical) voicing support for their strategy.

    The two SoC product lines launched by APM yesterday were the APM887208-H1 (based on HeliX 1) and the APM887104-H2 (based on HeliX 2).

    Reply
  18. Tomi Engdahl says:

    Bring your spade: The BIG DATA Gold Rush has begun
    Most forty-niners didn’t get rich – shovel-makers did
    http://www.theregister.co.uk/2014/10/03/big_data_gold_rush_begins/

    The forty-niners rushed to California in search of gold. Some even found it. But most stayed poor and the only people who got reliably rich were people who made and sold the picks and shovels the miners needed.

    Is that the way it is with big data? Will the only people to get rich be the suppliers of big data computer software, and computer and storage hardware?

    You can ask a simpler question: over 12 months, did the extra revenue generated by the big data systems exceed the total cost of buying, installing and operating the big data systems?

    But revenue is not profit. If it costs you $500,000 to generate $600,000 extra revenue from a big data system but a $400,000 shop generated $800,000 extra revenue then the big data investment would be sub-optimal. Better to build a second shop. Obvious, innit?

    This is another question for the accountants. The cost-justification for big data systems is based on generating extra profit from extra revenue (or saving cost which helps the profit number.)

    Reply
  19. Tomi Engdahl says:

    Report: Hewlett-Packard set to split PC, printer biz from enterprise wing
    Starship rejig. You let go, Picard. Pack your bags
    http://www.theregister.co.uk/2014/10/05/hewlett_packard_to_split_in_two_says_wsj/

    Hewlett-Packard is reportedly planning a big break up of its business, by splitting its PC and printer operations from the company’s corporate hardware and services divisions.

    Reply
  20. Tomi Engdahl says:

    IBM pulls the plug on Lotus 1-2-3
    http://www.theinquirer.net/inquirer/news/2373795/ibm-pulls-the-plug-on-lotus-1-2-3?utm_source=Outbrain&utm_medium=Cpc&utm_campaign=Inquirer%252BReferral&WT.mc_is=977=obinsource

    IBM HAS DISCONTINUED support for its pioneering productivity package Lotus 1-2-3.

    A notice on the IBM website confirms that there is to be no further support for the spreadsheet, database and diagram package that was first released to market 31 years ago.

    Reply
  21. Tomi Engdahl says:

    HP Returns to Breakup Plan It Shelved Three Years Ago
    http://recode.net/2014/10/05/hp-returns-to-breakup-plan-it-shelved-three-years-ago/

    Computing giant Hewlett-Packard is close to breaking itself into two companies after CEO Meg Whitman ran out of options to turn around a business that is saddled with declining operations.

    The decision to split itself in two, which could be announced as early as Monday, follows a months-long process to explore the sale of several business units, including its PC and enterprise services unit to no avail. The belief is that by splitting in two, both could pursue sales or acquisitions with a simpler balance sheet, which was an issue that sources said had scuttled a plan to buy EMC.

    Hewlett-Packard Plans to Break in Two
    Split Would Separate PC, Printer Operation From Corporate Hardware and Services Business
    http://online.wsj.com/news/article_email/hewlett-packard-plans-to-break-in-two-1412530028-lMyQjAxMTE0OTAzNTEwNjUzWj

    Reply
  22. Tomi Engdahl says:

    Adaptive Path and the Death Rattle of the Web 2.0 Era
    http://www.wired.com/2014/10/adaptive-path-end-of-the-web-2-0-era/

    You may not have heard of them, but a little company in San Francisco called Adaptive Path has had an outsized effect on the Web we know. It’s a consulting company that focuses on user experience–one of the first to make user experience its mission. Its team included an all-star list of thinkers, a who’s-who of the Web.

    So everyone who follows Web development was more than a little weirded out yesterday when Adaptive Path announced that it was selling itself to Capital One: a banking concern that’s made an empire out of consumer credit card debt. It all happened so quickly, and so slowly.

    Adaptive Path was an anchor tenant in the Web 2.0 land grab of the early aughts. Along with Six Apart, Odeo, 37 Signals and a few others, it drove a reimagining of the Web as a place where static pages would act instead more like dynamic applications.

    Adaptive Path gave a name to an emerging development trend. They named it Ajax

    Aside from web apps, Web 2.0′s most notable feature was probably hype. All the companies trying to sell you an idea were also now also effectively operating industry trade publications. Web 2.0 was a hothouse of self-promotion—and knives-out backstabbing.

    Those days are over. Six Apart (Movable Type, TypePad, LiveJournal) is now SAY media, and no longer makes any kind of blogging platform. Podcasting darling Odeo morphed into Twitter after a near death experience. A few months ago, 37 Signals (Basecamp, Campfire, and kind of Ruby on Rails) changed its name to Basecamp, to focus on its first and most successful product, dropping support for everything else. And now Adaptive Path, slayer of conference decks, destroyer of established ways of doing things, is a division of Visa or something.

    Reply
  23. Tomi Engdahl says:

    What’s Up With Ello, the Anti-Facebook Social Network?
    http://mashable.com/2014/09/25/whats-the-deal-with-ello/

    By now, you’ve probably heard something about Ello, the ad-free, invite-only, independent social network that has seemingly gone viral over the last week.

    The ad-free social network has quickly — and somewhat inexplicably — gained a reputation for being the “anti-Facebook.” Which is odd, because new users tend to boast on Facebook about having scored an invite to the service.

    While it’s not clear what’s behind the site’s sudden surge in popularity (it launched in March to little fanfare), several reports have linked the rise of Ello to the recent firestorm caused by Facebook’s so-called real name policy.

    Facebook’s policy, which requires all users to go by their legal name on the social network, has been in place for some time

    Reply
  24. Tomi Engdahl says:

    Get 20 Takes on ARM Tech Con
    http://www.eetimes.com/document.asp?doc_id=1324199&

    “Remember it began here today,” said Tom Bradicich, vice president of server engineering at Hewlett-Packard, pointing to the first 64-bit ARM server cartridge the company announced it is shipping.

    Jim Ang, of Sandia National Labs (below), made a brief appearance on stage. A Sandia group that tests all alternative server architectures shared some results of its work since March on the HP Proliant m400, which uses the Applied Micro X-Gene SoC. Ang said so far, “Our initial testing has shown good performance scaling.”

    Applied Micro set up side-by-side live demos of its current 40nm and next-generation 28nm X-Gene SoC against Intel Ivy Bridge and Haswell Xeon processors. It claims all its chips beat all of Intel’s in the number of requests per second, latency, and throughput.

    the next generation of ARM SoCs are clearly up and running on 28nm processes sporting more cores.

    Carol Basset, a Moonshot product manager at HP (above), showed her baby on the exhibit floor — the X-Gene cartridge that packs 64 Gbytes RAM and two 10 Gbit/s Ethernet ports. The chassis holds up to 48 of the cards along with two switch boards using Mellanox Ethernet switches.

    Separately, Applied showed a small cluster-in-a-box running OpenStack open source cloud software.

    At least two other 64-bit ARM server players are on their way. On the show floor, AMD showed a server using its Seattle SoC (above) running Java, Hadoop, and OpenStack on versions of Red Hat and Suse Linux.

    Meanwhile, Cavium is gearing up its 28nm ARM server SoC, delivering even more custom cores than Applied’s X-Gene. But the ThunderX chip has not even taped out yet

    Clearly both the silicon and software for ARM servers are making progress. But real market impact still seems to be at least a year away.

    Dell showed a prototype server using up to two Applied X-Gene SoCs. The system packs up to 48 TBytes hard disk storage and runs Hadoop.

    Samsung grabbed headlines at the event showing a working mobile applications processor — or at least a part of one — made with its 14nm FinFET process decoding high def video on OLED and LCD screens.

    Sources at the show said TSMC and Samsung are in a neck-and-neck race to see who will be first to deliver such a process.

    Globalfoundries aims to “copy exact” the Samsung 14nm process in its Saratoga, NY fab. Separately, ARM announced a tool to help designers more easily build power grids on FinFETs.

    STMicroelectronics won a best-of-show award for its F7 chip, one of several new SoCs using ARM’s new Cortex-M7 core

    ARM showed 37 development boards already using its free Mbed OS

    Details of ARM’s Mbed code are a work in progress. For example, it uses a proprietary implementation of Transport Layer Security from a partner to be named soon. It will implement at least the general concepts of the kind of hardware-backed security used in its TrustZone

    Reply
  25. Tomi Engdahl says:

    ARMv8 Goes Embedded with Applied Micro’s HeliX SoCs
    by Ganesh T S on October 3, 2014 10:00 AM EST
    http://www.anandtech.com/show/8588/armv8-goes-embedded-with-applied-micros-helix-socs

    At ARM TechCon 2014, Applied Micro (APM) had a very interesting update to their 64-bit ARM v8 product line. They launched two SoC families, HeliX 1 and HeliX 2. Both of them are based on the X-Gene ARMv8 cores developed for servers, but appropriately scaled down to fit in the 8 W – 42 W TDP scenarios for the embedded market. The HeliX 1 is fabricated in a 40 nm process, while the HeliX 2 uses a 28 nm process. The latter uses the second generation X-Gene ARMv8 core.

    Reply
  26. Tomi Engdahl says:

    Vanished blog posts? Enterprise gaps? Welcome to Windows 10
    The riddle that will satisfy Win 8 haters, at least
    http://www.theregister.co.uk/2014/10/03/windows_10_preview/

    Microsoft has posted early builds of Windows 10, showing its new approach to combining tablet and desktop support in one all-encompassing operating system.

    According to the company’s marketing script, this preview is for enterprises to try. A consumer preview will follow early next year, and a developer preview in April, to coincide with the Build conference in San Francisco.

    Reply
  27. Tomi Engdahl says:

    Why women leave tech: It’s the culture, not because ‘math is hard’
    http://fortune.com/2014/10/02/women-leave-tech-culture/

    Stories from 716 women who left tech show that the industry’s culture is the primary culprit, not any issues related to science education.

    Eighty-five women cited maternity leave policy as a major factor in their decision to leave their tech jobs. That’s over 10% of the women I surveyed.

    Many women said that it wasn’t motherhood alone that did in their careers. Rather, it was the lack of flexible work arrangements, the unsupportive work environment, or a salary that was inadequate to pay for childcare.

    One-hundred-ninety-two women cited discomfort working in environments that felt overtly or implicitly discriminatory as a primary factor in their decision to leave tech. That’s just over a quarter of the women surveyed. Several of them mention discrimination related to their age, race, or sexuality in addition to gender and motherhood.

    “Doesn’t matter how good you are, or even if your colleagues respect you. Eventually you get tired of being the odd duck.”

    The pipeline isn’t the problem

    It is popular to characterize the gender gap in tech in terms of a pipeline problem: not enough girls studying math and science. However, there are several indications that this may no longer be the case, at least not to the extent that it once was. High school girls and boys participate about equally in STEM electives.

    Almost everyone I spoke with said that they had enjoyed the work itself. Most mothers added that they would have happily returned to their jobs a few months after giving birth, but their companies didn’t offer maternity leave and they needed to quit in order to have their kids.

    Some women felt that their work environments were discriminatory, but most reported something milder: the simple discomfort of not fitting in in an otherwise homogenous setting.

    Women are leaving tech because they’re unhappy with the work environment, not because they have lost interest in the work.

    As cultural issues go, this is an incredibly expensive problem.

    Reply
  28. Tomi Engdahl says:

    Microsoft’s ‘RoomAlive’ transforms any room into a giant Xbox game
    Kinect brings your living room to life
    http://www.theverge.com/2014/10/5/6912979/microsoft-roomalive-research-projector-system

    When Microsoft first demonstrated its IllumiRoom research project at CES last year it generated a huge amount of attention ahead of a next-generation Xbox unveiling. A video showed off a projection system that was linked to the Xbox to extend games from a TV to nearby walls, and appeared to be more than just a concept demo. It turned out that IllumiRoom was “just research” after all, but Microsoft is back this year with IllumiRoom 2.0: RoomAlive.

    RoomAlive builds on the familiar concepts of IllumiRoom, but pushes things a lot further by extending an Xbox gaming environment to an entire living room. It’s a proof-of-concept demo, just like IllumiRoom, and it combines Kinect and projectors to create an augmented reality experience that is interactive inside a room. You can reach out and hit objects from a game, or interact with games through any surfaces of a room. RoomAlive tracks the position of a gamers head across all six Kinect sensors, to render content appropriately.

    Reply
  29. Tomi Engdahl says:

    Why Military Personnel Make the Best IT Pros
    http://tech.slashdot.org/story/14/10/06/1533209/why-military-personnel-make-the-best-it-pros

    Every year, approximately 250,000 military personnel leave the service to return to civilian life. When the home front beckons, many will be looking to become IT professionals, a role that, according to the U.S. Bureau of Labor Statistics, is among the fastest growing jobs in the country. How their field skills will translate to the back office is something to ponder.

    That said, the nature of today’s military—always on the go, and heavily reliant on virtual solutions—may actually be the perfect training ground for IT. Consider that many war-fighters already are IT technicians: They need to be skilled in data management, mobile solutions, security, the ability to fix problems as they arise onsite, and more.

    Why Military Personnel Make Ideal IT Pros
    http://news.dice.com/2014/10/06/why-military-personnel-make-ideal-it-pros/

    Managing Wireless Technology and Connected Devices

    Like everyone, government agencies have come to heavily rely on mobile technology. This reliance has brought on a slew of issues, however, from maintaining security to managing multiple devices.

    Personnel used to working with everything from SATCOM terminals to iPads are ideally suited for handling these issues. Many have successfully managed wireless endpoints, networks, and security while in the field.

    Civilian IT professionals face the same challenges. The number of mobile devices that appear on a network is rising on a daily basis, and will likely get far more extreme in the near future. As such, the need to monitor security, bandwidth, passwords, and network access has never been greater. Military personnel can use their field experience to tackle each of these.

    Automating Network Management

    Field personnel obviously have a lot to deal with beyond IT (there’s a little thing called “defense,” for example). Very often, though, they find themselves having to stop and manually manage network problems as they arise.

    As tedious as this is, it turns out that this experience can prove valuable for personnel looking to transfer to civilian IT, even as network automation becomes the norm. Knowledge of which processes typically require the most attention can help IT managers configure their automated systems to prioritize project monitoring. In short, field experience can help make the automated network that much more efficient, and enhance optimization to focus on things that are most essential.

    Reply
  30. Tomi Engdahl says:

    Hewlett-Packard Set to Break Up 75-Year-Old Company
    H-P Would Separate PC, Printer Business from Corporate Hardware, Services; More Layoffs Ahead
    http://online.wsj.com/news/article_email/hewlett-packard-to-split-into-two-companies-1412592132-lMyQjAxMTE0MjA4NjcwMDY2Wj

    Hewlett-Packard Co. HPQ +4.74% confirmed Monday that it planned to split the company into two parts, a move executives said was driven by the need to stay nimble to keep up with rapidly changing technology.

    H-P, which Bill Hewlett and Dave Packard famously started in their Palo Alto, Calif., garage in 1939, now will be carved up: One part, HP Inc., will consist of the company’s personal-computer and printer businesses. The other, Hewlett-Packard Enterprise, will sell computer servers, data-storage gear, software, consulting operations and other services for corporate-technology departments.

    Each of the companies will be about the same size, with more than $50 billion in annual revenue.

    H-P Chief Executive Meg Whitman said Monday that the two companies will be on very different courses. The new HP Inc. will be milked for cash, which will be earmarked for returns to stockholders. The enterprise company, which Ms. Whitman will run, will be operated for growth through a faster pace of investment in new products and through acquisitions, executives said on a conference call with analysts.

    Separately, H-P also boosted Monday the number of expected layoffs it has planned by 5,000 to 55,000, after identifying “incremental opportunities for reductions.”

    HP To Separate Into Two New Industry-Leading Public Companies
    http://www8.hp.com/us/en/hp-news/press-release.html?id=1809455#.VDOl_xZsUik

    Hewlett-Packard Enterprise will define the next generation of technology infrastructure, software and services for the New Style of IT

    Reply
  31. Tomi Engdahl says:

    Nvidia’s Maxwell graphics chips for laptops can beat some powerful gaming desktops
    http://venturebeat.com/2014/10/07/nvidias-maxwell-graphics-chips-for-laptops-can-beat-some-powerful-gaming-desktops/

    Computer gaming is about to take a big leap forward. Nvidia is launching its next-generation graphics chips for laptops today that the company claims are at about 75 percent of the performance of the fastest desktop graphics processing units (GPU)s.

    The mobile graphics chips use the new Maxwell architecture that also serves as the foundation for Nvidia’s new desktop GPUs. The Maxwell-based mobile GPUs will be sold under the Nvidia GeForce GTX 980M and 970M brand names. Maxwell chips can deliver twice the performance per watt of power consumed compared to the previous generation of chips. And if you’ve got a desktop gaming machine that is just a little bit old, chances are that the newest gaming laptops will beat it.

    Kaustubh Sanghani, the general manager of the notebook GPU business, told GamesBeat that the gaming laptop business has grown five times in the past three years. And the No. 1 thing that customers in that market want is “desktop-class performance,” he said.

    “We have nearly closed the gap,” Sanghani said.

    Reply
  32. Tomi Engdahl says:

    Symantec Said to Explore Split Into Security, Storage Cos
    http://www.bloomberg.com/news/2014-10-07/symantec-said-to-explore-split-into-security-storage-cos.html?alcmpid=breakingnews

    Symantec Corp. (SYMC) is exploring a breakup, according to people with knowledge of the matter, joining other large technology companies that are trying to make their businesses more focused and nimble.

    The Mountain View, California-based software company is in advanced talks to split up its business into two entities, with one that sells security programs and another that does data storage, said the people, who asked not to be identified because the conversations are private. An announcement may be a few weeks away, one of the people said.

    Reply
  33. Tomi Engdahl says:

    New Google+ Head David Besbris: We’re Here for the Long Haul (Q&A)
    http://recode.net/2014/10/07/new-google-head-david-besbris-were-here-for-the-long-haul-qa/

    Google+ isn’t dying anytime soon, says Google’s new head of social media David Besbris.

    Of course, you’d expect the person in charge of the social network to say so, despite the fact that it’s hard to ignore the chatter about the imminent demise of Google’s social media efforts following the departure of longtime Google+ head Vic Gundotra who unexpectedly left in April.

    Besbris assured us Google has every intention to continue investment in the division.

    Reply
  34. Tomi Engdahl says:

    GitHub Partners With DigitalOcean, Unreal Engine, Others To Give Students Free Access To Developer Tools
    http://techcrunch.com/2014/10/07/github-partners-with-digital-ocean-unreal-engine-and-others-to-give-students-free-access-to-developer-tools/

    Back in the days of shrink-wrapped software, students would often get huge discounts on expensive software packages like Adobe’s Creative Suite or Microsoft’s developer tools. But because almost every developer startup has now moved to a SaaS model, it’s become a bit harder to find student discounts.

    To help students start new software projects without breaking the bank, GitHub, Bitnami, Crowdflower, DigitalOcean, DNSimple, HackHands, Namecheap, Orchestrate, Screenhero, SendGrid, Stripe, Travis CI and Epic Game’s Unreal Engine are launching the GitHub Student Developer Pack, a new program to give students free access to their tools.

    A GitHub spokesperson told me that the company already has about 100,000 students on its free plan, but with the Developer Pack, they can now get free access to a GitHub micro account (usually $7/month) with up to five private repositories for as long as they are students.

    Reply
  35. Tomi Engdahl says:

    No tiles, no NAP – next Windows for data centre looks promising
    Now, about that consumer AV ‘protection’…
    http://www.theregister.co.uk/2014/10/08/windows_server_10_first_look/

    All eyes are on Windows 10, but Microsoft has also slipped out a technical preview for the parallel, but as yet unnamed, Windows Server.

    Whereas Windows 8 has been mired in controversy from the outset, the parallel Server 2012 and Server 2012 R2 releases (based on the same core code) have been well received.

    At first glance, this is not as big a release as Server 2012, which brought Microsoft into contention in virtualisation as well as introducing a ton of new features in storage, networking and remote desktop services. Usual health warnings apply though: this is a preview and there will be more to come.

    Reply
  36. Tomi Engdahl says:

    Gartner: Business digitalizes rapidly

    Affairs, Internet, virtualization, and general business increased digitalisation of the limelight research firm Gartner, this year’s Gartner Symposium / ITxpossa. The research house expects business digitalisation rate of only accelerate.

    According to him, only 3D printers have become more than a billion dollar market. Similarly, ten per cent of new cars are connected to the Internet.

    The pace is accelerating, as these amounts are doubled by the year 2015.

    “Companies are using this year to more than 40 billion dollar business and the digitization of things on the internet. All equipment with a modicum of dignity, equipped with sensors and connected to the network,” Sondergaard estimates

    The risks are kept under control

    CIO GMOs challenge is to assess issues or IoT (Internet of Things), the rapid development of related risks.

    “The digital business for the risks to be assessed in a different way than before,” Gartner’s Daryl Plummer shareholder notes.

    According to him, the new business risks must be proportioned to the company and its IT management’s ability to deal with problem situations.

    “Risk recognition is acceptable, but their neglect for many companies is a life and death issue,” Plummer says Network World.

    Gartner estimates that currently about 38 per cent of IT investments are being made in traditional IT procurement outside. In 2017, accounting for more than half.

    Source: http://www.tivi.fi/cio/gartner+liiketoiminta+digitalisoituu+nopeasti/a1018118

    Reply
  37. Tomi Engdahl says:

    Gartner: Make way for digital business, risks or die?
    http://www.networkworld.com/article/2691627/careers/gartner-make-way-for-digital-business-risks-or-die.html

    Enterprises will spend over $40 billion designing, implementing and operating the Internet of Things just this year

    While the notion of IT changing is nothing new really, Gartner says the shift towards everything virtual – what it calls the Digital Business – is more intense than years’ past.

    For example in his opening keynote Garter’s Peter Sondergaard, senior vice president and global head of research said that since 2013 650 million new physical objects have come online. 3D printers became a billion dollar market; 10% of cars became connected; and the number of Chief Data Officers and Chief Digital Officer jobs have doubled.

    By 2015, all of these items will double again, he said.

    “This year enterprises will spend over $40 billion designing, implementing and operating the Internet of Things,” Sondergaard said. “Every piece of equipment, anything of value, will have embedded sensors. This means leading asset-intensive enterprises will have over half a million IP addressable objects in 2020.”

    Reply
  38. beadboard wallpaper says:

    Simply want to saay our article is as surprising. The clearness in your poost is simply spectacular and
    i could assume you’re an expert oon this subject.
    Well with your permission leet me to grab your feed to kedep updated with forthclming post.
    Thanks a million and please carry on the enjoyable
    work.

    Reply
  39. Leanna says:

    Excellent way of describing, and fastidious post to get information concerning my presentation subject,
    which i am going to present in college.

    Reply
  40. Tomi Engdahl says:

    Linux systemd dev says open source is ‘SICK’, kernel community ‘awful’
    Reckons newbies should beware of hostile straight white males
    http://www.theregister.co.uk/2014/10/06/poettering_says_linux_kernel_community_is_hostil/

    Lennart Poettering, creator of the systemd system management software for Linux, says the open-source world is “quite a sick place to be in.”

    He also said the Linux development community is “awful” – and he pins the blame for that on Linux supremo Linus Torvalds.

    Poettering said Torvalds’ confrontational and often foul-mouthed management style is “not an efficient way to run a community” and that it sets an example that is followed by other kernel developers, creating a hostile environment for newcomers.

    “The Linux community is dominated by western, white, straight, males in their 30s and 40s these days,” Poettering wrote. “I perfectly fit in that pattern, and the rubbish they pour over me is awful. I can only imagine that it is much worse for members of minorities, or people from different cultural backgrounds, in particular ones where losing face is a major issue.”

    Reply
  41. Tomi Engdahl says:

    Analytics pusher Interana: Why tweak your seeks when you can scan your DRAM?
    Intros high-speed in-memory tool
    http://www.theregister.co.uk/2014/10/08/interanas_blink_of_an_eye_analytics_speed/

    Startup Interana has an analytics tool that processes data in scans rather than seeks, claiming it’s designed to be great at analysing time-based streams of event data.

    The software is a proprietary database that organises data by time and user to allow for single-pass queries with no intermediate sorts.

    t runs on clustered x86 servers and uses DRAM for working set data with intermediate flash and a disk-based backing store from which data is streamed. This means sequential and not slower, seek-based, random I/O.

    The data consists of event data – billions of sequences of events – from sources such as clickstreams, call detail records, transactions and sensor data, and allows users to ask questions and see the answers in seconds

    Reply
  42. Tomi Engdahl says:

    An EMC-HP Borg cube will totally ANNIHILATE its storage worlds
    Why overlapping kit from a merger equals a disaster in waiting
    http://www.theregister.co.uk/2014/10/08/a_merged_emc_and_hp_would_create_a_storage_nightmare/

    Suppose EMC and HP merged: the result would be an outrageous nightmare of overlapping storage products. Managers would run screaming into the darkness, and bean counters would spend millions converging products. It’s a no-go area, really.

    Companies merge to make a stronger entity, not to screw up good product lines with a competing morass of misaligned offerings. Were EMC and HP to merge or one to take over the other, the result for storage would be a disaster.

    Reply
  43. Tomi Engdahl says:

    The Empire Reboots
    http://www.vanityfair.com/business/2014/11/satya-nadella-bill-gates-steve-ballmer-microsoft

    Over the last decade, as the biggest force in tech history hurtled toward irrelevance (albeit lucratively), a few blamed Microsoft’ s woes on founder Bill Gates, while most pointed to his successor as C.E.O., Steve Ballmer. Bethany McLean charts the breakdown of their relationship, the growing dissatisfaction with Ballmer, and the challenges and opportunities facing its third C.E.O., Satya Nadella, as Gates returns to the fold.

    Reply
  44. Tomi Engdahl says:

    SUSE, MariaDB and IBM team up to tame Big Data
    http://www.linuxjournal.com/content/suse-mariadb-and-ibm-team-tame-big-data

    SUSE and MariaDB (the company formerly known as SkySQL!) officially teamed up today, joining forces with IBM Power Systems, in a partnership that promises to expand the Linux application ecosystem. According to sources at SUSE, customers will now be able to run a wider variety of applications on Power8, increasing both flexibility and choice while working within existing IT infrastructure.

    Ultimately, MariaDB Enterprise will be optimized for SUSE Linux Enterprise Server 12 on IBM POWER8-based servers.

    The joint offering provides customers with several key technical benefits, including:

    Choice – through easy porting of software developed on Linux for x86 to Linux for Power on little-endian systems, SUSE Linux Enterprise Server 12 enables users to run a larger variety of applications on POWER8, including those developed in compiled or scripted languages, such as C++, Ruby PHP or Java.
    Hypervisor support – IBM PowerKVM and SUSE Linux Enterprise Server 12, with KVM, provide virtualization support on POWER8 processors for Linux workloads.
    Scalability – SUSE Linux Enterprise Server 12 takes full advantage of POWER8′s eight threads per core, allowing a higher number of tasks to run simultaneously.
    Performance – larger L3 memory cache structures provide a greater potential for computing power at higher density, perfect for data-intensive applications.

    This partnership can potentially have immediate impact in the Big Data arena.

    Reply
  45. Tomi Engdahl says:

    Shouty investor Elliott makes PUBLIC CALL for VMware sale
    Billion dollar shareholder ups the EMC ante
    http://www.theregister.co.uk/2014/10/08/elliott_emphatically_ups_the_emc_ante/

    Activist investor Elliott Management, the holder of a billion dollars’ worth of Hopkinton shares, has publicly called for a VMware sale in a letter to EMC’s CEO and board.

    Its release has been nicely timed to closely follow on from HP’s decision to split itself into an printer and PC company on the one hand (HP Inc) and a servers/networking/storage/services/cloud enterprise company on the other (HPE).

    The two key points in the letter are these:

    EMC’s current structure – “the Federation” – obscures value at EMC.
    EMC should pursue pathways to recognise this value, including a separation of VMware from Core EMC and/or various M&A opportunities

    He goes on to say: “The reality today is that the Federation structure, which may have served EMC well years ago, no longer does.”

    Elliot points out: “Though the Federation strategy for EMC and VMware does not work and cannot be continued, the two companies can easily continue their partnership after a separation.”

    Elliott sees two alternatives: spin off VMware from EMC or have EMC merge with or be acquired by a third-party (M&A).

    Reply
  46. Tomi Engdahl says:

    Systemd Adding Its Own Console To Linux Systems
    http://linux.slashdot.org/story/14/10/08/134208/systemd-adding-its-own-console-to-linux-systems

    The next version of systemd is poised to introduce an experimental “systemd-consoled” that serves as a user-space console daemon. The consoled furthers the Linux developers’ goal of eventually deprecating the VT subsystem found within the Linux kernel in favor of a user-space driven terminal

    Systemd 217 Will Introduce Its New “Consoled” User Console Daemon
    http://www.phoronix.com/scan.php?page=news_item&px=MTgwNzQ

    Reply
  47. Tomi Engdahl says:

    Free Windows 8 to Windows 10 Upgrade Useless, PC Makers Say
    http://news.softpedia.com/news/Free-Windows-8-to-Windows-10-Upgrade-Useless-PC-Makers-Say-461162.shtml

    Windows 10 Preview is already here and all eyes are on the feature lineup that’s going to be improved with every new testing build release, but also on pricing details that could be unveiled by Microsoft at a later time.
    Even though Microsoft hasn’t talked about the price of Windows 10 until now, there have been some voices indicating that Redmond is considering a free upgrade for Windows 8 users in order to boost adoption of its new operating system and to get more people off its modern OS, which was considered to be more or less a flop.

    Now it appears that PC makers aren’t quite happy with such a promo because it could significantly impact new computer sales, and some have already expressed their dissatisfaction in private circles.

    A report by Digitimes reveals that notebook component makers “take a pessimistic attitude” about Windows 10 because they expect the new operating system to have a really low impact on new PC sales.

    The free upgrade could affect new PC sales

    Some PC makers believe that by offering Windows 10 free of charge to those on Windows 8, sales of new computers would be dramatically impacted because no one would actually need to purchase new hardware.

    And while this is true, if Microsoft decides to cancel such a promo, it would also become a double-edged sword

    Reply
  48. Tomi Engdahl says:

    How GNOME 3.14 is winning back disillusioned Linux users
    http://www.pcworld.com/article/2691192/how-gnome-3-14-is-winning-back-disillusioned-linux-users.html

    GNOME 2 was once the default desktop environment on Ubuntu and most other popular Linux distributions, from Fedora to Debian. It was a stable, simple environment. With GNOME 3 and the GNOME Shell desktop, the GNOME team made radical changes.

    Linux distributions started to bail. Ubuntu thought they could do better, so they made their own Unity desktop. In 2013, Debian switched to their more traditional Xfce desktop as their default, partly because it was a more familiar experience for GNOME 2 users. It wasn’t just Linux distributions, either—many Linux users at the time had negative reactions and looked for other desktop environments.

    Well, if you haven’t tried it in a while, GNOME 3 has improved. Performance is now good. Debian just switched back to GNOME as their default desktop, partly because its accessibility and systemd integration was better than Xfce’s, but the interface has improved enough to make those considerations possible.

    Red Hat Enterprise Linux 7 is using GNOME 3, too. But they’re using it in “Classic Mode,” where GNOME 3 behaves more like GNOME 2.

    The free CentOS distro is based on RHEL, so you’ll see GNOME 3’s Classic Mode in CentOS 7, too.

    Make no mistake: If there wasn’t a classic mode, Red Hat probably would have joined Debian in switching to Xfce, or done something even more drastic. “The last thing we want to do is disrupt our customers’ workflows,” it said.

    Classic mode is actually just a collection of officially supported “extensions” to GNOME Shell that you can install to make it behave like GNOME 2 in just a few clicks.

    Ubuntu won’t switch back to GNOME any time soon. They’re focused on their vision of a computing experience that adapts to different screen sizes

    Reply
  49. Tomi Engdahl says:

    Brown Dog snuffles the 99 percent
    http://channeleye.co.uk/brown-dog-snuffles-the-99-percent/

    A team of boffins is developing a search engine which can find all the data on the world wide web which cannot be seen by search bots.

    The engine, dubbed Brown Dog, searches the web for uncurated data and makes it accessible to scientists.

    Kenton McHenry, who along with Jong Lee lead the Image and Spatial Data Analysis division at the National Center for Supercomputing Application (NCSA) said that the information age has made it easy for anyone to create and share vast amounts of digital data, including unstructured collections of images, video and audio as well as documents and spreadsheets.

    But the ability to search and use the contents of digital data has become exponentially more difficult because digital data is often trapped in outdated, difficult-to-read file formats and because metadata–the critical data about the data, such as when and how and by whom it was produced–is nonexistent.

    McHenry and his team at NCSA have been given a $10 million, five year award from the National Science Foundation (NSF) to manage and make sense of vast amounts of digital scientific data that is currently trapped in outdated file formats.

    So far they have come up with a Data Access Proxy (DAP) which transforms unreadable files into readable ones by linking together a series of computing and translational operations behind the scenes.

    The second tool, the Data Tilling Service (DTS), lets individuals search collections of data, possibly using an existing file to discover other similar files in the data.

    According to IDC, a research firm, up to 90 percent of big data is “dark,” meaning the contents of such files cannot be easily accessed.

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*