Here is my collection of trends and predictions for year 2014:
It seems that PC market is not recovering in 2014. IDC is forecasting that the technology channel will buy in around 34 million fewer PCs this year than last. It seem that things aren’t going to improve any time soon (down, down, down until 2017?). There will be no let-up on any front, with desktops and portables predicted to decline in both the mature and emerging markets. Perhaps the chief concern for future PC demand is a lack of reasons to replace an older system: PC usage has not moved significantly beyond consumption and productivity tasks to differentiate PCs from other devices. As a result, PC lifespan continue to increase. Death of the Desktop article says that sadly for the traditional desktop, this is only a matter of time before its purpose expires and that it would be inevitable it will happen within this decade. (I expect that it will not completely disappear).
When the PC business is slowly decreasing, smartphone and table business will increase quickly. Some time in the next six months, the number of smartphones on earth will pass the number of PCs. This shouldn’t really surprise anyone: the mobile business is much bigger than the computer industry. There are now perhaps 3.5-4 billion mobile phones, replaced every two years, versus 1.7-1.8 billion PCs replaced every 5 years. Smartphones broke down that wall between those industries few years ago – suddenly tech companies could sell to an industry with $1.2 trillion annual revenue. Now you can sell more phones in a quarter than the PC industry sells in a year.
After some years we will end up with somewhere over 3bn smartphones in use on earth, almost double the number of PCs. There are perhaps 900m consumer PCs on earth, and maybe 800m corporate PCs. The consumer PCs are mostly shared and the corporate PCs locked down, and neither are really mobile. Those 3 billion smartphones will all be personal, and all mobile. Mobile browsing is set to overtake traditional desktop browsing in 2015. The smartphone revolution is changing how consumers use the Internet. This will influence web design.
The only PC sector that seems to have some growth is server side. Microservers & Cloud Computing to Drive Server Growth article says that increased demand for cloud computing and high-density microserver systems has brought the server market back from a state of decline. We’re seeing fairly significant change in the server market. According to the 2014 IC Market Drivers report, server unit shipment growth will increase in the next several years, thanks to purchases of new, cheaper microservers. The total server IC market is projected to rise by 3% in 2014 to $14.4 billion: multicore MPU segment for microservers and NAND flash memories for solid state drives are expected to see better numbers.
Spinning rust and tape are DEAD. The future’s flash, cache and cloud article tells that the flash is the tier for primary data; the stuff christened tier 0. Data that needs to be written out to a slower response store goes across a local network link to a cloud storage gateway and that holds the tier 1 nearline data in its cache. Never mind software-defined HYPE, 2014 will be the year of storage FRANKENPLIANCES article tells that more hype around Software-Defined-Everything will keep the marketeers and the marchitecture specialists well employed for the next twelve months but don’t expect anything radical. The only innovation is going to be around pricing and consumption models as vendors try to maintain margins. FCoE will continue to be a side-show and FC, like tape, will soldier on happily. NAS will continue to eat away at the block storage market and perhaps 2014 will be the year that object storage finally takes off.
IT managers are increasingly replacing servers with SaaS article says that cloud providers take on a bigger share of the servers as overall market starts declining. An in-house system is no longer the default for many companies. IT managers want to cut the number of servers they manage, or at least slow the growth, and they may be succeeding. IDC expects that anywhere from 25% to 30% of all the servers shipped next year will be delivered to cloud services providers. In three years, 2017, nearly 45% of all the servers leaving manufacturers will be bought by cloud providers. The shift will slow the purchase of server sales to enterprise IT. Big cloud providers are more and more using their own designs instead of servers from big manufacturers. Data center consolidations are eliminating servers as well. For sure, IT managers are going to be managing physical servers for years to come. But, the number will be declining.
I hope that the IT business will start to grow this year as predicted. Information technology spends to increase next financial year according to N Chandrasekaran, chief executive and managing director of Tata Consultancy Services (TCS), India’s largest information technology (IT) services company. IDC predicts that IT consumption will increase next year to 5 per cent worldwide to $ 2.14 trillion. It is expected that the biggest opportunity will lie in the digital space: social, mobility, cloud and analytics. The gradual recovery of the economy in Europe will restore faith in business. Companies are re-imaging their business, keeping in mind changing digital trends.
The death of Windows XP will be on the new many times on the spring. There will be companies try to cash in with death of Windows XP: Microsoft’s plan for Windows XP support to end next spring, has received IT services providers as well as competitors to invest in their own services marketing. HP is peddling their customers Connected Backup 8.8 service to prevent data loss during migration. VMware is selling cloud desktop service. Google is wooing users to switch to ChromeOS system by making Chrome’s user interface familiar to wider audiences. The most effective way XP exploiting is the European defense giant EADS subsidiary of Arkoon, which promises support for XP users who do not want to or can not upgrade their systems.
There will be talk on what will be coming from Microsoft next year. Microsoft is reportedly planning to launch a series of updates in 2015 that could see major revisions for the Windows, Xbox, and Windows RT platforms. Microsoft’s wave of spring 2015 updates to its various Windows-based platforms has a codename: Threshold. If all goes according to early plans, Threshold will include updates to all three OS platforms (Xbox One, Windows and Windows Phone).
Amateur programmers are becoming increasingly more prevalent in the IT landscape. A new IDC study has found that of the 18.5 million software developers in the world, about 7.5 million (roughly 40 percent) are “hobbyist developers,” which is what IDC calls people who write code even though it is not their primary occupation. The boom in hobbyist programmers should cheer computer literacy advocates.IDC estimates there are almost 29 million ICT-skilled workers in the world as we enter 2014, including 11 million professional developers.
The Challenge of Cross-language Interoperability will be more and more talked. Interfacing between languages will be increasingly important. You can no longer expect a nontrivial application to be written in a single language. With software becoming ever more complex and hardware less homogeneous, the likelihood of a single language being the correct tool for an entire program is lower than ever. The trend toward increased complexity in software shows no sign of abating, and modern hardware creates new challenges. Now, mobile phones are starting to appear with eight cores with the same ISA (instruction set architecture) but different speeds, some other streaming processors optimized for different workloads (DSPs, GPUs), and other specialized cores.
Just another new USB connector type will be pushed to market. Lightning strikes USB bosses: Next-gen ‘type C’ jacks will be reversible article tells that USB is to get a new, smaller connector that, like Apple’s proprietary Lightning jack, will be reversible. Designed to support both USB 3.1 and USB 2.0, the new connector, dubbed “Type C”, will be the same size as an existing micro USB 2.0 plug.
2,130 Comments
Tomi Engdahl says:
Docker Raises $15M For Its Open-Source Platform That Helps Developers Build Apps In The Cloud
http://techcrunch.com/2014/01/21/docker-raises-15m-for-popular-open-source-platform-designed-for-developers-to-build-apps-in-the-cloud/
The shift to scale out architectures and an app-centric culture has turned out well for Docker and its lightweight open-source “container” technology designed for developers to quickly move code to the cloud.
Docker will use the funding to push toward the general availability of the Docker environment, develop commercial services that pair with the open-source technology and build a team to support the growing community.
The technology path is similar to the one VMware followed in its early days when IT managed their corporate-owned infrastructure.
The similarity to VMware in its early days and the excitement that Docker has generated made it an attractive investment,
“One of the things we learned at VMware is be as frictionless as possible,” Chen said in a phone interview today. “Docker has that ability as well.”
Docker also can be scaled from scratch. It can grow to multiple apps or be used on public or private servers, Chen said. And it can be scaled out in seconds, moved anywhere and all done without having to re-configure all over again.
“Docker is the right tech to fit the rapid updates,” Chen said.
Docker faces the challenge of making its technology easy-to-use with features that make it effective for a developer or a DevOps professional. For this new DevOps pro, Docker has to consider the management and orchestration of apps that are continuously updated using the Docker environment. For example, Docker will develop both public and private registries for developers to store their containers. It also plans to build management and orchestration tools that are needed as people and their organizations manage more and more Docker containers.
Tomi Engdahl says:
Ask Slashdot: It’s 2014 — Which New Technologies Should I Learn?
http://ask.slashdot.org/story/14/01/22/0320214/ask-slashdot-its-2014—-which-new-technologies-should-i-learn
Tomi Engdahl says:
Docker loads up $15m to push containerization into bit barns
Lightweight Linux-lovin approach blows the doors off of typical virtualization
http://www.theregister.co.uk/2014/01/22/docker_series_b_funding/
Containerization expert Docker has slurped $15m in filthy valley lucre to help it push its tech further into data centers under the leadership of a new, experienced chief executive.
The funding will give the company the money needed to grow its engineering team while also ploughing more money into sales and support, says new chief executive Ben Golub, who described Docker as having the potential for becoming a “fundamental architecture” for how new apps are designed.
he reckon the company’s technology is “fundamental enough” to become the basis of a large, publicly traded company.
He might be right: Docker uses the Linux kernel’s LXC CGroups, and Namespaces code to create a lightweight software container that makes for more efficient resource usage than typical virtualization, though with the caveat that it shares the underlying host OS across multiple apps in secure sandboxes.
Golub intends to make money from Docker through professional services and paid-for management layers, much like Red Hat does with Linux.
“While it is true that the idea of containers have been around for several years, for the most part, their use has been constrained to large service providers,” Golub told us via email. “While Google, Rackspace, and PaaS providers like Heroku and dotCloud used containers, their use required highly specialized teams and tools. Furthermore, containers developed in one environment were not portable across different hosts or different environments.”
“I think containers are already becoming a superior substitute for virtualization when it comes to Linux workloads and scale out applications that are being built for private and public clouds. Docker can also run it a VM (e.g. VMware or a VM on AWS) to give customers the best of both worlds when this hybrid approach is better. However, Docker isn’t a substitute for Windows workloads today but that may change in the future,” Chen told El Reg via email.
Tomi Engdahl says:
Press
Italy puts Free Software first in public sector
http://fsfe.org/news/2014/news-20140116-01.en.html
The Italian government has made Free Software the default choice for public administrations. In a document published last Wednesday, the Italian Digital Agency issued rules saying that all government organisations in the country must consider using Free Software before buying licenses for proprietary programs.
The document, titled “Guidelines on comparative evaluation [of software]“, sets out a detailed method which public bodies must follow to decide which software to use. They are required to look for suitable Free Software programs, or choose software developed by the public sector. Only if no suitable programs of these types are available may they consider acquiring non-free software.
“There is no excuse. All public administrations must opt for Free Software or re-use whenever possible”, says Carlo Piana, FSFE’s General Counsel, who participated in the committee that advised on the guideline. “Now Free Software and re-use are the norm, proprietary software the exception. This is the most advanced affirmative action in Europe so far. I’m so proud that Italy leads the way, for once”.
“This is a great example of a simple measure that governments everywhere can take to gain control of their IT infrastructures”, says Karsten Gerloff, FSFE’s President.
Tomi Engdahl says:
In-memory MemSQL bods develop new transaction: $35m now in-pocket
Upstart slurps Silicon Valley sugardaddy cash for database expansion
http://www.theregister.co.uk/2014/01/23/memsql_funding/
The in-memory former-Facebook database jockeys at MemSQL have taken in $35m in valley lucre.
MemSQL’s in-memory database is designed to run on scale-out commodity hardware, and pulls off various tricks to cut the time its system takes to parse a query. It supports structured and semi-structured data, and admins can use SQL to query JSON data.
The company was founded by former Facebook engineers Eric Frenkiel and Nikita Shamgunov in 2011. Since then it has notched up a few prominent customers, including Comcast, Zynga, Verisign, and stock-photo emporium Shutterstoc
MemSQL is one of many new relational (and non-relational) databases vying to unseat incumbent technologies from stalwarts like IBM, Oracle, and SAP. Most of these new databases have either been built by employees of some of the early-2000s breakout consumer internet giants like Google, Facebook and Amazon, or by people trying to sell to these companies.
Tomi Engdahl says:
‘Facebook could lose 80pc of users by 2017′
Researchers at the Princeton University predict that Facebook’s popularity will plummet in the coming years
http://www.telegraph.co.uk/technology/facebook/10589169/Facebook-could-lose-80pc-of-users-by-2017.html
Facebook’s growth is set to come to an abupt halt, just as an infectious disease spreads rapidly and suddenly dies, according to a new report.
Researchers at Princeton University in the United States predict that Facebook will undergo a rapid decline in the coming years, losing 80pc of its peak user base between 2015 and 2017.
This conclusion has been reached by comparing the adoption and abandonment dynamics of social networks to the dynamics that govern the spread of infectious disease.
“Ideas, like diseases, have been shown to spread infectiously between people before eventually dying out, and have been successfully described with epidemiological models,” the reserachers wrote in their paper.
“Ideas are spread through communicative contact between different people who share ideas with each other. Idea manifesters ultimately lose interest with the idea and no longer manifest the idea, which can be thought of as the gain of ‘immunity’ to the idea.”
The Facebook curve shows a decline in search activity starting in 2013, which corroborates reports that the social network started losing some of its younger user base in 2013.
Tomi Engdahl says:
IBM gives up on x86 servers
China’s Lenovo to buy US-based IBM’s x86 server Alongside business and related maintenance services for $ 2.3 billion, or about 1.6 billion euro.
Lenovo is the world’s largest PC manufacturer, but the pc-iron demand slackens, the company has been trying to find new sources of revenue from smartphones, tablets, and television sets.
IBM’s lower-level server provides the company with the purchase of a business foothold in the enterprise hardware solutions, which will improve the competitive position of Dell and HP with.
The transaction includes the IBM System x -, BladeCenter – and the Flex System servers and switches, x86-based Flex Systems, NeXtScale – and iDataPlex servers and related applications, and network operations and maintenance.
IBM still keeps IBM System z is central machines, Power Systems, Storage Systems, Power-based Flex-servers, PureApplication and PureData.
Source: Tietoviikko
http://www.tietoviikko.fi/kaikki_uutiset/ibm+luopuu+x86palvelimista/a961901
Tomi Engdahl says:
It needs a new kind of hybrid experts
Plain software development or server management is no longer sufficient and calls for extending through the company’s hybrid skills.
Sept. changing qualification is clear: the new technologies require new skills and broader knowledge.
Big data and analytics, cloud computing, mobile services and the emergence of social media is shot through with corporate organizations in all parts of the world, including IT departments.
“Everybody has access to the same systems and tools. But it’s must be able to create these using something new and important,” he says.
Add value to the IT departments must recruit suitable staff. A narrow area of expertise is no longer enough, but IT professionals need to master the big picture.
“This is a traditional IT skills go hand in new roles with. IT departments need these stronger players, producing more value to the company,”
“For example, a valid Java developer can create an excellent online shop for applications, if she manages social media practices, and is also familiar with, say, relations between the company and its clients,” Foote says.
These new hybrid talent can arise in many other technological areas. They might be, say, cloud services architects who understand the system, marketing skills with computer security professionals, or more than the coding of skilled software engineers.
The security sold in plain language
Especially in security-related expertise attracts Footea who believes in the security professionals in demand for continuous growth.
“Overall, the NSA spying fuss, no one would think that companies no longer need to underline the importance of information security. But companies understand the importance of information security still not right.”
“That’s why there is a need professionals who know how to tell security issues for managers in plain language,” Foote explains pitches of the dominant information security professionals need.
Source: Tietoviikko
http://www.tietoviikko.fi/cio/it+tarvitsee+uudenlaisia+hybridiosaajia/a961895
Tomi Engdahl says:
5 Hybrid IT Roles Your Business Needs to Succeed in 2014
http://www.cio.com/article/746805/5_Hybrid_IT_Roles_Your_Business_Needs_to_Succeed_in_2014
This year, the ability to simply configure and run a server or develop software in isolation won’t be enough. Employers will aggressively pursue workers with multi-dimensional talent — combinations of technology, domain, business, process and people skills.
It’s clear that the 2014 corporate agenda will be dominated by the integration of big data analytics, cloud computing, mobile technology, and social media into the enterprise. But the focus must not be on the technologies themselves. Everyone has access to the same systems and tools. The differentiator will not be the technology itself, but the business value it delivers — or doesn’t.
“The technologies are a side show to a lot of what’s really critical,” says David Foote, chief analyst at IT labor research and analyst firm Foote Partners. “It’s IT’s ability to do something meaningful with them that’s important.”
Here are five hybrid roles IT organizations will need to fill to stay competitive this year:
1. Enterprise Architects Who Get the Cloud
2. Business Analysts With Integrated Thinking
3. Security Professionals With Marketing Skills
4. Database Pros to Bring Structure to the Unstructured
5. Software Engineers That Do More then Generate Code
Tomi Engdahl says:
It was inevitable: Lenovo stumps up $2.3bn for IBM System x server biz
7,500 staffers at Big Blue ‘expected’ to transfer over
http://www.channelregister.co.uk/2014/01/23/lenovo_buy_systemx/
IBM has offloaded its failing x86 biz to Lenovo for $2.3bn, albeit a day later than our sources had predicted.
The pair have entered into a definitive agreement – days after talks were confirmed – which includes System x, BladeCenter and Flex System blade servers and switches, x86-based Flex integrated systems, NeXtScale and iDataPlex servers, software and maintenance.
According to Gartner, IBM turned over $5.5bn worth of x86 kit in 2010 but by the end of 2012 the unit sold just $5.6bn.
Under the terms of the deal, Lenovo will cough $2.07bn in cash with the remainder in stock
This is a massive climbdown from the return that Big Blue execs wanted last spring, when it is understood they were asking for up to $6bn. The continued decline in sales at System x obviously convinced them to get shot for less.
IBM said it will continue to develop and evolve the Windows and Linux software lines for x86 platforms. Lenovo will start to OEM and resell Big Blue’s storage kit globally including Storwize disk and tape, General Parallel File System software, SmartCloud Entry and part of IBM’s software (Systems Director/ Platform Computing).
System x employs 7,500 people worldwide and IBM said they are “expected to be offered employment by Lenovo”.
x86 market leader HP will be watching developments very closely, having already relinquished the global PC crown to Lenovo last year.
Tomi Engdahl says:
Did object storage come of age in 2013?
Some crossed the chasm. Some turned back…
http://www.theregister.co.uk/2014/01/23/coming_of_age_object_storage_in_2013/
Object storage entered 2013 pretty much as a niche technology but exited the year headed towards mainstream status, with ViPR, Black Pearl and EVault leading the way.
Others, though, held back. As of the year end, we saw IBM doing nothing much with objects, NetApp apparently standing still with its StorageGRID, and Dell exiting the field.
Counterbalancing these three, EMC re-entered the object world with ViPR, while SpectraLogic, Seagate and EVault all entered object storage in significant and different ways. Suppliers gained funding and strengthened products and, suggesting growing respectability, IDC introduced an object storage marketscape with a supplier rating eye-candy chart.
Amazonian storage
We heard in April that the Bezos behemoth, the grand-daddy of cloud object storage, had two trillion objects in its S3 storage cloud, double the number it had a year before. It stored just 10 billion objects in October, 2007. We’re probably looking at 4 trillion now, if the recent rate of progress is extrapolated roughly.
S3 stores data as objects in buckets in a key:value scheme. The details of Amazon’s object storage scheme have not been revealed. With Amplidata, Caringo, Cleversafe, EMC, HDS and Scality all growing it’s apparent that object technology is not just the preserve of hyper-scale web sites like S3, and ordinary enterprises can use the technology.
Let’s chance our arm and say while in 2013 object storage sorted crossing the chasm, in 2014 object storage will arrive as a mainstream and accepted storage technology.
Tomi Engdahl says:
AMD Posts Solid Q413 Earnings
http://www.eetimes.com/document.asp?doc_id=1320730&
AMD on Tuesday announced the fourth quarter earnings for 2013 exceeding its own goal. The increased sales of new game consoles by Sony and Microsoft allowed AMD’s semi-custom SoCs and graphics products to make up for the decline in their chipsets sold for notebooks.
The chipmaker brought in $1.59 billion in revenue, an increase of 9% sequentially and 28% year-over-year. AMD’s gross margin was 35%, its operating income was 135 million, and non-GAAP operating income was $91 million.
Tomi Engdahl says:
Recalculating the newsroom: The rise of the journo-coder?
http://www.journalism.co.uk/news/recalculating-the-newsroom-the-rise-of-the-journo-coder-/s2/a555646/
Through a series of interviews with journalists and developers working at the BBC and Financial Times plus some key players outside of major newsrooms, Liz Hannaford aims to find out about data journalism’s special role today.
Tomi Engdahl says:
Oracle nemesis SkySQL gins up enterprise MariaDB product
Puts database, Galera cluster, and custom manager into a $20k box
http://www.theregister.co.uk/2014/01/23/skysql_mariadb_enterprise/
Anti-Oracle software startup SkySQL has put the finishing touches on a subscription service that wraps up the MariaDB database, Galera cluster tech, and a proprietary manager, as the company tries to get more organizations to dabble in the drop-in replacement for MySQL.
Though MariaDB is a drop-in replacement for MySQL there are signs that the tech is gaining some features that go beyond that available in version 5.6 of its Oracle-backed counterpart, such as the Parallel Slave tech that featured in the beta of the latest version of the system.
Tech aside, SkySQL has also managed to draw some big names away from MySQL: In September, El Reg broke the news that Google was migrating its internal production servers from MySQL to MariaDB and had worked with SkySQL to implement needed features in the tech. Similarly, folks at Mozilla are migrating to MariaDB, Red Hat has made it the new storage engine in RHEL, and Fedora made it the default implementation of MySQL in Fedora 19.
Tomi Engdahl says:
Apple executives on the Mac at 30: ‘The Mac keeps going forever.’
http://www.macworld.com/article/2090829/apple-executives-on-the-mac-at-30-the-mac-keeps-going-forever.html
Thirty years ago, Apple introduced the Macintosh, and we all learned why 1984 wasn’t going to be like 1984. A lot has changed in 30 years, and yet even in as fast-moving a field as technology, Apple and the Mac are still here.
Tomi Engdahl says:
CES 2014: AMD Showcases FreeSync
http://www.designnews.com/author.asp?section_id=1386&doc_id=271158&cid=nl.dn14
AMD pulled out all the stops for this year’s CES in Las Vegas, showcasing everything from 4K high-resolution displays to various software solutions such as the company’s FreeSync for implementing “dynamic refresh rates.”
AMD developed FreeSync, which is able to blend the monitors refresh rate with the number of FPS being generated with the video card, much like Nvidia’s G-Sync, to combat this issue.
In a demonstration at its booth, AMD used two off-the-shelf Toshiba Satellite notebooks to demonstrate the effect running a simple animated windmill, with run using typical vsync and the other using FreeSync. The vsync version ran much as you would expect, with noticeable image tearing and roughness, while the other with FreeSync produced a clean, smooth image.
The secret behind FreeSync is that it relies on technology found on almost all AMD GPU and APU chips over the last three generations, which is known as Vblank and is/was used primarily for power-saving options. Using vsync before new images are ready wastes power, which is what Vblank fixes as it sets a dynamic variable refresh rate to compensate for varying FPS.
The one drawback of this technology is that displays have to support it in order for it to function, and there are only a handful of display manufacturers who have adopted the technology. There is a movement, however, to create a VESA standard to rectify the issue, but only time will tell if or when it may be adopted.
Tomi Engdahl says:
Everything You’re Thinking About Nintendo Is Totally Wrong
http://www.wired.com/gamelife/2014/01/nintendo-mobile/
The entire internet has weighed in with what it believes is the answer to Nintendo’s financial woes: Go mobile, immediately. But the entire internet is wrong.
The conventional wisdom is wrong. It is not an inevitability that Nintendo must put its games on rival hardware or die. It may even be a bad move.
Having been at least convinced that it would be too risky for Nintendo to jettison its hardware business entirely, many analysts and commentators are now staking out what they imagine to be a more moderate and sensible position: Nintendo should put some of its games on others’ platforms. This, too, is a logical fallacy, namely argumentum ad temperantiam: the idea if one is faced with two opposing arguments, the correct position must be somewhere in the middle. But suggesting that Nintendo “dip its toe” into mobile app stores is like suggesting that a couple pondering parenthood consider getting just a little bit pregnant.
Nintendo has a reputation of being “too conservative,” which is true in some ways but not others.
But on the other hand, Nintendo 64 revolutionized 3-D videogames by introducing the analog stick, force feedback and other innovations, not to mention the way Super Mario 64 and Zelda: Ocarina of Time practically wrote the book on how to make 3-D videogames. Nintendo can be a phenomenally deep stick in the mud about some things, but also willing to go all in on crazy new ideas. Sony and Microsoft, in their entire histories in the game hardware business, have never done anything half as crazy as Wii.
Just as eventually Nintendo was forced to put its games on discs, so too will it eventually come around to the reality that it is possible to run a more open and agile digital games store without ruining the entire videogame industry in the process. If that doesn’t work to fix Wii U and 3DS, then Nintendo will likely at least release another generation of hardware and see if that can do it.
And if that doesn’t work, well, then maybe Nintendo will get out of hardware. Nothing lasts forever. But it’s likely that there will be many, many steps between now and then
Tomi Engdahl says:
Amazon’s ‘schizophrenic’ open source selfishness scares off potential talent, say insiders
Moles blame Bezos for paltry code sharing
http://www.theregister.co.uk/2014/01/22/amazon_open_source_investigation/
Amazon is one of the most technically influential companies operating today – but you wouldn’t know it, thanks to a dearth of published research papers and negligible code contributions to the open-source projects it relies on.
This, according to multiple insiders, is becoming a problem. The corporation is described as a “black hole” because improvements and fixes for the open-source software it uses rarely see the light of day. And, we’re told, that policy of secrecy comes right from the top – and it’s driving talent into the arms of its rivals.
This secretiveness, “comes from Jeff,” claimed another source. “It’s passed down in HR training and policy. It’s all very clear.”
Though a select few are permitted to give public talks, when they do, they disclose far less information about their company’s technology than their peers.
“Amazon behaves a lot like a classified military agency,” explained another ex-Amazonian
Multiple sources have speculated to us that Amazon’s secrecy comes from Jeff Bezos’ professional grounding in the financial industry, where he worked in trading systems. This field is notoriously competitive and very, very hush-hush. That may have influenced his thoughts about how open Amazon should operate, as does his role in a market where he competes with retail giants such as Walmart.
But one contact argued that a taciturn approach may not be appropriate for the advanced technology Amazon has developed for its large-scale cloud computing business division, Amazon Web Services.
“In the Amazon case, there is a particular schizophrenia between retail and technology, and the retail culture dominates,” explained the source. “Retail frugality is all about secrecy because margins are so small so you can’t betray anything – secrecy is a dominant factor in the Amazon culture.
“It’s a huge cost to the company.”
Tomi Engdahl says:
Digitalization , this is not enough , but we are still too far away from industrial solutions, speed and efficiency .
What then is the right industrial it? One of the most important criteria for me is that the solution will be developed once, and all deliveries are strongly based on time tested solutions to work a second time.
Standard Software have been around for a long time industrial it ‘s an expression , but all too often they serve only as a part of the solution . Specialists of artisans required for their configuration , deployment and operation of .
It reminds me of the history of the first cars . Niihinhän had to take a factory specializing in the driver .
Cloud services take on many points of industrialization further. While the software package must be able to install and maintain , the cloud may be purchased to use the software almost “turn-key ” basis. Successful introduction of its best instructions of self-service.
Another important feature of the industrialization process associated with I think is compatibility. Each industrial producer ‘s interest to make sure your product agreeing to an existing environment.
Package applications and cloud services integration with other solutions to get to the level that it does not always have to be rebuilt . Also need to show the compatibility of uses more stable.
If you return to the previously mentioned autovertaukseen , as few current to drive the car needs to go sign training. Industrial IT solution must be deployable without special training program.
Expansion into new areas of digitization ‘s going to take more and more industrially produced IT solutions.
I for one want to see more often than ready-made solution to fit as business needs in order of time and money will come and business development. At the same time might be invented brand-new items, which make use of information technology.
I think this is leading to more and stronger development of IT expertise in industrial distribution of it ‘s producers and users of the IT solutions .
The traditional understanding of IT solutions from importing competitive advantage is going to change . The baseline is suitable for the aircraft industry and the airlines. A similar equipment utilizing the airlines can produce very different types of services with widely different results .
Source: Tivi CIO100 “Tahdon tehdastekoista it:tä”
http://www.tietoviikko.fi/cio/blogit/CIO_100_blogi/tahdon+tehdastekoista+itta/a961933
Tomi Engdahl says:
Facebook Mocks ‘Infection’ Study, Predicts Princeton’s Demise
http://slashdot.org/story/14/01/24/0257232/facebook-mocks-infection-study-predicts-princetons-demise
“In a followup to the earlier story about Princeton researchers predicting the end of Facebook by 2017, Facebook has struck back with a post using similar statistical techniques to predict that Princeton itself may be facing irreversible decline.”
Tomi Engdahl says:
Thirteen year old Windows XP operating system is still widely used, even if the manufacturer of the Microsoft support ends in April. From this point on it is a very limited and expensive to support.
XP machines continues to be guided up to the automation lines and control systems, as well as factory control room SCADA terminals. The system is also used in medical devices.
- Even in modern production machine is not always dare to update because the update at worst, may cause interruptions in production, says IT company CGI ‘s leading Finnish security expert Jan Mickos .
Mickos, the plants are still in use even more older Windows 95 operating system, which updates have stopped a long time ago.
In response to security concerns, Microsoft announced last week that the Windows Defender anti-virus software updates will continue until July 2015. Experts agree that it does not eliminate the problems.
Although the actual XP support ends in April, Microsoft will sell a special tailor-made to support large additional cost.
- We have fixed a price on a customizable so that the customer will understand, that it is wise to upgrade to a newer operating system. Small and medium-sized businesses or entrepreneurs are not able to buy it, says Tom Toivonen Finland Microsoft.
Microsoft did not disclose the price of the special support. Techniques & economic information, tailor-made support can cost big companies millions of euros a year, and it is intended to protect critical systems.
Special support can be purchased with these views for 2-3 years.
Source: Tietoviikko
http://www.tietoviikko.fi/uutisia/windows+xp+tuki+jatkuu+ndash+mutta+vain+kovalla+rahalla/a962450
Tomi Engdahl says:
Nintendo Loses Heaps of Money, Slashes Forecasts. So, What Now?
http://www.wired.com/gamelife/2014/01/nintendo-forecast/
We all knew it was coming, but Nintendo unleashed the bad-news bonanza late last night: It won’t make the 55 billion yen (about $520 million) profit it initially forecasted for this fiscal year, but instead it will lose about 25 billion yen ($240 million) due to weaker than expected sales of pretty much all of its products. It lowered this year’s sales forecast for the Wii U console from 9 million units to 2.8 million.
Reactions from the investment community were about what you’d expect.
Nintendo Mulls New Business Model After Forecasting Loss
http://www.bloomberg.com/news/2014-01-17/nintendo-forecasts-net-loss-on-stagnating-sales-of-wii-u-games.html
Tomi Engdahl says:
Apple now spends more on chips than top three PC makers combined
And together, Apple and Samsung are nearly a quarter of the market
http://www.theregister.co.uk/2014/01/25/apple_buys_more_chips_than_hp_lenovo_dell/
In case you were wondering whether the PC industry is still in a slump, the numbers are in on semiconductor purchasing in 2013 – and once again, the biggest spenders weren’t PC vendors.
Together, Apple and Samsung bought $52.5bn worth of semiconductors in 2013, a sum that represented 22 per cent of the “served available market” – a metric that excludes purchases companies make from their own internal divisions.
In other words, when you add up what the three largest PC vendors spent on semiconductors in the year, it comes to $27bn – 11 per cent less than what Apple alone spent, and just over half the total spending of Apple and Samsung combined.
Tomi Engdahl says:
Google Buys UK AI Startup Deep Mind
http://tech.slashdot.org/story/14/01/27/0622255/google-buys-uk-ai-startup-deep-mind
TechCrunch reports that Google has acquired London-based artificial intelligence firm Deep Mind.
it seems to fit well with the emphasis on AI that the company underscored with its hiring of futurist Ray Kurzweil
Tomi Engdahl says:
Pinterest Is More Popular Than Email for Sharing Stuff Online
http://www.wired.com/business/2014/01/pinterest-more-popular-than-email/
Pinterest, has become a potent force on the internet. According to a new study, it’s now one of the primary ways that people share stuff online. It even tops email.
“In a sign of how quickly social media has changed the digital landscape, consumers are now ‘pinning’ things like articles, photos and recipes to share with their friends more often than emailing links,” wrote Kurt Abrahamson of ShareThis in a blog post.
Facebook remained the most popular way to share, according to ShareThis. But Pinterest’s sharing stats are growing the fastest.
Tomi Engdahl says:
Chromebooks Take Other Mobile PCs to School
The Inexpensive Laptops Grab Fifth of U.S. Educational Purchases
http://online.wsj.com/news/articles/SB10001424052702304856504579338941198812358
Chromebooks have come from nowhere to grab nearly a fifth of U.S. school purchases of mobile computers, posing problems for Microsoft Corp. and possibly even Apple Inc.
The inexpensive laptops, which run Google Inc. GOOG -3.13% software but are mostly sold by other companies, accounted for 19% of the K-12 market for mobile computers in the U.S. in 2013, according to a preliminary estimate by Futuresource Consulting. In 2012, Chromebooks represented less than 1% of the market, according to the research firm, whose estimate includes both tablets and notebook PCs but excludes desktop computers.
Mobile computers running Microsoft Windows slid from 47.5% of that market in 2012 to 28% in last year’s third quarter, said Futuresource, which isn’t yet providing comparisons for all of 2013.
For Microsoft, Chromebooks add to the already stiff competition that emerged after the iPad arrived in 2010. IDC said the software company’s share of sales to schools and higher education, which hit 77% in 2010, had dwindled to 43% by the 2013 third quarter.
Tomi Engdahl says:
Valve showers Debian Linux devs with FREE Steam games
Community contributors given unlimited access to Valve titles
http://www.theregister.co.uk/2014/01/24/valve_free_games_for_debian_developers/
Games vendor Valve has offered a surprise present to the Debian Linux community, in the form of subscriptions that give Debian project members free, unlimited access to all Valve game titles – past, present, and future – forever.
According to a message posted to a Debian mailing list on Thursday, the offer is open to all Debian Developers, the project’s name for official, registered contributors.
Valve founder Gabe Newell has long been a vocal supporter of Linux, particularly as compared to recent operating systems from his former employer, Microsoft.
In 2012, Newell described Windows 8 as “a catastrophe for everyone in the PC space” and said that Valve was working on making all 2,500 games in its Steam store available to Linux users as a “hedging strategy.”
Most recently, Valve has been working on SteamOS, a Linux-based turnkey gaming platform that will come preinstalled on Steam Machines – dedicated gaming consoles based on standard PC hardware from a variety of vendors.
Not surprisingly, perhaps, SteamOS is based on Debian. Specifically, it is a fork of the Debian 7.1 “Wheezy” stable distribution with additions and modifications
Tomi Engdahl says:
Altcoins will DESTROY the IT industry and spawn an infosec NIGHTMARE
After Bitcoin cometh the storm. And after the storm…
http://www.theregister.co.uk/2014/01/27/altcoin_gpu_market_crash_security_nightmare/
Much has been written about how Bitcoin will affect libertarian society, banks, money and government, but there are some other effects that bear consideration: what it will do to the IT industry.
If, however, that graphics card is a money-making machine it ís a perfectly legitimate expense. And graphics cards are being sold in huge quantities for mining virtual currency.
While processor development moves along at a good rate, graphics processors are a much more competitive market.
Mining bitcoins is now beyond the processor power of even the fastest of consumer graphics cards but there are plenty of wannabe currencies that can be created thanks to the super-fast processing of high-end graphics cards.
Not surprisingly, demand for the highest of high-end graphics cards has soared. Availability is poor and they are selling at a premium.
But as time goes on and alt currencies either die out, or go the way of Bitcoin and require dedicated hardware, demand for cards will slump.
A market flooded with cards that months ago were selling for thousands of pounds will see prices drop to barely hundreds of pounds
Cointerra went from “hey, let’s build a Bitcoin mining company”, through processor development, test and build to shipped product in customer’s rack in under nine months. Traditional wisdom would have that cycle take two to three years.
One of the ASIC applications which will benefit is custom chips used to crack passwords. Mining is essentially a brute-force attack on the generating algorithm.
Bitcoin, and all the other alt-coins, is training a skillset for building password-cracking hardware that is both powerful and portable. These devices are effectively an infinite number of monkeys at an infinite number of keyboards. The implications for the security industry are significant. Suddenly, just keeping a device isolated from the internet isn’t good enough.
The opposite side of this is that equipment for hardware-level data encryption also becomes cheap and plentiful. Expect password-encoding ASICs to become a norm.
Tomi Engdahl says:
How flash storage raced from glory to despair in 2013
And back again. For some…
http://www.theregister.co.uk/2014/01/17/flash_business_from_glory_to_despair_in_2013/
Flash dramas abounded in 2013, from triumphant IPOs, disastrous IPO after-effects, profound strategy changes, and firms crashing out in flames.
Two CEOs went. Two companies crashed. Several were bought. All in all it was a hell of a year. Let’s start with all-flash arrays.
Here’s where a lot of the flash action was as all-flash arrays (AFAs) went mainstream. The legacy disk drive array suppliers all went into the all-flash array business as they built product to respond to Numbus Data, Pure Storage, Solidfire and Violin Memory arrays
In April Switzerland-based CloudSigma offered its IaaS services using all-flash storage.
Google also offered flash storage for compute-in-the-cloud instances in December.
Some flash firms encountered significant troubles during the year.
Flash storage is hot but it’s not easy money for the startups. Flash foundry capacity is limited and will stay limited, foundries costing $10bn a pop. Crudely speaking though, flash is the new disk, disk is the new tape (for nearline and backup data) and tape is the archive. The flash is located in the same premises as the servers, sometimes inside the servers, and all three storage media devices can be located in public clouds.
But there isn’t enough flash being made and security of chip supply is coming to the fore, as is the growing presence and influence of NAND foundry operators, like Samsung and Toshiba in the business of packaging flash chips into products. We can point to the Seagate Samsung relationship here as well as the Toshiba-Violin nexus, Toshiba’s OCZ purchase, and Toshiba foundry partner SanDisk’s purchase of SMART.
There may be more of this in 2014. We might also see:
Servers adopt flash DIMMS
A PCI flash card supplier shakeout
One or two all-flash array startups collapsing or struggling to survive,
Mainstream storage vendors selling boatloads of flash arrays to their customers
Tomi Engdahl says:
Why Flash storage will be fast and furious in 2014
We look in the rear view mirror before racing down the road ahead
http://www.theregister.co.uk/2014/01/15/flash_in_2013/
Flash had a fantastic year in 2013 with an enormous number of developments.
It was a year of generally positive flash transitions, with cell geometry shrinking, all-flash arrays springing up, flash companies being bought, flash companies crashing back to earth after inflated IPOs or just crashing, and happiness spraying out like sunshine from three hybrid flash/disk array suppliers.
Trying to keep up with all this and fit it into a big picture was like trying to sip delicately from the water coming over Niagara Falls.
We’ll look at the business side of things in a later article, but right now let’s take a look at developments in flash technology.
We didn’t see triple layer cell (TLC) NAND push into enterprise applications in 2013, TLC flash being slow and having a ridiculously short write endurance level.
There was the beginning of a transition towards 1X (19nm-10nm) flash cell geometries from the 2X (29nm – 20nm) technology.
One thing that didn’t happen was the emergence of any non-volatile technology to take over from flash. It’s generally agreed that flash technology may be unable to develop usable enterprise flash storage with acceptable endurance down at 15nm and below. Phase Change Memory and varieties of Resistive RAM, such as HP’s Memristor, are still future tech, with 3D NAND taking up the capacity slack
SSDs steadily developed in the year but, really, the action was elsewhere,
There was a steady iteration of hybrid flash-disk drive technology products, using a small slug of flash cache – less than 24GB typically – and firmware to detect hot files and put them on the flash.
There was one other big flash technology development in the year; SMART Storage announced it had stuffed flash chips into a memory DIMM-type module, using Diablo Technologies IP, and so arrived at flash on a server’s memory bus with access latency below that of PCIe flash.
The number of PCIE flash card suppliers multiplied as everyone and their brother tried to get in on the server app flash acceleration game
Tomi Engdahl says:
Razer’s Project Christine: A Modular PC Prototype
by Jarred Walton on January 22, 2014 6:38 PM EST
http://www.anandtech.com/show/7716/razers-project-christine-a-modular-pc-prototype
Coming courtesy of Razer, Project Christine ran away from CES with numerous awards and accolades.
First, it’s important to note that the two Christine prototypes shown at CES are apparently not functional (or at least, no longer functional after shipping?),
Anyway, the idea is that you have this modular case (tower/column) where you can plug in GPUs, HDDs/SSDs, and other devices that come in self-contained modules. Need a faster GPU? No longer do you open up your PC and unscrew the old GPU and then install the new GPU; instead, you simply pop out the old module and add a new one
Instead of air-cooling or even liquid-cooling, Razer is apparently using a non-conductive mineral oil that circulates through all of the modules (or at least the modules that need cooling I suppose), with as I understand it the parts being completely submerged in the oil.
Razer indicated that there’s at least one working prototype that’s currently being used by their CEO.
So far so good, but rerouting PCI Express lanes to custom ports isn’t really all that difficult (relatively speaking).
Swapping GPUs is easy enough, as we already do that with our “modular” desktop PCs. The same applies to storage devices as well as things that might plug into USB ports. But what happens if you want to upgrade your CPU or chipset?
What would be really ground breaking would be a modular PC where you could easily swap any and all components. Maybe that’s something Razer is hoping to deliver in the future, but imagine having the center column contain a large PCI-E backplane that could be upgraded with various options. The default model might come with 24 or 32 PCI-E lanes, while higher end backplanes could boast 48, 72, or even 96 (or more!) lanes.
We could even have a design that could be upgraded to PCI-E 4.0 support in the future, and maybe something with the ability to transition between CPU platforms – so AMD, Intel, ARM, etc. That would take a lot of work and probably wouldn’t really receive much in the way of support from Intel, but it’s a nice dream.
Tomi Engdahl says:
Google Beat Facebook for DeepMind, Creates Ethics Board
https://www.theinformation.com/Google-beat-Facebook-For-DeepMind-Creates-Ethics-Board
Google, which is acquiring DeepMind Technologies, has agreed to establish an ethics board to ensure the artificial intelligence technology isn’t abused, according to two people familiar with the deal.
The unusual step of establishing the ethics committee comes as Google, which is paying more than $500 million to acquire the company, won the deal over Facebook, several people familiar with negotiations said. Facebook was in serious acquisition talks with DeepMind late last year, these people said, and it is unclear why talks fell apart.
Tomi Engdahl says:
IBM reveals radical email interface rethink
Touch-enabled, in-browser, multi-platform email coming later in 2014
http://www.theregister.co.uk/2014/01/28/ibm_reveals_radical_email_interface_rethink/
IBM has revealed a radical new user interface for email at its Connect 2014 conference in the USA.
Dubbed Mail Next, the new interface is inspired by social media and billed as a quantum leap beyond the likes of Outlook.
IBM says this interface is a scientific approach to coping with multiple incoming communications across multiple channels and uses analytics to turn personal information managers into tools that use analytics to help knowledge workers understand what to prioritise. Such an interface is needed, IBM suggests, because a decade or more of personal information managers and unified communications tools haven’t improved many knowledge workers’ feeling of being overwhelmed by email.
Tomi Engdahl says:
Microsoft contributes cloud server designs to the Open Compute Project
27 Jan 2014 3:20 PM
http://blogs.technet.com/b/microsoft_blog/archive/2014/01/27/microsoft-contributes-cloud-server-designs-to-the-open-compute-project.aspx
On Tuesday, I will deliver a keynote address to 3,000 attendees at the Open Compute Project (OCP) Summit in San Jose, Calif. where I will announce that Microsoft is joining the OCP, a community focused on engineering the most efficient hardware for cloud and high-scale computing via open collaboration. I will also announce that we are contributing to the OCP what we call the Microsoft cloud server specification: the designs for the most advanced server hardware in Microsoft datacenters delivering global cloud services like Windows Azure, Office 365, Bing and others. We are excited to participate in the OCP community and share our cloud innovation with the industry in order to foster more efficient datacenters and the adoption of cloud computing.
The Microsoft cloud server specification essentially provides the blueprints for the datacenter servers we have designed to deliver the world’s most diverse portfolio of cloud services. These servers are optimized for Windows Server software and built to handle the enormous availability, scalability and efficiency requirements of Windows Azure, our global cloud platform. They offer dramatic improvements over traditional enterprise server designs: up to 40 percent server cost savings, 15 percent power efficiency gains and 50 percent reduction in deployment and service times.
Microsoft and Facebook (the founder of OCP) are the only cloud service providers to publicly release these server specifications, and the depth of information Microsoft is sharing with OCP is unprecedented. As part of this effort, Microsoft Open Technologies Inc. is open sourcing the software code we created for the management of hardware operations, such as server diagnostics, power supply and fan control. We would like to help build an open source software community within OCP as well.
Tomi Engdahl says:
Reversing course, Google rejects Adobe Web publishing tech
http://news.cnet.com/8301-1023_3-57617840-93/reversing-course-google-rejects-adobe-web-publishing-tech/
The CSS Regions would have helped ease magazine-like layouts on the Web. But Google’s priority now is on performance, not features — especially mobile browser performance.
It can be hard to say no to an idea with some merit — especially after already saying yes.
But that’s the position Google is in with an Adobe Systems technology for bringing more sophisticated, magazine-style layouts to Web publishing through a technology called CSS Regions. Google changed its mind after deciding that it was too complex and that it would hamper one of Google’s top 2014 priorities, making Chrome faster on mobile devices, according to Google Chrome programmer Eric Seidel.
Adobe had been working on CSS Regions for years, developing the idea as part of its effort to reincarnate Flash Player abilities as native Web standards. Adobe made progress working CSS Regions support into Google’s Blink browser engine and the Apple WebKit project from which Blink originated. But Seidel proposed working with Adobe to remove CSS Regions code from Blink.
CSS Regions supporters, unhappy with the decision, pushed back with arguments that the technology and a related one, text fragmentation, are useful, and that Google’s “draconian” decision-making style would deter others from contributing to Blink. And Adobe tried to re-reverse Google’s position.
Tomi Engdahl says:
Electronic Arts Sales Come Up Short in Console Transition
http://www.bloomberg.com/news/2014-01-28/electronic-arts-sales-fall-short-on-shift-to-new-consoles.html
Electronic Arts Inc. (EA), the No. 2 U.S. video-game maker, posted third-quarter sales that fell short of analysts’ estimates and cut its full-year revenue outlook as shoppers spend less on games for old consoles.
Tomi Engdahl says:
Google brings Chrome apps to Android and iOS, lets developers submit to Google Play and Apple’s App Store
http://thenextweb.com/google/2014/01/28/google-brings-chrome-apps-android-ios-lets-developers-submit-google-play-apples-app-store/
Google today launched Chrome apps for Android and iOS. The company is offering an early developer preview of a toolchain based on Apache Cordova, an open-source mobile development framework for building native mobile apps using HTML, CSS and JavaScript. Developers can use the tool to wrap their Chrome app with a native application shell that enables them to distribute it via Google Play and Apple’s App Store.
Today’s announcement builds on the company’s launch of Chrome apps in September that work offline by default and act like native applications on the host operating system. Those Chrome apps work on Windows, Mac, and Chrome OS, but now the company wants to bring them to the mobile world.
Chrome packaged apps are written in HTML, JavaScript, and CSS, but launch outside the browser
Tomi Engdahl says:
Lenovo to Reorganize Into Four Business Groups
New Structure to Consist of PC, Mobile, Enterprise, and ‘Ecosystem and Cloud’
http://online.wsj.com/news/article_email/SB10001424052702303277704579348492561696058-lMyQjAxMTA0MDIwODEyNDgyWj
Tomi Engdahl says:
It Begins: AMD Announces Its First ARM Based Server SoC, 64-bit/8-core Opteron A1100
by Anand Lal Shimpi on January 28, 2014 6:35 PM EST
anandtech.com/show/7724/it-begins-amd-announces-its-first-arm-based-server-soc-64bit8core-opteron-a1100
The Opteron A1100 features either 4 or 8 AMD Cortex A57 cores.
AMD will be making a reference board available to interested parties starting in March, with server and OEM announcements to come in Q4 of this year.
AMD tells me we should expect a total “solution” price somewhere around 1/10th that of a competing high-end Xeon box, but it isn’t offering specifics beyond that just yet. Given the Opteron X2150 performance/TDP comparison, I’m guessing we’re looking at a similar ~$100 price point for the SoC.
Tomi Engdahl says:
Facebook puts 10,000 Blu-ray discs in low-power storage system
http://www.pcworld.com/article/2092420/facebook-puts-10000-bluray-discs-in-lowpower-storage-system.html
If you thought Netflix and iTunes would make optical discs a thing of the past, think again. Facebook has built a storage system from 10,000 Blu-ray discs that holds a petabyte of data and is highly energy-efficient, the company said Tuesday.
It designed the system to store data that hardly ever needs to be accessed, or for so-called “cold storage.” That includes duplicates of its users’ photos and videos that Facebook keeps for backup purposes, and which it essentially wants to file away and forget.
The Blu-ray system reduces costs by 50 percent and energy use by 80 percent compared with its current cold-storage system, which uses hard disk drives, said Jay Parikh, Facebook’s vice president of infrastructure engineering, in a talk at the Open Compute summit.
Facebook cold storage efforts lead to petabyte Blu-ray system
http://www.slashgear.com/facebook-cold-storage-efforts-lead-to-petabyte-blu-ray-system-28314787/
Tomi Engdahl says:
Microsoft Open Sources Its Internet Servers, Steps Into the Future
http://www.wired.com/wiredenterprise/2014/01/microsoft-open-compute-servers/
For nearly two years, tech insiders whispered that Microsoft was designing its own computer servers. Much like Google and Facebook and Amazon, the voices said, Microsoft was fashioning a cheaper and more efficient breed of server for use inside the massive data centers that drive its increasingly popular web services, including Bing, Windows Azure, and Office 365.
Microsoft will not only lift the veil from its secret server designs. It will ‘open source’ these designs, sharing them with the world at large.
As it released its server designs, the company also open sourced the software it built to manage the operation of these servers.
A company like Quanta is the main beneficiary of the Open Compute movement.
Thus, Dell and HP will sell machines based on Microsoft’s designs, just as they’ve backed similar designs from Facebook.
This morning, Dell revealed it has also embraced the Project’s effort to overhaul the world’s network gear. Web giants such as Facebook are moving to towards “bare metal” networking switches
The Googles and the Facebooks and the Amazons started this movement. But when Microsoft and Dell get involved, it’s proof the rest of the world is following.
Tomi Engdahl says:
AMD tries to kickstart ARM-for-servers ecosystem
Reveals 64-bit A1100 silicon
http://www.theregister.co.uk/2014/01/29/will_an_arm_be_a_legup_for_amd/
AMD today rolled the dice on a risky proposition: enthusiasm for ARM-powered servers in the data center.
Here’s the potted view of what AMD has launched:
A development platform for an upcoming ARM chip;
The Opteron A1100, a 64-bit ARM processor built on a 28nm process, which it says will go into sampling soon;
A microserver design that will be handed over to the Open Compute Project, as part of the “Group Hug”* common slot motherboard architecture.
The processors will come in four or eight core ARM Cortex-A57 variants
The development kit packages the processors into a Micro-ATX form factor, along with the necessary connectors for developers to throw memory, power and communications at it, and a basic software stack of GNU/Linux, device drivers, Apache, MySQL, PHP, and Java 7 and 8.
McIsaac said, are likely to be the target customers of greatest importance to AMD.
Tomi Engdahl says:
Facebook Attracts Server Startups
http://www.eetimes.com/document.asp?doc_id=1320785&
Servergy Inc. and Rex Computing debuted low power server architectures using Power and Adapteva processors, respectively, at the Facebook-led Open Compute Summit here. The two startups are the latest vendors leveraging the push for low power in big datacenters to disrupt Intel’s dominance in servers.
As many as a dozen companies are working on ARM server SoCs in hopes of gaining an edge over the Atom-based SoCs Intel is now shipping. Servergy and Rex both emerge from a sort of high tech left field with their options.
Tomi Engdahl says:
10 Jobs That Are Being Replaced By Machines
http://mashable.com/2014/01/26/10-jobs-replaced-by-machines/?utm_cid=mash-com-Tw-main-link
Automation may claim as many as 47% of current jobs by 2033, according to a recent Oxford University study. If you’re planning a career that spans beyond the next decade, you may want to strike the following off the list. Why?
“In the future, it’s very likely that many of today’s jobs, from cashier to teller, will be automated and the need for real people to take on these roles won’t be needed as technology will catch up and take on these responsibilities,” says Scott Dobroski, Glassdoor community expert.
“Themes people should be aware of include low-skilled jobs being likely replaced by automation first, such as telemarketer or typist, whereas jobs requiring creativity or a social aspect to them are not as at risk.
Tomi Engdahl says:
IBM’s bailed out of the server market – will they dump Storwize next?
Whither IBM storage now, post X86 server divestment?
http://www.theregister.co.uk/2014/01/29/storage_blues/
January’s not even ended yet and already we have an interesting technology market happening; IBM’s withdrawal from the x86 server market does lead to a number of questions.
IBM’s piecemeal withdrawal from the hardware market – a retreat to the highlands of the legacy enterprise market – will lead to questions across the board as to what the future is for any IBM hardware. I am not sure of the market acceptance of their converged compute/network/storage strategy in the form of PureSystems or their me-too ‘Block’ offering but surely this is dead now.
The heart of IBM’s Storwize product set is x86-based servers; SVC in particular was ‘just’ an IBM server.
But will IBM formally move Storwize into the software-only product space?
Tomi Engdahl says:
Oracle releases ‘lite’ virtual Big Data appliance
Write an app on your workstation, deploy straight to appliance
http://www.theregister.co.uk/2014/01/30/oracle_releases_big_data_dev_tool/
Big Data Lite, as the tool is called, bundles Oracle Linux, Cloudera software, Hadoop and plenty of Oracle’s cloud connectors, just about replicating the environment available on Big Red’s Big Data Appliance.
The bundle is made available as a virtual machine that’s ready to run on workstation-class machines.
The idea is that developers will use the virtual machine to code big data apps
Tomi Engdahl says:
UK picks Open Document Format for all government files
Minister Francis Maude wants to break ‘oligopoly’ of software suppliers
http://www.theregister.co.uk/2014/01/30/uk_picks_open_document_format_for_all_government_files/
The UK Government has decided that Open Document Format, the OpenOffice-derived file format, is the best choice for all government documents.
“Browser-based editing is the preferred option for collaborating on published government information. HTML (4.01 or higher e.g. HTML5) is therefore the default format for browser-based editable text. Other document formats specified in this proposal – ODF 1.1 (or higher e.g. ODF 1.2), plain text (TXT) or comma separated values (CSV)”
Tomi Engdahl says:
Cloud, schmoud, says Cisco: The IoT needs ‘FOG COMPUTING’
Borg forks Linux to create hybrid server/router ‘IOx’ operating system
http://www.theregister.co.uk/2014/01/29/cisco_forks_linux_as_iot_operating_system/
Cisco is looking to wrap more of the Internet of Things in its warm embrace, announcing that it’s going to create a Linux-plus-IOS mashup to run IoT apps at the network edge.
Designated IOx, the “fog computing” (Cisco’s term) operating system is being pitched as a way to deal with data coming from IoT devices. For example, an application running on an edge router could pre-process data coming from a network of thousands of sensors to make the data volumes manageable.
In other words, the Borg wants to turn its routers into router-plus-application-server. The owner of the putative sensor network, for example, would avoid having to park a Linux server behind the edge router to process incoming data from the network.
Since the applications would run under IOx/Linux rather than IOS, third parties and end users will be able to write their own software for Internet of Things operations.
Tomi Engdahl says:
ARM lays down law to end Wild West of chip design: New standard for server SoCs touted
According to Hoyle, your DRAM is at 0×80000000
By Jack Clark, 29th January 2014
http://www.theregister.co.uk/2014/01/29/arm_standardization_sbsa/
Brit processor core designer ARM has forged a specification to smash through a significant barrier to the widespread adoption of its highly customizable chip architecture in data centers.
A specification that aims to standardize how ARM system-on-chips (SoCs) interoperate with low-level software, and in doing so spur the creation of a broader selection of applications to run on the 64-bit ARMv8 architecture.
ARM-compatible processors represent an alternative to the traditional x86 chips sold by AMD and Intel into major data centers.
The Server Base System Architecture (SBSA) specification was announced at the Open Compute Project’s Summit
If adopted, SBSA means that operating system makers will be able to build one image of their software for all SBSA-compatible ARM-powered servers
The SBSA features multiple levels of standardization
“Think of it as a specification at the SoC platform level – designed for firmware and OS vendors,”
And it’s worth noting that the rise of ARM in the data center is by no means a sure thing
Tomi Engdahl says:
Silicon Brains That Think as Fast as a Fly Can Smell
http://slashdot.org/topic/datacenter/silicon-brains-that-think-as-fast-as-a-fly-can-smell/
Silicon that mimics brain cells, connected in the same pattern as a fly’s smell-processing center, shows speed and power in solving computing problems.
Researchers in Germany have discovered what they say is a way to get computers to do more than execute all the steps of a problem-solving calculation as fast as possible – by getting them to imitate the human brain’s habit of finding shortcuts to the right answer.
A team of scientists from Freie Universität Berlin, the Bernstein Center Berlin, and Heidelberg University have refined the idea of parallel computing into one they describe as neuromorphic computing. In their design, a whole series of processors designed as silicon neurons rather than ordinary CPUs are linked together in a network similar to the highly interconnected mesh that links nerve cells in the human brain.