Electronics trends for 2015

Here are my collection of trends and predictions for electronics industry for 2015:

The computer market, once the IC growth driver per se, apparently is approaching saturation status. Communications industry is still growing (6.8%.). Automotive V2X, LED lighting and smart domestic objects are set to drive semiconductor market growth through the year 2020, according to market analysis firm Gartner.

Car electronics will be hot in 2015. New cars will have more security features, smart infotainment and connectivity in them. It is an are where smart phone companies are pushing to. Automotive Industry Drives Chip Demand article says that until 2018, the IC demand from automotive customers is expected to exhibit the strongest average annual growth — 10.8% on average. This is significantly higher than the communications industry, at second place with 6.8%. Demand drivers include safety features that increasingly are becoming mandatory, such as backup cameras or eCall. But driver-assistance systems are also becoming ubiquitous. Future drivers will include connectivity, such as vehicle-to-vehicle communications, as well as sensors and controllers necessary for various degrees of autonomous driving.

Power electronics is a $90 billion-per-year market. The market for discrete power electronics is predicted to grow to $23 billion by 2024 from $13 billion today. Silicon rules power electronics industry, but new materials are pushing to headlines quickly. In the power electronics community, compound semiconductors such as gallium nitride (GaN) are drawing more attention as they try to displace silicon based power devices, which have been doing the heavy lifting for the past 30 years or so. While silicon-based devices are predicted to remain predominant with an 87% share of the market, it is expected that SiC- and GaN-based components to grow at annual rates of 30% and 32%, respectively. There’s no denying the cost advantages that silicon possesses.

Chip designs that enable everything from a 6 Gbit/s smartphone interface to the world’s smallest SRAM cell will be described at the International Solid State Circuits Conference (ISSCC) in February 2015. Intel will describe a Xeon processor packing 5.56 billion transistors, and AMD will disclose an integrated processor sporting a new x86 core, according to a just-released preview of the event. The annual ISSCC covers the waterfront of chip designs that enable faster speeds, longer battery life, more performance, more memory, and interesting new capabilities. There will be many presentations on first designs made in 16 and 14 nm FinFET processes at IBM, Samsung, and TSMC.

There is push to go to even smaller processes, and it seems that next generation of lithography equipment are started to being used. Earlier expectation was for chipmakers to use traditional immersion lithography for production of 10 nm chip, but it seems that extreme ultraviolet (EUV) scanners that allows allow scaling to 10 nm or even smaller is being used. TSMC to Use EUV for 7nm, Says ASML. Intel and TSMC have been injecting money in ASML to push process technology.

2015 promises to see initial FPGA product releases and (no doubt) a deluge of marketing claims and counter-claims. One thing is certain: 2015 will not be boring. There will be FPGA products that use processes beyond 20nm, for example Altera and  Xilinx have committed to use the TSMC 16nm FinFET technology. There is  publicized (and rumored) race to get to production at 14nm has seen time frames for initial samples move into 2015. However, with both FPGA companies reporting gross margins of close to 70 percent, it would be possible for either company to take an initial hit on margin to gain key socket wins.

It seems that the hardware becomes hot again as Wearables make hardware the new software. Apple invest its time when it released the Apple Watch last quarter, going up against the likes of Google’s Android Wear and others in the burgeoning wearables area of design. Once Apple’s bitten into a market, it’s somewhat a given that there’s good growth ahead and that the market is, indeed, stable enough. As we turn to 2015 and beyond  wearables becomes an explosive hardware design opportunity — one that is closely tied to both consumer and healthcare markets. It could pick up steam in the way software did during the smartphone app explosion.

There will be more start-up activity within hardware sector. For recent years, the software has been on the main focus on the start-ups, and the hardware sector activity has been lower. Hardware sector has seem some start-up activity with many easy to use open hardware platforms became available (make development of complex devices easier and reachable for smaller companies). The group financing (Kickstarter, Indiegogo, etc.) have made it possible to test of new hardware ideas are market-worthy and get finance to get them to production.

EEs embrace hackathons aand accelerators. Design 2.0 is bubbling up in the engineering community, injecting new energy into the profession. In many ways, it’s the new Moore’s Law. Easy to use open hardware development platforms have made it possible to design working hardware device prototypes within hackathons.

Silicon Startups Get Incubator article tells that there will be new IC start-up activity as semiconductor veterans announced plans for an incubator dedicated to helping chip startups design their first prototypes. Keysight, Synopsys, and TSMC have signed exclusive deals to provide tools and services to the incubator. Silicon Catalyst aims to select its first batch of about 10 chip startups before April.

MEMS mics are taking over. Almost every mobile device has ditched its old-fashioned electret microphone invented way back in 1962 at Bell Labs. Expect new piezoelectric MEMS microphones, which promise unheard of signal-to-noise ratios (SNR) of up to 80 dB (versus 65 dB in the best current capacitive microphones) in 2015. MEMS microphones are growing like gangbusters.Also engineers have found a whole bunch of applications that can use MEMS microphone as a substitute for more specialized sensors starting in 2015.

There will be advancements in eco-design. There will be activity within Europe’s Ecodesign directive. The EC’s Ecodesign Working Plan for 2015-2017 is currently in its final study stages – the plan is expected to be completed by January 2015. The chargers will be designed for lower zero load power consumption in 2015, as on February 2016, after the 5-watt chargers are no longer at no load connected consume more than 0.1 watts of power. Socket for power supplies values ​​are defined in the new Energy Star standard VI.

LED light market growing in 2015. Strategies Unlimited estimates that  in 2014 the LED lamps were sold $ 7 billion, or about 5.7 billion euros. In 2019 the LED lamps will already sold just over 12 billion euros. LED technology will replace other lighting technologies quickly. For those who do not go to the LED Strategies Unlimited permission difficult times – all other lamp technologies, the market will shrink 14 percent per year.  The current lighting market growth is based on LED proliferation of all the different application areas.

IoT market is growing fast in 2015. Gartner is predicting a 30 percent compound annual growth rate for the IoT chip market for the period 2013 to 2020. The move to create billions of smart, autonomously communicating objects known as the Internet of Things (IoT) is driving the need for low-power sensors, processors and communications chips. Gartner expects chips for IoT market to grow 36% in 2015 (IoT IC marker value in 2014 was from $3.9 billion to $9 billion depending how you calculate it). The sales generated by the connectivity and sensor subsystems to enabled this IoT will amount $48.3 billion in 2014 and grow 19 percent in 2015 to $57.7 billion. IC Insights forecasts that web-connected things will account for 85 percent of 29.5 billion Internet connections worldwide by 2020.

With the increased use of IoT, the security is becoming more and more important to embedded systems and chip designers. Embedded systems face ongoing threats of penetration by persistent individuals and organizations armed with increasingly sophisticated tools. There is push for IC makers to add on-chip security features to serve as fundamental enablers for secure systems, but it is just one part of the IoT security puzzle. The trend toward enterprise-level security lifecycle management emerges as the most promising solution for hardened security in embedded systems underlying the explosive growth of interconnected applications. The trend continues in 2015 for inclusion of even more comprehensive hardware support for security: More and more MCUs and specialized processors now include on-chip hardware accelerators for crypto operations.

Electronics is getting smaller and smaller. Component manufacturers are continually developing new and smaller packages for components that are mere fractions of a millimeter and have board to component clearances of less than a mil. Components are placed extremely close together. No-lead solder is a relatively recent legislated fact of life that necessitated new solder, new fluxes, higher temperatures, and new solder processing equipmentTin whisker problems also increased dramatically. You should Improve device reliability via PCB cleanliness, especially if you are designing something that should last more then few years.

Photonics will get to the circuit board levels. Progress in computer technology (and the continuation of Moore’s Law) is becoming increasingly dependent on faster data transfer between and within microchips. We keep hearing that copper has reached its speed limit, and that optics will replace copper for high-speed signals. Photonics now can run through cables, ICs, backplanes, and circuit boards. Silicon chips can now have some optical components in them using silicon photonics technologies. For more than 10 years, “silicon photonics” has attracted significant research efforts due to the potential benefits of optoelectronics integration. Using silicon as an optical medium and complementary metal-oxide semiconductor fabrication processing technology, silicon photonics allows tighter monolithic integration of many optical functions within a single device.

Enter electro-optical printed circuits, which combine copper and optical paths on the same board. Electro-optical PCBs use copper for distributing power and low-speed data, and optical paths for high-speed signals. Optical backplane connectors have been developed, as well as a technique to align the small waveguides to transceivers on the board. The next challenge is to develop waveguides on to boards where the tight bends don’t degrade performance to unacceptable levels.

3D printing will continue to be hot. Additive manufacturing, can build complex prototypes, parts, tools, and models in various materials for a variety of uses, and is quickly expanding beyond making one-off products to the space industry. The major space agencies have all taken notice of additive manufacturing as a key enabling technology, and so should you.

3D printing will bring structural electronics. With 3D printing hot in the news, and conformable, flexible, or even printed electronics fitting any shape, it is only a matter of time before electronic circuits can be laid-out as part of the 3D-printing process, the electronic framework becoming an integral supporting part of any object’s mechanical structure. For example “structural batteries” have already been implemented in electric cars, in racing-car aerofoils, and in the Tesla pure electric car.

Superconductors are heating up again.  Superconductivity will be talked again in 2015 as there were some advancements in the end of 2014. A group of international scientists working with the National Accelerator Laboratory in Menlo Park, Calif., have discovered lasers that can create conditions for superconductivity at temperatures as high at 140°F. The Massachusetts Institute of Technology (MIT) has discovered a law governing thin-film superconductors, eliminating much of the trial and error for companies that manufacture superconducting photodetector. With MIT’s new mathematical law, new superconducting chips can be designed with the correct parameters determined ahead of time.

For more trends and predictions you should also read Hot technologies: Looking ahead to 2015 and IEEE: Top 10 technology trends for 2015 articles.

1,206 Comments

  1. Tomi Engdahl says:

    Obscured part numbers protect IP in production, without the grind of grinding
    http://www.edn-europe.com/en/obscured-part-numbers-protect-ip-in-production-without-the-grind-of-grinding.html?cmp_id=7&news_id=10005510&vID=1326#.VMDfTS53B-s

    Companies frequently require markings on components to be removed to make it hard for rivals to easily find out what parts are used on the PCB. Contract manufacturer Escatec has developed a technique using an existing production machine, that simplifies the task and eliminates stress to components.

    Escatec has solved this problem by using its existing CO2 laser marker that was initially installed to change the surface structure of solder resist to give a clear and permanent marking on the surface of PCBs. The marking it produces can be for machine-readable codes as well as text, graphics and logos, which show up as white areas on the PCB.

    “We found that using the laser to burn on a ‘sea of numbers’ pattern on the tops of the components made sure that none of the [manufacturers' original] markings can be read,”

    Reply
  2. Tomi Engdahl says:

    Unique MOSFETs Automatically Regulate And Balance Series-Connected Supercaps
    http://powerelectronics.com/discrete-power-semis/unique-mosfets-automatically-regulate-and-balance-series-connected-supercaps

    Two years of development have yielded a MOSFET that provides the necessary characteristics to automatically regulate and balance the leakage currents of series-connected supercaps.

    Advanced Linear Devices (ALD) has developed Supercapacitor Auto Balancing (SAB) MOSFETs that by themselves address regulation and leakage current balancing of series-connected supercaps. Without the proper supercap balancing, overcharging could cause failure or punch through that leads to unreliable performance.

    Individual supercapacitors have a 2.5 to 2.7 V maximum rating so they must be connected in series to work at higher voltages, which requires the balancing of supercap leakage currents for proper operation

    The principle behind the Supercap Auto Balancing MOSFET is basically simple. It is based on the natural threshold characteristics of a MOSFET device.

    During the production process, these MOSFETs are trimmed to operate at specific threshold voltages.

    SAB MOSFETs provide regulation of the voltage across a supercap cell by increasing its drain current exponentially across the supercap when supercap voltages increase, and by decreasing its drain current exponentially across the supercap when supercap voltages decrease.

    When a supercap in a supercap stack is charged to a voltage less than 90% of the desired voltage limit, the SAB MOSFET across the supercap turns off and there is zero leakage current contribution from the MOSFET. On the other hand, when the voltage across the supercap is over the desired voltage limit, the SAB MOSFET turns on to increase its drain currents and keep the over-voltage from rising across the supercap.

    Reply
  3. Tomi Engdahl says:

    Enabling next-generation avionics systems
    http://www.edn.com/design/analog/4438444/Enabling-next-generation-avionics-systems?elq=c05807ef59d24d78b5d5539b87c1e6c4&elqCampaignId=21288

    Recent generations of MEMs technology are now providing Highly Robust Enabling Performance to Avionics Equipment, with significant advances in size, weight, power (SWAP), and cost.

    Within the Avionics industry, and other equally demanding applications, traditional solutions based on prior-generation MEMs or other inertial technology, have a proven track record of meeting performance objectives. On the other hand, those same technologies have failed to make significant generational advancements on cost and other economies.

    A critical dilemma facing avionics equipment integrators today is to maintain performance, while also improving SWAP/Cost.

    Surveying inertial MEMs components in production today across the entire electronics industry, there are three primary and distinct pedigrees of the technology. The solutions have originated from one of these main application focuses: Military, Automotive, or Consumer. Decades old military-origin technology is of course highly robust, but inflexible in SWAP and cost. Consumer-origin technology meets aggressive cost goals, but with notable and limiting tradeoffs in Performance, and Ruggedness. On the other hand, technology originally targeted at the Automotive industry was specifically optimized to meet demanding goals on all key parameters; Performance, Ruggedness, Cost, Size, Weight, and Power.

    Reply
  4. Tomi Engdahl says:

    A simple approach to develop Spice macro models
    http://www.edn.com/design/analog/4438382/A-simple-approach-to-develop-Spice-macro-models?elq=c05807ef59d24d78b5d5539b87c1e6c4&elqCampaignId=21288

    This transistor based approach uses simple equations (relatively) which can be modified accordingly to address the various processes on which amplifiers are designed.

    The idea is to be able to create a macro model using a few parameters from the datasheet regardless of the input or output topology. This technique is based on the assumption that most op amps have a secondary pole which is well beyond the unity gain bandwidth.

    Typically, the following parameters are needed:

    Supply voltage, open loop gain and load, unity gain bandwidth, slew rate, input common mode range, CMRR, PSRR, Vos, Ios, Ib, open loop output impedance, phase margin, broadband band noise and 1/f, supply current and short circuit current as well the open loop output impedance. For a rail to rail output, you will need the output saturation voltage (dropout voltage) as well sinking and sourcing currents. Also, you will need to specify RL, the load resistance.

    Reply
  5. Tomi Engdahl says:

    A Chimera of Standards, or the Challenges of Adapting Prototyping Standards
    http://www.edn.com/design/analog/4438443/A-Chimera-of-Standards–or-the-Challenges-of-Adapting-Prototyping-Standards?elq=c05807ef59d24d78b5d5539b87c1e6c4&elqCampaignId=21288

    Through the years there has been a proliferation of standards, and not many industries have created more than our electronics industry. Do you ever wonder why we have a standard for something that seems to be an odd number or gauge?

    Development Board Expansion Standards

    For years component manufactures have offered development systems to assist their customers with designing applications around their parts. For programmable devices such as FPGAs and microcontrollers, there are always connections for interfacing to the other components so that software development can begin along with, or before, the hardware. With time, very loose pseudo standards for these “expansion interfaces” have emerged, some more consistent than others. FPGA vendors such as Xilinx have driven some of these standards like FMC to make it as easy as possible for customers to migrate to the newest platform.

    Xilinx has also used third-party standards like the Pmod™ standard developed by Digilent, and there is a wide selection of peripherals for this interface. Microcontroller manufacturers have been somewhat slower to standardize, many utilizing their own proprietary interfaces. However, market forces like the maker movement and the popularity of the Arduino® platform are herding them towards pseudo standards too.

    The Pmod interface is a great way to mix and match peripherals with an FPGA development board.
    The type definitions make it easier to use the Pmod interface standard with a microcontroller board, but there are still challenges.

    The Arduino Pseudo Standard

    The Arduino pseudo standard is a completely different beast, a different platform developed for a different audience for different reasons. The original Arduino board simply exposed the pins of a simple microcontroller and added enough supporting devices to make it easy to program, yet still affordable for hobbyists. Because of its simple nature, the original pinout was defined by the capabilities of the microcontroller.
    As the platform evolved to support more-capable processors, this pseudo standard was fragmented with a myriad of pin-muxing combinations with, arguably, more exceptions than rules. Some issues, such as support for different I/O voltages and the inconsistency of I2C signals, were addressed in revision 3 of the UNO board. Yet, anyone pairing an Arduino board (or any of the Arduino derivatives) with a shield still needs to carefully review the compatibility.

    So we have the Pmod interface and the Arduino pseudo standard, both readily available from numerous sources.

    Is there any hope of getting a peripheral from one platform to communicate with a controller from the other? Of course anything is possible, but sometimes the cure is worse than the disease.

    Proposed Solution: Use a Serial-Controlled Crosspoint Switch

    One way to address this mapping problem is to put an array of configuration jumpers on the board. While straightforward, this is certainly neither elegant nor user friendly.

    To implement the I2C type, for example, simply configure the mux so that the SDA pin and Pmod pin 4 are enabled on channel A, and SCL and Pmod pin 3 are enabled on channel B. Channel A and B are arbitrary and can be swapped freely.

    Conclusion
    Upon inspection, there is no evidence to suggest that either the Pmod specification or Arduino pseudo standard originated from the back side of a horse. However, the legacy of issues and factors that led to their creation is encoded deeply within their DNA. While the differences between the two standards seem irreconcilable, the MAX14661 enables us to bring the mythical Arduino-Pmod chimera to life.

    Reply
  6. Tomi Engdahl says:

    Signal integrity testing with a VNA
    http://www.edn.com/electronics-blogs/designcon-central-/4438441/Signal-integrity-testing-with-a-VNA?_mc=NL_EDN_EDT_EDN_analog_20150122&cid=NL_EDN_EDT_EDN_analog_20150122&elq=c05807ef59d24d78b5d5539b87c1e6c4&elqCampaignId=21288

    Failures during product qualification testing can be time consuming to debug. Understanding simple testing methodologies go a long way towards minimizing these risks.

    With frequent use, connectors eventually wear and must be replaced and so must be considered as a consumable item.

    Why use a VNA and S-parameters? Very low-level signals can be measured more accurately with narrow bandwidths, plus they can measure very fast rise times. The 4-port, single-ended S-parameters have become a de-facto standard for describing the electrical properties of any 4-port interconnect.

    the common SI characteristics such as insertion loss or attenuation, delay, reflections, crosstalk, and differential to common mode conversion.

    Reply
  7. Tomi Engdahl says:

    EM simulation tools only go so far
    http://www.edn.com/design/test-and-measurement/4423956/EM-simulation-tools-only-go-so-far

    Tight schedules, budgets, and faster devices have made EMC software tools more attractive than ever. Radiated emissions are always a challenge, and the lower voltage levels of very fast devices have made immunity (ESD, radiated, conducted) even more important than in the past. Certainly there will be no shortage of work for EMC engineers in the near future.

    However, there is actually a shortage of trained, experienced EMC engineers. Many companies do not have a full-time EMC engineer (if they have any at all). If there is an EMC engineer, he or she might be relatively inexperienced.

    Reply
  8. Tomi Engdahl says:

    Andreessen Horowitz:
    Andreessen Horowitz explains 16 tech trends of interest including VR and machine learning
    http://a16z.com/2015/01/22/16-things/

    Reply
  9. Tomi Engdahl says:

    $2 Waves 3D Gesture Into Any Device
    http://www.eetimes.com/document.asp?doc_id=1325368&

    Any embedded device can made to recognize 3D gestures in mid-air, with the addition of the new GestIC from Microchip Technology Inc. (Chandler, Arizona). Microchip supplies all the chips, development software and know-how, it claims, to enable engineers to quickly make any embedded device smart enough to respond to commands drawn in mid-air with your bare hands.

    Microchip believes its newest GestIC chip is the most cost-effective gesture detection system available today. “We not only provide the lowest-cost entry point for easy-to-use yet advanced 3D hand gesture recognition,”

    “but by focusing our newest family member, the MGC3030, on the core gesture detection function, we make the software engineers job quick and easy too — using our free, downloadable Aurea graphical user interface (GUI) and Colibri Gesture Suite.”

    The $2 GestIC (MGC3030 in a SSOP28 package) includes a three-dimensional (3D) gesture processing unit that makes the engineers’ job easy — according to Duvenhage — by detecting a wide variety of gestures with built-in algorithms. The GestIC is inexpensive enough for smart toys, yet smart enough to control audio systems, security systems, lighting systems and any other embedded application that could work smarter with gesture control, the company claims.

    The Woodstar Development Kit (MGC3030) includes all the parts needed to prototype an application

    Reply
  10. Tomi Engdahl says:

    News & Analysis
    Motion Control Comes to Masses with TI Launchpad
    http://www.eetimes.com/document.asp?doc_id=1325405&

    Since the advent of the Arduino, a flood of low-cost development boards have become available to help speed and simplify embedded systems development. Now, motion control of 3-phase DC motors has one of its own. Based on the Texas Instruments C2000 Piccolo MCU, the InstaSPIN-MOTION Launchpad comes pre-loaded with motion control software for just $25.

    The F2806x is part of the TI Launchpad modular development board series

    The new feature in the InstaSPIN-MOTION Launchpad is the inclusion of the SpinTAC velocity and position control suite from LineStream Technologies. The suite includes in its library such high-level functions as Identify, which automatically identifies the real inertia and friction in a motion-control system, Move, which produces run-time optimized motor control profiles based on desired start and stop position, acceleration limits, and the like, and Plan, which allows state-based combining of Move commands. The suite also helps reduce design effort by providing a motor control algorithm with single-parameter tuning that automatically estimates and cancels system disturbances.

    The pre-loaded motor control libraries greatly simplify the development of motion control systems using brushless DC motors. “With brushed DC motors you just apply power and it turns,”

    “but you have to deliberately commutate brushless motors, energizing the coils at the right times.”

    Development boards such as these are part of a larger trend toward simplifying and speeding development of complex mechatronic systems while minimizing the need for special expertise. “Professionals and Makers alike want to be able to rapidly prototype solutions at lower cost,”

    Reply
  11. Tomi Engdahl says:

    Flexible OLED mobile phones are becoming more common

    Samsung Display indicates that the organic LED technology-based displays of traditional LCD screen when the price drops to a level already this year. Since then, OLED panels will become common in cell phones pace. OLED enables ruudultaan flexible equipment manufacturing.

    Samsung according to the traditional screen and the OLED panel, the price difference was in 2013 up to 10-15 dollars.

    Last year, the price differential was reduced to five dollars for the screen

    All the manufacturers are now adding a mobile phone-sized OLED panels in production.

    Source: http://www.etn.fi/index.php?option=com_content&view=article&id=2332:taipuisat-oled-kannykat-yleistyvat&catid=13&Itemid=101

    Reply
  12. Tomi Engdahl says:

    ARM-based MCU builds on sub-threshold technology for 10-fold power reduction
    http://www.edn.com/electronics-products/other/4438431/ARM-based-MCU-builds-on-sub-threshold-technology-for-10-fold-power-reduction

    Ambiq Micro says its Apollo MCUs redefine ‘low power’ with up to 10x reduction in energy consumption: the ARM Cortex-M4F-based MCUs are based on subthreshold voltage technology to deliver breakthrough improvement in battery life.

    Ambiq Micro, based in Austin, Texas, says it has resolved the issues surrounding placing logic built with sub-threshold technology into volume production in a standard CMOS process. Its first announced products are the Apollo family of four 32-bit ARM Cortex-M4F-based MCUs. In real-world applications, their energy consumption is typically 5 to 10 times lower than that of MCUs of comparable performance, resulting in far longer battery life in wearable electronics and other battery-powered applications. The reduction in energy consumption is achieved using Ambiq’s patented Subthreshold Power Optimized Technology (SPOT) platform.

    Ambiq Micro’s SPOT platform operates transistors at subthreshold voltages (less than 0.5V), rather than using transistors that are turned all the way “on” at 1.8V. It uses the leakage current of “off” transistors to compute in both digital and analog domains. The company’s AM08x5 and AM18x5 families of ultra low power real time clocks, launched in 2013, are based on the same platform.

    Asked if the 24 MHz clock speed was a limit, Ambiq’s spokesman says that there is scope to increase it further but that the process’ limits are in the “10s of MHz” – “this will never be a Ghz-processor, server-class platform – its natural area of application lies in the lowest power, the portable and battery-powered [space]”.

    Reply
  13. Tomi Engdahl says:

    These aren’t the droids you’re looking for: The latest evolution in electronics hand gesturing
    http://www.edn.com/electronics-products/electronic-product-reviews/other/4438470/These-aren-t-the-droids-you-re-looking-for–The-latest-evolution-in-electronics-hand-gesturing-?_mc=NL_EDN_EDT_EDN_today_20150126&cid=NL_EDN_EDT_EDN_today_20150126&elq=a4a1b8d043bc44689d296462ab1288ae&elqCampaignId=21343

    Ben Obi-Wan Kenobi uttered these words and with a wave of his hand summoned the “Force” and was able to control the minds of two Storm Troopers in Lucasfilm Ltd. and now Disney’s Star Wars.

    Well, the advancement of hand gesturing in 2015 has not reached that mind-controlling level by the “Force”, but it has advanced to using an E-field force, a design unique to Microchip, with simplified user-interface options focused on gesture detection for such things as volume control, light dimming and page-turning in e-readers.

    I am a big fan of using natural human prompts as control for the Smart Home and other Internet of Things control. Voice and hand gestures just come naturally to control our amazing new electronic design advancements. Using a Smart Phone or tablet is OK, but nothing surpasses the good old human analog control tools like voice sound and hand/body movement in my humble opinion.

    Voice, Wireless or Infrared Control?
    http://www.planetanalog.com/author.asp?section_id=3065&doc_id=563827&

    Reply
  14. Tomi Engdahl says:

    Beating the heat: Not without a heat sink
    http://www.edn.com/electronics-blogs/brians-brain/4438378/Beating-the-heat–Not-without-a-heat-sink?_mc=NL_EDN_EDT_EDN_today_20150126&cid=NL_EDN_EDT_EDN_today_20150126&elq=a4a1b8d043bc44689d296462ab1288ae&elqCampaignId=21343

    a DSL modem/router combo that came packaged complete with an attached protective clear plastic sheet … a clear plastic sheet which completely blocked the ambient air flow vents on the top of the unit. Although I (barely) noticed and removed the sheet prior to operating the device for the first time, I surmised that others might not be so lucky and might end up prematurely “cooking” their units as a result. And my suspicions were confirmed

    Reply
  15. Tomi Engdahl says:

    Exclusive: Apple supplier Foxconn to shrink workforce as sales growth stalls
    http://www.reuters.com/article/2015/01/27/us-hon-hai-labor-idUSKBN0L00Z520150127

    Taiwan’s Foxconn Technology Group, the world’s largest contract electronics manufacturer, will cut its massive workforce, the company told Reuters, as the Apple Inc (AAPL.O) supplier faces declining revenue growth and rising wages in China.

    Reply
  16. Tomi Engdahl says:

    Capacitive sensing improves wearable devices
    http://www.edn.com/design/sensors/4438469/Capacitive-sensing-improves-wearable-devices?_mc=NL_EDN_EDT_EDN_today_20150127&cid=NL_EDN_EDT_EDN_today_20150127&elq=330b070af4fc43ddb78a63ef29f6f250&elqCampaignId=21359

    Smart, connected wearable devices are definitely trend.

    These wearable smart devices make life easier for the user. Wearable devices are usually light and small. Typical characteristics include:

    Small Displays (LED or LCD)
    Limited Space for User Interface
    Need to be Easy to Access
    Have Low Power Requirements
    Need to be Low in Cost

    Many wearable devices, however, still have clunky click buttons for a user interface

    In comparison, a touch interface provides a more intuitive user interface. Wearables with small displays can incorporate a touchscreen for gestures or a swipe interface. Power consumption is lowered by incorporating a capacitive sensor to only turn the device on when it’s worn.

    Capacitive Proximity Sensing can be used to wake the microcontroller from sleep to light up the user interface. There are sensors in the market that can run below 2.5 µA.

    Reply
  17. Tomi Engdahl says:

    Rambus takes noise monitoring on chip
    http://www.edn.com/electronics-blogs/designcon/4438483/Rambus-takes-noise-monitoring-on-chip?_mc=NL_EDN_EDT_EDN_today_20150127&cid=NL_EDN_EDT_EDN_today_20150127&elq=330b070af4fc43ddb78a63ef29f6f250&elqCampaignId=21359

    Rambus Inc has announced the addition of an on-chip noise monitor to its suite of tools and IP cores and will be demonstrating the IP at DesignCon this week.

    The noise monitor is a compact IP block that the company says allows easy and precise noise measurements for both low-power mobile and high-performance server SoCs. Embedded on-chip, the noise monitor aims to eliminate the need to use hand-probing techniques.

    “The Noise Monitor is used to enhance customers’ ability when they are doing very high speed chips to allow them to view and characterize the noise that’s on chip,”

    “What we’ve done is built a very small circuit that can go inside the chip in various points on the power grid that allows you, through an external tool, to look at and monitor on-chip noise,” Frerro said. “It’s a very small piece of IP, 0.01mm-square, but at the same time it gives the customers a lot of power.”

    “Probing on-chip noise externally, you’re maybe up to 1 gigahertz if you are lucky,” he said, estimating the Rambus monitor can go up to 6 gigahertz.

    Reply
  18. Tomi Engdahl says:

    The semiconductor houses like new cars. The amount of electronics in them will grow all the time, which bodes well for companies supplying the automotive electronics. This year, semiconductors sold cars for 31 billion dollars. This amount is the IHS by 7.5 per cent higher than last year. What’s more, the area of development is not as dependent on the semiconductor business cycle fluctuations. The more it depends on the number of new cars sold.

    According to the Institute of semiconductors is needed most hybrid cars automotive electronics systems, telematics, and the ADAS, or driver assistance systems. In these coming years the growth rate is predicted to No 20-18 per cent.

    Automotive semiconductor market has many small players. Infineon is the market leader with a market share of 9.8 per cent. Freescale is 7.4 percent, Texas Instruments 6.4 per cent and 3.6 per cent of the ON Semiconductor market. Renesas figures IHS is not reported, even though the Japanese company is a major supplier of cars.

    Electronics accounts for the high-end internal combustion engine car is currently about 40 per cent. The hybrid model, the electronics account for up to 75 percent of the car’s value.

    Source: http://www.etn.fi/index.php?option=com_content&view=article&id=2339:autoihin-jatkuvasti-lisaa-elektroniikkaa&catid=13&Itemid=101

    Reply
  19. Tomi Engdahl says:

    Maxim Exits Consumer MEMS, Touch Sensors
    http://www.eetimes.com/document.asp?doc_id=1325430&

    Maxim Integrated Products Inc. has decided to get out of consumer MEMS and consumer touch sensor markets and focus its sensor business on the automotive sensor. Meanwhile Maxim is making moves into wearable equipment, mainly with the provision of power management ICs.

    Reply
  20. Tomi Engdahl says:

    Analog, Embedded Drive TI Upwards in 2014
    http://www.eetimes.com/document.asp?doc_id=1325432&

    “Revenue growth of 8 percent year-over-year was consistent with our expectations,” said Rich Templeton, TI’s chairman, president and CEO, in a statement. “Analog and Embedded Processing drove revenue growth in the quarter, and combined, they comprised 85 percent of fourth-quarter revenue.”

    Reply
  21. Tomi Engdahl says:

    Are TSMC, UMC Affected By Taiwan’s Water Shortages?
    http://www.eetimes.com/document.asp?doc_id=1325427&

    Taiwan’s leading chip foundries, Taiwan Semiconductor Manufacturing Co. (TSMC) and United Microelectronics Corp. (UMC), are unaffected so far by water shortages in Taiwan, thanks to the companies’ efforts in recent years to institute resource-reuse programs.

    Rainfall in Taiwan is at the lowest level since 1947, according to the government’s Water Resources Agency

    Reply
  22. Tomi Engdahl says:

    Touch Control Improvements Address Industrial Needs
    http://www.eetimes.com/document.asp?doc_id=1325423&

    Next-generation touch controllers from Atmel combine two projective capacitance touch sensing technologies to improve performance in sub-optimal conditions. Onscreen dirt and moisture, operation with gloved hands, and electromagnetic noise have been addressed with a combination of advanced analog and digital signal processing. Although primarily targeting consumer and automotive applications, Atmel’s new maxTouch U family also holds promise for bringing multi-touch and sliding operation to control panels in industrial settings.

    The maXTouch mXT874U is the first sampling product in this new family of touch controllers. According to the product datasheet, the controller is able to support stable 25-30mm finger hover tracking, 1.0mm passive stylus sensing, and 3.0-5.0mm glove touch. The device is also able to handle moisture on the screen while still providing reliable multi-touch detection.

    Reply
  23. Tomi Engdahl says:

    Print-On Polymer Multiplies Solar Output
    Photovoltaic cells output boosted with carbon
    http://www.eetimes.com/document.asp?doc_id=1325378&

    Scientists have demonstrated a doubling of the number of electrons produced by carbon-based photovoltaic polymer potentially doubling the efficiency of any solar cell. The process called “singlet fission” produces “identical twin” electrons from a single photon, instead of the normal one, dramatically boosting the theoretical maximum output of solar cells. Instead of loosing energy to heat, an extra electron is produced by the process of applying a polymer solution to an existing solar cell.

    “One of the challenges in improving the efficiency of solar cells is that a portion of the absorbed light energy is lost as heat,” lead scientist at Brookhaven National Labs, Matt Sfeir, told EE Times. “In singlet fission, one absorbed unit of light results in two units of electricity via a multiplication process rather than resulting in one unit of electricity and heat as would occur in a conventional cell.”

    Reply
  24. Tomi Engdahl says:

    RS-485 COMMUNICATIONS MICRO PLC CARD
    http://www.eeweb.com/company-blog/maxim/rs-485-communications-micro-plc-card

    Maxim Integrated employs some of its high performance RF design products into a single system board, creating the MAXREFDES62#. This reference design is very suitable for industrial control and automation, where high system efficiency is a must. The board has a space-saver advantage and can be considered a small-solution footprint.

    Industry 4.0 marks the fourth industrial revolution, characterized by distributed, intelligent control systems. Breaking from a past with large, centralized programmable-logic controllers, Industry 4.0 allows for highly configurable, highly modular factories that accept an ever-increasing number of sensor inputs and operate at a higher output than before. The ultra-small PLC, or Micro PLC, lies at the heart of the Industry 4.0 factory, providing high performance with ultra-low power consumption, in an ultra-small package. The MAXREFDES62# is Maxim’s micro PLC RS-485 communications card.

    The entire system typically operates at less than 500mW and fits into a space roughly the size of a credit card.

    System Board 5984
    MAXREFDES62#: RS-485 Communications Micro PLC Card
    http://www.maximintegrated.com/en/design/reference-design-center/system-board/5984.html?utm_source=EEWeb&utm_medium=TechCommunity&utm_term=2014&utm_content=Content&utm_campaign=Maxim

    Reply
  25. Tomi Engdahl says:

    High-Voltage Design: Living Long and Still Prospering
    http://www.eetimes.com/author.asp?section_id=36&doc_id=1325436&

    Low-voltage design gets most of the attention these days, but there are many applications which require very-high voltages even though they do not deliver significant amounts of current to the load.

    It’s easy to think that almost “everyone” is doing low-voltage designs with power-stingy, battery-operated circuits — but that’s a simplistic and myopic perspective. There are well-known exceptions to the low-power world in applications which must deliver significant power to a load, such as a heater or motor. In those situations, using higher voltages allows use of lower currents for a given power rating

    But it’s not all about higher voltages when it comes to reducing current, even though the current may still be in the tens or hundreds of amps . There are many unavoidably high-voltage situations which are also fairly low current, often under 100 mA.

    We routinely and somewhat casually rely on many materials-related technologies which enable our hi-tech advances and innovations

    Why the need for the high voltages and low currents, as seen at many exhibits at the MRS event? These systems and their instrumentation are not “power devices” in the conventional sense, and minimizing IR loss and I 2R dissipation is not the primary concern. The need is simple: it’s the law of physics. These systems require high voltages to steer electron beams, attract and accelerate particles, and change the energy state of atoms. I saw many specialty vendors whose supply product lines began at 10 kV, as well as many high-voltage supplies embedded within highly specialized analysis, fabrication, and measurement systems. Consumers also have a need for voltages in the >10 kV range, to power the magnetron in their microwave oven, and in past times, for the venerable CRT of now-obsolete television displays.

    For designers who have little or no exposure to high-voltage/low-current design, it’s a very different world. There’s little need to minimize IR loss by using heftier connectors and conductors, since voltage drop is not a primary concern at these low currents. Instead, it’s a world of thin conductors, thick insulation, safety interlocks, and mandated minimum physical spacing between conductors and anything nearby. It’s also an unforgiving world where marginal design, inadequate attention to tiny details, and microscopic cracks in insulation can have dangerous consequences for equipment and users.

    Nothing is done quickly, easily, or casually: voltage/current monitoring, probing with test equipment during debug or repair, and designing for user access must all take into account high voltages and its tendency to go through any breach in system physical integrity. Seeing a 25-kV rail cause spark-over in an poorly placed capacitor is an experience you won’t forget

    In high-voltage designs however, that degree of freedom doesn’t exist and any change in routing must be carefully assessed to make sure it doesn’t violate appropriate design guidelines or numerous regulatory standards for placement, creepage, clearance, and safety

    Reply
  26. Tomi Engdahl says:

    Graphene: Reversible Method of Magnetic Doping Paves Way For Semiconductor Use
    http://science.slashdot.org/story/15/01/28/1641252/graphene-reversible-method-of-magnetic-doping-paves-way-for-semiconductor-use

    A team of physicists at University of California, Riverside have discovered how to induce magnetism in graphene in a way that still preserves the material’s electronic properties, which paves the way for graphene to be used as a semiconductor.

    Researchers Make Magnetic Graphene
    UC Riverside research could lead to new multi-functional electronic devices
    http://ucrtoday.ucr.edu/26810

    Graphene, a one-atom thick sheet of carbon atoms arranged in a hexagonal lattice, has many desirable properties. Magnetism alas is not one of them. Magnetism can be induced in graphene by doping it with magnetic impurities, but this doping tends to disrupt graphene’s electronic properties.

    Now a team of physicists at the University of California, Riverside has found an ingenious way to induce magnetism in graphene while also preserving graphene’s electronic properties. They have accomplished this by bringing a graphene sheet very close to a magnetic insulator – an electrical insulator with magnetic properties.

    “This is the first time that graphene has been made magnetic this way,” said Jing Shi, a professor of physics and astronomy, whose lab led the research. “The magnetic graphene acquires new electronic properties so that new quantum phenomena can arise. These properties can lead to new electronic devices that are more robust and multi-functional.”

    In their experiments, Shi and his team exposed the graphene to an external magnetic field. They found that graphene’s Hall voltage – a voltage in the perpendicular direction to the current flow – depended linearly on the magnetization of yttrium iron garnet (a phenomenon known as the anomalous Hall effect, seen in magnetic materials like iron and cobalt). This confirmed that their graphene sheet had turned magnetic.

    Reply
  27. Tomi Engdahl says:

    Deliver the power
    Lee Ritchey -January 18, 2015
    http://www.edn.com/design/test-and-measurement/4438377/Deliver-the-power

    Nine out of 10 boards that I troubleshoot today have power-delivery problems. Power delivery relates directly to signal integrity, and when you have signal-integrity problems, bits won’t reliably get to their destination. Most of the systems I work with today have Ethernet, running at 40Gbit/s over fiber and 25Gbit/s over copper. Power-delivery problems cause Ethernet links to break, requiring re-establishing the link and delivering packets again.

    When a transmitter needs to send a signal, it must energize a transmission line. That takes power, instantaneous power that must be stored in capacitors. Many data sheets tell you to place bypass capacitors around a board—as close to an IC’s power pins as possible—to deliver power and to compensate for inductance in power-delivery networks (Figure 1). The problem is that most application notes and data sheets are based on 20-year-old technology, when signal rise times were much slower, say 15ns. Rise times for, say DDR3 memory, are around 100ps.

    I’ve seen some boards that work no differently whether or not they have bypass capacitors. That’s because the capacitors are ineffective at today’s speeds.

    Another major issue arises because many of today’s high-speed boards have many power rails.

    You want point-of-load converters to be close to the load for two reasons.

    One is speed. The closer they are to the load, the less inductance in the delivery network to cause voltage drops.
    minimizes the distance that the high current will need to flow on the board. The raw supply can tolerate a higher voltage drop because we regulate the voltages at the load.

    The board I’ve described with 29 VDD rails has 22 layers, but only six are for signals. The rest are for delivering power.

    In designing such a board, place VDD and VSS layers close together, which results in the most plane capacitance, so keep them in pairs throughout the stack. Signal layers are usually single stripline layers today, in between the plane pairs.

    Reply
  28. Tomi Engdahl says:

    The Big IDEA – Dual Configurable Logic Design Contes
    http://www.eeweb.com/company-blog/nxp/the-big-idea-dual-configurable-logic-design-contest

    Welcome to the 2015 Big I.D.E.A., a design contest featuring NXP’s unique product line-up across the complete spectrum, including Dual Configurable Logic, Smart Analog, Mosfets and Power.

    Everyone has a chance to win thousands of dollars worth of prizes, with awards at every level of the competition

    Reply
  29. Tomi Engdahl says:

    Qualcomm Cuts Outlook, Warning Its Snapdragon 810 Dropped From a Flagship Device
    http://recode.net/2015/01/28/qualcomm-cuts-outlook-warning-its-snapdragon-810-dropped-from-a-flagship-device/

    Qualcomm said Wednesday cut its financial outlook for the current fiscal year, confirming its Snapdragon 810 chip was dropped by a large customer.

    A Qualcomm representative declined to say which phone maker has dropped the Snapdragon 810, though reports have said the chip has heat issues and that Samsung had pulled the processor from its upcoming Galaxy S6. The 810 has been announced for use in some other products, including the LG Flex 2.

    Reply
  30. Tomi Engdahl says:

    Ethernet acquisition module offers 16-bit resolution
    http://www.edn.com/electronics-products/other/4438493/Ethernet-acquisition-module-offers-16-bit-resolution?_mc=NL_EDN_EDT_EDN_today_20150128&cid=NL_EDN_EDT_EDN_today_20150128&elq=7cb8d2404532459b8cb7dd6b1e39e3ba&elqCampaignId=21374

    A multifunction data-acquisition device, the E-1608 from Measurement Computing measures eight single-ended or four differential analog inputs at a sample rate of up to 250 ksamples/s with 16-bit resolution. In addition, the module furnishes two 16-bit analog outputs, eight individually configurable I/O lines, and a 32-bit counter input.

    The E-1608 has a built-in 10/100Base-T Ethernet communication port with auto-negotiation

    Windows software options for the E-1608 include DAQami and TracerDAQ to display and log data, along with support for C, C++, C#, Visual Basic, and Visual Basic .NET. Drivers for DASYLab and LabView are also provided.

    Reply
  31. Tomi Engdahl says:

    Advanced fault models in small-scale CMOS technology nodes
    http://www.edn.com/electronics-blogs/day-in-the-life-of-a-chip-designer/4438467/Advanced-fault-models-in-small-scale-CMOS-technology-nodes?_mc=NL_EDN_EDT_EDN_today_20150128&cid=NL_EDN_EDT_EDN_today_20150128&elq=7cb8d2404532459b8cb7dd6b1e39e3ba&elqCampaignId=21374

    With small-scale CMOS technology nodes, the probability of physical defects occurring in the device increases. Various defects occur which cannot be detected with the help of conventional Single Stuck-at and Transition Fault models. Due to this barrier, use of advanced fault model for detection of physical defects becomes necessary.

    The ATPG tool supports the following advanced fault models which can target various classes of faults within the design. Ideally, this tool can apply various test pattern sets that can cover possible faults within the design. This test strategy can help one to increase the high defect coverage leading to drastic test quality improvement. As given below, Path delay, Hold Time, Small delay and Iddq are diverse fault models supported by ATPG tools other than conventional Single Stuck-at and Transition fault models.

    Reply
  32. Tomi Engdahl says:

    Moving Atomically Thin Semiconductors for Use in Flexible Devices
    http://video.techbriefs.com/video/Moving-Atomically-Thin-Semicond;Electronics-Computers

    Researchers from North Carolina State University have developed a new way to transfer thin semiconductor films, which are only one atom thick, onto arbitrary substrates – paving the way for flexible computing or photonic devices. The films, called molybdenum sulfide (MoS2), have electronic and optical properties similar to materials already used in the semiconductor industry.

    Reply
  33. Tomi Engdahl says:

    Vista Virtual Prototyping
    http://www.mentor.com/esl/resources/overview/vista-virtual-prototyping-f1e40feb-801c-4958-b526-0690d1754c4e?contactid=1&PC=L&c=2015_01_28_embedded_technical_news

    Vista Virtual Prototyping provides an early, abstract functional model of the hardware to software engineers even before the hardware design is implemented in RTL. It can run software on embedded processor models at speeds par with board support packages, providing sufficiently fast simulation models for OS and application software validation. The Vista Virtual Prototyping solution has two distinctive components: creation of the transaction-level modeling (TLM) platform and usage of the virtual prototype.

    Reply
  34. Tomi Engdahl says:

    Product How-To: Disciplining a precision clock to GPS
    http://www.edn.com/design/military-aerospace-design/4375055/Product-How-To–Disciplining-a-precision-clock-to-GPS?&elq=bb24564432c4425fb2724a317fdd858a&elqCampaignId=21226

    Precision timing is an essential component of modern communications and navigation systems. While the signals from GPS satellites are sufficient for many applications, often these systems require a local precision timing reference. This is particularly important for applications which require higher stability and/or availability than GPS can provide. In these applications, it is often valuable to form a composite clock, which exploits both the short-term stability of the local clock and the long-term accuracy of GPS. The local ensemble is generated by a technique known as “disciplining” in which the timing output of the precision clock is compared to that of the GPS system and the frequency of the clock is gently steered into compliance with the GPS. Whether the local clock is a crystal oscillator or a high precision atomic clock, optimum disciplining requires an understanding of the noise properties of both the clock and the GPS reference source in order to avoid the risk of accidentally degrading the clock’s performance while trying to improve it.

    GPS is a valuable reference source because of its ready availability and long-term stability and accuracy. On shorter time scales, however, relatively low cost clocks, such as quartz or rubidium oscillators, have significantly better stability, as shown below in the Allan Deviation (ADEV) stability chart of Figure 1.

    Reply
  35. Tomi Engdahl says:

    Ultra-low power medical sensing devkit & IC/IP
    http://www.edn.com/design/design-tools/development-kits/4438505/Ultra-low-power-medical-sensing-devkit—IC-IP?_mc=NL_EDN_EDT_EDN_analog_20150129&cid=NL_EDN_EDT_EDN_analog_20150129&elq=7015abf414304c9e8e407f99dceb8ba5&elqCampaignId=21392

    The Holst Centre and imec have announced a devkit for ultra-low-power medical sensing applications. It builds upon their multi-sensor data acquisition chip (MUSEIC), which is also available as licensable IP.

    According to Holst,”The development kit consists of a customizable sensor layer (including 3-lead ECG, bio-impedance, accelerometer, and microphone), the MUSEIC chip, SD card storage, a Bluetooth (4.0) and Bluetooth low-energy compliant radio, and a separate ARM Cortex M4 processor.”

    the kit consumes only 10mW

    Imec and Holst Centre announce Multi-sensor Data Acquisition IC and Open Hardware Development Kit for Personal Health Monitoring
    http://www2.imec.be/be_en/press/imec-news/imec-Holst-Centre-ECG-chip-Museic-development-kit.html

    Reply
  36. Tomi Engdahl says:

    First Look: 10nm
    http://semiengineering.com/first-look-10nm/

    Problems and an early look at best practices that will be required for dealing with the next level of complexity.

    As the semiconductor industry begins grappling with mass production at 14/16nm process nodes, work is already underway at 10nm. Tools are qualified, IP is characterized, and the first test chips are being produced. It’s still too early for production, of course—perhaps three years too early—but there is enough information being collected to draw at least some impressions about just how tough this next node will be.

    So what are the big challenges at 10nm? There are several. Individually they pose challenges, and together, they pose even bigger ones. But unlike previous nodes, all of them have to be solved together.

    Multi-patterning
    The proposed insertion of commercially viable EUV lithography at 7nm has dramatically increased the anxiety level over 10nm designs. Much has been written about the travails of EUV development, which originally was expected to begin rolling out at 45nm. It’s a big problem on the manufacturing side, of course, but it’s a big problem on the design side, too. It means multiple photomasks will be required rather than one because the 193nm laser beam is too wide, and designers need to take this into account. It’s possible that some devices won’t print as cleanly as they expect, and the more irregular the shapes the higher that possibility becomes.

    Design teams will have a choice of triple patterning, self-aligned double patterning (SADP), or litho-etch-litho-etch-litho-etch (LELELE). While these techniques are well understood, each of them can cause a boatload of problems if any changes are made to original layouts.

    “Most people have given up on EUV at 10nm,”

    “The surprise was that most people didn’t really notice double patterning at 20nm. But triple patterning is a harder problem to solve because people have to do some of this at the IP level.”

    Electromigration
    Electromigration grows worse at 10nm, as well. EM is the displacement of atoms as a result of current flowing through a conductor. Until 20nm, chipmakers relied on a capping layer of silicon carbon nitride and a copper alloy to control EM—basically using a barrier to keep the atoms from moving. New designs will require new materials for capping layers, most likely cobalt compounds. But while these materials have seen some traction at older processes, they will need to be well tested at 10nm before they enter mass production.

    “Electromigration is an issue every time we talk about skinnier wires,”

    Intellectual property
    Rules are key here. Design teams have been working with an increasing number of restrictive design rules for the past three process nodes, and the number of rules will only increase. That also makes it harder to develop commercial IP that can be partially customized, though, and it takes longer to characterize and qualify what is developed.

    “The paint is still wet at 16/14nm,”

    “Implementation and manufacturing will become more of a challenge, although the principles used in 16nm remain the same at 10nm,”

    Other issues
    On the system-design side, things get more complicated in other ways, too. Questions are beginning to be asked regularly about soft errors in more densely packed chips using smaller features. Memory makers have been dealing with this for some time, but logic designers have not.

    “We don’t know if it’s a problem yet,”

    Reply
  37. Tomi Engdahl says:

    Tools And Flows In 2015
    http://semiengineering.com/tools-and-flows-in-2015/

    Sometimes predictions are interesting by what is not said. That is certainly true for this year and possibly indicates a period of change ahead.

    “EDA grows by solving new problems as discontinuities occur and design cannot proceed as usual. In semiconductors and electronics, discontinuities happen at such a fast rate that it seems almost continuous.”

    “EDA is all about solving new problems, both in established markets, and for new customers and industries,” Derrick notes. “Developing new EDA solutions for markets in transition, like automotive, aerospace, the broader transportation industry, and the IoT, will fuel the growth of the design automation industry long into the foreseeable future.”

    The Big Shift Left
    It appears as if everyone has gotten tired of the term Electronic System Level (ESL) and the industry has created a new name for it – The Big Shift Left. Chi-Ping Hsu, senior vice president, chief strategy officer for EDA and chief of staff to the CEO at Cadence, defines it as “Shift Left means that steps, once done later in the design flow, must now start earlier. Software development needs to begin early enough to contemplate hardware changes.”

    “the tasks of testing, validating and verifying systems for safety and security are critical. It can no longer be addressed by adding more development resources and as a result there is increasing demand to deploy EDA automation tools and prototyping technologies that will help shift development and testing left.”

    All of the big three EDA companies see the benefits of virtual prototyping and bringing together the hardware and software aspects of the system.

    “There will be more tools for hardware/software co-design to help manage the tradeoffs for system-on-chip design,”

    Some see SystemC as the language of choice for higher levels of abstraction, but not everyone is in agreement. “The huge number of gates in a chip design dictate that we must move to higher levels of design,”

    Others see the need for languages that span multiple levels of abstraction. “Today, SystemC is heavily used for the development of blocks that process specific algorithms,” says David Kelf, vice president of marketing for OneSpin Solutions. “Transaction-level models of the algorithmic processing components in a chip are modeled using C, with a SystemC wrapper. These are verified and then synthesized into Verilog.” Kelf feels that this is too many languages.

    “For years we’ve been hearing about the benefits of high-levels of abstraction,” says Brett Cline who was with Forte Design Systems and now in the system level design group of Cadence. Benefits he lists include better productivity, faster verification, and architectural exploration. “High Level Synthesis (HLS) has been a key contributor to this move to a higher-level of abstraction and has delivered on these values and more.”

    Kelf also sees change happening in a bottom-up manner. “Increasingly, more control components that go with these algorithmic parts are being coded in SystemC rather than Verilog. Due to more intense event modeling required of these control sections, we will see increased use of SystemC at RTL.”

    Verification
    Bernard Murphy, chief technology officer for Atrenta, provides a summary of the changes he expects for verification: “I’ll continue to predict more scaleable approaches to verification and no doubt I’ll continue to be wrong, but at some point I have to be right. So I’ll stick with the long bet: IPs have to be more completely verified and system integration verification has to become more reliant on architectures designed for verifiability using static and constrained formal methods. System verification and validation will continue to move towards software-driven verification techniques. Also, given limitations of validating software on proxies for silicon, expect more 2-pass validation – first silicon will be to fully debug hardware and software, 2nd will be the final silicon.”

    Emulation
    One segment of verification that has been doing very well recently is emulation. Lauro Rizzatti, a verification consultant provides his view of the market. “Hardware emulation may be well over 20 years old, but the prediction that one day a software emulator will ring its death bell was never realized. It will be at the foundation of every verification strategy because no other verification tool is able to bridge the two disciplines like hardware emulation can.”

    With growing adoption, engineering teams are finding more uses for hardware emulation. “It’s not hard to imagine that 2015 could be the breakout year for the tool,” says Rizzatti.

    Formal Verification
    Another area that has seen growing acceptance in the past few years is Formal Verification. “Formal continues to be deployed in situations where preset applications work well,” says Kelf. “More advanced applications such as system security analysis and system-level verification testing will emerge.”

    Kelf also predicts that the most important development for formal in 2015 will be “a proliferation of the technology to designers where it may be used to speed up the early checking of design code before it is submitted into the verification regression environment.”

    Design and IP
    As systems get bigger, the amount of design and verification data is growing. “In 2015, design flows and tools will embrace the 3Vs of Big Data,” says HarnHua, chief executive officer for Plunify Pte Ltd. “These are Volume, Velocity and Variety. This will require tools to be reworked for large server farms where hundreds of processors can work on a problem in parallel.”

    IP is clearly becoming a vital aspect of the design flow. “2015 will see an increasing focus on verification effectiveness with regard to IP reuse,”

    New markets are also pacing tighter demands on IP. “The use of proven hardware IP that meets automotive safety requirements will accelerate time to market by reducing the development costs,”

    Reply
  38. Tomi Engdahl says:

    These aren’t the droids you’re looking for: The latest evolution in electronics hand gesturing
    http://www.edn.com/electronics-products/electronic-product-reviews/other/4438470/These-aren-t-the-droids-you-re-looking-for–The-latest-evolution-in-electronics-hand-gesturing-?_mc=NL_EDN_EDT_EDN_weekly_20150129&cid=NL_EDN_EDT_EDN_weekly_20150129&elq=61227c51cb2842198647f445f6d18682&elqCampaignId=21409

    Well, the advancement of hand gesturing in 2015 has not reached that mind-controlling level by the “Force”, but it has advanced to using an E-field force, a design unique to Microchip, with simplified user-interface options focused on gesture detection for such things as volume control, light dimming and page-turning in e-readers.

    I am a big fan of using natural human prompts as control for the Smart Home and other Internet of Things control. Voice and hand gestures just come naturally to control our amazing new electronic design advancements. Using a Smart Phone or tablet is OK, but nothing surpasses the good old human analog control tools like voice sound and hand/body movement in my humble opinion.

    Reply
  39. Tomi Engdahl says:

    New Generation of MOSFETs for Better Performance
    http://www.eeweb.com/company-blog/fairchild_semiconductor/new-generation-of-mosfets-for-better-performance

    This application note is both an introduction and overview of Fairchild’s new generation of Super-Junction MOSFETs, SuperFET II and SuperFET II Easy Drive MOSFETs. The architecture of each will be discussed, along with the advantages that they can do for the designs.

    Power MOSFET technology has developed toward higher cell density for lower on-resistance. The super-junction device utilizing charge balance theory was introduced to semiconductor industry ten years ago and it set a new benchmark in the high-voltage power MOSFET market. The Super-Junction (SJ) MOSFETs enable higher power conversion efficiency. However, the extremely fast switching performance of super-junction MOSFETs creates unwanted side effects, like high voltage or current spikes or poor EMI performance. Based on recent system trends, improving efficiency is a critical goal and using a slow switching device just for EMI is not an optimized solution.

    Fairchild recently added a SuperFET® II MOSFET family using the latest super-junction technology to the high- voltage power MOSFET portfolio. With this technology, Fairchild provides high performance in high-end, AC-DC SMPS applications such as servers, telecom, computing, industrial power supply, UPS/ESS, solar inverter, and lighting applications; as well as consumer electronics, which require high power density, system efficiency, and reliability. Utilizing an advanced charge-balance technology, Fairchild helps designers achieve more efficient and high-performance solutions that consume less board space and improve EMI and reliability by introducing the 600 V N-channel SuperFET® II MOSFET family.

    Super-Junction MOSFETs based on charge-balance technology offers outstanding performance with respect to reduced on-resistance and parasitic capacitance, which usually are in trade-off. With smaller parasitic capacitances, the SJ MOSFETs have extremely fast switching characteristics and reduced switching losses. However, without dv/dt control, the drain-source voltage slew rate can reach up to 100 V/ns, which can lead to EMI problems and unstable operation related to the stray parasitics in devices or printed circuit board and non-linear parasitic capacitances of SJ MOSFET in SMPS.

    A critical control parameter in gate-drive design is the external series gate resistor (Rg). This dampens the peak drain-source voltage and prevents gate ringing caused by lead inductance and parasitic capacitances of the power MOSFET. It also slows down the rate of rise of the voltage (dv/dt) and current (di/dt) during turn-on and turn-off. Rg also affects the switching losses in MOSFETs. Controlling these losses is important as devices must achieve the highest efficiency in the target application.

    UniFET – Optimized Switch for DCM PFC
    http://www.eeweb.com/company-blog/fairchild_semiconductor/unifet-optimized-switch-for-dcm-pfc

    Reply
  40. Tomi Engdahl says:

    10 standards for effective tech standards
    http://www.edn.com/electronics-blogs/designcon-central-/4438516/10-standards-for-effective-tech-standards?_mc=NL_EDN_EDT_EDN_today_20150129&cid=NL_EDN_EDT_EDN_today_20150129&elq=9fe2efc4f23841079a3d28d6b095ed98&elqCampaignId=21403

    Bartleson’s 10 commandments for effective standards

    1. Cooperate on standards; compete on products: Bartleson calls this “the golden rule” of the standards process. She describes it as being mature enough to cooperate to create a standard, yet savvy enough to later use the standard in competing products.
    2. Use caution when mixing patents and standards: Patents and standards are contentious and powerful. Mix them with care.
    3. Know when to stop: There are things that should not be standardized.
    4. Be truly open: “‘Open’ can mean different things to different people,” Bartleson said. It does not necessarily mean free. But if a standard is truly a standard, participation in its standard process should be available to all and its technologies should be available to all.
    5. Realize there is no neutral party: Recognize that “everyone who participates in a standard committee has a reason for being there,” she noted.
    6. Leverage existing organizations and proven processes: Modeling a new standard’s process off of the successful procedures of an existing group can save time and effort.
    7. Think relevance: “The biggest measure of a standard is its adoption,” Bartleson said.
    8. Recognize there is more than one way to create a standard: Different groups have different needs. Processes must be adjusted.
    9. Start with contributions, not from scratch: Do not create a situation that lends itself toward dominance by any contributor to the standard who may build up from a sole perspective.
    10. Know that standards have technical and business aspects: This can be a big struggle, but don’t forget that there’s a business side to what happens in a standard process.

    At minimum, she said, “Respect the standards that are around you and the people who created them.”

    Reply
  41. Tomi Engdahl says:

    Configuration Tools Simplify MCU Setup
    http://www.eetimes.com/author.asp?section_id=36&doc_id=1325459&

    Graphical tools supporting developers in configuring complex MCU peripheral sets continue to evolve.

    One of the drawbacks to the increasing integration of microcontrollers (MCUs) has been an almost exponentially-related increase in the complexity of configuring their peripherals. This has been further exacerbated by the advent of multipurpose peripherals and programmable IO assignments in pin-limited devices. Fortunately, an evolving string of configuration tools is helping simplify MCU configuration.

    In the beginning, there was only the data sheet. Developers had to study the descriptions of registers and peripheral functions in order to determine which bits to set where to configure things to their liking. But as product manuals began pushing toward hundreds of pages in length, the learning curve for figuring out how to use an MCU’s peripherals became so steep that developers became reluctant to change MCU families one project to another. Switching simply took too much time and results were too error prone for hard-pressed development teams to risk using a new (to them) MCU.

    MCU vendors, in an effort to grow market share by easing such barriers to new users, began offering libraries containing configuration code.

    Vendors then hit upon the idea of a graphical configuration tool that developers could use to specify the desired use and behavior of all an MCU’s peripherals through drop-down menus. The tool then generated a single documented code package developers could import into their development tool. One of the first such tools I know of is part of the DAvE development tool from Infineon, which is now in its third generation. The Texas Instruments Grace platform appeared in 2011 for the MSP430 series MCUs.

    Most other vendors now offer graphical configuration tools for their MCUs. Available products include:

    Freescale’s Processor Expert
    Microchip’s Code Configurator plug-in for MPLAB
    Simplicity Studio Configurator as well as AppBuilder from Silicon Labs
    STM32Cube from STMicroelectronics

    There is also an open-source graphical configuration tool — CoSmart — from CooCox, for ARM CortexM processors.

    Despite the long history of the concept, however, these graphical tools are still relatively new to most developers. Many have only been introduced in the last year or two.

    Reply
  42. Tomi Engdahl says:

    Backplanes Hit a Wall at 56G
    Talk shows need for better channels, chips
    http://www.eetimes.com/document.asp?doc_id=1325457&

    Something has to change to enable the next generation of fast computer and communications boards, but just what it will be is the subject of hot debate.

    The problem hits at the 56 Gbit/second speeds needed to drive systems that cost-effectively deliver 400 Gbit/second Ethernet. Big data centers and carrier networks are hungry for the fast interfaces now in development. But engineers are finding they just can’t drive signals at 56G across boards with two connectors that have traces typically up to 40 inches long.

    “That’s out of the picture for now, something has to change,”

    Beyene showed results of research sending 56G signals between chips and modules and even between boards

    As the lengths reached 20 inches, Rambus researchers had to either adopt PAM-4 signaling or five-tap DFE equalization, both requiring more complex and expensive chips.

    Companies are understandably shy about shifting materials. The 28 Gbit/s boards made for today’s systems were some of the first to use Megrton-6. The prior 10 Gbit/s generation was among the first to go off mainstream FR4 boards, adopting Nelco 4000.

    “My sense is we may need new materials again, but many people don’t see it that way,”

    Another engineer suggested assumptions about crosstalk are a wild card in the debate over NRZ and PAM-4 signaling. NRZ has been the work horse signaling technology for many years. Experts have been saying for at least a year that NRZ will not be able to stretch to use in 56G backplanes.

    Reply
  43. Tomi Engdahl says:

    Darker Silicon
    http://semiengineering.com/darker-silicon/

    MRAMs offer less volatile cache to address the dark silicon dilemma. What happened to Dennard’s Law?

    For the last several decades, integrated circuit manufacturers have focused their efforts on Moore’s Law, increasing transistor density at constant cost. For much of that time, Dennard’s Law also held: As the dimensions of a device go down, so does power consumption. Smaller transistors ran faster, used less power, and cost less.

    As most readers already know, however, there was a limit. Smaller devices with thinner dielectrics and shorter channels are more prone to leakage. Indeed, leakage, negligible for much of the industry’s history and ignored in Dennard’s original paper, now approaches the same order of magnitude as the circuit’s dynamic power. Advances such as the introduction of high dielectric constant gate dielectric materials helped, but leakage-limited transistor structures are now a fact of life. Switching a transistor at a lower threshold voltage requires a thinner gate dielectric, but leakage constraints place a lower bound on dielectric thickness. As a result, while feature sizes have continued to shrink, threshold voltage has not.

    This failure of Dennard scaling has introduced the era of what designers call “dark silicon.” If the number of transistors doubles, but the power budget for the circuit as a whole stays the same — or goes down, thanks to the proliferation of mobile devices — then the available power for each transistor is cut in half. If threshold voltage stays the same, then the number of transistors that can operate at one time is also cut in half. These non-operational transistors are dark silicon, measured as a fraction of the chip’s total area.

    Calculating the power consumption of a generic chip is difficult. It depends on a wide range of factors, from dielectric thickness and process variation to the workload of different parts of the chip.

    projections estimate the dark silicon fraction will be about one-third of total area in the 20nm technology node (including 16/14nm finFETs), increasing to as much as 80% by the 5nm node. Real products are likely to achieve better results, but clearly power consumption imposes an increasingly severe design constraint.

    At that point manufacturers may be tempted to ask why they are putting so much effort into making smaller transistors if designers aren’t planning to use them. Part of the answer is that “dark” silicon is not “useless” or “wasted” silicon. In any design, many circuit paths will be “dark” at any given moment. Some elements, such as specialized logic and cache memory, are particularly “dark-silicon friendly,” in that they contribute to overall IC performance while consuming power only in special situations.

    Reply
  44. Tomi Engdahl says:

    Chinese Walls and Back Doors
    http://www.eetimes.com/author.asp?section_id=36&doc_id=1325473&

    Qualcomm and U.S. industry are the losers as China’s antitrust regulators help build a new wall around China’s semiconductor industry.

    The need to build Chinese walls in the semiconductor industry is taking on a completely new significance as China’s antitrust regulators start to flex their muscles.

    The implications affect global companies that aim to keep their foothold in China’s semiconductor market — which by 2012 became the world’s largest, accounting for 52 percent of total demand — that continues to lead industry growth.

    The key chipmakers impacted by China’s antitrust initiatives include Qualcomm of the US and MediaTek of Taiwan as well as emerging Chinese companies such as Semiconductor Manufacturing International Corp. (SMIC) and Spreadtrum Communications. The Chinese government, which has aimed to make semiconductors a pillar industry for years, is using antitrust issues to create a level playing field for domestic companies.

    The Wall Street Journal on January 27 reported that China’s central government has asked Spreadtrum to custom design “safe phone” processors for officials’ smartphones that in one to two years may replace chips from U.S. suppliers which Beijing suspects may contain back doors to aid foreign spying. To be sure, China may be justified in protecting itself from national security risks, following revelations about U.S. surveillance activities by former National Security Agency contractor Edward Snowden.

    China’s antitrust probe of Qualcomm may already be taking a toll on the world’s largest maker of smartphone chips, which today said it has cut expectations for sales and profit this year after losing semiconductor orders and facing stronger competition in China.

    Reply
  45. Tomi Engdahl says:

    What Drove CES 2015 Innovation? IP and IP Subsystems
    http://www.eetimes.com/author.asp?section_id=36&doc_id=1325454&

    How do we manage all those blocks in an age of exploding block usage?

    If you want to see what electronic design innovation is all about these days, come to the Consumer Electronics Show.

    the array of technology development showcased here the first week of 2015 was breathtaking. The Sands was packed with almost countless wearables vendors, IOT systems houses, and 3-D printers

    But for these guys — from a market standpoint — there’s a shakeout ahead: There are too many vendors in the wearables and IOT space making too-similar products.

    Rise of IP subsystems
    What’s enabling these systems innovations is of course IP. You’ve no doubt seen the slideware showing that the number of IP blocks in an average SoC has crested 100 and is moving quickly north. That’s 10 times the number of blocks than we designed in just a few short years ago.

    This explosion in block usage is creating its own design complexity (how do we manage all those blocks?).

    “Instead of dealing with SoC design at the lowest common denominator — the discrete IP block, SoC designers now look to move up a layer of abstraction to design with system level functionality to reduce the effort and cost associated with complex SoC designs today.”

    “the start of a period in which large SIP providers will exert a concerted effort to create IP subsystems, combining many discrete IP blocks into larger, more converged IP products to offer better performance and to reduce the cost of IP integration into complex SoCs.”

    Semico forecasts the IP subsystem market will double from $108 million in 2012 to nearly $350 million in annual sales in 2017. IP providers clearly understand that delivering IP is just one piece of the puzzle and that to enable system development there needs to be a subsystems push as well.

    This trend — and the systems it enables — is going to drive much more rapid innovation in the months and years ahead.

    Reply
  46. Tomi Engdahl says:

    ChipEstimate.com Chip Planning Portal
    http://www.chipestimate.com/

    The ChipEstimate.com chip planning portal is an ecosystem comprised of over 200 of the world’s largest semiconductor design and verification IP suppliers and foundries. These companies all share in the common vision of helping the worldwide electronics design community achieve greater profitability and success. To date, a diverse global audience of over 27,000 users has joined the ChipEstimate.com community and has collectively performed over 100,000 chip estimations. ChipEstimate.com is a property of Cadence Design Systems, Inc. (NASDAQ: CDNS), the leader in global electronic-design innovation.

    Reply
  47. Tomi Engdahl says:

    The server carbon footprint smaller

    The European Commission wants to define the data center’s carbon footprint. For this purpose, a pilot program was launched back in 2013. The aim is to harmonize legislation across the Union.

    Products’ carbon footprint is referred to as PEF (Product Environmental Footprint). The Commission shall determine the PEF-category rules and leading manufacturers produce group specific, product life-cycle-based guidelines for the design and manufacture. The goal is to get a more detailed and fuller understanding of the environmental impact of products, as the current operating efficiency, based on the method of calculation.

    Life Cycle Assessment to take into account the activity during the energy consumed in addition to product manufacturing, installation, dismantling and recycling of generating energy and resource consumption. Life cycle analysis provides both manufacturers and users a more accurate picture of the entire system environment.

    The result is a holistic picture decisions which affect the green data centers and the installed IT equipment selection. For example, the operation of energy-efficient use of time, but an unreliable product, which is based on less sustainable design may not be possible, “greener” product line development.

    Highlights of manufacturers and system integrators to minimize the impact of all factors, in order to ensure that the environmental effects of a balanced approach. Power Supply Architecture plays a key role in ensuring the right balance

    The server integration, the degree of growth has made it possible to compress multiple processor cores and support the logic of the same SoC system circuits (system-on-chip), which are each card can be found in a number.

    Only two decades ago, 150 watts of brick-class power supply realistic maximum. Now, up to a quarter-brick converters that take the printed circuit board space of only 21 square centimeters, can run up to 864 watts of power and soon even kilo watts.

    This high-density power thermal compatibility with the environment is one of the key factors

    Since the servers are the most important methods of cooling air conduction and airing, air flow planning is an important component. The open frame structure of power sources has become popular because of their structure to improve air flow efficiency. They also use less metal structures and enclosures. Their actual performance depends on operating conditions.

    The open structure with respect to the direction of the air flow is much more sensitive than the closed structures.

    In order to support multi-core servers, high currents and reliability requirements of the power supplies must often be used in parallel N + 1 configurations. THE REGULATION is the key to parallel architectures.

    In order to ensure the correct functioning of the power source circuits future input voltage is very close tolerances, often less than ± 30 millivolts.

    Digital control is more flexible and more efficient way.
    Digital control can also reduce the use of materials through the use of cheaper passive components.

    With the system-level requirements into the data centers of advanced IT systems designers to meet future stringent legal requirements, which are based on product life-cycle-based environmental impacts

    Source: http://www.etn.fi/index.php?option=com_content&view=article&id=2355:palvelimen-hiilijalanjalki-pienemmaksi&catid=26&Itemid=140

    Reply
  48. Tomi Engdahl says:

    Qualcomm Outlook Exposes 5 Trouble Spots
    http://www.eetimes.com/document.asp?doc_id=1325486&

    As the world’s dominant mobile chip supplier, Qualcomm knows exactly how many mobile chips have been shipped to handset vendors everywhere.

    1. Qualcomm continues to have trouble getting licensing agreements in place and collecting royalties in China.
    2. A shift in the balance of power between Apple and Samsung has emerged at the premium tier market, with Apple gaining share at the expense of Samsung.
    3. The product mix on the mobile chip market has changed, with Qualcomm selling a growing number of thin modems compared to higher valued apps processors with built-in modems.
    4. Qualcomm’s Snapdragon 810 apps processor won’t be included in Samsung’s newest smartphone.
    5. Qualcomm faces more competition in China.

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*