Electronics trends for 2013

Electronics industry hopefully starts to glow after not so good year 2012. It’s safe to say that 2012 has been a wild ride for all of us. The global semiconductor industry has demonstrated impressive resilience in year 2012, despite operating in a challenging global macroeconomic environment. Many have already ratcheted back their expectations for 2013. Beyond 2012, the industry is expected to grow steadily and moderately across all regions, according to the WSTS forecast. So we should see moderate growth in 2013 and 2014. I hope this happens.

The non-volatile memory market is growing rapidly. Underlying technologies for non-volatile memories article tells that non-volatile memory applications can be divided into standalone and embedded system solutions. Standalone applications tend to be driven primarily by costs is dominated by NAND FLASH technology. The embedded market relies mainly on NOR Flash for critical applications and NAND for less critical data storage. Planar CT NAND and 3D NAND could fast become commercially viable this year or in few years. MRAM, PCRAM, and RRAM will need more time and new material innovation to become major technologies.

Multicore CPU architectures are a little like hybrid vehicles: Once seen as anomalies, both are now encountered on a regular basis and are widely accepted as possible solutions to challenging problems. Multi-core architectures will find their application but likely won’t force the extinction of single-core MCUs anytime soon. Within the embedded community, a few applications now seem to be almost exclusively multicore, but in many others multicore remains rare. There are concerns over the complexity and uncertainty about the benefits.

FPGAs as the vanishing foundation article tells that we are entering a new environment in which the FPGA has faded into the wallpaper – not because it is obsolete, but because it is both necessary and ubiquitous. After displacing most functions of ASICs, DSPs, and a few varieties of microcontrollers, it’s fair to ask if there is any realm of electronic products where use of the FPGA is not automatically assumed. Chances are, in the next few years, the very term “FPGA” might be replaced by “that ARM-based system on a chip” from Xilinx, Altera, Lattice, or other vendor.

Software and services have become the soul of consumer technology. Hardware has become increasingly commoditized into blank vessels that do little more than hold Facebook and Twitter and the App Store and Android and iOS.

Are products owned when bought? The trend in recent decades has been an increase in the dependence of the buyer on the seller.

More than 5 billion wireless connectivity chips will ship in 2013, according to market research firm ABI Research. This category includes standalone chips for Bluetooth, Wi-Fi, satellite positioning, near-field communications and ZigBee as well as so called “combo” chips that combine multiple standards. Broadcom seen retaining lead in connectivity chips. Bluetooth Smart, WiGig and NFC are all are seeing increased adoption in fitness, automotive and retail applications. Combo chips are also a growing opportunity based on the popularity of smart phones, tablet computers and smart televisions.

Signal integrity issues are on the rise as both design complexity and speed increase all the time. The analog world is moving faster than ever. Learning curves are sharper, design cycles are shorter, and systems more complex. Add to all this the multidisciplinary, analog/digital nature of today’s designs, and your job just gets more complicated.

High-speed I/O: On the road to disintegration? article tells that increases in data rates driven by a need for higher bandwidth (10Gbps, 40Gbps, 100Gbps networking) means the demands on system-level and chip-to-chip interconnects are increasingly challenging design and manufacturing capabilities. For current and future high-performance, high-speed serial interfaces featuring equalization could well be the norm and high levels of SoC integration may no longer be the best solution.

crystalball

For a long time, the Consumer Electronics Show, which began in 1967, was the Super Bowl of new technology, but now consumer electronics show as a concept is changing and maybe fading out in some way. The social web has replaced the trade show as a platform for showcasing and distributing products and concepts and ideas.

NFC, or near-field communications, has been around for 10 years, battling its own version of the chicken-and-egg question: Which comes first, the enabled devices or the applications? Near-field communications to go far in 2013 article expects that this is the year for NFC. NFC is going to go down many different paths, not just mobile wallet.

3-D printing was hot last year and is still hot. We will be seeing much more on this technology in 2013.

Inexpensive tablets and e-readers will find their users. Sub-$100 tablets and e-readers will offer more alternatives to pricey iPads and Kindles. Also sub-$200 higher performance tablet group is selling well.

User interfaces will evolve. Capacitive sensing—Integrating multiple interfaces and Human-machine interfaces enter the third dimension. Ubiquitous sensors meet the most natural interface–speech.

Electronic systems in the automotive industry is accelerating at a furious pace. The automotive industry in the United States is steadily recovering and nowadays electronics run pretty much everything in a vehicle. Automotive electronics systems trends impact test and measurement companies Of course, with new technologies come new challenges: faster transport buses, more wireless applications, higher switching power and sheer amount and density of electronics in modern vehicles.

Next Round: GaN versus Si article tells that the wide-band gap (WBG) power devices have shown up as Gallium Nitride (GaN) and Silicon Carbide (SiC). These devices provide low RDSON with higher breakdown voltage.

Energy harvesting was talked quite much in 2012 and I expect that it will find more and more applications this year. Four main ambient energy sources are present in our environment: mechanical energy (vibrations, deformations), thermal energy (temperature gradients or variations), radiant energy (sun, infrared, RF) and chemical energy (chemistry, biochemistry). Peel-and-stick solar cells are coming.

Wireless charging of mobile devices is get getting some popularity. Wireless charging for Qi technology is becoming the industry standard as Nokia, HTC and some other companies use that. There is a competing AW4P wireless charging standard pushed by Samsung ja Qualcomm.

crystalball

In recent years, ‘Low-carbon Green Growth’ has emerged as a very important issue in selling new products. LED lighting industry analysis and market forecast article tells that ‘Low-carbon Green Growth’ is a global trend. LED lighting is becoming the most important axis of ‘Low-carbon Green Growth’ industry. The expectations for industry productivity and job creation are very large.

A record number of dangerous electrical equipment has been pulled from market by Finnish Safety and Chemicals Agency’s control. Poor equipment design have been found in a lot, especially in LED light bulbs. Almost 260 items were taken from the market and very many of them were LED lights. With high enthusiasm we went to the new technology and then forgotten the basic electrical engineering. CE marking is not in itself guarantee that the product is safe.

The “higher density,” “higher dynamic” trend also is challenging traditional power distribution technologies within systems. Some new concepts are being explored today. AC vs DC power in data center discussion is going strong. Redundant power supplies are asked for in many demanding applications.

According to IHS, global advanced meter shipments are expected to remain stable from 2012 through 2014. Smart electricity meters seen doubling by 2016 (to about 35 percent penetration). In the long term, IHS said it anticipates that the global smart meter market will depend on developing economies such as China, Brazil and India. What’s next after smart power meter? How about some power backup for the home?

Energy is going digital article claims that graphical system design changes how we manipulate, move, and store energy. What defines the transition from analog to digital and how can we tell when energy has made the jump? First, the digital control of energy, in the form of electricity, requires smart sensors. Second, digital energy systems must be networked and field reconfigurable to send data that makes continuous improvements and bug fixes possible. Third, the system must be modeled and simulated with high accuracy and speed. When an analog technology goes digital, it becomes an information technology — a software problem. The digital energy revolution is enabled by powerful software tools.

Cloud is talked a lot in both as design tool and service where connected devices connect to. The cloud means many things to many people, but irrespective of how you define it, there are opportunities for engineers to innovate. EDA companies put their hope on Accelerating embedded design with cloud-enabled development platforms. They say that The Future of Design is Cloudy. M2M companies are competing in developing solutions for easily connecting embedded devices to cloud.

Trend articles worth to check out:
13 Things That Went Obsolete In 2012
Five Technologies to Watch in 2013
Hot technologies: Looking ahead to 2013
Hot technologies: Looking ahead to 2013
Technology predictions for 2013
Prediction for 2013 – Technology
Slideshow: Top Technologies of 2013
10 hot consumer trends for 2013

Popular designer articles from last year that could give hints what to expect:
Top 10 Communications Design Articles of 2012
Top 10 smart energy articles of 2012
Slideshow: The Top 10 Industrial Control Articles of 2012
Looking at Developer’s Activities – a 2012 Retrospective

626 Comments

  1. Tomi Engdahl says:

    Linear Technology has introduced a new microassembly compressed regulator, with the addition of a digital serial interface. It allows system designers to change the power supply parameters on the fly.

    LTM4676-regulator may be out of the two 13 amp or a 26 amp power. Linear it is intended primarily for optical networks, devices, data centers and telecommunication routers, industrial testing systems and other applications where electrical, cooling, and maintenance of the total costs is significant.

    In addition to the power supply regulator is able to monitor the power and the power management utilize parameter PMBus bus.

    Source: http://www.etn.fi/index.php?option=com_content&view=article&id=409:saada-teholahdetta-lennossa&catid=13&Itemid=101

    Reply
  2. Tomi Engdahl says:

    The sensor monitors the concussions

    Hockey season in Finland began last season talking about head injuries occurring in the field. A similar debate is taking place in the United States of American in soccer. Now, the latest technology will be introduced for monitoring of head injuries.

    Brain Sentry-named startup has introduced a new Brain Sentry Impact Sensor – sensor is a device that can be implanted in the helmet. Its focus is on STMicroelectronics ‘MEMS-based accelerometers, which provide accurate information about the players’ helmets against attacks.

    ST’s sensor measures the acceleration of all different directions. In principle, the American football players can be monitored after each challenge.

    Source: http://www.etn.fi/index.php?option=com_content&view=article&id=405:anturi-monitoroi-aivotarahdyksia&catid=13&Itemid=101

    Reply
  3. Tomi Engdahl says:

    RADIATION SNATCHED from leaky microwave ovens to power gadgets
    http://www.theregister.co.uk/2013/09/24/boffins_harvest_microwave_energy_outside_the_door/

    Microwaves pump out energy in the 2.4GHz band: the industrial, scientific and medical (ISM) radio space popularised by Wi-Fi and Bluetooth. The casing of the oven contains almost all the energy used to heat the “meal for one”, but some escapes to interfere with nearby wireless networks and that’s the energy scooped up by the boffins and their new circuit.

    The electronics in their microwave “rectenna” consists of an antenna, a diode and a capacitor, we’re told. The incoming waves induce a tiny voltage across the antenna’s terminals; the other components rectify and step up the voltage to a mighty 1.8V, enough for most gadgets.

    Domestic microwave ovens are governed by regulations that restrict their leakage to five milliwatts per square centimetre, at a distance of five centimetres

    The researchers discovered leakage was well below that across the brands they tested, but were able to harvest a good proportion of that power to run kitchen appliances including a thermometer and countdown timer.

    Reply
  4. Tomi Engdahl says:

    UAE: Japanese tech giant grows strawberries in Dubai
    http://www.bbc.co.uk/news/blogs-news-from-elsewhere-24205029

    Ailing electronics giant Sharp hopes to turn around its fortunes by growing strawberries indoors in Dubai, it seems.

    Once a global household name, the former sponsor of Manchester United is losing money so fast that commentators suggest it won’t survive another year without a huge cash injection. But its directors plan to cash in on Middle Eastern consumers’ penchant for the sweet strawberries from its native Japan, it seems. The firm’s using technology to allow the fruit to be grown from seedlings at a facility in the United Arab Emirates by artificially controlling light, temperature and humidity – thus avoiding the problems of berries spoiling during shipping – reports Tokyo’s Asahi Shimbun.

    It apparently uses LED lights to facilitate photosynthesis.

    Reply
  5. Tomi Engdahl says:

    LIB3 plans to bring contract manufacturing to the masses
    http://hackaday.com/2013/09/24/lib3-plans-to-bring-contract-manufacturing-to-the-masses/

    LIB3 is an open source hardware start-up from upstate New York. Thus far, the team has made some interesting products such as the piLED kit. However, they have big dreams for the future. LIB3 plans to become a contract assembly house specifically targeting low volume makers. To do this they have to build their own tools. LIB3’s latest project is a solder paste dispenser for surface mount components. Traditionally solder paste is applied with stencils made of stainless steel. In more recent years laser cut kapton has become a favorite for low volume production.

    Both of these systems require a stencil to be made up. LIB3 took a different approach, and modified an old CNC glue dispenser for paste.

    For their Maker Faire demo LIB3 replaced the paste with a standard felt tipped pen. Any mouse motion on the attached PC was directly translated into pen motion. The modified machine is extremely accurate – 0603 resistor pads will be no problem. The machine is also incredibly fast.

    Reply
  6. Tomi Engdahl says:

    Slideshow: Wearable Tech on the Rise
    http://www.designnews.com/author.asp?section_id=1386&doc_id=267969&cid=nl.dn14

    The trend of wearable tech is in full swing, and it doesn’t look to be slowing down any time soon. Google Glass, smart watches, and fitness trackers are probably the first things that come to mind, but there are many other products in the works that are just as, if not more, exciting.

    Reply
  7. Tomi Engdahl says:

    Red Pitaya Will Be a Boon for Budding Engineers
    http://www.designnews.com/author.asp?section_id=1386&doc_id=268005&cid=nl.dn14

    Almost every engineer wants to have their own laboratory. In fact, most people have dedicated places for their projects, engineer or not. Whatever the case may be, that place needs to be equipped to handle some serious work. For the engineers out there, this can mean housing some expensive equipment.

    I would love to have a function generator or oscilloscope to help me with projects. However, the price of these devices is not very friendly for budding engineers. This is about to change with the help of an open-source device funded through Kickstarter.

    Unlike many open-source boards we see on Kickstarter and the Internet, the Red Pitaya is not just meant to be programmed. It is meant to function as an all-purpose electrical test and measurement device. That means an included oscilloscope, waveform generator, spectrum analyzer, and PID controller. The best part is not a single line of code needs to be written out of the box for it to function.

    For a quick overview of the board, users will find two analog inputs and two analog outputs. This is where all the generation and measurements will be taking place. The analog input/outputs are going to run at 125 megasamples per second with a 14-bit resolution. In addition, the board is coming equipped with a dual core ARM Cortex processor plus a Xilinx Zynq FPGA. As a result, 16 GPIO pins can be used with the FPGA and four pairs of differential pins for serial data transfer and synchronization.

    Other connectivity options include 100 Mb Ethernet (which should be getting upgraded to 1Gb due to amount of funding), a USB port, JTAG connections, and the common communication protocols (SPI, I2C, and UART).

    Now that we are familiar with the hardware, there is also software needed before users can begin generating and measuring signals. The software is going to be available through a marketplace on the Internet referred to as the “Bazaar.” This is where one will go to find the applications to make this board function. They are free of charge and are capable of being accessed through any Internet connection on a computer or tablet.

    Next, there is the “Backyard.” This is going to be an open-source repository where code and tools can easily be accessed for further development or use. Programmers can develop code in HDL, C/C++, scripting languages such as Perl or Python, MATLAB, or HTML Web applications.

    The Kickstarter campaign has been successfully funded. Their goal was enormously surpassed, reaching $240,722 of the $50,000 goal. T

    For a price of only $359, this is going to be an instrument that will sooner or later find its way onto many engineers’ workbenches.

    Reply
  8. Tomi Engdahl says:

    Boeing Turning Old F-16s Into Unmanned Drones
    http://tech.slashdot.org/story/13/09/24/2126230/boeing-turning-old-f-16s-into-unmanned-drones

    “Boeing has revealed that it has retrofitted retired fighter jets to turn them into drones. It said that one of the Lockheed Martin F-16s made a first flight with an empty cockpit last week.”

    Reply
  9. Tomi Engdahl says:

    Applied, Tokyo Electron Agree to $29 Billion Merger
    http://www.eetimes.com/document.asp?doc_id=1319589&

    Applied Materials Inc., the world’s largest chipmaking equipment supplier, and Tokyo Electron Ltd., ranked No. 3, have agreed to a merger that values the combined entity at about US$29 billion (about 2.8 trillion yen).

    The deal, which would bring together the leading US and Japanese vendors of chipmaking equipment, will push Europe’s ASML Lithography NV, based in Bilthoven, Netherlands, which has specialized in the scanners used for patterning ICs using optical lithography, back into a distant second place.

    Applied and Tokyo said in a joint press release that the new company would be formed as a merger of equals

    No indication was given as to what the merged company will be called. The merger has been unanimously agreed to by the boards of directors of both companies but is subject to approval of both sets of shareholders and to regulatory approval.

    The deal will produce a clear No. 1 company in the sale of a range of equipment for precision engineering and patterning for integrated circuits and displays, with about 25 percent market share.

    Reply
  10. Tomi Engdahl says:

    Intel says Internet of Things is the next IT game changer
    Quark chip means more than smart watches and wearable technology
    http://www.theinquirer.net/inquirer/news/2296940/intel-says-internet-of-things-is-the-next-it-game-changer

    SAN FRANCISCO: CHIPMAKER Intel expects the Internet of Things to be the next game changer for the IT industry, allowing firms to sift through huge quantities of data via technology such as the chip giant’s upcoming Quark low-power processor.

    Doug Fisher, corporate VP and GM of Intel’s Software and Services Group, pictured holding the tiny Quark chip, said that the firm expects to see huge enterprise demand for its Quark processor, although he declined to name specific companies being targeted as customers.

    “We’re not doing this out of our own joy,” he said during a media session at the Oracle Openworld show in San Francisco on Monday. “This will be the biggest inflection point for IT for a number of years.”

    Fisher said that while technology advancements such as virtualisation and connected computing via networks were important, they were not mission changing, whereas the Internet of Things is a sea change for IT.

    Intel unveiled its Quark ultra-small system on chip (SoC) family for wearable technology – which is said to be one fifth of the size of the firm’s current Atom processor and uses a tenth of the power – at its IDF show earlier in September.

    The wearable technology aspect – while it has garnered most of the headlines helped by launches such as the Samsung Galaxy Gear smartwatch – forms part of Intel’s plans, but is not the crux of the Quark strategy, said Fisher.

    “We’ll innovate around wearable tech. But I’m not worried about hitting every aspect of that, Quark is designed for wearable tech and sensors, for minute types of devices,” Fisher said.

    Instead, Intel is much more excited by how the broader internet will be altered by the dominance of sensors, which Fisher said would reach five billion by 2020.

    Reply
  11. Tomi Engdahl says:

    Autotestcon Day 2: PXI on the rise
    http://www.edn.com/electronics-blogs/test-cafe/4421527/Autotestcon-Day-2–PXI-on-the-rise

    In my first report from Autotestcon, I focused on instrument vendors who offer products based on large format modular standards like VXI or the emerging AXIe. Many of these also offer PXI, arguably the largest volume of all the standards. Today I’m focusing on vendors who offer PXI as their primary, or only, modular standard.

    Reply
  12. Tomi Engdahl says:

    Chip Embedded Instruments: Test and Measurement goes Embedded
    http://techonline.com/electrical-engineers/education-training/tech-papers/4421413/Chip-Embedded-Instruments-Test-and-Measurement-goes-Embedded?elq=a94a887bda21490a92f4a78a59504058&elqCampaignId=1370

    The transition towards Embedded System Access has initiated a real paradigm shift regarding validation, test, programming, and debugging of complex electronics units. Within this context, more and more instruments are directly implemented in silicon or uploaded into FPGAs as soft macro to be able to see exactly what silicon sees. New standards drive standardized access to these instruments and facilitate their use throughout the entire product life cycle. In perspective, particularly FPGA-embedded instruments are promising an enormous applicability. However, test systems must be capable of efficiently putting these possibilities into practice

    Reply
  13. Tomi Engdahl says:

    Video: Shoe Inserts Harvest Energy to Power Devices
    http://www.designnews.com/author.asp?section_id=1386&doc_id=267997&cid=nl.dn14&dfpPParams=ind_184,industry_alt,industry_consumer,aid_267997&dfpLayout=blog

    Remember how startup Pavegen has designed floor tiles to harness the power of footsteps to keep the lights on at the 2012 Summer Olympics and generate energy at the Paris marathon?

    Now a company called SolePower wants to give individuals this same power with a waterproof insole that can be swapped between different pairs of shoes, allowing people to store energy from their own footsteps through patent-pending energy-harvesting technology.

    “The device is designed to capture the force that is created when your heel impacts the ground,” David Davitian, a marketing intern at SolePower, told Design News in an email. “The device uses that force to spin a small generator which produces electricity. That electricity is wired out of the insert into a battery that is inside the external battery pack, which is attached to your shoe or to your ankle via the ankle strap.”

    SolePower ultimately aims to solve the universal problem of charging devices like mobile phones, GPS, cameras, and the like where there is no or limited access to power

    SolePower is initially targeting outdoor enthusiasts

    “The long term goal is to make this technology useful for everyone. This means to be able to generate significant amounts of power with routine daily walking, such as walking to class, to the store, or exploring a local park. So our long term goal is allowing everyone to be able to create their own portable energy seamlessly.”

    Reply
  14. Tomi says:

    Tomorrow’s Connected World Ushers in a Resurgence of Analog Technology
    http://www.electronicproducts.com/Analog_Mixed_Signal_ICs/Power_Management/Tomorrow_s_Connected_World_Ushers_in_a_Resurgence_of_Analog_Technology.aspx

    Integrated analog grows in parallel to digital technology

    As the era of embedded intelligence spreads before us, analog integration returns to vogue. No longer merely a specialty of bearded men, analog integration pushes the boundaries of both infrastructure and industry to maintain pace with tomorrow’s always-connected, on-demand world.

    What is analog integration?
    Fueled by the advent of wireless communication making its way into the bowels of absolutely everything like some transient digital ooze, computational processing has spread throughout the physical network, embedding intelligence in previously untouched locations. The role of analog integration has shifted from merely including analog functions on digital chips, to data farming on an exponential level. Sound, light, vibration, and voltage are all parameters needed to keep the modern robust network connected. The world of computer processing is interfacing with the physical world more than ever before, challenging information networks to devise solutions rooted in analog systems, thereby creating the information-gathering tools needed to stay connected.

    A symbiotic relationship with digital
    The increasing complexity of information networks and consumer electronics signifies an increase in the networks’ need to interface with the physical world, advocating the symbiotic link between the digital and analog modes of expression.

    The automotive vehicle market is shifting focus to include a wider range of electric vehicles. It’s estimated that by 2012 there will be 10 million electric vehicles on roads, forcing manufacturers to compete for innovation and affordability.

    With the worldwide population aging, the estimated number of people over the age of 65 is slated to triple by 2050 according to the American Diabetes Association. Keeping healthcare costs low and chronic diseases in check requires innovation in the ultrasound market.

    Getting people out of hospitals will directly cut the cost of a six trillion dollar yearly expense associated with annual hospital care. Advanced Analog integration will reduce the size and cost of cutting-edge medical equipment, giving patients access tools normally not available in public, and shift healthcare from a reactive industry toward a preventative industry. Consider that the three-lead ECG monitoring patient vitals was once the size of a refrigerator, but advances in analog integration have compacted the size to be built right into computer-worn clothing.

    This trend signifies a growing interconnectivity between medical devices, giving medical practitioners the greater means to precisely monitor individual patient health. So much data streaming into the cloud must be securely protected

    Smart factories, and the automated process control systems associated with them, are quickly becoming the best practices in worldwide manufacturing. Manufacturing is transitioning from centralized system control to having the actual processing directly embedded in the machine itself, thereby optimizing the assembly line and reducing bottlenecks.

    Reply
  15. Tomi Engdahl says:

    Long live coin cells: Getting years of battery life from small batteries in low power devices
    http://www.edn.com/design/power-management/4421334/Long-Live-Coin-Cells–Getting-years-of-battery-life-from-small-batteries-in-low-power-devices

    Imagine a central alarm control system, with tiny wireless alarm sensors sprinkled throughout a home or building reporting open doors and windows, glass vibration, smoke, CO2, etc. Or an HVAC monitoring system, with the sensors sending temperature, humidity, airflow information, with a central processing unit utilizing the information to optimize energy usage. And these are just mainstream applications- how about spooky spy applications like buried perimeter sensors monitoring vibrations caused by trespassers? Or sensors embedded in roadways detecting traffic, noise, or embedded in bridges reporting structural stress?

    A large class of the sensors in these systems are “quiet” much of the time, measuring slow processes on a very low duty cycle or exceptional (interrupt-driven) basis. These sensors can operate on extremely low power, utilizing the very low power sleep states available in recently available microcontrollers, awakening and consuming short bursts of power only when a measurement is made and processed, or when the radio is sending data. The microcontrollers can wake in under 20µs, turn on embedded analog and the ADC, make a measurement, provide some processing, and return to sleep mode in a matter of a few hundred microseconds. The averaged power consumption of such units can be less than 1uA

    These systems have in common the need for unobtrusive, generally hidden, often inaccessible sensors. And, the sensors are usually tucked away in places where there’s no power- so how to power them? Lately, it’s been fashionable to consider harvested energy.

    Reply
  16. Tomi Engdahl says:

    Stun Gun Ad Sells False Sizzle
    http://www.designnews.com/author.asp?section_id=1367&doc_id=267799&cid=nl.dn07

    I frequently see ads selling stun guns that claim unbelievably high output voltages, even as high as two million volts. This is physically impossible.

    The spark gaps on stun guns serve two purposes: first, to intimidate a potential attacker, and second, to limit the output voltage to a level that won’t destroy the generating electronics

    The spark gaps are typically 1.5 to 2 inches long. The dielectric breakdown voltage of air is about three million volts per meter.

    A two-inch gap would limit the voltage to about 152,000 volts.

    I think the best way to compare the shock power of a stun gun is to hear how loud the spark is, not just how long the spark is. The volume of the “snap” sound is dependent on the actual current flow as well as the spark length.

    Reply
  17. Tomi Engdahl says:

    Artificial Retina Is Solar Powered
    http://www.designnews.com/author.asp?section_id=1386&doc_id=268160&cid=nl.dn14

    A New Jersey company has been granted a patent to design the first solar-powered artificial retina, an invention that would preclude the current need for external hardware and give a patient the possibility of near-normal vision.

    The US Patent Office has awarded patent no. 8,433,417 to Newcyte Inc. for a carbon nanostructured artificial retinal implant that runs on energy harvested from solar energy. Natcore Technology, which specializes in solar technology and is based in Red Bank, N.J., purchased Newcyte in 2009.

    Reply
  18. Tomi Engdahl says:

    Slideshow: Service Robots Can Do Most Anything
    http://www.designnews.com/author.asp?section_id=1386&doc_id=254000&cid=nl.dn14

    To many of us, service robots often mean robots that assist the elderly, or help with the rehabilitation of medical patients. But the range of services that robots can perform is extremely broad. Some are involved in agricultural tasks that are either dangerous or rough on humans, such as weed-pulling and harvesting crops. Others collect trash and garbage, or work in recycling to sort waste from usable, reclaimable materials.

    Reply
  19. Tomi Engdahl says:

    Researchers develop particle accelerator the size of a grain of rice
    http://natmonitor.com/2013/09/28/researchers-develop-particle-accelerator-the-size-of-a-grain-of-rice/

    Laser accelerators could lead to tiny, portable X-ray sources to advance medical care for people hurt in war.

    According to a news release from the Department of Energy’s SLAC National Accelerator Laboratory, scientists have demonstrated an “accelerator on a chip.” This technology could lead to smaller, cheaper tools for science and medicine.

    Scientists utilized a laser to accelerate electrons at a rate 10 times higher than normal technology in a nanostructured glass chip tinier than a grain of rice.

    “We still have a number of challenges before this technology becomes practical for real-world use”

    According to scientists, the miniature accelerator could equal the accelerating power of SLAC’s 2-mile-long linear accelerator in just 100 feet if operating at its maximum potential. Additionally, the “accelerator on a chip” could achieve a million more electron pulses per second.

    The first trial reached an acceleration gradient of 300 million electronvolts per meter, which is approximately 10 times the acceleration offered by the current SLAC linear accelerator.

    According to the researchers, laser accelerators could drive compact X-ray free electron lasers that are valuable devices for a wide assortment of research projects. Laser accelerators could also lead to tiny, portable X-ray sources to advance medical care for people hurt in war.

    Reply
  20. Tomi Engdahl says:

    Multiple antennas a requirement for new cars
    http://www.edn.com/electronics-blogs/automotive-innovation/4421619/Multiple-antennas-a-requirement-for-new-cars

    The new features of automotive interiors come at a cost of multiple sets of wireless information and content being delivered transparently to the driver. In the past, this was at most two antennas – AM and FM for the entertainment system. Modern vehicles need a few more antennas, and the count is growing.

    A typical high-end car today has AM, FM, Satellite Radio, TPM, Remote Entry, Remote Start, In-Vehicle TV, DAB, GPS, Bluetooth, Collision Avoidance Radar, Parking Assist Radar and Electronic Toll Collection. Next-gen vehicles will add GSM and LTE in addition to Wi-Fi, specialized Car-To-Car Communications and additional systems for automated drivers assist.

    Reply
  21. Tomi Engdahl says:

    Littelfuse intrinsically safe fuses
    http://www.edn.com/electronics-products/electronic-product-reviews/other/4421515/Littelfuse-intrinsically-safe-fuses

    In harsh industrial environments, the presence of gases, airborne dust and petroleum products can be highly explosive when a spark or high temperatures are affecting the surrounding environment. Underwriters Laboratories (UL) and other standards bodies had to step in to minimize these potential hazards to protect life and property.

    Intrinsically Safe (IS) devices

    A mandatory requirement of operating electronic equipment in a hazardous area requires that certified IS devices must be used.

    PICO259-UL913 fuse design and encapsulation are certified for use in intrinsically safe apparatus for applications up to voltages of 125Vrms (190V peak)

    Electronic devices in hazardous areas

    Motor controllers, lighting, flow meters, communication handsets, some sensors and process control and automation products may be used in a hazardous area. While in operation, there are possibilities of small internal sparks from devices such as motor brushes, connectors or switch contacts. The energy of these sparks must be contained to avoid ignition of explosive materials in the environment.

    The use of an intrinsically safe certified fuse must be used to limit the current under abnormal conditions so that the circuit will open with no spark generated that may cause ignition. The surface temperature of the fuse need also be kept below ignition temperatures.

    Reply
  22. Tomi Engdahl says:

    Ceramic chip capacitors are safety-certified
    http://www.edn.com/electronics-products/other/4421729/Ceramic-chip-capacitors-are-safety-certified

    Offered with C0G (NP0) and X7R dielectrics, multilayer ceramic chip capacitors in the VJ Safety series from Vishay Intertechnology provide X1/Y2 and X2 safety classifications and 250-VAC voltage ratings. The devices are optimized for EMI and AC line filtering, as well as lightning-strike and voltage-surge protection in power supplies, battery chargers, and isolators for fax machines, telephones, modems, routers, and AC equipment and appliances.

    Reply
  23. Tomi Engdahl says:

    SMD clock module integrates rechargeable battery
    Staff – September 24, 2013
    http://www.edn.com/electronics-products/other/4421642/SMD-clock-module-integrates-rechargeable-battery

    Cardinal Components’ Real Time Plus combines a real-time clock with two-wire I2C control, Enerchip solid-state battery, trickle charge/recharge circuit, power-management circuit, and 32-kHz watch crystal in a plastic 10×12-mm surface-mount package.

    Reply
  24. Tomi Engdahl says:

    Taiyo Yuden claims world’s first 330-uF MLCC
    http://www.edn.com/electronics-products/other/4421824/Taiyo-Yuden-claims-world-s-first-330-uF-MLCC

    Taiyo Yuden Co., Ltd. has released the 330 µF EIA 1210 size AMK325ABJ337MM (3.2 x 2.5 x 2.5mm) as another addition to the company’s super high-end product group of high-capacity multilayer ceramic capacitors (over 100 µF).

    The company has improved the capacitance by more than 50% in the same size capacitor as compared to Taiyo Yuden’s product AMK325ABJ227MM (200 µF). The large-capacitance 330-µF super high-end product claims to be the first of its kind in the world.

    Reply
  25. Tomi Engdahl says:

    PMBus Spec working group issues version 1.3 for review
    http://www.edn.com/electronics-blogs/power-system-management-design/4421731/PMBus-Spec-working-group-issues-version-1-3

    The PMBus Specification Working Group has released the PMBus 1.3 specification proposal for adopter review. The updated specification adds features and improves performance of PMBus 1.2 and adds AVS Bus (Part III). If the Working Group achieves its goals, the new specification will complete its review process and be approved in early 2014

    1MHz Bus Speed

    The faster bus gives a 2.5X communication improvement that is backwards compatible with 100kHz and 400kHz PMBus devices.

    Floating Point Data Format

    Floating Point is IEEE 754 Half Precision (16 bit). When implemented, all PMBus commands will use Floating Point. The goal was to create a consistent format (all commands) that is easy to convert to data types used by programmers. Programmers will typically use 32 bit floats, and conversion from half precision to single precision is simply mapping bits, a truncation/rounding operation, or extending zeros. Floating point also supports NaN and +/-Inf. NaN indicates a slave could not deliver a meaningful value, eliminating the need to NACK a request.

    Most existing firmware converts percentages (margin, fault high/low) to absolute values. Relative Voltage Thresholds allows firmware to program all output voltage related values in percent of the output voltage.

    New Part III

    The proposed specification contains a whole new Part III for AVSBus (Adaptive Voltage Scaling). The purpose of AVS is to enable an ASIC, FPGA, or Processor to change the voltage of its supply by sending commands to a POL (Point of Load Converter). In addition, AVS provides for reading back voltage and current of the POL. The AVS algorithm used is typically a proprietary control loop implemented by the system designer. The purpose of AVS is to achieve the highest performance while using the least energy. AVSBus is a communication protocol standard to enable AVS in a system.

    The AVSBus is best understood as a SPI bus without Chip Select (CS). The protocol uses Start and CRC sub-frames that allows a slave (POL) to discover the beginning and end of frames without CS.

    Reply
  26. Tomi Engdahl says:

    Board-level shields help solve EMI issues
    http://www.edn.com/electronics-products/other/4421856/Board-level-shields-help-solve-EMI-issues

    TE Connectivity has introduced shielding products that provide isolation of board-level components, minimize crosstalk, and reduce susceptibility to electromagnetic interference without impacting system speed. These stamped or drawn one- and two-piece metal cages can be used for such applications as mobile phones and tablet PCs

    Reply
  27. Tomi Engdahl says:

    Improve power supply reliability
    http://www.edn.com/design/test-and-measurement/4421754/Improve-power-supply-reliability

    How to use Safe Operating Area plots to monitor device stress limits.

    Every switching device has a maximum voltage, current and power specified by the device manufacturer, displayed on its technical application note. Reliability of the power supply is dependent on not exceeding these limits. Safe Operating Area plots help confirm operating margins.

    Safe Operating Area (SOA) plots help confirm operating margins.

    In oscilloscopes with long memories, it is possible to find SOA violations that occur for only a few cycles after aevent, such as a short circuit or startup. Such low duty cycle events can be problematic. If they go undetected, they can degrade the device over time, which reduces system reliability.

    Reply
  28. Tomi Engdahl says:

    Bad crimp, bad news
    http://www.edn.com/design/components-and-packaging/4421838/Bad-crimp–bad-news

    Given my recent blogs on how contact resistance can cause heating, and on dishwasher fires, this presentation came at just the right time.

    After several fires had occurred, Goodman engineers believed that they were caused by faulty power cords. The heating/cooling units had been in production for several years with no fires prior to Goodman switching to power cords made by a company called Tower in 2007.

    Fires appeared to start at the unit’s control board, but the boards were too badly burned for Glover to tell where the fire actually started.

    The clues came from inspecting heating/cooling units that had not started fires.

    many of the units had damaged power-cord connectors and wires

    The power cords were rated for 20A. Engineers at Everex tested the cords made by Tower with 40A and found failures. The flag connector, made by AMP (now TE Connectivity), heated and failed. Tests on power cords from the previous supplier turned up no failures. The investigations went on.

    The investigation moved to the connector, where Glover and others found that improper crimping was the root cause of the problem.

    Not enough force was applied to the connectors during power-cord manufacturing because Tower’s factory wasn’t using the specified AMP crimping machine – they were using a knockoff.

    Tower changed to AMP crimping machines and the crimps then met specifications. Problem solved.

    Reply
  29. Tomi Engdahl says:

    First Carbon Nanotube Computer Integrated With CMOS
    http://www.eetimes.com/document.asp?doc_id=1319660&

    The first working computer using carbon-nanotube transistors on a silicon wafer was recently demonstrated by researchers at Stanford University. Using what is called an “imperfection-immune design,” these researchers claim to have overcome the main obstacles facing carbon-based semiconductors, by integrating nanotubes into the complementary metal-oxide semiconductor (CMOS) design process.

    Nanometer diameter tubes of pure carbon — nanotubes — were heralded as the ideal transistor material a decade ago, because carbon nanotube transistors are higher-speed and lower-power than silicon transistors.

    “Using a combination of imperfection-immune design techniques with processing advances enabled us to overcome the challenges of using carbon nanotubes,” said Mitra. “Our entire paradigm is silicon compatible — both processing and design follow traditional CMOS flows.”

    Their approach works by first using chemical vapor deposition (CVD) to grow nanotubes side-by-side in precise arrays.

    Reply
  30. Tomi Engdahl says:

    Rocking Out With Carbon Nanotubes
    Electronics: New carbon-nanotube-based headphones generate sound through the thermoacoustic effect
    http://cen.acs.org/articles/91/web/2013/09/Rocking-Carbon-Nanotubes.html

    A new type of headphone heats up carbon nanotubes to crank out tunes. The tiny speaker doesn’t rely on moving parts and instead produces sound through the thermoacoustic effect (Nano Lett. 2013, DOI: 10.1021/nl402408j). The nanotube speaker could be manufactured at low cost in the same facilities used to make computer chips, the researchers say.

    Engineers think that the lack of moving parts would make speakers that use the thermoacoustic effect more durable than conventional ones. The problem has been that most conducting materials don’t have a thermoacoustic effect strong enough to produce sound efficiently. However, the effect is enhanced in carbon nanotubes, which are superb conductors of electricity and heat.

    The Tsinghua researchers integrated these thermoacoustic chips into a pair of earbud headphones and connected them to a computer to play music from videos and sound files. They’ve used the headphones to play music for about a year without significant signs of wear, Yang says. According to him, this is the first thermoacoustic device to be integrated with commercial electronics and used to play music.

    “We found that processing the carbon nanotube film into thin yarn arrays doesn’t weaken the thermoacoustic effect but can greatly improve the device robustness and durability,” Yang says. And the new design mounts the nanotube structures on silicon chips that are compatible with existing manufacturing methods. The thermoacoustic chips could be easily integrated into circuit boards for speakers with other electronic elements, such as control circuits, Yang says.

    Reply
  31. Tomi Engdahl says:

    Equalize data streams to 32Gbps
    http://www.edn.com/electronics-products/electronic-product-reviews/other/4421814/Equalize-data-streams-to-32Gbps

    As anyone who attends DesignCon knows, moving the 4x25Gbps (and higher speeds) bits along copper traces is a design challenge. Testing receivers running at those speeds often requires equalization to open the eye diagrams. Tektronix has introduced new products and made enhancements to others that now form a complete test system for the electrical part of 100Gbps for Ethernet OIF-CEI-28G, 100G-KR4 and -CR4, InfiniBand QDR, FDR, and EDR, and Fibre Channel FC16 and FC32.

    Reply
  32. Tomi Engdahl says:

    Review: Right The First Time, by Lee Richey
    http://www.edn.com/electronics-blogs/the-emc-blog/4421861/Review–Right-The-First-Time–by-Lee-Richey

    One of the most common questions I receive as an EMC consultant have to do with PC board design. And, no wonder. As clock and data frequencies increase towards 10 GHz, proper PC board design becomes an imperative for proper functioning of the system. The typical “rules of thumb” we used for low frequency boards no longer seem to apply.

    So, when I ran across Lee Richey’s self-published book, Right The First Time – A Practical Handbook on High Speed PCB and System Design (Volume 1), I was intrigued. Both this book and the follow-on volume 2 (Advanced Topics) are available on his web site.

    There is a huge amount of controversy and continuing discussion regarding rules of thumb, such as rules about decoupling capacitors, right angle bends in clock traces, use of guard traces, splitting of ground planes, and so forth – many of which have been accepted without adequate proof they were even valid for the particular application. Richey launches into this early on and much of the book includes simulations and physical experiments proving or disproving many of these “urban legends”.

    The book includes theory, but is loaded with practical advice, backed up with experimental evidence, on all aspects of PC board design for high speed circuits.

    Reply
  33. Tomi Engdahl says:

    Carbon nanotubes make for easy listening
    ‘Hollow. Is it these you’re looking for?’
    http://www.theinquirer.net/inquirer/news/2297962/carbon-nanotubes-make-for-easy-listening

    A HEADPHONE with no moving parts has been demonstrated using thermoacoustics and tiny tubes called nanotubes.

    As you probably learned in Physics GCSE, most speakers work by using mechanical drivers that vibrate the air in front of them to reproduce different frequencies. But not so these, which work by passing an alternating current through a conductor, heating and cooling the air around it and causing the conductor to expand and contract.

    The main advantage of this method is that there is almost no wear and tear.

    We’ve had no comment on the sound quality of these headphones, and in fact none of the information we have even mentions it, which should immediately start alarm bells ringing.

    Ray Baughman, director of the Nanotech Institute at the University of Texas in Dallas, said that while he is impressed, there are still a few bugs to iron out, particularly high power consumption, as the efficiency of nanotubes in turning electrical energy into sound is quite poor.

    But in spite of this, if the sound quality is comparable this could one day revolutionise the way we listen to music. It wasn’t that long ago that we listened to radio through tubes.

    Reply
  34. Tomi Engdahl says:

    Package Converter Compliments Chip Obsolescence
    http://techonline.com/electrical-engineers/education-training/tech-papers/4413077/Package-Converter-Compliments-Chip-Obsolescence?elq=~~eloqua..type–emailfield..syntax–recipientid~~&elqCampaignId=~~eloqua..type–campaign..campaignid–0..fieldname–id~~

    The Semiconductor industry enabling today’s electronics market place is widely disseminated between multiple customer factions such as consumer electronics, telecommunications, automotive, medical devices, military, aerospace, industrial controls, embedded computing and other industries. A component’s life cycle for each industry varies drastically from 6 months to 10 years.

    Reply
  35. Tomi says:

    Wireless devices become more common, need technologies to devices on the network and sensors to obtain operating electricity directly from the environment.

    Self-powered appliances and sensors for the use of electricity can be generated by capturing energy directly from the environment and converting it to fit the operating voltage. This energy extraction is available in a number of technical methods, and there are already ready-made components.

    Collection of energy directly from the environment should be attractive solution where there is not electricity grid and the battery is not a meaningful. This is the case for networks of the future IoT (Internet of Things), in which wireless devices, sensors and actuators communicate with each other over the air.

    The easiest way to destination can capture the energy in the sun light. Silicon-based solar cells have been used to produce electricity for many years.

    Vibration recovered

    Mechanical vibrations occur almost everywhere. Recovery makes it attractive to the strong vibrations occur in a variety of machines and mechanical appliances whose operation is often desired to remotely monitor various sensor systems.

    Mechanical energy is readily available, for example, located near to roads, bridges, and structures of houses. Vibration energy converters utilizing the selection should be to identify in advance the desired installation site produced by the vibration spectrum and intensity. Here’s how to select the object in question and its surroundings, the most suitable type of converter.

    Most of the acceleration transducers based on different sensitive organs which are resonant at a certain frequency. Operating principle they are usually either electrostatic, electromagnetic or piezoelectric.

    Temperature differences for recovery

    In many applications where exposure to sunlight or vibration energy is not readily available, can be used for temperature differences based solutions that utilize the Peltier effect. Sufficient temperature difference is available in particular buildings, HVAC systems, and mechanical machinery and engines.

    Polymers game

    In recent years it has also been possible to develop polymeric photo-electric cells, whose manufacture is cheaper to silicon cells. Flexible honeycomb material can in fact produce the roll onto the roll-type (R2R, roll-to-roll) in the production process.

    Solar cell efficiency record, is currently holding a Japanese Sharp, which developed the company’s sensor is able to notice the change as much as 44.4 per cent of the solar radiation that hits it into electricity. The record cell is made of three different layers

    Source: http://www.tietokone.fi/artikkeli/uutiset/kayttosahkoa_lahiymparistosta

    Reply
  36. Tomi Engdahl says:

    Boffins demo new holo storage using graphene oxide
    Busted disk? No problem
    http://www.theregister.co.uk/2013/10/03/boffins_demo_new_holo_storage_using_graphene_oxide/

    We realise at El Reg that holographic storage has been on the “real soon now” list practically forever, but it’s a topic that never loses its research fascination. Especially when, as has been demonstrated by a Swinburne University research group, the data that’s stored can be retrieved even if the disk is broken.

    That’s the promise held out by this paper in Nature (available in full). Ignoring the scintillating title – “Giant refractive-index modulation by two-photon reduction of fluorescent graphene oxides for multimode optical recording” – at least one of the characteristics of the graphene oxide-based holographic storage is data retrieval from broken media.

    “By focusing an ultrashort laser beam onto the graphene oxide polymer, the researchers created a 10-100 times increase in the refractive-index of the graphene oxide along with a decrease in its fluorescence”, the release states.

    That can be used for multimode optical recording

    “The giant refractive index of this material shows promise for merging data storage with holography for security coding,”

    Reply
  37. Tomi Engdahl says:

    TSMC Shows Path to 16nm, Beyond
    Gains Come as Costs, Complexity Rise
    http://www.eetimes.com/document.asp?doc_id=1319679&

    Taiwan Semiconductor Manufacturing Co. is making steady progress on its next two nodes, bringing advances in performance and low power. The bad news is it’s widely expected the latest nodes add less transistor density and more cost than in the past.

    TSMC has taped out several 20nm chips and expects to let customers start designing 16nm FinFET chips before the end of the year. By the end of 2014 it expects it will have taped out 25 20nm designs and be far along in work on 30 16nm chips.

    Company execs gave a frank and detailed rundown of their progress, especially on the 16nm node at a Silicon Valley event

    TSMC is seen as a bellwether of the chip sector and electronics generally because it is one of the world’s largest and most advanced makers of semiconductors. It puts out a whopping 1.3 million eight-inch equivalent wafers each month, some of them now down to 20nm geometries.

    Both the 20 and 16nm nodes represent key turning points.

    The 20nm node is the first to use double patterning, requiring more masks and more runs under an immersion lithography machine.

    The 16nm node represents TSMC’s first use of FinFETs, a.k.a. vertical transistors. Indeed, this node basically just adds FinFETs to the existing 20nm process, thus it provides almost no gain in packing in more transistors per area of die, although it does offer benefits in lower power or higher performance.

    TSMC execs did not mention the added costs and the reduced density in talks

    Nevertheless, TSMC execs read out testimonials about its 20nm process from a handful of key customers. Oracle said it successfully taped out its M7 server processor in TSMC’s 20nm node. Xilinx said it used the node for a PLD, and Altera said it is developing its latest serdes in the node. Smartphone chip giant Qualcomm said it is developing both 16 and 20nm design flows with TSMC.

    Reply
  38. Tomi Engdahl says:

    Auto Market Challenges Chipmakers’ Stamina
    http://www.eetimes.com/document.asp?doc_id=1319666&

    I don’t think I’m alone in wondering why any chip company these days would stay in the automotive electronics market.

    Let’s face it. Carmakers are notorious for beating up semiconductor suppliers over pennies. They demand high-standard, “automotive quality” for every IC they procure. These chips need to last longer than cars, which means a very long product life cycle is required for every automotive chip.

    Carmakers say they need innovation, and yet the typical automotive product development cycle is about five years. In most other industries, the whole world — of technology, market trends, consumer preferences, and pricing — changes within three years.

    To top it off, automotive doesn’t exactly offer either the fastest growing or the largest volume market for semiconductor companies.

    So, why stay?
    At the European Microelectronics Summit here last week, Ian Riches, director of global automotive practice at Strategy Analytics, concluded that “mainstream automotive is starting to look less attractive to some semiconductor vendors.”

    Comparing the car biz to the mobile industry, Riches pointed out that it takes more than five years to develop a new model, while the development cycle for a mobile handset is two years. A car’s lifetime is about eight years, while a smartphone lasts only 1.5 years. Perhaps, the only positive for automotive chip suppliers is that the semiconductor content in a car is higher in value. He estimates some $1,000 worth of electronics content inside a car, compared to about $130 per smartphone. But of course, the volume of global car sales totally pales when compared to mobile phones.

    The case for chip companies to stay in the automotive market has two solid arguments. The first is the strong growth in advanced safety application. The second is the continued development of hybrid electric (HEV) and electric vehicles (EV).

    In fact, automotive is a “stabilized” market, said Riches, with its overall semiconductor demand (in value) growing at a steady pace of 7 percent every year — between 2012 and 2017.

    EV hype bubble bursts
    Strategy Analytics similarly pegs semiconductor demand for HEV/EV to become as large as $2.6 billion in 2020, with its 2012-2017 CAAGR at 22 percent. However, Riches acknowledged that growth in “the EV end of the HEV/EV market is less certain.”

    “The hype bubble has now burst,” he added. There have been already some significant failures in the EV market.

    But ignoring the EV trend is the worst mistake the electronics industry could make. “The main danger now is to underestimate the long-term importance of vehicle electrification,” he warned. The energy shift towards greater electrification is “beyond question,” he noted.

    Several separate issues are making the sales of hybrids and EVs less certain than the automotive industry’s early predictions. “Many electrified-vehicle customers remain highly brand-conscious,” Riches said. These consumers “will not pay prices inflated above that brand’s normal price range in order to access electrified vehicles.”

    Second, “[The HEV/EV] market is not yet rational,” Riches said. “EV is an emotional purchase.” He pointed out that the highest sales of HEV/EV are in the United States, where fuel prices are much lower than in Europe or Asia. “Consumers are not buying HEV/EV just to save gasoline,” he said. Rather, they’re making a statement.

    Cherry-picking by non-traditional players
    Not included in the discussion of automotive semiconductor demand is the car infotainment market, such as the automotive wireless segment. “Wireless technologies like Bluetooth and embedded cellular are accelerating in the car business with market revenue set to rise by 41 percent from 2012 through 2018,” according to an Automotive Infotainment Market Tracker Report from information and analytics provider IHS.

    This trend provides an opening to non-traditional automotive chip companies such as Broadcom, Qualcomm, and Nvidia. Speaking of the power of those “new entrants with consumer-market scale,” Riches observed, “they are potentially cherry-picking some high-growth areas.”

    Qualcomm also sees its role growing in automotive chips, as embedded cellular connectivity is also finding its way into the automotive market. IHS notes that 25 percent of US cars in 2012 were sold with the feature, for the most part included as standard equipment. “OEMs will increasingly want to use embedded cellular for both safety and diagnostic purposes, because built-in wireless connectivity in cars will prove more robust and reliable than using a tethered or mobile device like a smartphone,” explained Luca De Ambroggi, senior analyst for automotive infotainment at IHS.

    Strategy Analytics’ Riches counts Nvidia as one of the stronger players among the non-traditional automotive chip companies. Tesla’s ambitious goal to create an in-vehicle computing platform that rivals top-of-the-line laptop PCs is well known. Nvidia is openly courting Tesla to partner on a Tegra 3-based chipset

    It’s time to rethink
    While not every traditional automotive chip company should be chasing every new automotive electronics segment, the automotive landscape is changing, forcing chipmakers to rethink their viability in the automotive electronics market.

    Reply
  39. Tomi Engdahl says:

    Graphene Could Make Data Centers and Supercomputers More Efficient
    http://www.technologyreview.com/news/519441/graphene-could-make-data-centers-and-supercomputers-more-efficient/

    New research suggests graphene could enable highly efficient optical communication in chips for data centers and supercomputers.

    Computer chips that use light, instead of electrons, to move data between electronic components and to other chips could be essential for more efficient supercomputers and data centers. Several industrial research labs are working toward such optical interconnects that rely on germanium to turn light into ones and zeros. But recent research suggests that graphene devices could be far better and cheaper.

    An optical interconnect consists of a modulator that converts electrical signals into optical ones, and a photodetector, which does the reverse. Current iterations feature modulators made of silicon and photodetectors made of germanium. Intel recently announced plans to use such technology and begin manufacturing a product it calls “silicon photonics,” for use in data centers

    But graphene photodetectors have a good chance to equal or surpass the performance of germanium ones in several important aspects within a few years, says Dirk Englund, a professor of electrical engineering and computer science at MIT. Although graphene devices are still about an order of magnitude behind germanium in terms of capacity to generate current in response to the absorption of light, they have improved immensely in this area in just a few years.

    Graphene has a number of potential advantages over germanium

    The first graphene-based modulator was demonstrated in 2011. So this recent work suggests it might be possible to build an optical interconnect entirely out of graphene.

    Reply
  40. Tomi Engdahl says:

    NIST closed, all measurements invalid
    http://www.edn.com/electronics-blogs/rowe-s-and-columns/4422114/NIST-closed–all-measurements-invalid

    The current deadlock in Washington over the Federal budget has shut down most of the U.S. Government. That includes NIST, which houses the national standards for many physical quantities–voltage and current–and provides calibration services. With the house at the top of the measurement chain closed, all measurements made in the U.S. are hereby invalid.

    All joking aside, what if your lab’s calibration standard is at NIST awaiting calibration? Clearly, the calibration will be delayed. Should that delay cause your calibration to go out of date, will that affect your work?

    Reply
  41. Tomi Engdahl says:

    Do you have what it takes to be a prototyping super hero?
    http://blogs.synopsys.com/breakingthethreelaws/2013/09/do-you-have-what-it-takes-to-be-a-prototyping-super-hero/

    I was recently talking to a customer who found that deploying FPGA-based prototyping was a challenge. This was a customer who had only every done simulation for verification purposes.

    The process to bring up a prototype was not smooth, they made a couple of key mistakes which I will share with you in an effort to help you avoid these in the future.

    #1 – ASIC Code is not FPGA friendly
    This is #1 rule from the FPGA-based Prototyping methodology Manual. Their code was full of ASIC specific instances that challenged the initial bring up. One of the problems was that the customer *thought* they could use the FPGA vendor tools for the synthesis.

    #2 Wasted time developing in-house FPGA Boards
    The customer thought that as they can design multi-million ASIC gate SoC’s of course they can design a PCB with a couple of FPGA’s on it. Sadly this choice delayed the start of the prototyping project as developing a PCB like this and managing clocking, configuration and debug is not as easy as it seems. The customer spun the PCB twice before getting a platform which provided basic function.

    #3 Tried to bring up the whole SoC prototype at once
    Classic mistake. The funny thing is that within simulation the customer brings up individual design blocks and only when each has past it’s hello world and basic functionality tests does it get integrated into a larger SoC verification environment. This is exactly the same as what you should be doing for FPGA-based prototyping.

    The customer made other mistakes but the above ones were the worst offenders.

    Reply
  42. Tomi Engdahl says:

    Kozio Hardware Verification Technology Spans PCBs, SoCs & FPGAs
    http://www.eetimes.com/document.asp?doc_id=1319698&

    Kozio has just announced an incredibly interesting development in its hardware verification technology. This solution is so all-embracing that it’s difficult to know quite where to start, so let’s take a moment to set the scene.

    Testing PCBs
    I remember creating test programs for printed circuit boards (PCBs) in the early 1980s. These programs ran on functional testers. You applied a test vector to the inputs and clocked it in. Then you applied the next test vector and clocked that in. Then you did the same for the next, and so it went. At some stage, data started to appear on the outputs, and your program checked these actual values against the expected results. If any discrepancies were found, the next step was to hone in on the faulty component or suspect PCB track, but that’s a story for another day.

    We described all the input test vectors to be applied and the resulting output vectors we expected to receive by hand. You simply couldn’t do that for a modern circuit board, because modern electronic designs have grown so horrendously complex.

    So the hardware design engineers get to design the board, the layout designers get to implement it, and manufacturing gets to build it. At some stage, the board will be handed over to the software folks to start creating firmware and developing (or porting) applications. But who gets to test the board to make sure it functions as planned?

    The hardware folks may perform some rudimentary tests, but they are typically keen to move on to the next project, and they don’t really understand all the things the application software will do while running on the board. As a results, it’s often left to the software developers to create tests to verify the board’s functionality. But this is the last thing they wish to do. All they want is to work on their software applications.

    Even worse, the software developers typically don’t understand all the nuances associated with the hardware, and they usually don’t have expertise in developing board-level test programs. A very common scenario is for the software developers to boot the board up under its operating system (OS); let’s assume it’s Linux for the purpose of this discussion. Then they start writing simple tests like reading and writing to the memory. One issue is that booting a full-up Linux operating system takes quite some time. Another issue is that even creating something as basic as a good memory test requires a lot of expertise. Writing comprehensive tests to verify all the corner conditions on things like USB interfaces can bring the strongest of us to our knees.

    This was the problem Kozio decided to attack several years ago. The full Kozio solution is far too complex to go into here; you can discover more by bouncing over to the company website. Suffice it to say that it has developed a special verification and test OS called VTOS that has a small memory footprint and is incredibly fast to load. It has also created a comprehensive suite of test routines that can verify different types of memory and different peripheral devices like USB interfaces.

    Reply
  43. Tomi Engdahl says:

    Lattice MachXO3 aimed at MIPI, PCI Express, Gigabit Ethernet bridging capability
    http://www.edn.com/electronics-blogs/fpga-gurus/4421625/Lattice-MachXO3-aimed-at-MIPI–PCI-Express–Gigabit-Ethernet-bridging-capability

    While Lattice Semiconductor Corp’s main play in FPGAs has been the ice40 architecture it gained from the acquisition of SiliconBlue Technologies Inc, the company’s own MachXO family has been an important player in the low-end programmable market, where devices often still are referred to as PLDs. The MachXO3, introduced at the end of September, deserves more scrutiny than a tossed-off reference to “just another low-power PLD.”

    Thus, MachXO3 focuses on the Mobile Industry Processor Interface (MIPI), PCI Express, and Gigabit Ethernet. Hard IP blocks are used for the latter two standards, while MIPI uses a combination of hard and soft IP for the most efficient implementation of the standard. MachXO3 combines the interface cores with a switching fabric that operates at 150 MHz

    These bridge devices are not large FPGAs – the density ranges from 640 to 22,000 logic cells. But devices in the family are offered in high volume for less than $1.00 each, putting the MachXO3 in the price range of typical microcontrollers. Lattice might have defined a winning architecture for glue-logic consolidation.

    Reply
  44. Tomi Engdahl says:

    Buying IP Is Not for the Faint of Heart
    http://www.designnews.com/author.asp?section_id=1365&doc_id=268222&cid=nl.dn14

    This month, Microsoft announced that it had reached an agreement to purchase Nokia’s handset business for $7.2 billion. Recalling that two years ago, Google purchased Motorola’s handset business for nearly twice that much, ($12.5 billion), it seems like Microsoft got a steal of a deal. Or did it?

    Being on an engineering development team, working countless days and nights toward the day when your product finally launches, you always had the vision that you had designed the best device the world could ever see. The marketing launch would be glorious. But then, you’d watch the fanfare quickly fade into absolute oblivion, due to the sheer number of competitors who concurrently developed a closely related offering. That once-in-a-lifetime project for which you missed your kid’s birthday and ate cold pizza for dinner four nights a week during tool-release time turned out to be just a tiny, meaningless cog in the gigantic machine called the telecom industry. You had nearly sacrificed your wife and family for what? A crumby commodity, just like corn or pork bellies. So what was the point?

    I think Google would say the intellectual property (IP) was the point. IP was the jewel in the $12.5 billion buyout of Motorola — its IP portfolio. Motorola had amassed a healthy library of very desirable patents over the years

    As I read about the deal between Nokia and Microsoft, I noticed Microsoft paid a very healthy $7.2 billion, but only for the handset business. The deal did not include the ownership of Nokia’s patent and IP portfolio. Instead, Nokia agreed to license the IP for Microsoft’s use but maintain full ownership of its coveted portfolio. So was this a good buy? Microsoft’s stockholders didn’t think so, and the NAV of Microsoft fell sharply on the news.

    Microsoft’s funds would have been better spent if they just stayed in the bank. Handset businesses cost a lot of money to run. Development requires large, multi-disciplined teams of people, all drawing nice salaries, and tooling alone can cost millions of dollars for a mass production launch. The completion is fierce, and losing money is a very realistic scenario. The bank may only pay a 0.9 percent return, but it’s guaranteed, and it’s positive.

    Reply
  45. Tomi Engdahl says:

    Microchip SoC Simplifies Design of Portable Products
    http://www.designnews.com/document.asp?doc_id=268215&cid=nl.dn14

    A new system-on-chip family could speed development of blood pressure monitors, lab instruments, power meters, and myriad other products by combining a 16-bit microcontroller with high-precision analog components.

    Microchip Technology Inc., manufacturer of the new product family, says its design will help product engineers minimize board-level noise problems, eliminate communication bottlenecks, and shorten time to market.

    ”The chances of getting to market in one revision are now much higher,” Jason Tollefson, senior product marketing manager for Microchip, told Design News. “This is a much simpler design.”

    Indeed, the PIC24F GC family, as it’s known, essentially includes a full analog signal chain — 16-bit analog-to-digital converter (ADC), 12-bit ADC, and a digital-to-analog converter (DAC), along with an integrated LCD driver and USB. Also included on the chip is a 16-bit PIC microcontroller. ”It’s really an analog system-on-chip that happens to have a microcontroller onboard,” Tollefson told us.

    Reply
  46. Tomi Engdahl says:

    Linear Introduces Multi-source Energy-Harvesting Chip
    http://www.designnews.com/document.asp?doc_id=267708&cid=nl.dn14

    Energy harvesting is becoming an increasingly good option to power wireless sensor networks and other ultra-low-power mini and micro devices. But while some of the devices being created provide all of the power for an application, others are meant as a complement to battery power to extend its life.

    To the latter end, Linear Technology, a provider of circuits and power-management technology, has released a new multi-source energy-harvesting chip, the LTC3330, which can deliver up to 50mA of continuous output current to extend battery life when harvestable energy is available, according to the company.

    What’s more, the device is of the hybrid nature, meaning it can harvest energy from more than one source — in this case, solar, piezoelectric, or magnetic sources, according to Linear. Generally, energy harvesters can only leverage one source of energy, but the trend is moving toward more versatile harvesters that can generate energy from multiple sources.

    Reply
  47. Tomi Engdahl says:

    ARM TechCon 2013 promises small and large innovations
    http://www.edn.com/electronics-blogs/catching-waves/4421972/ARM-TechCon-2013-promises-small-and-large-innovations

    From tiny devices in the Internet of Things (IoT) to giant server farms, ARM TechCon 2013 is poised to run the gamut of multi-core processing possibilities. Housed in the Santa Clara Convention Center from October 29-31

    Reply
  48. Tomi Engdahl says:

    What Do You Leave for Last?
    http://www.designnews.com/author.asp?section_id=1386&doc_id=268364&cid=nl.dn14

    We often talk about the part of the design that’s considered last. Nobody wants to have their subsystem as the final thought.

    The parts that come up most often in the last-to-be-considered category include the battery or power subsystem, the cooling method, and the circuit protection. Nothing should be last (but something has to be), so I suggest you take extra caution if circuit protection falls to the bottom.

    it’s time to end the excuses.

    Comment:

    We leave what we refer to as “bells and whistles” for the last – those things that are nice to have if time and budget allows. Circuit protection, cooling, and powering the system are biggies that ensure a system will be both robust and reliable and these get our attention from the early stages of design.

    Reply
  49. Tomi Engdahl says:

    BITalino: A DIY Toolkit for Physiological Computing
    http://www.eetimes.com/document.asp?doc_id=1319710&

    I just heard from Hugo Silva, a researcher at the Instituto de Telecomunicações in Portugal. He is also a co-founder of PLUX: Wireless Biosignals, a company that focuses on creating technologies for the healthcare and quality-of-life markets.

    His latest project, the BITalino, is a low-cost, modular biosignal sensor kit.

    In fact, it would be interesting to monitor a host of biosignals. For example, an electrocardiography (ECG) sensor can be used to monitor the electrical activity of the heart over a period of time. An electromyography (EMG) sensor can be used to evaluate and record the electrical activity produced by skeletal muscles. An electrodermal activity (EDA) sensor can be used to monitor the electrical conductance of the skin, which varies with its moisture level

    The monitoring, recording, analysis, and interpretation of biosignals has traditionally been focused on medical and quality-of-life applications. More recently, biosignals (combined with data from other sensors, like accelerometers and ambient light sensors) have started appearing in a wide variety of other areas, including informatics, a broad academic field encompassing computer science, information science, information technology, and human-computer interaction.

    Until recently, these types of sensors would have been horrendously expensive, and using them would have been horrendously complicated. This is the point where Silva leaps to center stage with a fanfare of trumpets brandishing his low-cost, modular BITalino biosignal sensor kit. The current incarnation includes an ECG, an EMG, an EDA, an accelerometer, and an ambient light sensor. These are connected together and presented on a single board

    The cool thing is that each functional block can be snapped off and used independently, or the blocks can be gathered together in different combinations.

    Bitaliano
    http://www.bitalino.com/

    Low-cost
    100x cheaper than your average physiological data acquisition system

    Price starting from 149 Euros

    What’s included

    1x BITalino Board or 1x Plugged
    or 1x Freestyle (you choose)
    1x 3-lead accessory (for EMG / ECG)
    1x 2-lead accessory (for EDA)
    5x Pre-gelled electrodes
    1x Li-Po Battery 320mAh
    Software for real-time data visualization and recording
    Access to programming API’s in Python, Android & other languages
    Access to the communication protocols

    Micro-Controller Unit
    up to 1000Hz sampling rate
    6 analog inputs (4@10-bit + 2@6-bit)
    4 digital inputs, and 4 digital outputs

    Programming APIs
    - Python
    - Java
    - Android
    - LabVIEW

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*