NI explains how to overcome emerging test challenges as the notion of sensors becomes redefined as technologies incorporate more of them and more unusual ones.
National Instruments’ (NI) experience of working with over 35,000 customers spanning a wide variety of sectors, from automotive to telecommunications, aerospace to medical, provides a vast and diverse customer base. This enables NI to predict which emerging test and measurement technologies are at the forefront of innovation.
Janusz Bryzek, an executive from Fairchild Semiconductor, says that sensors could reach a manufacture rate of 1 trillion per year, from today’s rate of 10 million, within the next 10 years. With that sort of growth, microelectromechanical systems (MEMS) could become 30% of the total global semiconductor market, worth in excess of $100 billion per year compared to a value of about $11 billion in 2012.
This eruption of sensors poses clear challenges on test departments and their engineers. To make this worse, many industries have been forced to change their fundamental understanding of what a sensor is. The previous notion that sensors only measure temperature, strain, force and other basic data points is obsolete.
Greater Expectations Create Greater Challenges
Some of the clearest examples of this challenge are in high-volume, consumer-facing markets like automotive and telecommunications. In these industries, consumers, suppliers and even legislative bodies have high product expectations. Consider the automobile: in the past, sensors were used to monitor key data points like engine temperature and oil pressure, but because of the incredible rise in consumer and legislative demands, car manufacturers have been forced to significantly increase their cars’ electronic components and capabilities. Today, vehicles are required to control their emissions, correct dangerous human driving behaviours, receive satellite radio signals and provide a level of entertainment and convenience to passengers. To accomplish all of this, engineers must expand their idea of a “sensor“ to technologies like O2 sensors for catalytic converter output, cameras for monitoring the driver’s eyes, an antenna for picking up digital radio and navigation signals and a display for video and information communication just to list a few.
This only scratches the surface of these new types of possible sensor, a view supported by Tom Pierce, Vice President and General Manager for Test and Measurement at Honeywell Sensing and Control.
“In the future, people are going to put sensors in places we have never thought about,” stated Pierce. “The need for sensors is exploding and there are many more potential sensor applications than we could ever have predicted.”
Exceptional Sensor Applications
The robotics industry is a lucrative one where a multitude of sensors are required to accurately measure the surroundings allowing the technology to sense its environment. Pierce’s words are echoed in an extraordinary and innovative application of sensors in the UK, a robot controlled by the humble fruit fly.
An inter-disciplinary collaboration of engineers from Imperial College London, Optotune AG, ETH Zürich, ViSSee Sagl and Tufts University developed a flexible robotic device to measure and simulate flight patterns in winged insects.
All animals, including the common fruit fly, employ a profusion of rapid and precise biological “sensors, controllers and actuators; desirable features for any control system. Think of a coal miner carrying a canary to warn of gas as a simple example of a biological sensor. In the robotics case, these biological components are replaced by electronic and mechanical sub-systems.
“Feedback from three linear cameras and eight proximity sensors determine the visual stimulus shown to the fly,” explained Optotune’s Chauncey Graetzel. “Flight parameters such as the fly’s wingd beat frequency and amplitude control the robots movements.”
Whilst biological sensors may be some way from commercialisation, the mobile phone industry quickly exhibits a more timely effect of this explosion of sensors. By 2015, your average consumer mobile phone is projected to have nearly 20 MEMS sensors, compared to two sensors in 2000. For example, the Samsung Galaxy S4 seamlessly combines a number of sensors: an accelerometer and gyro; geomagnetic sensor; temperature and humidity sensor; Hall Effect sensor; and RGB light sensor.
Implications for Test
Capital costs of automated testers can account for more than 60% of overall test costs, so minimising hardware changes can significantly reduce overall expenditure. For example, a dedicated test process that addresses mobile devices, which have a typical shelf life of 18 months, has obsolete sensors and technology for every new design. Architecting a test system that can adapt to changes occurring once or twice a year requires an agile or proactive test approach. Unlike an ad hoc approach, with dedicated box instruments that specialise in one specific measurement, a proactive test approach features modular hardware and anticipates technology changes. A modular approach minimises the sustaining costs of a tester with incremental changes instead of whole product replacement. National Instruments advocates this ethos of a modular approach: the NI PXI platform lends itself perfectly to such applications, offering a modular embedded controller and a variety of modular instruments, from single-point DC to RF and microwave measurements. This approach contributed to a huge success for device characterisation at Qualcomm Atheros.
Qualcomm Atheros has been a prominent supplier of next-generation wireless technologies for more than two decades. Its engineers faced serious challenges when adapting high-throughput wireless technologies to meet the demands of new connected applications. As wireless standards become more complex, the number of operational modes for its MIMO transceiver devices increases exponentially. In order to utilise the 802.11ac WiFi standard, these devices require new modulation schemes, more channels, more bandwidth settings and additional spatial streams. Additionally, characterising WLAN transceivers is especially challenging when faced with thousands of independent operational gain settings.
To overcome this problem the team swapped their traditional bench top instruments, which took only 40 meaningful data points per gain setting, for a modular PXI system which increased throughput by 10 times. To increase performance to the level required for 802.11ac, in 2012 Qualcom saw the need to upgrade its instrumentation. By utilising its existing modular PXI platform and swapping out RF instruments for the latest Vector Signal Transceiver module, Qualcomm was able to achieve a staggering 300,000 meaningful data points per gain setting.
“Using the software-designed NI PXI vector signal transceiver and the NI WLAN Measurement Suite, we improved test speeds by more than 200 times compared to traditional rack-and-stack instruments while significantly improving test coverage.” – Doug Johnson Qualcomm Atheros
A Platform Approach for rapidly changing technology
If the vision of a trillion sensors per year by 2024 is true, product complexity will be growing and advancing at a significantly more rapid pace than the current trajectory. This trend will continue to impact test organisations as more frequent product redesigns dramatically affect the total cost of test. Companies that use a test strategy featuring a modular approach that can accommodate the changing sensor market will reduce total cost of ownership and improve redesign time to meet more stringent time-to-market demands.
View National Instruments 4-part webcast series on-demand to learn about common sensor and signal measurements. Engineers teach you the sum and substance of a particular measurement application, from theory to practice.
View the Sensor Fundamental Webcast Series
Learn more about NI CompactDAQ