-
- News
- Books
Featured Books
- smt007 Magazine
Latest Issues
Current IssueSpotlight on India
We invite you on a virtual tour of India’s thriving ecosystem, guided by the Global Electronics Association’s India office staff, who share their insights into the region’s growth and opportunities.
Supply Chain Strategies
A successful brand is built on strong customer relationships—anchored by a well-orchestrated supply chain at its core. This month, we look at how managing your supply chain directly influences customer perception.
What's Your Sweet Spot?
Are you in a niche that’s growing or shrinking? Is it time to reassess and refocus? We spotlight companies thriving by redefining or reinforcing their niche. What are their insights?
- Articles
- Columns
- Links
- Media kit
||| MENU - smt007 Magazine
Managing Big Data from an Analog World
November 18, 2015 | Chandran Nair, National InstrumentsEstimated reading time: 3 minutes

There once was a time when hardware sampling rates, limited by the speed at which analog-to-digital conversion took place, physically restricted how much data was acquired. But the advances in computing technology, including increasing microprocessor speed and hard-drive storage capacity, combined with decreasing costs for hardware and software, have provoked an explosion of data coming in unabated. Among the most interesting to the engineer and scientist is data derived from the physical world. This is analog data that is captured and digitized and otherwise known as “big analog Ddata.” It is collected from measurements of vibration, RF signals, temperature, pressure, sound, image, light, magnetism, voltage, and so on.
In the field of measurement applications, engineers and scientists collect vast amounts of data every minute. For every second that the Large Hadron Collider at the European Organization for Nuclear Research (CERN) runs an experiment, the instrument generates 40TB of data. For every 30 minutes that a Boeing jet engine runs, the system creates 10TB of operations information (Gantz, 2011).
In the age of big data, hardware is evidently no longer the limiting factor in acquisition applications, but the management of acquired data is. How do we store and make sense of data? How do we keep them secured? How do we future proof them? These questions become compounded when systems evolve to become more complex, and the amount of data required to describe those systems grow beyond comprehension. This inevitably results in longer project schedules and less efficiency in development. More advanced tools and smarter measurement systems will be essential to managing this explosion of data and help engineers make informed decisions faster.
For engineers, this means instrumentation must be smarter and sensors, measurement hardware, data buses, and application software need to work together to provide actionable data at the right time. The big data phenomenon adds new challenges to data analysis, search, integration, reporting, and system maintenance that must be met to keep pace with the exponential growth of data. And the sources of data are many. As a result, these challenges unique to big analog data have provoked three technology trends in the widespread field of data acquisition.
Contextual Data Mining
The physical characteristics of some real-world phenomena prevent information from being gleaned unless acquisition rates are high enough, which makes small data sets an impossibility. Even when the characteristics of the measured phenomena allow more information gathering, small data sets often limit the accuracy of conclusions and predictions in the first place.
Consider a gold mine where only 20% of the gold is visible. The remaining 80% is in the dirt where you can’t see it. Mining is required to realize the full value of the contents of the mine. This leads to the term “digital dirt,” meaning digitized data can have concealed value. Hence, data analytics and data mining are required to achieve new insights that have never before been seen.
Data mining is the practice of using the contextual information saved along with data to search through and pare down large data sets into more manageable, applicable volumes. By storing raw data alongside its original context or “metadata,” it becomes easier to accumulate, locate, and later manipulate and understand. For example, examine a series of seemingly random integers: 5126838937. At first glance, it is impossible to make sense of this raw information. However, when given context like (512) 683-8937, the data is much easier to recognize and interpret as a phone number.
Descriptive information about measurement data context provides the same benefits and can detail anything from sensor type, manufacturer, or calibration date for a given measurement channel to revision, designer, or model number for an overall component under test. In fact, the more context that is stored with raw data, the more effectively that data can be traced throughout the design life cycle, searched for or located, and correlated with other measurements in the future by dedicated data post-processing software.
Editor's Note: This article originally appeared in the November 2015 issue of SMT Magazine.
Testimonial
"In a year when every marketing dollar mattered, I chose to keep I-Connect007 in our 2025 plan. Their commitment to high-quality, insightful content aligns with Koh Young’s values and helps readers navigate a changing industry. "
Brent Fischthal - Koh YoungSuggested Items
Advanced Packaging-to-Board-Level Integration: Needs and Challenges
09/15/2025 | Devan Iyer and Matt Kelly, Global Electronics AssociationHPC data center markets now demand components with the highest processing and communication rates (low latencies and high bandwidth, often both simultaneously) and highest capacities with extreme requirements for advanced packaging solutions at both the component level and system level. Insatiable demands have been projected for heterogeneous compute, memory, storage, and data communications. Interconnect has become one of the most important pillars of compute for these systems.
Procense Raises $1.5M in Seed Funding to Accelerate AI-Powered Manufacturing
09/11/2025 | BUSINESS WIREProcense, a San Francisco-based industrial automation startup developing cutting-edge AI and remote sensing technologies for process manufacturers has raised $1.5 million in a seed funding round led by Kevin Mahaffey, Business Insider’s #1 seed investor of 2025 and HighSage Ventures, a Boston-based family office that primarily invests in public and private companies in the global software, internet, consumer, and financial technology sectors.
Zuken Announces E3.series 2026 Release for Accelerated Electrical Design and Enhanced Engineering Productivity
09/10/2025 | ZukenZuken reveals details of the upcoming 2026 release of E3.series, which will introduce powerful new features aimed at streamlining electrical and fluid design, enhancing multi-disciplinary collaboration, and boosting engineering productivity.
AI Infrastructure Boosts Global Semiconductor Revenue Growth to 17.6% in 2025
09/09/2025 | IDCAccording to the Worldwide Semiconduct o r Technology and Supply Chain Intelligence service from International Data Corporation (IDC), worldwide semiconductor revenue is expected to reach $800 billion in 2025, growing 17.6% year-over-year from $680 billion in 2024. This follows a strong rebound in 2024, when revenue grew by 22.4% year-over-year.
I-Connect007 Editor’s Choice: Five Must-Reads for the Week
09/05/2025 | Andy Shaughnessy, I-Connect007It’s almost fall here in Atlanta, and that means that the temperature is finally dropping. And it quit raining! It’s been raining since March, and I’m so over it, as the social influencers say. Last night we grilled out on the deck, and it wasn’t hot, and we didn’t get rained on. Life is good. It was a busy week in the industry. In this installment of my must-reads, we say goodbye to Walt Custer, the man who made PCB data points interesting for the rest of us.