-
-
News
News Highlights
- Books
Featured Books
- smt007 Magazine
Latest Issues
Current IssueThe Rise of Data
Analytics is a given in this industry, but the threshold is changing. If you think you're too small to invest in analytics, you may need to reconsider. So how do you do analytics better? What are the new tools, and how do you get started?
Counterfeit Concerns
The distribution of counterfeit parts has become much more sophisticated in the past decade, and there's no reason to believe that trend is going to be stopping any time soon. What might crop up in the near future?
Solder Printing
In this issue, we turn a discerning eye to solder paste printing. As apertures shrink, and the requirement for multiple thicknesses of paste on the same board becomes more commonplace, consistently and accurately applying paste becomes ever more challenging.
- Articles
- Columns
Search Console
- Links
- Media kit
||| MENU - smt007 Magazine
Estimated reading time: 8 minutes
Smart Factory Insights: Me and My Digital Twin
It would be wonderful to have a digital twin of myself, designed to take on all of the boring aspects of life, leaving me free to focus on what I enjoy doing. How much work would be needed to do this, given that the digital me needs to be developed, trained and configured to make decisions, and do things in the way that I would want? How can we ensure that the benefits from our digital twins outweigh the costs?
A fully functional digital twin involves more than it may initially seem. At first, we tend to think about access to information. To prepare my digital twin, I will need to prepare information about myself, details of where I live, the utilities, where I bank, the cars I drive, contact lists of family, friends and colleagues, security information, as well as my likes and dislikes, access to social media accounts, etc. How about security? There is a great deal of trust to be considered when creating a digital twin, as there is scope for its use both for good and evil. Unlike my physical self, my digital twin can be copied and cloned an unlimited number of times, then used by anyone for anything. Having said that, most of the information is “out there” already. It is really surprising how much personal information is willingly or unwittingly shared through internet-based services, especially social media.
Digital twin lesson number one: Create a secure environment for my digital self. As we baby-proof our homes for newborn humans, we need to baby-proof our digital environment as well.
Likewise, manufacturing digital twins must be protected with cybersecurity measures, as information includes details of production lines, machines, configurations, flows, capabilities, manpower, and other resources. This is intellectual property with commercial value.
More Than a Database
So far, my personal digital twin is just a database; I need more. As a person, I am connected to the world. Events happen, and as with most humans, I develop opinions which are linked to memories, food for thought, which evolves as I address and solve problems, and interact with other people. The manufacturing digital twin also needs to be connected. Things that we do and say are very tightly linked with events that we experience. Without this link, my digital twin would be limited to endlessly repeating old facts from the original database, not considering any changes to the environment, and unaware of specific needs and constraints. In other words, irrelevant.
The IPC Connected Factory Exchange is not going to help my personal digital twin, but does fulfill requirements of a manufacturing digital twin, providing real-time information that brings live visibility of every event and situation across the whole shop floor through a single interface and language. Wouldn’t communication be great if every human on the planet had at least one common language in which we could all communicate?
With connections made, now we get to the tricky part. I would like my digital twin to not just exist, but to actually do stuff, specifically things I don’t want to have to do myself. Digital assistants can do this today but without any real intelligence beyond what they see about each of us in their database, albeit more than we are likely aware of.
For example, when I want to buy something, it is very satisfying to be able to choose the best item from all the choices offered. There are many products of the type that I need, all of which have different looks and specifications, and of course, prices. I need the best value item that gives me what I need in terms of performance, capability, and expected life; perhaps, I want a little more on top of that to show off a little (we are only human after all). There are likely to be several suppliers offering the same or very similar products that meet the criteria. As well as price comparison, I should also consider the delivery cost, time and reliability, supplier rating, after sales service record, etc.
There is only so much attention span that my human brain can muster, especially when key information seems deliberately hidden. We cannot get into every detail of differentiation; I certainly have better things to do. My digital twin, however, could do it all, and find the perfect solution. Millions of data exchanges across the internet cost virtually nothing. I can be very happy then that the choice “I” end up making is the very best one. For the manufacturing world, the new IPC 2551 Digital Twin provides the definition and structure as to how all of this can be done, linking information and interoperability between digital twin solutions, bridging the once separated worlds of product, manufacturing, and lifecycle digital twin elements.
Two Types of Algorithms
In addition to all the data, however, my thought process to do this needs to be coded into the digital twin, as there will be many conflicts, trade-offs, and compromises to consider during the decision-making process. Lower cost is good, but at the expense of quality? There is an algorithm therefore to be applied to the digital twin data. As a software developer of more than 30 years, I find there to be two choices of types of algorithms.
My favorites are the heuristic-based algorithms, which model the thought processes of humans. Rules, often complex, are followed that determine calculations that lead to a specific answer. The difference between the software and my limited biological approach is that the computer will follow all possible tracks, rather than being limited to those within my own attention span. The danger of this algorithm, however, is that unless written in a very clever way, it is harder to change the thought process based on new ideas or concepts. The benefit, however, is that the results appear very quickly and are effective.
The second type of algorithm is a random mathematical model originally termed “genetic algorithms.” The connection of facts, such as the order in which a process could be done, is laid out at random, the effectiveness measured, then the order changed, and effectiveness re-measured. How the changes are made vary, the original genetics-based idea being to divide them in the same way as genes are shared from parents to a child—slice and dice, then try again, potentially billions of times. No matter how sophisticated genetic algorithms and the like become, the result will take time, geometrically increasing in proportion to the number of variables. The benefit is that unlike the heuristic model, there are no assumptions; a solution that no one may ever have considered could be found to be the best. The downside is that it takes time—a long time—to come up with the best solution. One of my own heuristic machine program optimization algorithms was once beaten by a genetic algorithm, shaving off a second or two of the machine’s run time. I did like to point out that the heuristic algorithm had taken five minutes whereas the genetic algorithm was still going after five days, four days after the production was supposed to have started. An “I’m bored” button then appeared to stop the genetic algorithm and take whatever has been the best result thus far.
The interesting thing, however, about the genetic algorithm is that the “health” of each potential solution discovered is qualified by a function that measures the value of the solution. The need to have a defined method provokes similar limitations as seen with the heuristic method; to truly find new and original solutions is questionable. It is easier to change the ruleset of the genetic algorithm as compared to the logic of a heuristic model. If a solution were able to automatically change the ruleset, based on feedback of the real effectiveness of solutions over time, this would lead to the potential of actual “AI.” The human ability to change the constraints associated with a problem can be termed either intelligence or recklessness, depending on the nature of influencing factors. As a digital twin can be used for good or evil, will we trust the AI to modify itself in a way that we would assume to be in our interest?
We see the trend of increasing amounts of data, more decisions to be made, more complexity, and more security, so the next stage of intelligence may be a hybrid of the two algorithmic types, whereby the simple cause and effect decisions are implemented as heuristic elements of an advanced genetic algorithm type of approach. This can include the building in of “laws” to protect human interest.
Saturation point reached; I don’t want to have to understand all of this “nerd talk” as being simply the user of my digital twin. I simply want it to do the work for me, and just give me answers. Engineers and managers on the manufacturing shop floor will find their decisions augmented using the manufacturing digital twin, providing relief from having to do all the data gathering, formatting, and mind-numbing analysis manually, resulting instead in a clear vision of what will happen, should nothing be done, or perhaps if one of two things were to change—such as the introduction of a new product, material availability issues, or customer demand rates fluctuate.
Put away the abacus, notebook, calculator, Excel spreadsheet, or whatever tools that you have historically chosen to number-crunch and turn to the excellent digital twin solutions that exist within the modern IIoT-based MES solution. As software developers, we do the work, coding in the rules and methods, defining best practices, creating the ontology that turns data into actionable value. To create my personal digital twin would require far more work than I am prepared to put in, but when it comes to manufacturing software, there is an army of developers that have created and continue to evolve a singular solution applicable to all manufacturing environments, bringing true smart Industry 4.0 manufacturing for everyone.
If you are interested in exploring how a digital twin can benefit your factory, you can learn more here.
This column originally appeared in the April 2021 issue of SMT007 Magazine.
More Columns from Smart Factory Insights
Smart Factory Insights: Making Rework a Smart Business OpportunitySmart Factory Insights: The Sustainability Gold Rush
Smart Factory Insights: Today’s Manufacturing Jobs Require a New Skill Set
Smart Factory Insights: Compose Yourself, Mr. Ford
Smart Factory Insights: The Smart Business Case for Local PCB Manufacturing
Smart Factory Insights: Manufacturing Digital Twin—Spanners in the Works
Smart Factory Insights: Machines, People, and AI
Smart Factory Insights: Is Sustainability in Manufacturing a Benefit or Burden?