-
-
News
News Highlights
- Books
Featured Books
- design007 Magazine
Latest Issues
Current IssueLearning to Speak ‘Fab’
Our expert contributors clear up many of the miscommunication problems between PCB designers and their fab and assembly stakeholders. As you will see, a little extra planning early in the design cycle can go a long way toward maintaining open lines of communication with the fab and assembly folks.
Training New Designers
Where will we find the next generation of PCB designers and design engineers? Once we locate them, how will we train and educate them? What will PCB designers of the future need to master to deal with tomorrow’s technology?
The Designer of the Future
Our expert contributors peer into their crystal balls and offer their thoughts on the designers and design engineers of tomorrow, and what their jobs will look like.
- Articles
- Columns
Search Console
- Links
- Media kit
||| MENU - design007 Magazine
Hon Hai Research Institute Launches Traditional Chinese LLM With Reasoning Capabilities
March 13, 2025 | PRNewswireEstimated reading time: 3 minutes
Hon Hai Research Institute announced today the launch of the first Traditional Chinese Large Language Model (LLM), setting another milestone in the development of Taiwan's AI technology with a more efficient and lower-cost model training method completed in just four weeks.
The institute, which is backed by Hon Hai Technology Group ("Foxconn") (TWSE:2317), the world's largest electronics manufacturer and leading technological solutions provider, said the LLM – code named FoxBrain – will be open sourced and shared publicly in the future. It was originally designed for applications used in the Group's internal systems, covering functions such as data analysis, decision support, document collaboration, mathematics, reasoning and problem solving, and code generation.
FoxBrain not only demonstrates powerful comprehension and reasoning capabilities but is also optimized for Taiwanese users' language style, showing excellent performance in mathematical and logical reasoning tests.
"In recent months, the deepening of reasoning capabilities and the efficient use of GPUs have gradually become the mainstream development in the field of AI. Our FoxBrain model adopted a very efficient training strategy, focusing on optimizing the training process rather than blindly accumulating computing power," said Dr. Yung-Hui Li, Director of the Artificial Intelligence Research Center at Hon Hai Research Institute. "Through carefully designed training methods and resource optimization, we have successfully built a local AI model with powerful reasoning capabilities."
The FoxBrain training process was powered by 120 NVIDIA H100 GPUs, scaled with NVIDIA Quantum-2 InfiniBand networking, and finished in just about four weeks. Compared with inference models recently launched in the market, the more efficient and lower-cost model training method sets a new milestone for the development of Taiwan's AI technology.
FoxBrain is based on the Meta Llama 3.1 architecture with 70B parameters. In most categories among TMMLU+ test dataset, it outperforms Llama-3-Taiwan-70B of the same scale, particularly exceling in mathematics and logical reasoning. The following are the technical specifications and training strategies for FoxBrain:
Established data augmentation methods and quality assessment for 24 topic categories through proprietary technology, generating 98B tokens of high-quality pre-training data for Traditional Chinese
- Context window length: 128 K tokens
- Utilized 120 NVIDIA H100 GPUs for training, with total computational cost of 2,688 GPU days
- Employed multi-node parallel training architecture to ensure high performance and stability
- Used a unique Adaptive Reasoning Reflection technique to train the model in autonomous reasoning
In test results, FoxBrain showed comprehensive improvements in mathematics compared to the base Meta Llama 3.1 model. It achieved significant progress in mathematical tests compared to Taiwan Llama, currently the best Traditional Chinese large model, and surpassed Meta's current models of the same class in mathematical reasoning ability. While there is still a slight gap with DeepSeek's distillation model, its performance is already very close to world-leading standards.
FoxBrain's development – from data collection, cleaning and augmentation, to Continual Pre-Training, Supervised Finetuning, RLAIF, and Adaptive Reasoning Reflection – was accomplished step by step through independent research, ultimately achieving benefits approaching world-class AI models despite limited computational resources. This large language model research demonstrates that Taiwan's technology talent can compete with international counterparts in the AI model field.
Although FoxBrain was originally designed for internal group applications, in the future, the Group will continue to collaborate with technology partners to expand FoxBrain's applications, share its open-source information, and promote AI in manufacturing, supply chain management, and intelligent decision-making.
During model training, NVIDIA provided support through the Taipei-1 Supercomputer and technical consultation, enabling Hon Hai Research Institute to successfully complete the model pre-training with NVIDIA NeMo. FoxBrain will also become an important engine to drive the upgrade of Foxconn's three major platforms: Smart Manufacturing. Smart EV. Smart City.
Suggested Items
Yamaha Motor to Launch New YRP10e Entry-Level Solder Paste Printer
02/26/2025 | Yamaha Motor Europe Robotics, SMT SectionYamaha Motor Europe Robotics SMT Section announces that it will release the new YRP10e solder paste printer on April 1 of this year.
Mycronic to Showcase its Full-line Solution at IPC APEX EXPO 2025
02/21/2025 | MycronicMycronic, the leading Sweden-based electronics assembly solutions provider, continues to respond to growing customer demand by offering high-flexibility and high-productivity solutions.
Thailand’s Smartphone Market Grew 17.1% in 2024 to 16.9 Million Units
02/13/2025 | IDCAccording to International Data Corporation ’s (IDC) Worldwide Quarterly Mobile Phone Tracker, Thailand’s smartphone market shipped 16.9 million units in 2024, marking a 17.1% year-on-year (YoY) growth.
Semi-Solid State Battery Adoption in EVs Gradually Rises, Projected to Exceed 1% Market Penetration by 2027
02/04/2025 | TrendForceTrendForce’s latest research highlights that semi-solid state batteries—an emerging battery technology combining the advantages of traditional liquid electrolyte batteries and solid-state batteries—entered trial production before 2020.
Rising AI Infrastructure Demand Highlights Industry Shift Toward Cost-Effective Solutions as DeepSeek Gains Traction
01/30/2025 | TrendForceTrendForce’s latest investigations have revealed that the recent release of DeepSeek-V3 and DeepSeek-R1 underscores an industry-wide shift toward more cost-effective AI infrastructure.