In a recent blog posting we introduced the topic of EDA (Equipment Data Acquisition) standards testing and sub-divided the domain into three parts:
Compliance testing – does the equipment adhere to the specifications described in the SEMI Standards?
Performance and stability testing – does the equipment meet the end users’ performance and availability specifications?
Equipment metadata model conformance testing – does the equipment model delivered with the interface represent the tool structure and content anticipated by the end customer?
Today’s post deals with the performance and stability testing in greater detail.
In our discussions with EDA users (both OEM implementers and fab end users) about EDA testing, they all acknowledge the need for compliance testing. However, the vast majority have said, “If you can help me automate my performance testing, I would be able to save a huge amount of time.” Most thought they could reduce testing time from several weeks to just a couple of days.
Everyone has different ideas about what should be included in performance testing of their EDA software. Everyone can agree that generally they need to test if the equipment meets the end users’ performance and availability specifications in terms of data sampling intervals, overall data volume transmitted, size and number of DCPs (data collection plans) supported, demands on the computing/network resources, and up-time. They also need to know if the software will support the range of application clients expect in a production environment.
EDA users want to know the sheer volume of data that can be collected.
ISMI has reported in public forums that IC makers expect EDA to achieve data rates of 50+ variables per chamber at rates up to 10 Hz. In EDA specifications, IC makers have requested the ability to gather 1,000 to 2,000 parameters using data collection rates from 5 to 20 Hz, which translates to 40,000 values per second.
These rates are easily achievable with today’s computing platform technology, but users also want to know the upper limit. In other words, at what point does the ability to collect data break down?
EDA users want to know that the data comes in at the specified rates and that the values and timestamps that are received at those rates are accurate.
EDA users want to know how different data collection rates and volumes will affect the system resources. Will memory usage be too high? How will different collection rates affect CPU usage? Is the network bandwidth sufficient for gathering the required data at the required speeds and still maintain high data quality?
In a previous post, we mentioned that Cimetrix has automated the EDA compliance evaluation procedures. We are also in the process of designing the performance testing components of this tester. The ISMI EDA Evaluation Method discusses some basic performance testing in Appendix B; performing these tests will be included in the feature set of EDA tester now under development. In addition, we are designing ways to help evaluate the data volume, data quality and resource usage of a given equipment.
If you want to know more about EDA testing and/or discuss your specific needs or provide input on what you would like to see included for performance testing, contact Cimetrix for a demonstration of this exciting new capability!