Semiconductor Industry News, Trends, and Technology, and SEMI Standards Updates

Semiconductor

Derek Lindsey: Product Manager

Derek Lindsey has been an employee of Cimetrix for over 22 years. For 21 of those years, Derek was a member of the product development team and was involved in the development of several of Cimetrix products. He also has extensive experience working with Cimetrix customers in implementing their tool control solutions using our products. He recently joined the product management team to help add some technical expertise to product development. Derek has a bachelor’s in computer science from Brigham Young University.
Find me on:

Recent Posts

Designing Recipes in CCF

Posted by Derek Lindsey: Product Manager

Jan 24, 2017 11:00:00 AM

Anyone above a certain age will be able to tell you what you get when you combine two all-beef patties, special sauce, lettuce, cheese, pickles, onions – on a sesame seed bun. There are many who would argue that what sets a Big Mac apart from other burgers – and has made it one of the best-selling products of all time – is the special sauce.

In a March 2016 blog post, Cimetrix listed eleven points to be taken into consideration when starting an equipment control application using CIMControlFramework (CCF). One of the things to consider is how you want to provide process and path information through the tool using recipes. This blog post delves a little deeper into the recipe aspect of equipment control applications.

In CCF, recipes are either process recipes or sequence recipes.

Cookbook1.png

A process recipe contains the instructions to be carried out by a particular process module. These instructions can range from temperature settings to types of gas to flow. The most important aspect of any tool control application is allowing the tool manufacturer to do what they do best – perform their process better than anyone else in the world. The process recipe allows tool manufacturers to add their special sauce to the wafer. CCF provides a sample process recipe implementation as well as very simple process recipe editor. Since recipes are generally custom for each tool manufacturer, CCF application developers usually want to customize the recipe contents for a process recipe.

If the processing of material is the special sauce, the rest of the application, moving the wafer through the tool, is a necessary evil. To assist in moving material through the tool, CCF also provides a sequence recipe. A sequence recipe determines which process recipes are to be run, at which modules to run them, and the order in which this is to occur. CCF provides a sample sequence recipe editor that can be used in creating sequence recipes or customized for each tool manufacturer’s needs.

Both process and sequence recipes can be created on the tool or downloaded from a factory host. CCF provides a handler that receives recipes from the host and stores them in the Recipe Server. Regardless of where the recipes are created, CCF’s Recipe Server stores the recipes locally and passes them in to the scheduler when a job is to be run. The Recipe Server allows recipes to be stored as Engineering recipes while they are being finalized. They can then be promoted to Production recipes for use in a production environment. 

By making use of recipes in CCF, you can ensure that your special sauce is applied to material processing to help make your tool one of the best-selling in history.

 

Topics: CIMControlFramework

Create a Scheduler Using CCF

Posted by Derek Lindsey: Product Manager

Dec 14, 2016 11:30:00 AM

Shel_Silverstein_MelindaMae.jpg

How do you eat a whale? One bite at a time. 

In a March 2016 blog post entitled CIMControlFramework Work Breakdown, Cimetrix listed eleven points to be taken into consideration when starting an equipment control application using CIMControlFramework (CCF). One of the things to consider is how you want to control the material moving through the tool – or scheduling of material. This blog post delves a little deeper into the scheduling aspect of equipment control applications.

Doctoral dissertations and entire books have been written to discuss scheduler theory. Because of sheer volume of information available regarding scheduling and scheduling theory, the topic can come across as a little (or a lot) intimidating. CCF aims to take the scare factor out of scheduling and allow equipment control application developers to fully control the movement of material through their tool.

CCF provides a framework for a reactive, rule-based scheduler. You, as the application developer and in conjunction with your customer needs, get to decide what decisions are important when creating your scheduler. One of the first things you need to do when developing a scheduler is to ask questions to help you determine the rules for scheduling. Some questions you may ask:

  • What is the most important thing I am attempting to accomplish with my scheduler?
    • Is throughput the most important?
    • Is path predictability most important?
  • How can I ensure that when I pick material up that there is a destination available to put it?
  • If two components need a robot to take action (pick up or place material), which action takes precedence?
  • Do the process chambers need to be prepared before receiving material to process?
  • What is the wafer flow scenario (is there a specific order that material must follow)?
    • Does the material need to be aligned before processing?
    • Does the material need to be acclimatized before processing?
    • Does the material need to be cooled before returning to the carrier?
  • Are there any maintenance tasks that have to be performed periodically in the tool that have to be accounted for in scheduling?

This is by no means an exhaustive list of questions, but these are the types of questions that need to be answered when creating your scheduler. 

The scheduling framework provided by CCF allows you to translate these rules into C# conditional statements. No proprietary scripting languages need to be learned. No specific configuration training is required. C# developers can use industry standard IDEs to put these rules into scheduling practice.

Once the scheduling rules are determined, it can still be intimidating to know how to start creating your scheduler. Cimetrix provides a few basic, as well as more advanced,of  scheduling labs that fully explain how to translate your rules into a functional scheduler. These labs can be completed in as little as a few hours. The labs explain the scheduling theory used by Cimetrix and allow users to create functional schedulers in a short amount of time. Many CCF users can create a working scheduler in one week.

Cimetrix also provides complete working examples of atmospheric and vacuum schedulers as part of CCF. Another lab provided by Cimetrix clearly describes how to start with one of these working schedulers and modify it to suit your scheduling needs.

The CCF scheduling framework allows the software to be hardware agnostic. In other words, it is not tightly coupled to device drivers. This allows tool manufacturers to change out hardware without having to make scheduler changes to support the new hardware.

Although scheduling may seem like an intimidating whale to eat, CCF helps break the tasks down into bite-sized chunks.

 

Topics: CIMControlFramework

EDA Metadata Conformance Testing

Posted by Derek Lindsey: Product Manager

Nov 15, 2016 11:00:00 AM

In a recent blog posting we introduced the topic of EDA (Equipment Data Acquisition) standards testing and sub-divided the domain into three parts:

  • Compliance testing – does the equipment adhere to the specifications described in the SEMI Standards?

  • Performance and stability testing – does the equipment meet the end users’ performance and availability specifications?

  • Equipment metadata model conformance testing – does the equipment model delivered with the interface represent the tool structure and content anticipated by the end customer?

Today’s post deals with the equipment metadata model conformance testing in greater detail.

The impetus for the metadata conformance requirement is SEMI Standard E164 – Specification for EDA Common Metadata. Although this standard is not part of the original core suite of EDA standards, it is now being required by GLOBALFOUNDRIES and a number of other major semiconductor manufacturers on EDA-enabled equipment.

The purpose of the standard “is to promote commonality among implementations by defining common representations and conventions of equipment metadata based on SEMI E125.” (Section 1.1 of E164)

In other words, conformance to E164 requires a consistent implementation of E125. All state machines required by the GEM300 standards must be implemented and use the same names for required events, parameters, state names and transitions. It requires that all process modules implement the E157 Module Processing state machine using specified names. As a result, E164 ensures a high level of implementation commonality across all equipment types. This commonality enables better automation of data collection processes across the fab, driving major increases in engineering efficiency. In summary, E164 is to EDA what GEM was to SECS-II.

Currently, the only E164 conformance tester is Metadata Conformance Analyzer (MCA) that was commissioned by Sematech and implemented by NIST (National Institute of Standards and Technology). In our discussions with potential users of an EDA test tool, most clients agree that the sooner a replacement can be created for MCA, the happier they will be.

In a previous post, we mentioned that Cimetrix has automated the EDA compliance evaluation procedures. We are also in the process of designing the performance testing components of this tester. The plan is to also create an E164 conformance tester that will replace MCA.

If you want to know more about EDA testing and/or discuss your specific needs or provide input on what you would like to see included for E164 conformance testing, contact Cimetrix for a demonstration of this exciting new capability!

 

Topics: Interface A, EDA

EDA Performance Testing

Posted by Derek Lindsey: Product Manager

Nov 1, 2016 1:00:00 PM

In a recent blog posting we introduced the topic of EDA (Equipment Data Acquisition) standards testing and sub-divided the domain into three parts:

  • Compliance testing – does the equipment adhere to the specifications described in the SEMI Standards?

  • Performance and stability testing – does the equipment meet the end users’ performance and availability specifications?

  • Equipment metadata model conformance testing – does the equipment model delivered with the interface represent the tool structure and content anticipated by the end customer?

    Today’s post deals with the performance and stability testing in greater detail.

In our discussions with EDA users (both OEM implementers and fab end users) about EDA testing, they all acknowledge the need for compliance testing. However, the vast majority have said, “If you can help me automate my performance testing, I would be able to save a huge amount of time.” Most thought they could reduce testing time from several weeks to just a couple of days.

Everyone has different ideas about what should be included in performance testing of their EDA software. Everyone can agree that generally they need to test if the equipment meets the end users’ performance and availability specifications in terms of data sampling intervals, overall data volume transmitted, size and number of DCPs (data collection plans) supported, demands on the computing/network resources, and up-time. They also need to know if the software will support the range of application clients expect in a production environment.

Data Volume

EDA users want to know the sheer volume of data that can be collected.

ISMI has reported in public forums that IC makers expect EDA to achieve data rates of 50+ variables per chamber at rates up to 10 Hz. In EDA specifications, IC makers have requested the ability to gather 1,000 to 2,000 parameters using data collection rates from 5 to 20 Hz, which translates to 40,000 values per second.

These rates are easily achievable with today’s computing platform technology, but users also want to know the upper limit. In other words, at what point does the ability to collect data break down?

Data Quality

EDA users want to know that the data comes in at the specified rates and that the values and timestamps that are received at those rates are accurate.

Resource Usage

EDA users want to know how different data collection rates and volumes will affect the system resources. Will memory usage be too high? How will different collection rates affect CPU usage? Is the network bandwidth sufficient for gathering the required data at the required speeds and still maintain high data quality?

In a previous post, we mentioned that Cimetrix has automated the EDA compliance evaluation procedures. We are also in the process of designing the performance testing components of this tester. The ISMI EDA Evaluation Method discusses some basic performance testing in Appendix B; performing these tests will be included in the feature set of EDA tester now under development. In addition, we are designing ways to help evaluate the data volume, data quality and resource usage of a given equipment.

If you want to know more about EDA testing and/or discuss your specific needs or provide input on what you would like to see included for performance testing, contact Cimetrix for a demonstration of this exciting new capability!

 

Topics: Interface A, EDA

XP is Dead, It’s Time to Move On

Posted by Derek Lindsey: Product Manager

May 19, 2016 1:00:00 PM

Its-dead-jim.jpg

When my daughter turned one year old, she got a very soft blanket as a birthday present. She loved that blanket and would take it everywhere with her. She couldn’t/wouldn’t go to sleep at night without it. When she got old enough to talk, she called it her special blanket or “spesh.” Needless to say, after many years of toting that blanket around, it started to wear out – in fact, it started getting downright nasty. She adamantly refused to part with it even though it was just a rag with little redeeming value.

A couple of years ago, Microsoft made the following announcement: “After 12 years, support for Windows XP ended April 8, 2014. There will be no more security updates or technical support for the Windows XP operating system. It is very important that customers and partners migrate to a modern operating system.”

In the immortal words of Dr. Leonard “Bones” McCoy from Star Trek, “It’s dead Jim!”

windows_xp-100154667-large.png

Many arguments have been proffered on both sides as to why users should stay with or move away from XP. Windows XP was first introduced in 2001. That makes the operating system 15 years old — an eternity in computer years. The main argument I see for upgrading from XP is that it is impossible to keep up with advances to the .NET framework and remain on the old operating system. By staying with XP, you are missing out on new features and technologies. These features include taking advantage of better hardware integration for improved system performance and being able to use 64-bit applications and memory space.

Since Microsoft no longer supports XP and no longer provides security updates for the OS, staying with XP is a security risk. Any security holes that have been discovered since Microsoft withdrew support have been ruthlessly targeted.

To come full circle, my daughter finally did give up the little rag that she had left of the blanket. I don’t remember what ultimately made her give it up. She is now 18 and a few months ago, we came across that small piece of her special little blanket that we had stored away. The rag brought back good memories, but we were both glad it had been retired. Isn’t it time to do the same with XP?

Topics: Microsoft, Software

CIMControlFramework Dynamic Model Creation

Posted by Derek Lindsey: Product Manager

Apr 14, 2016 1:00:00 PM

turkey-218742_960_720.jpg

Have you ever watched one of those cooking shows where the chef spends a lot of time whipping up the ingredients to some elaborate dish, and, when it comes time to put the dish in the oven to bake, there is already a finished one in there? If only the real world worked that way. Sometimes it would be nice to be able to go to the oven and have a delicious meal already waiting for you.

The Cimetrix CIMControlFramework™ (CCF) product is unique among Cimetrix products in that it not only provides source code, but also combines several other Cimetrix products (CIMConnect, CIM300, and CIMPortal™ Plus) and takes full advantage of all the features provided by each product.

One of the features of CIMPortal Plus that is used in CCF is the concept of an equipment model. The equipment model describes the data that your equipment provides through Interface A. The tool hierarchy is modeled along with all of the parameters, events, and exceptions published by the tool. It used to be that CCF users had to manually create the tool hierarchy in their base equipment model. CCF would then populate the model with the parameters, events, and exceptions. If the tool hierarchy changed, the base model would have to be modified. It made changing the tool configuration much more difficult.

Starting with the CCF 4.0 release, a base equipment model that is common to all equipment was installed. Generally, CCF users will not need to modify the base model. CCF takes advantage of the modeling API provided by CIMPortal Plus to dynamically add hierarchy nodes to the base model depending on the components that are created in CCF. This new feature makes it easy to change the configuration of the CCF tool because the user does not have to make modifications to the base model and redeploy the package to be able to run CCF.

The dynamically created model is also compliant with the SEMI E164 Common Metadata standard. This compliance is possible because of the dynamic nature of model creation. The required elements of E164 are added to the equipment model dynamically during the startup of Tool Supervisor.

Having a dynamically created Interface A model that exactly matches your equipment structure and is guaranteed to be E164-compliant without having to do any extra work is similar to going to the oven and finding a delicious dish already cooked and waiting for you.

Topics: EDA, CIMControlFramework, Product Information, Software

CIMControlFramework Work Breakdown

Posted by Derek Lindsey: Product Manager

Mar 15, 2016 1:00:00 PM

FirstStepofaThousandMileJourney1.jpg

“A journey of a thousand miles begins with a single step.” – Lao Tzu

“Watch out for that first step Mac, it’s a lulu!” – Bugs Bunny

These quotes by the great philosophers Lau Tzu and Bugs Bunny have more in common than would appear at first glance. At the beginning of a journey you have the element of the unknown. There is excitement that it could be a great journey, but there is also an element of the unknown that may make that first step the hardest to take. If you haven’t put in the preparation for a successful journey, that first step might be a lulu.

Similarly, when starting a new equipment control application, there is excitement for the great end product, but also some element of not knowing the best place to start. CIMControlFrameowrk (CCF) offers a great training program to get you started and many building blocks for helping create a first-class equipment control application. Even with these great starting tools, many users still have the question, “Where do I go from here?”

The first step is to create a work breakdown of what it takes to create a successful equipment control application. There will obviously be tasks that are unique to each equipment control application, but most applications have some common tasks or epic user stories that have to be completed during the project. The order in which these stories are completed may depend on milestones and expectations for when they are accomplished, but they generally all need to be completed during the project.

  • Integrate Devices – CCF provides an Equipment layer with abstractions of most commonly used devices. Integrating these devices into CCF only requires the implementation of the abstract interface.

  • Material Movement Through the Tool – CCF provides a flexible scheduler with complete working examples of different types of scheduling that could be done.

  • Implement the Process Module – CCF provides a process module interface that allows the rest of CCF to communicate with your process module – your secret sauce.

  • Create an Operator Interface (OI) – CCF provides an OI framework that allows commands to be sent and updates to be made. It also provides some default screens that use this interface. It also allows for insertion and use of custom OI screens.

  • Simulation – CCF provides a simulator that can be used in place of real hardware. The simulator can be used to deliver/remove material, perform robot moves, and do simulated IO. This is invaluable in continuing development before the hardware is ready or if there is limited tool time for the developers.

  • Recipes (Process Recipes and Execution) – CCF provides a recipe manager for passing recipes through the tool. The default recipe can be used or custom recipes can be added.

  • I/O – CCF provides ASCII serial drivers and other common IO providers. Custom IO providers can also be included in CCF.

  • Data Collection and Storage – Knowing what data to store and what medium to use for storage is recommended up front.

  • Factory Automation – CCF provides a fully integrated GEM, GEM300, and EDA implementation.

  • Diagnostics and Testing – The CCF logging package is a fantastic tool for debugging your application both on the tool and remotely.

  • Errors and Recovery – CCF provides an Alarms package for signaling of and recovery from error conditions.

By going through CCF training and creating a work breakdown of the tasks that need to be done for your equipment control application, you can ensure that your first step will be the foundation of a successful journey.

Topics: CIMControlFramework, Product Information, Software

Software Interfaces and API Method Signatures Should Remain Consistent During a Product's Lifecycle

Posted by Derek Lindsey: Product Manager

Jan 28, 2016 1:07:00 PM

TheMartian.jpg

I recently read The Martian by Andy Weir. Since this information comes out on the first page of the book, I don’t think I’m spoiling too much to say that it is the story of an astronaut, Mark Watney, who is lost in a space storm on a mission to Mars. He is presumed dead by his crewmates and abandoned on the planet. Of course he is not dead and he has to use training, skill, ingenuity, and luck to survive long enough to be rescued. Several times throughout the adventure, he has to connect life supporting utilities, tanks, airlocks, and vehicles together using the connecting valves supplied on each component. Watney says, “I’ve said this many times before, but: Hurray for standardized valve systems!” This is obviously a work of fiction, but what would have happened if he had tried to attach a holding tank to the ascent vehicle but the valves had changed between versions?

Software customers should be able to have the same expectation as Mark Watney that the valves don’t change during the mission. In the case of software, we aren’t talking about physical valves. Rather we are talking about software interfaces and API method signatures. In a real sense, the consistency of these software signatures are as mission critical as the standardized valve connections were for the astronaut in The Martian. Changing the method signatures, at the very least, requires that the users of the software have to rebuild their applications. Often times such changes require software users to have to requalify their entire tool. This places undue burden on the users of the software. Software users should be able to reasonably expect that the interfaces and API remain constant through the life of the mission (i.e. within the version of the software including minor releases and patches). A side note on this topic: If Cimetrix product management determines that a piece of software has a bug or does not conform to the SEMI standards on which our products are based, changes will be made to correct the problem. Similarly, if NASA determined that one of their connectors did not conform to the spec, they would immediately resolve the issue for the item that was out of spec.

The Cimetrix release versioning process (see our January 14, 2016 blog) allows Cimetrix personnel and Cimetrix software users to be aware of what backward compatibility guarantees are made for a specific version of Cimetrix software.

We would like our software users to be able to say, “Hurray for compatible software versions!”

Topics: Semiconductor Industry, Software

Receive Email Updates

Follow Us

Learn More About the
SEMI Standards

SECS/GEM

GEM 300

Interface A/EDA

PV2 (PVECI)