Wednesday, October 7, 2009
Joy of doing ASIC verification!!!
Verification is often treated as the step-child of design. Decade back verification was considered less critical task than design by some companies and fresher’s where often pushed in to verification It’s not surprising, then, that most of the verification engineers want to be designers. But now verification is more lucrative career option than design and many experience people now hold on to verification without moving to design. It is generally estimated that 70% of ASIC design cycle is spend on functional verification. The ratio of verification engineers to design engineers is approximately 3:1. Job switching for verification engineers is easy when compared to the designers provided they have the right skill set. The advancements happen in verification at a very fast rate that design.
Earlier verification job was looked down, as the design is what is getting taped out and moves in to mass production stage not the test bench. But verification requires lot more effort and skills, example, to test a 100 line state machine we need to develop testbench which will have atleast 500 lines of code and draft a test plan which covers all the possible scenarios. VIP development companies get their revenues from their testbench which is licensed and shipped as a product.
Reasonably experienced person will know that building a re-useable system level verification environment and verifying the design without any post silicon bugs is more difficult than adding a glue logic in the design.
Do you still believe verification is a less critical task and requires lesser expertise that design?
Earlier verification job was looked down, as the design is what is getting taped out and moves in to mass production stage not the test bench. But verification requires lot more effort and skills, example, to test a 100 line state machine we need to develop testbench which will have atleast 500 lines of code and draft a test plan which covers all the possible scenarios. VIP development companies get their revenues from their testbench which is licensed and shipped as a product.
Reasonably experienced person will know that building a re-useable system level verification environment and verifying the design without any post silicon bugs is more difficult than adding a glue logic in the design.
Do you still believe verification is a less critical task and requires lesser expertise that design?
Sunday, September 27, 2009
Score board architecture !!!
How would you implement the following requirement of designing a scoreboard ?
2) Scoreboard should be able to handle packet drops.
Just extend the VMM data stream scoreboard and implementing few virtual methods like transform, quick_compare & compare. use expect_with_losses() method for requirement (2). The requirement (1) can be implemented easily with transform method ().
VMM has much more robust features to offer, it is a good idea to check out the features available in VMM score board before coding your own scoreboard, i believe it would save lot of time.
Sunday, September 20, 2009
How do you identify a good functional verification engineer?
Evaluation based on product success:
The answer looks straight forward, at the end of an emulation effort, chip tape out & chip production. If there are no functional bugs and design works as expected then obviously the person who has verified the design is a good verification engineer.
The above statement has a rider; the above result can be produced in three circumstances
1) Good designer, bad verification engineer & very low bug rate.
2) Bad designer, good verification engineer & very high bug rate.
3) Re-used design which is silicon proven & no bugs.
If your chip taped out successfully without any functional issues with scenario 2 , then you have identified a good verification engineer.
Evaluation based on process success:
You wrote a verification environment, found lot of design bugs, now you need to verify a design enhancement which requires changes in your earlier verification environment. Now effort required to do the changes in your verification environment depends on reusability of code you have written earlier. If you are able add the enhancement within a short span of time with little code changes you are on track to be identified as good verification engineer.
Evaluation when the design success is not immediately visible:
This type of scenario is seen in VIP development where the development is verified internally and product release is done for the customers use.
The only way to identify a good verification engineer under this scenario is “customer bug rate over a fixed time say 12 months” Vs “internal bug rate during development”. If there are no customer bugs for a long period of time on the feature, you have identified a good verification engineer.
Evaluation based on knowledge of language /methodology /protocol:
This is the method most of the people use for identifying a verification engineer during hiring in a new organization. If the person has good coding experience in projects, he will be having a very good command on verification languages and will be well versed with the intricacies of the language. Asking a person to write a code for a given scenario will test his knowledge of the language and test his problem solving skills. Testing a persons knowledge of protocol will help us to know how well he has understood the protocol and used the knowledge in verifying the design.
Evaluation based on moving with technology:
This is very important aspect in today’s industry. The verification methodology, tools improvements happens at a very fast rate from the EDA vendors helping the verification engineers to reduce the time on verification. A good verification engineer will definitely keep himself updated on the new aspects of functional verification which is good sign in identifying a good verification engineer.
I have also in my past come across some sense less interviewers evaluating verification hires on his knowledge of digital electronics and CMOS. Does a verification engineer use digital design or CMOS for architecting or writing his test bench?
The answer looks straight forward, at the end of an emulation effort, chip tape out & chip production. If there are no functional bugs and design works as expected then obviously the person who has verified the design is a good verification engineer.
The above statement has a rider; the above result can be produced in three circumstances
1) Good designer, bad verification engineer & very low bug rate.
2) Bad designer, good verification engineer & very high bug rate.
3) Re-used design which is silicon proven & no bugs.
If your chip taped out successfully without any functional issues with scenario 2 , then you have identified a good verification engineer.
Evaluation based on process success:
You wrote a verification environment, found lot of design bugs, now you need to verify a design enhancement which requires changes in your earlier verification environment. Now effort required to do the changes in your verification environment depends on reusability of code you have written earlier. If you are able add the enhancement within a short span of time with little code changes you are on track to be identified as good verification engineer.
Evaluation when the design success is not immediately visible:
This type of scenario is seen in VIP development where the development is verified internally and product release is done for the customers use.
The only way to identify a good verification engineer under this scenario is “customer bug rate over a fixed time say 12 months” Vs “internal bug rate during development”. If there are no customer bugs for a long period of time on the feature, you have identified a good verification engineer.
Evaluation based on knowledge of language /methodology /protocol:
This is the method most of the people use for identifying a verification engineer during hiring in a new organization. If the person has good coding experience in projects, he will be having a very good command on verification languages and will be well versed with the intricacies of the language. Asking a person to write a code for a given scenario will test his knowledge of the language and test his problem solving skills. Testing a persons knowledge of protocol will help us to know how well he has understood the protocol and used the knowledge in verifying the design.
Evaluation based on moving with technology:
This is very important aspect in today’s industry. The verification methodology, tools improvements happens at a very fast rate from the EDA vendors helping the verification engineers to reduce the time on verification. A good verification engineer will definitely keep himself updated on the new aspects of functional verification which is good sign in identifying a good verification engineer.
I have also in my past come across some sense less interviewers evaluating verification hires on his knowledge of digital electronics and CMOS. Does a verification engineer use digital design or CMOS for architecting or writing his test bench?
Saturday, September 12, 2009
Verification effectiveness!!!
“Functional verification takes 70% of the chip design cycle”. Writing test plans, Writing reusable verification environment, writing assertions for the design, debugging RTL failures, attaining code coverage and functional coverage goals & gate level simulation and debug are some of the common activities a functional verification engineer goes through in project life cycle before tape out. The work of verification engineer exponentially increases if the design under test has more number of bugs, which involves lot of RTL debug effort. Metrics on which a verification engineer is evaluated is on “How many bugs where hit during functional verification” Vs “bugs hit during emulation/post silicon validation” Even a single post silicon functional bug indicates the ineffectiveness in the functional verification.
If you are verification engineer and you feel that you have hard time in meeting your schedules and you work for more hours in office (say more than 8 hours) to meet the dead lines. Following are some effective ways i used to meet the dead lines without compromising on work quality or the compromise on working hours.
1) Micro schedule your tasks with effort estimates and get an approval on the time line from your manager.
2) Whenever scope of the work is increased / decreased re-schedule the effort estimates and keep your manager updated about this.
3) Prioritize your tasks and complete one after the other.
4) Whenever you are writing a test bench make sure your test bench is reusable. This can help you to minimize your work at a later point of time.
5) Always try to use module level verification environment at the system level. Maximum effort you need to put is the integration effort from module level to system level.
6) Always write random verification environment to test your design, most of the bugs are easily captured by random verification environment. Write a directed test case only if it is absolutely required to hit a functional coverage / code coverage hole.
7) Always move with the market, try learning and using new technology which will overall reduce the verification effort. Example adoption to tested methodology like VMM or OVM might be initially tough but on a longer run it will reduce your time to verification.
8) Always file a bug when you hit design issue, this is very important because this is the only metrics on which a verification engineer gets evaluated. Also top level management will know the effectiveness of the verification and schedule slips due to design issues.
9) Always keep your manager updated with status of your task so that he will be in position to evaluate your bandwidth for future tasks.
10) Never compromise on testing a DUT feature due to lack of test bench support, this may lead to emulation/post-silicon bugs.
11) Always keep the code coverage analysis towards the end of the project after your function coverage goals are met.
12) When you are stuck with a problem, do not work continuously to fix the issue this will increase the stress level and you will end up spending more time hitting on the areas around the issue. Take a break from the issue and come back with a fresh mind.
13) Try to understand the code base relevant for the enhancement or the modification, spending time on understanding overall code base has diminishing returns.
Does the functional verification engineer get rewarded for his verification efforts is really a question mark and largely dependent on the company you work for.
If you are verification engineer and you feel that you have hard time in meeting your schedules and you work for more hours in office (say more than 8 hours) to meet the dead lines. Following are some effective ways i used to meet the dead lines without compromising on work quality or the compromise on working hours.
1) Micro schedule your tasks with effort estimates and get an approval on the time line from your manager.
2) Whenever scope of the work is increased / decreased re-schedule the effort estimates and keep your manager updated about this.
3) Prioritize your tasks and complete one after the other.
4) Whenever you are writing a test bench make sure your test bench is reusable. This can help you to minimize your work at a later point of time.
5) Always try to use module level verification environment at the system level. Maximum effort you need to put is the integration effort from module level to system level.
6) Always write random verification environment to test your design, most of the bugs are easily captured by random verification environment. Write a directed test case only if it is absolutely required to hit a functional coverage / code coverage hole.
7) Always move with the market, try learning and using new technology which will overall reduce the verification effort. Example adoption to tested methodology like VMM or OVM might be initially tough but on a longer run it will reduce your time to verification.
8) Always file a bug when you hit design issue, this is very important because this is the only metrics on which a verification engineer gets evaluated. Also top level management will know the effectiveness of the verification and schedule slips due to design issues.
9) Always keep your manager updated with status of your task so that he will be in position to evaluate your bandwidth for future tasks.
10) Never compromise on testing a DUT feature due to lack of test bench support, this may lead to emulation/post-silicon bugs.
11) Always keep the code coverage analysis towards the end of the project after your function coverage goals are met.
12) When you are stuck with a problem, do not work continuously to fix the issue this will increase the stress level and you will end up spending more time hitting on the areas around the issue. Take a break from the issue and come back with a fresh mind.
13) Try to understand the code base relevant for the enhancement or the modification, spending time on understanding overall code base has diminishing returns.
Does the functional verification engineer get rewarded for his verification efforts is really a question mark and largely dependent on the company you work for.
Sunday, August 9, 2009
six reasons why you should use system verilog for verification !!!
1) System verilog is an IEEE standard supported by multiple vendors, your code is portable across simulators.You are not tied to a single vendor, which is the case if you are using HVL like VERA/NTB/SPECMAN.
2) Free open source standard verification methodologys like VMM & OVM are available which can be used with system verilog.
3) Simulation speed will improve,if you are an HVL user using VERA/SPECMAN for verification.
4) Most of the VIP vendors support system verilog, building verification environment for SOC will not be an issue.
5) System verilog interoperability layers are available, you can re-use you VERA/NTB code from system verilog,similar arrangement is available for specman user too.
6) System verilog supports most of the constructs supported by HVL ( VERA/NTB/SPECMAN) , migration to sytem verilog for HVL users will not be an issue.
2) Free open source standard verification methodologys like VMM & OVM are available which can be used with system verilog.
3) Simulation speed will improve,if you are an HVL user using VERA/SPECMAN for verification.
4) Most of the VIP vendors support system verilog, building verification environment for SOC will not be an issue.
5) System verilog interoperability layers are available, you can re-use you VERA/NTB code from system verilog,similar arrangement is available for specman user too.
6) System verilog supports most of the constructs supported by HVL ( VERA/NTB/SPECMAN) , migration to sytem verilog for HVL users will not be an issue.
Sunday, July 19, 2009
Learning curve for a verification engineer !!!
For a verification engineer, which of following work environment gives him maximum learning opportunity ?
1) IP verification
2) SOC verification
3) Verification IP developement
4) Verification consultancy
I will try to evaluate each of these work environment.
According to me,the skill you accuire on doing your day to day activities at your work place should match the requirements of the industry and should be portable across companies. (i.e) Assume you are doing an assembly level verification for a processor design using internal tools of that company, the methodology and tool knowledge is limited to that particular company and skill is not portable between companies, then the skill you accuired is not marketable, hence the learing curve is minimum in this case.
We can find our self in above scenario in verification consultancy work environment where we have very little control of job nature , implementation flexibily is also minimal in this case. One good thing about this work enviroment is you will get your hands dirty on different type of projects and you will rarely find your self struck with the same project. In comparitive scale verification consultancy work enviroment gives verification engineer moderate learning curve.
Verification IP developement requirement and process are different from RTL developement and verification. In this kind of work environment we have very good learning curve on the new verification methodology , we can improve our knowledge on different languages as VIP are developed in single language but controlled through different languages like VERA/NTB/VERILOG/SYSTEM VERILOG /C. you can gain good protocol knowledge by developing the VIP and stay updated on the developements in the protocol. One of the draw backs in VIP developement work environment is your initial learning curve will be steep, after few years most of you work will be just bug fixes and some times VIP enhancements. In comparitive scale VIP developement work enviroment gives verification engineer moderate learning curve.
SOC verification work environment is different, as the verification is done on proven design IP,finer points in protocol are generally ignored. One good thing in SOC verification is we end up working on different type of protocol and interfaces. In comparitive scale SOC verification work enviroment gives verification engineer good learning curve provided he works on different interfaces every project.
1) IP verification
2) SOC verification
3) Verification IP developement
4) Verification consultancy
I will try to evaluate each of these work environment.
According to me,the skill you accuire on doing your day to day activities at your work place should match the requirements of the industry and should be portable across companies. (i.e) Assume you are doing an assembly level verification for a processor design using internal tools of that company, the methodology and tool knowledge is limited to that particular company and skill is not portable between companies, then the skill you accuired is not marketable, hence the learing curve is minimum in this case.
We can find our self in above scenario in verification consultancy work environment where we have very little control of job nature , implementation flexibily is also minimal in this case. One good thing about this work enviroment is you will get your hands dirty on different type of projects and you will rarely find your self struck with the same project. In comparitive scale verification consultancy work enviroment gives verification engineer moderate learning curve.
Verification IP developement requirement and process are different from RTL developement and verification. In this kind of work environment we have very good learning curve on the new verification methodology , we can improve our knowledge on different languages as VIP are developed in single language but controlled through different languages like VERA/NTB/VERILOG/SYSTEM VERILOG /C. you can gain good protocol knowledge by developing the VIP and stay updated on the developements in the protocol. One of the draw backs in VIP developement work environment is your initial learning curve will be steep, after few years most of you work will be just bug fixes and some times VIP enhancements. In comparitive scale VIP developement work enviroment gives verification engineer moderate learning curve.
SOC verification work environment is different, as the verification is done on proven design IP,finer points in protocol are generally ignored. One good thing in SOC verification is we end up working on different type of protocol and interfaces. In comparitive scale SOC verification work enviroment gives verification engineer good learning curve provided he works on different interfaces every project.
In case of IP developement work environment , your learning curve on the protocol will be good.The test plan and implementation will touch the finer points in protocol verification. In comparitive scale IP verification work enviroment gives verification engineer good learning curve provided his company has migrated to system verilog or HVL based verification.
The best scenario is to have at least a few years of experience in all the type work environments, so that you can get good experience in verification methodology, verification tools,languages & different protocols.
Thursday, April 30, 2009
VMM Planner
So what is VMM planner ? VMM planner is a tool which can atomatically annotate the functional coverage and code coverage from regression runs and present the data in a XLS or XML format. It associates the test plan to the test result automatically. The planner can be used for managing verification effort for any project. The basic requirement for using VMM planner is we need to have complete test plan mapped to a functional coverage model. Once we have the plan as an XML or HVP document we can automaticlly annotate the test result using HVP commands. Some of the user provided metrics like bug count , test pass/fail count can be provided to planner tool using the userdata command .We can use VMM planner to report verification status to top level management.
Subscribe to:
Posts (Atom)