Tuesday, December 22, 2009

Enabling Faster ABV – new initiatives

Assertion Based Verification has certainly been one of the mostly debated topics over the last half-a-decade. So much so that one of the past DVCons was full of SystemVerilog & PSL papers on ABV that someone commented it is “ABV conference” than DVCon (was it DVCon 2005?2006?)

Even then the adoption rate has been slower than expected – agreed by many stats, EDA folks etc. A relatively new EDA vendor is addressing it via some tools, see: http://www.zocalo-tech.com/index.php 

Not clear how easy it will be and how much ROI users will see, but an interesting development I must say.

With SystemVerilog 2009 LRM adding checker constructs we predict that the adoption of OVL-like libraries should dramatically improve. We explained it thoroughly in our SVA handbook 2nd edition, see: http://www.cvcblr.com/blog__resources 

We have a DVCon 2010 paper on this very topic, see: http://dvcon.org/events/eventdetails.aspx?id=108-3 

Let’s see how fast the checker gets implemented by EDA tools, we used VCS for our new book http://www.systemverilog.us/sva_info.html

Hope 2010 brings ABV more and more into RTL engineer’s desktops!

What is there in a number? No, it is not numerology – rather EDA marketing fun!

For those of us who have been following the EDA marketing over several years, it is no surprise that there are dedicated marketing professionals within big EDA companies focussing on conveying message/confusing the ecosystem if needed (unfortunately). We have several anecdotes starting from “VHDL is dead” back in 2003 (http://www.eetimes.eu/uk/17408257) and guess what, last month we had a full-house “Advanced VHDL TB class” (http://www.cvcblr.com/blog/?p=86) and another one being scheduled in Jan 2010. I don’t intend to blame any single entity/individual for this, rather this is how it works, and those of us who have seen it for years understand it. Another classical case was for IEEE-1850 PSL – it is alive and kicking with becoming part of recent VHDL as well. Though not much development on PSL itself, but it is expected to stay for much longer than what some folks have predicted. Need a proof – name an EDA vendor without support for PSL – Mentor, Cadence, Synopsys, Aldec – all have them. It will be foolish to predict that all of these marketing teams went wrong with their predictions – if PSL were to be short-lived why have every EDA vendors invested in it?

Fast-forwarding to present day, recently SystemVerilog VMM 1.2 has been released (http://www.cvcblr.com/blog/?p=91) after a relatively longer incubation/Beta period than usual. And almost instantaneously we find Tom’s analysis at http://tinyurl.com/vmm12-20-ment – True VMM 1.2 has lots and lots of new stuff and even the old features have newer implementations (parameterized versions of channel etc. – maybe they were in VMM 1.1* as well?). But to the user community I believe this is a good thing – we are slowly seeing a sign of convergence to a CBCL becoming reality. Yes today VMM can run on 3 EDA tools and so is OVM. But how well do they interop? Ask Ashsih from Nokia Bangalore, he will tell you the horror stories he had since last 1 year or so.

Recently Accellera VIP-TSC established an inter-op kit, we saw that during recent SVUG here in Bangalore, see: www.svug.org for archives.

More recently (after the SVUG Bangalore event), the VIP-TSC has proposed a new name for this CBCL - “UVM” (No, not URM, rather UVM – fortunately this name has been spared so far by vendors). How this will shape up will be known in coming days, weeks, months if not years!

But it is clear that it will contain contributions from VMM & OVM and hopefully will run on all tools too. Having closely observed both OVM and VMM (1.2 including), there is easier migration path from OVM to VMM 1.2, if needed and vice versa, infact we present that as a handout to out regular training attendees who take up one methodology during training and pick up the other on the go!

With VMM 1.2 (Or 2.0, as per Tom) having similar concepts as OVM the creation of UVM should be lot simpler – we hope. Let’s see.

BTW, there is OVM 2.1 around the corner, should it be re-numbered? Anyone? Vaastu? Numerologists? Mentor is arranging a private Webinar for its valued partners for OVM 2.1 updates, so we should see another blog soon.

To me it is clear that the individual development efforts/bug fixes to both OVM & VMM will continue atleast till UVM 1.0 (??) emerges.By then will we see VMM 1.4? OVM 2.5? Anybody’s guess!

Enough on numbering! Let’s start the convergence, hope 2010 is a luck number for SystemVerilog enthusiasts as UVM should see its birth! Maybe Santa is granting UVM as a gift to SystemVerilog professionals :-)

More on UVM as we hear..

Thursday, December 17, 2009

SystemVerilog code automation from Puneet

Good news for all those Emacs + SystemVerilog users. Puneet has just now released his SV Snippet for Emacs, see:

http://coverification.org/2009/12/17/systemverilog-snippets-for-emacs/

Will certainly try it out ASAP. Good start Puneet, keep it up. Thanks for sharing it!

Wednesday, December 16, 2009

Breakdown of Verification effort – Debug, Debug & more Debug..

Interesting analysis of how Verification effort is being spent across industry:

http://tinyurl.com/dbg-it-man

(See the pie-chart, Figure 2). It goes very much inline with what we have been hearing from customers, competitors and also from our own own experience. So DEBUG is THE area if one were to automate within Functional Verification. I’m little surprised to see a 15% spent on ENV – perhaps it is the case for modern SystemVerilog/VMM/OVM stuff, but again that’s for the initial period I suppose. My belief is if you reuse VIPs, leverage on previous code and hire the right candidate, the ENV creation can be handled within 10% The testcase development is shown as 18%, not clear if some of it spread into the coverage bucket (another 15%) – as there is a strong correlation among the two anyway. I believe this is where technologies like Breker’s trek http://www.cvcblr.com/blog/?p=89 becomes interesting.

 

On the debug – the good old Novas/SpringSoft is still the leader with Siloti, Verdi and Debussy. Though I’m little disappointed at their SystemVerilog solutions – personally I would have liked more innovation on that space from these debug GURUs. They do have “log/transaction display”, but am sure more is in pipeline. A new company http://www.vennsa.com/product.html is showing up at places, will be interesting if anyone locally is using it. It will be worth getting some true success stories to see what exactly it automates.

Staying on the debug – I personally believe lot of these automation originate inhouse at customer sites. For instance during our Ethernet Switch/Router Verification monster, we created several scripts, plots etc. to do intelligent failure analysis (http://www.iec.org/pubs/print/verification_toc.html). Also our recent work with a local SAN customer resulted in visualizing AVL trees from running simulation. See: http://www.cvcblr.com/blog__resources and http://www.snug-universal.org/asia/india09_V1_Abstract.pdf

And then we had this SystemVerilog memory blow-up debug case, http://www.cvcblr.com/blog/?p=29 – so for now Debug continues to fascinate us the most!

Drop me a note if you would like to explore how you can automate your debug challenges.

Happy Debugging!

Tuesday, December 15, 2009

Formal Verification – Model Checking case study from SUN & Jasper – excellent read, to refer..

 

In case you missed it: http://chipdesignmag.com/display.php?articleId=3723

I mentioned this during our recent Advanced VHDL TB class (http://www.cvcblr.com/blog/?p=86) during PSL session and attendees were very interested. Today I got a mail back from Chandramohan asking for the link, sent to him and read it once again (must admit, not in full indepth PCI-e level). Overall an excellent paper, perhaps a strong candidate for a DVCon Best paper award – real design bugs/scenarios listed..truly worth reading.

Such a nice paper didn’t have to have the following on simulation

Simulation, the alternative, brute force approach, ends up wasting resources and introduces additional risk. Even for cases where you think you understand the full state-space, it requires huge effort to develop a test strategy, e.g. complex test scenario with nested loops etc. Manual effort and test are required. Simulation cycles are long and regression test after modifications is slow. Furthermore, the designer generally has to edit down the simulation and remove certain combinations, without absolute knowledge of whether these are important or not. It is hit-or-miss because no design or verification engineer can enumerate all of these combinations.

With due respect to the authors – they seem to be cornering SIM way too side..One can forget the use of intelligent stimulus generation, adopting functional coverage, sequences, virtual sequences and even better the all new Trek (www.brekersystems.com) – their examples do contain similar PCIe stuff and it is quite powerful too. So let’s not write-off simulation, agreed – if and when formal works it is a great technology, but not at the cost of simulation..

VMM 1.2 is out…finally

OpenSource VMM 1.2 is finally out, see vmmcentral.org – we have been mentioning it to many of our training attendees as “it is coming, it is coming”..now it is HERE!!

 

One of the greatest challenges we face is when our previous SystemVerilog/VMM attendees attend our newer classes (for upgrade, learn other methodology etc.) – they get very confused about the VMM channel (old way) vs. the new TLM way. The put/get definitions were simple, elegant, ready to use for first timers in VMM 1.0*. True the TLM adds lot of value, but existing users are finding it hard..This is where we folks like CVC fit in I suppose, so no complaints..

 

Enjoy and welcome the TLM way!

Sunday, December 13, 2009

VMMing of a VHDL-C based Environment, anyone?

Recently @VGuild Mike asked”

 

Does anyone use ModelSim's FLI for verification? What are the pros and cons of this?
I've been considering adopting SystemVerilog for writing test environments (we code our designs in VHDL and use PSL for assertions and functional coverage) but, from what I can gather, instead of SV I might as well just use ModelSim's FLI and write sophisticated testbenches in C. As an engineer, I am already very familiar with C, and so learning another language for verification (SV) is not desirable.
I suppose SV is more portable to other tools, rather than relying on ModelSim's FLI. And I suppose SV is supported by frameworks such as OVM. Other than that, why not use C/C++ as your verification language with the FLI?

Though the entire EDA marketing machinery is strongly biased on SystemVerilog, let’s realize that there is a sizeable population using VHDL, C etc. Few pointers for those unconvinced:

So what’s the solution for VHDL users, looking at high-end Verification stuff? Is SystemVerilog THE only way? We at CVC believe SystemVerilog is “A way”, not necessarily THE ONLY way. For instance PSL becoming part of VHDL makes it a string candidate than SVA for VHDL users (yes even with recent SVA-09 features included http://www.systemverilog.us/sva_info.html, PSL’s LTL is long proven, well supported than SVA-09). I hear recently more momentum towards PSL from local VHDL users.

So coming back to Mike’s topic – few suggestions:

  • No single-size fit all solution
  • FLI is a choice if for foreseeable future Modelsim is bought over by your employer. But if there is any question of portability (given that there are strong contenders, pricing factor – did we not hear of BIG EDA vendor slashing prices like crazy – much like Magma style, but for verification?
  • I highly recommend to look at VHPI than FLI as it is IEEE standard and well supported by tools like VCSMX, IUS, Aldec (Riviera for sure, Active-HDL too I guess, anyone to confirm??)
  • For SystemVerilog like features – explore www.trusster.com for TEAL/TRUSS – akin to VMM/OVM without all bells and whistles, but provides a baseline and is FREE!! Can even run with Icarus for Verilog, hurray!!

So choose the right tool for the right job..

Breker’s Trek @DAC and CVC’s engagement so far..

Another piece partly covered in Cooley’s report, but for those interested in full details (more technical updates coming in soon)..

Here are my (and my team, who is looking at it closely during an eval) observations on Breker's Trek tool. What we really like about this tool is that it an add-on to any existing methodology/environment (atleast we look at Verilog, SystemVerilog, VMM & OVM for now). Their marketing is also quite good in saying we solve the last 20% of the problem (which usually is the pain-point) though it needs to be proven (our eval is still in early stage). The BNF syntax looks interesting and for the uninitiated it may take a while, but certainly no big deal. We can appreciate the value such a tool brings in for testcase generation. However they claim to be eliminating the need for complex checkers - this is something we are still wary about and would like to delve deep into during the eval. In our view the checker part is hard and will be hard even with Trek. Our view of this feature of Trek is it is an ability to correlate the testcase and coverage to the checking mechanism - hopefully at a higher level of abstraction. If this can be achieved we would be glad with that. The coverage results annotation and reachability analysis part is really promising as it presents the test-coverage at a higher level of abstraction than traditional SV. In SV world one needs to code the complex covergroups, code/generate tests, correlate them and then view lower level coverage data (GUI/HTML/TEXT) to extract same kind of information.

Thanks

Shalini, CVC Pvt Ltd

Our NuSym updates from DAC and around..

Some of you might have seen our report of DAC from John Cooley. Here is our full version of NuSym report for those interested. Trek to follow (wiht more updates after the DAC report was sent out)..

We at CVC have been tracking Nusym's technology for a while. I visited their booth & demo and here are our (mine combined with my CTO's inputs) comments/impressions. While the generation of additional tests/filling holes is a critical piece of its features, I believe the coverage analysis feature is not so well published/well understood.

With our customers who are serious about coverage, the analysis of coverage holes has been one of the biggest pains. The sad part is no major EDA vendor is really adding features to enhance that, with Nusym addressing that problem, it is certainly very useful. The techniques they showed in their slides/demo are not truly path breaking but simple ones that can aid in not wasting time with unreachable coverage holes. It is that simplicity that made me very interested in their stuff. But it is unclear if the tool can go beyond the "static analysis" part and offer more sophisticated means to analyze/exclude coverage holes.

While the major EDA vendors claim to address this challenge, much is yet to be done with say minor changes to RTL, then the whole analysis goes invalid and has to be repeated - much manually. It will be great if Nusym can address that.

The other not-so-much-spoken feature is their "replay" technique - perhaps it is still maturing, but sure enough it is one of those very useful techniques in regression runs.

Thanks and Regards,

Shalini Pandey

CVC Pvt Ltd

Tuesday, December 8, 2009

What are your painpoints with SystemVerilog ABV adoption?

While there is so much talk about ABV in the market, the adoption is still far less than desired/expected by the buzz! Harry Foster from Mentor tries to find some rationale in his new blog at:

http://blogs.mentor.com/verificationhorizons/blog/2009/12/06/abv-and-people-from-missouri/#comment-6

 

Here is what we from CVC feel about it (also added as comments in that blog).

>> What are the obstacles you see to adoption?

Major one I hear from RTL folks often is the verbosity associated (as of SystemVerilog 2005) with using OVL-like libraries. Especially existing users of 0-in checkerware are so much pampered by the ease of use and the value it adds - though their management may have the extra $$ as concern - it is hard for them to appreciate the need to type-type-type the “clock, reset” mundane stuff! It was all being “inferred” so far and suddenly come a standard language/implementation such as SVA and that takes them back in history! Refer to AMD’s excellent presentation on OVL TC for a proof! True, the new (very new I must say) “checker” construct along with $inferred* takes care of it (sigh… it lacks $inferred_enable). We cover these in our recently published SVA Handbook 2nd edition (http://www.systemverilog.us/sva_info.html) and also in upcoming DVCon 2010 paper.

Cheers
Srini
http://www.cvcblr.com

 

What do you have to say? Please comment, your views will hopefully help in shaping up future SystemVerilog standard!

Adv VHDL Testbench training - Aldec-South Asia begins with a BANG!

For those who missed it, see:

http://www.aldec.in/Company/News.aspx?newsid=34678573-19e6-45a3-99d5-9d5b6accda6c 

This is a significant move I would say as it reinforces few facts:

  • Industry is slowly recovering (Hurray!!)
  • India/SouthAsia is gaining more and more importance as a wide customer base – apart from major EDA vendors, others are setting up their own centres, driving investments etc.
  • India as such provides a vibrant FPGA market and there is enough to tap onto it for EDA vendors!

Recently Aldec-SA conducted a 2-day seminar on “Creating efficient Testbenches using VHDL”. CVC did the delivery of this seminar, being VHDL & Verification experts.

377

 

We got very good feedback from this event, here is a sample:

**** Straight from customer ************

  Hello Sir,

I am Ramesh R Nair, working in Continental Automotive as an ASIC verification engineer as part of my internship programme of M.Tech(VLSI).

I have attended your  training class on test bench writing last week(ALDEC).

Although we are writing a lot of test benches some utilities were unnoticed.. you bring those things to light.

So it  was very helpful and i shared it with my team members.

Thank you and Congratulations.

Best Regards

Ramesh R Nair

Continental Automotive Components

Bangalore

*****************************

It is always great to hear feedback from customers and it gets better if it is a positive one :-)

Wednesday, December 2, 2009

What’s wrong with the present ABV promotion?

If you have not heard of the buzz word “ABV” (and assuming you are a VLSI front-end engineer of-course) you must be living in a different world I must say (no pun intended) – with so much marketing around it is hard to have missed it – with SystemVerilog Assertions, PSL, OVL etc.

Despite that there are some folks who say the adoption is not as much as predicted – heard it from Adam Sherer earlier this week here in Bangalore and now read it on: http://www.edadesignline.com/showArticle.jhtml?articleID=221901260 

 

Well I for one don’t believe this is fully true – atleast in India/AsiaPac – CVC has done well with ABV, we have developed PSL based MIP (Monitor IP) for Taiwan customers, got paid, and delivered several customer trainings on it etc. Though the recent focus has been more on OVM/VMM/VSV – SVA is still making money I must say. It is entering FPGA domain well, see recent ModelsimDE release, Active-HDL supporting ABV for long time now for FPGA domain etc. And just today we introduced PSL to a large customer base (VHDL house) and it is well received among engineers.

 

But – yet I agree to some extent, there are challenges with it – that prevents it becoming “Mainstream”. For one we don’t have good tool support to verify Assertions standalone. There were early starts with “Assertion Studio” http://www.systemverilog.org/pdf/AT_HDL_Symposium.pdf but it is no longer to be found! This is to say – I don’t have RTL, no TB, just write SVA/PSL – can I visualize/verify them standalone? Formal tools can in principle do it, I have seen it with Magellan whilst at SNPS, but internally. Not sure if IFV can do it, Jasper can do it etc. Even if they do – it is too expensive an option I guess!

Secondly – I believe there is a lack of good “reference” – to common “templates”. We tried addressing it in our SVA book via dictionary, but being a big book not sure how many used that part of the book. Didn’t hear much from customers on that.

 

While doing trainings I always felt this would be a perfect fit for an animation based demo/training, tried prototyping it with a publishing house, effort dried off due to lack of commitments/funds. Also – all said and done, the language features as they exist are INADEQUATE to express temporal behaviors intuitively. This is even with SVA-09 features. If I were to re-design a language for it (read it as: if I had all the time and money needed for it) I would develop a from-scratch, user driven means for it, than language/tool imposed restrictions dominating the definition. Just a sample:

“Variable delays” are not allowed in SystemVerilog Assertions – give me a break…(No I don;t need a work-around, I can send you if needed, drop me an email).

Tuesday, December 1, 2009

Sub $5000 high-end Mixed HDL simulator - VHDL+Verilog+SV-Design

 

http://edageek.com/2009/11/16/vhdl-verilog-xilinx-secureip/

Not a bad news after all – given that the industry is showing signs of recovery, such offerings are GREAT indeed – during the downturn several mergers, IP accumulation, consolidation have happened. That might have led to mix of languages in new SoCs. Usually the cost of ownership of a full fledged Mixed-HDL simulator (from any of 3 big EDA vendor) costs a lot (some say a “fortune” though I disagree). But this Riviera offering is certainly encouraging indeed.

But is this a “sign-off” tool? Anyone?

And BTW – in DAC they announced $1995 package for Active-HDL with similar support, so it is real!

http://www.aldec.com/Company/News.aspx?newsid=c86c2ee8-5490-4eae-b61e-a7c0aaf7396c

Sub $5000 high-end Mixed HDL simulator - VHDL+Verilog+SV-Design

 

http://edageek.com/2009/11/16/vhdl-verilog-xilinx-secureip/

Not a bad news after all – given that the industry is showing signs of recovery, such offerings are GREAT indeed – during the downturn several mergers, IP accumulation, consolidation have happened. That might have led to mix of languages in new SoCs. Usually the cost of ownership of a full fledged Mixed-HDL simulator (from any of 3 big EDA vendor) costs a lot (some say a “fortune” though I disagree). But this Riviera offering is certainly encouraging indeed.

But is this a “sign-off” tool? Anyone?

Hardware Emulation becoming more and more affordable

Read: http://www.your-story.org/eve%E2%80%99s-latest-emulator-offers-the-lowest-cost-of-ownership-in-the-industry-62042/

 

With the so called “penny-per-gate” pricing – sure is a marketing gimmick, it is becoming more and more viable to explore low cost emulation stuff. We still see that our customers continue to rely on own, self cooked FPGA boards, but with such innovative business models it may be changing soon..

 

Good job Eve folks – I wonder if they allow sharing “across customers” – say we host one Zebu server at CVC and allow several customers to log-in and pay-per-use!

Sunday, November 29, 2009

Update on IEEE 1800-2009 standard, fresh from the oven!

As you all may know by now, IEEE 1800-2009 was recently approved.
There were many updates in SystemVerilog core, the Assertions, and the addition of the checker, a new type of entity where several assertions and verification code can be defined just like a module/interface. In addition the checker can be inlined procedurally unlike a module.

Immediate next step will be to get real users exposed to the power of new constructs. We would expect tool vendors to start adopting this new version, probably sooner than we may think as some vendors were actively implementing the new features as the LRM was being refined. Now atleast 2 major EDA vendors have released support for varying sets of constructs from this new LRM. Ping your EDA support for updates!


As far as book support, we're please to announce the release of our SystemVerilog Assertions Handbook, 2nd Edition that includes the IEEE 1800-2009 updates.
For more information, see http://systemverilog.us/sva2_toc_preface.pdf
http://systemverilog.us/sva_handbook2_cover.jpg


SystemVerilog Assertions Handbook, 2nd Edition is an excellent reference for learning the basics of the assertion language. Syntax summaries along side examples help in learning the syntax. There are many examples with graphical representations that demonstrate the concepts. Basic rules are listed, often with quotes from the standard, and then explained. The book goes beyond the standard to demonstrate many subtleties that produce unexpected results and poor performance, and flags the pitfalls to avoid. It is a great refresher for experienced users and for those looking to understand what is new in the SVA language for the IEEE 1800-2009 release. Additional chapters present methodology and application perspectives. This book is co-authored by:
Ben Cohen, Srinivasan Venkataramanan, Ajeetha Kumari, and Lisa Piper

Tuesday, November 24, 2009

Training on “Protocol Verification using SystemVerilog Assertions”

 

December is usually the time of holidays, relatively work load etc. Given the challenging job scenario this is also the best time to hone your skills and face the New Year with new skills, explore new job avenues, segments etc.

CVC is announcing a week long certificate course on standard protocol verification. At the end of this course you would have finished developing a MIP (Monitor IP) for a standard protocol based on SVA. Assertions are very powerful to capture temporal behavior. Broadly it covers the following topics:

  • ABV Introduction
  • SystemVerilog Assertions (SVA)
  • Project – develop a real life Protocol Monitor IP (MIP) with SVA

Course contents:  http://www.cvcblr.com/trng_profiles/CVC_LG_SVA_profile.pdf

Topic

Duration

SystemVerilog Assertions

2.0 days

Project

3.0 days

Schedule

Tentative: 2nd week of December, 2009

For exact schedule visit http://www.cvcblr.com/blog/ or contact us.

Contact

Send an email to: training@cvcblr.com and/or cvc.training@gmail.com for more details, cost etc. Or call us at: +91-9620209226/+91-80-42134156

Please include the following details in your email:

Name:

Company Name:

Contact Email ID:

Contact Number:

Make best use of your Dec holidays: Verification Fest (VFest)

 

December is usually the time of holidays, relatively work load etc. Given the challenging job scenario this is also the best time to hone your skills and face the New Year with new skills, explore new job avenues, segments etc.

CVC is launching its highly successful 2 weeks certificate course on Functional Verification using SystemVerilog with a project in one of the following domains.

· Networking

· Communication

· Image Processing

VFest also focuses the language aspect SV in depth. Broadly it covers the following topics:

Duration

Topic

Duration

SystemVerilog Basics

0.5 day

Verification using SystemVerilog

2.5 days

Mini Project

2.0 days

Verification Methodology

2.0 days

Project

3.0 days

Schedule

Tentative: 1st week of December, 2009

For exact schedule visit http://www.cvcblr.com/blog/ or contact us.

Contact

Send an email to: training@cvcblr.com and/or cvc.training@gmail.com for more details, cost etc. Or call us at: +91-9620209226/+91-80-42134156

Please include the following details in your email:

Name:

Company Name:

Contact Email ID:

Contact Number:

SystemVerilog tip: watch out enum and randc

Recently an interesting question was raised by SystemVerilog user on randc usage with enum. To illustrate, consider the following code:

[cpp]
typedef enum {red, green, blue, yellow, white} house_color_type;
class c;
randc house_color_type enum_0;
[/cpp]

Spot anything wrong above? Perhaps not? As it goes with randc an implementation needs to remember all values generated so far before recycling! So it does consume extra memory. SV LRM says:

To reduce memory requirements, implementations may impose a limit on the maximum size of a randc
variable, but it shall be no less than 8 bits.

By default an enum is an int – i.e. 32-bits, hence allowing a randc on it blindly is a real challenge for tools – though some advanced tools/versions (Questa 6.5a for instance) allows it. But this default int choice is not something I like so much – it should have been cleverer to choose appropriate sized of vector by the implementation, did we not know LRM committee is often biased by implementers. No pun intended, but just MHO.

Anyway coming back to the question, a very useful tip here (like “Moral of the story is..” – something that’s day-to-day phrase in a typical school boy father’s life , something that I thoroughly enjoy, thanks to my Anirudh Pradyumnan): Model your enum size while declaring it. As in:

[cpp]

typedef enum {red, green, blue, yellow, white} house_color_type;

typedef enum bit [2:0] {red, green, blue, yellow, white} house_color_type_BETTER;

[/cpp]

ASIC Design Verification for FPGA designers

 

…Step upto ASIC world with SystemVerilog, Assertions & Testbench

CVC (www.

Technorati Tags:

cvcblr.com) is announcing a new session of its 10-day course on “FPGA-2-ASIC_DV-with SystemVerilog” - a step-by-step approach to introduce modern day Design & Verification challenges & solutions for FPGA designers. It is structured as follows:

  • Basic Session
    • Comprehensive Functional Verification (CFV)
    • SystemVerilog basics (SVB)
  • Advanced Session
    • ABV Introduction
    • SystemVerilog Assertions (SVA)
    • Project – develop a real life Protocol IP (PIP) with SVA
    • Verification Using SystemVerilog (VSV)

Course contents: 

http://www.cvcblr.com/trng_profiles/CVC_LG_SVA_profile.pdf

http://www.cvcblr.com/trng_profiles/CVC_LG_VSV_profile.pdf

Topic

Duration

Comprehensive Functional Verification (including UNIX usage, EDA tools)

1.5 days

SystemVerilog basics

1 day

Project

0.5 days

SystemVerilog Assertions

2 days

SystemVerilog Testbench

2 days

Project

3.0 days

All the course contents, agenda can be found at http://www.cvcblr.com/program_offering. It is meticulously prepared with the common expertise of FPGA designers in mind. Having transformed several FPGA designers into ASIC Design-Verification engineers at CVC we fully understand the challenges involved, skills needed etc. The course is structured in a balanced manner with theory and lab sessions tightly embedded in a manner that helps in mastering topics learned so far in the course.

Schedule:

Dec 1st week at Bangalore

To attend this class, confirm your registration by sending an email to training@cvcblr.com

Ph: +91-9620209226, +91-80-42134156

Please include the following details in your email:

Name:

Company Name:

Contact Email ID:

Contact Number:

Sunday, November 8, 2009

SV: implication constraint and its implication/effect

SystemVerilog has a nice implication constraint feature to guard constraint expressions on their applicability. Last week during our SystemVerilog + methodology workshop one of the attendees faced an interesting issue. She was creating a min-VIP for APB as part of our SystemVerilog 10-day workshop (See details at: http://www.cvcblr.com/trng_profiles/CVC_VSV_WK_profile.pdf ).

She wrote a APB scenario code that was intended to create a sequence of transactions with varying address, kind etc. Here is a code snippet:

constraint cst_xactn_kind{
       if(this.scenario_kind == this.sc_id)
       this.length == 10;
       foreach (items[i])
        {
          (i==0) -> items[i].apb_op_kind == APB_WR;items[i].addr == 'b01; items[i].wdata == 'd11;


               (i==1) -> items[i].apb_op_kind == APB_WR;items[i].addr == 'b11; items[i].wdata == 'd12;
                       }
     }

Spot anything wrong in the above code? Perhaps not for the unsuspecting, bare eyes. Code intention: Keep the:

0th transaction KIND == WRITE, address == 01, data == 11;

1st transaction KIND == WRITE, address == 3, data == 12;

Read again the code – it seems to imply just that, isn’t it? Let’s run it.

Here is what Questa says:

###########################################################################                    
#                     WELCOME !!!
#                      APB PROJECT USING VMM
#                     DONE BY PRIYA @ CVC
#                     DATE:21stOctober2009
############################################################################
# Normal[NOTE] on APB_PROGRAM(0) at                    0:
#     APB PROJECT:       Start of APB Random test!    
# ****************************************************************************
# Normal[NOTE] on APB_ENV(0) at              0.00 ns:
#     APB PROJECT: Sim shall run for 10 number of transactions
# Normal[NOTE] on APB_ENV(0) at              0.00 ns:
#                     Reset!!!!!!!!!               
# Normal[NOTE] on APB_ENV(0) at            230.00 ns:
#                    Reset Release!
# ****************************************************************************
# *FATAL*[FAILURE] on APB Generator Scenario Generator(APB_GENERATOR) at            730.00 ns:
#     Cannot randomize scenario descriptor #0

Puzzled? What is wrong? Review by the code author herself few times didn’t reveal anything wrong (bias towards own code?).

Seek expert assistance.. Questa has a simple flag to bring up solver debugger as: vsim –solvefaildebug Let’s try that now..

 

# ../tb_src_scenario/apb_scenario_gen.sv(1): randomize() failed due to conflicts between the following constraints:
#     ../tb_src_scenario/apb_scenario_gen.sv(25): the_scenario.cst_xactn_kind { (the_scenario.items[0].addr == 32'h00000001); }
#     ../tb_src_scenario/apb_scenario_gen.sv(1): the_scenario.repetition { (the_scenario.repeated == 32'h00000000); }
#     ../tb_src_scenario/apb_scenario_gen.sv(25): the_scenario.cst_xactn_kind { (the_scenario.items[0].apb_op_kind == APB_WR); }
#     ../tb_src_scenario/apb_scenario_gen.sv(26): the_scenario.cst_xactn_kind { (the_scenario.items[0].addr == 32'h00000003); }
#     ../tb_src_scenario/apb_scenario_gen.sv(26): the_scenario.cst_xactn_kind { (the_scenario.items[0].wdata == 32'h0000000c); }
#     ../tb_src_scenario/apb_scenario_gen.sv(26): the_scenario.cst_xactn_kind { (the_scenario.items[1].apb_op_kind == APB_WR); }
#     ../tb_src_scenario/apb_scenario_gen.sv(26): the_scenario.cst_xactn_kind { (the_scenario.items[1].addr == 32'h00000003); }
#     ../tb_src_scenario/apb_scenario_gen.sv(26): the_scenario.cst_xactn_kind { (the_scenario.items[1].wdata == 32'h0000000c); }
#     ../tb_src_scenario/apb_scenario_gen.sv(26): the_scenario.cst_xactn_kind { (the_scenario.items[2].addr == 32'h00000003); }
#     ../tb_src_scenario/apb_scenario_gen.sv(26): the_scenario.cst_xactn_kind { (the_scenario.items[2].wdata == 32'h0000000c); }
#     ../tb_src_scenario/apb_scenario_gen.sv(26): the_scenario.cst_xactn_kind { (the_scenario.items[3].addr == 32'h00000003); }
#     ../tb_src_scenario/apb_scenario_gen.sv(26): the_scenario.cst_xactn_kind { (the_scenario.items[3].wdata == 32'h0000000c); }
#     ../tb_src_scenario/apb_scenario_gen.sv(26): the_scenario.cst_xactn_kind { (the_scenario.items[4].addr == 32'h00000003); }
#     ../tb_src_scenario/apb_scenario_gen.sv(26): the_scenario.cst_xactn_kind { (the_scenario.items[4].wdata == 32'h0000000c); }
#     ../tb_src_scenario/apb_scenario_gen.sv(26): the_scenario.cst_xactn_kind { (the_scenario.items[5].addr == 32'h00000003); }
#     ../tb_src_scenario/apb_scenario_gen.sv(26): the_scenario.cst_xactn_kind { (the_scenario.items[5].wdata == 32'h0000000c); }

Smell something wrong? Why is the constraint on addr, data getting applied across scenario items 2,3,4,5 etc.? Beyond the 0, 1 that the “implication” supposed to guard it? Relook at constraint code:

          (i==0) -> items[i].apb_op_kind == APB_WR;items[i].addr == 'b01; items[i].wdata == 'd11;

Found it? Not yet? The devil lies in details – here in that SEMICOLON “ ; “. A semicolon in Verilog/SV denotes END of a statement and begin of the next one. Hence the effect of “implication” is ENDED with the variable “kind” alone here – thereby it doesn’t affect the addr, data – hence the implication is invisible to them. At line 25, the addr == 1; At line 26, addr == 3; Hence the contradiction!

The fix will be to use && to imply that the guard is applicable to all the 3 variables – kind && addr && data.

  Instead of:

(i==0) -> items[i].apb_op_kind == APB_WR;items[i].addr == 'b01; items[i].wdata == 'd11;

Use:

   (i==0) –> (items[i].apb_op_kind == APB_WR) && (items[i].addr == 'b01) && (items[i].wdata == 'd11);

Morale of the debug session is: you need to be careful while using implication constraints for more than single variable :-)

Tuesday, July 7, 2009

Certificate course on SystemVerilog Assertions …Language + Lab + Mini-project

Certificate course on SystemVerilog Assertions

…Language + Lab + Mini-project

CVC is announcing a new session of its popular 2-day certificate course on SsystemVerilog Assertions (ABV_SVA) covering SystemVerilog Assertions in depth. Broadly it covers the following topics:

  • ABV Introduction
  • SystemVerilog Assertions (SVA)
  • Project – develop a real life Protocol IP (PIP) with SVA

Course contents: http://www.cvcblr.com/trng_profiles/CVC_LG_SVA_profile.pdf

Duration

Here is a detailed breakdown of the course with duration. Note that we have a “mini project” tightly embedded in the course that helps in mastering topics learned so far in the course. This is on top of the regular labs that are part of the training.

Topic

Duration

Start

End

SystemVerilog Assertions

1.5 days

July 13

July 14

Mini Project II

0.5 day

July 14

July 14

Schedule:

July 13, 14 at Bangalore

To attend this class, confirm your registration by sending an email to training @ cvcblr.com

Ph: +91-9916176014, +91-80-42134156

Please include the following details in your email:

Name:

Company Name:

Contact Email ID:

Contact Number:

Saturday, March 14, 2009

CCD, My read of Certess technology and positioning

 

With due respect to the technology behind Certess’s tool I have some discomfort with the way it is being positioned – atleast in the below article:

http://www.edadesignline.com/howto/215600203;jsessionid=TP12OA3IF1X3UQSNDLOSKHSCJUNN2JVN?pgno=2

Before I talk about my discomfort, let me state the positives: Not very often do we get to read such well written, all encompassing technical article, Kudo’s to Mark Hampton – he touches on every aspect of functional verification in this article, not so common in an EDA product “promotional” article – to which this article may be characterized to (unfortunately IMHO). Having said that, I personally believe Certess should position the technology “along with” existing ones than challenging/trying to replace time tested/well adopted methodologies such as code cov, functional cov etc. Not that I differ from his views on the shortcomings of these technologies, rather going by what Pradip Thakcker said in DVM 08 (http://vlsi-india.org/vsi/activities/2008/dvm-blr-apr08/program.html)

“Code coverage and functional coverage are useful techniques with their own strengths and weaknesses. Rather than worrying about their weaknesses, focus on the positives and use them today”..Pradip, during his “Holistic Verification: Myth or The Magic Bullet?”

I will be very glad if Certess focuses on their real strength of exposing lack of checkers in a Verification environment than trying to “eat” into the well established market of Code/Func coverage tools. Another rationale: Both the cov and qualification is compute intensive and given the amount of EDA investment that has gone into stabilizing and optimizing these features, it will be irrational to try and replace them with “functional qualification” (No offense meant, I have great respect for Mark – given his excellent article and ofcourse the product). With SpringSoft acquiring Certess hopefully their customer base/reach increases and that will throw up more success stories in the coming months/quarters. So good times ahead!

ITG, C2D & ACC - Emerging Verification technologies

Well, it is not the overly hyped *V here - such as CRV, CDV, ABV - we at CVC (www.noveldv.com) consider them as yesterday-ones for the sake of giving room to next generation ones such as:

  • ACC - Automatic Coverage Closure 
  • ITG - Intelligent Test Generation (such as Graph based)
  • CCD - Covered & Checked implies Done (such as Certess/SpringSoft)

Out of this let me spend more time on the last two as the ACC is already been discussed for a while now (atleast more than the other two).

ITG - Intelligent Test Generation (such as Graph based)

ITG - is still in its early days. Two tools seem to be addressing this as of today:

  1. Infact  from Mentor is one big name.
  2. The other one that is very promising is: Breker Systems with a very high profile team behind it. These folks know what they are talking about - with their CTO holding "Adnan holds 15 patents in test case generation and synthesis.".

We at CVC are yet to get our hands dirty with these tools, but certainly worth watching indeed! From our early analysis this technology will assist more and more system level tests being easily captured by raising the level of abstraction of testcase specification. This will be fun indeed!

CCD - Covered & Checked implies Done 

Coming to the other category: CCD (yet to find a better name) – this is a topic that has been haunting us for atleast a decade now. Ever since I started using Functional coverage (early 2000), we always had this problem of “I got it covered, but did I get it checked too?”. During an Ethernet monster switch/router verification at Intel we hit this problem atleast half-a-dozen times and those corridor discussions still ring in my ears. The Design (read it as RTL) manager (Sutapa Chandra) made fun of us asking “are we taping out RTL or testbench” as we seem to be finding lack of checkers every now and then. Most of these situations are the case of bugs went undetected at block/cluster level and later get got (luckily) at full chip level – then we do a rigorous review of our block level env, and find that we indeed had coverage points for those scenarios, just that we didn’t have enough checkers! Shame, but true. A technology such as Certess’s Testbench Qualification was what was indeed needed! A very detailed read of Certess technology is at: http://www.edadesignline.com/howto/215600203;jsessionid=TP12OA3IF1X3UQSNDLOSKHSCJUNN2JVN?pgno=2

Friday, March 6, 2009

OVM rule checker for free!

A process/methodology is only as good as its adoption/compliance. And in Verification since there are so many different ways of “getting it done” it is hard to get sync across teams/groups/projects/companies etc. This is where the methodology comes to play (be it VMM,OVM<eRM etc.). But again how am I sure that I follow all the guidelines?

 

Consider this – we got an email from a user as recently as last  week on our VMM book code (www.systemverilog.us) about not adding “is_valid” method in data model. The book was published back in 2006 I believe. That just goes to say that a “compliance checker” is a very handy tool to create highly reusable Verification code. But who will check for compliance?

 

Thankfully, here is an answer for OVM and that too for no cost!

http://www.veriez.com/dvcon_2009_pr.htm<a href="http://www.veriez.com/dvcon_2009_pr.htm"></a> This is indeed an interesting development and a welcome one during tight economical situations. Make sure you download and use them!

 

Cheers,

Ajeetha www.noveldv.com

Wednesday, February 18, 2009

Automatic Coverage Closure – my perspective

Recently EDA tools are emerging in the area of “Automatic Coverage Closure” that promise a new level of automation in CDV/MDV/any_other_Buzz_word_Driven_Verification process. A significant name in this arena is nuSym, a relatively new EDA player. There have been few good reviews about them @ Deepchip.com.

http://deepchip.com/items/0479-05.html

http://deepchip.com/items/0473-06.html

http://deepchip.com/items/dvcon07-06.html

And another one @ SiliconIndia:

http://www.siliconindia.com/magazine/articledesc.php?articleid=DPEW289996212

 

And very recently on VerifGuild:

http://www.verificationguild.com/modules.php?name=Forums&file=viewtopic&t=3102

I like Gopi’s post/comment b’cos I have the same opinion about CRV (Constraint Random Verification) – it catches scenarios/bugs that you didn’t envision – either via constraints or coverage (or otherwise). Now if we fool ourself by going behind “only the existing/identified coverage holes” we fall into a trap. This is inline/insync with what Sundaresan Kumbakonam of BRCM ( need his profile? See: http://vlsi-india.org/vsi/activities/dvw05_blr/index.html) shared with me once:

 

Quoting Sundaresan:

 I don’t believe much in the idea of “writing functional coverage model” and then tweaking a constraint here-or-there, or writing a “directed test” for it to fill the hole.

Coming back to my view, I believe some redundancy via randomness/CRV is actually good. In my past verification cycles I have seen design errors due to “repeated patterns” – no big deal, is it?

So where exactly do these ACC tools fit?

Referring back to:

http://www.verificationguild.com/modules.php?name=Forums&file=viewtopic&t=3102

>> whether these tools are only used to reach last few % of coverage goal which is hard to reach ?

I would differ here, they shall be very useful somewhere during the middle phase – neither too early, nor too late. Too early – perhaps we don’t have full RTL and/or functional cov model. Too late – perhaps our focus should be more into “checking” than only coverage (As Nagesh pointed out in VerifGuild). I would like to add that during those last minutes, the coverage shall be taken for “granted” – meaning it is a *must* and not a *nice to have* thing and the focus shall be to look for any failures.

To me a reasonable flow with these ACC tools would be:

  • Run with CRV, measure code coverage. Add checkers
  • Add functional coverage, use CRV again to hit them using the coverage points as “potential trouble spots” than “actual scenarios” themselves. In few cases where in the scenario description is easy to capture using Functional coverage syntax, this is great. IMHO the existing coverage syntax is little too verbose and unusable to a large extent for solid, easy-to-use coverage specification. Specifically the SV syntax overhead of coverage is just too much for me. IEEE 1647 “e” fairs slightly better but that’s a different story altogether. I’m still on the lookout for a higher level coverage specification language.. (matter for another blog post anyway).
  • Once the RTL and the coverage model is reasonably stable, use ACC regularly as a “sanity” test on every interim RTL release. I believe ACC has a HUGE potential here – if we can optimize the tests needed to release interim RTL versions, we are saving quality time and enabling faster turn around.
  • Towards the end, enable “plain CRV” (without the ACC bias) and look for “trouble free regression for XX days”.

 

And while speaking to a friend of mine here a while back, he is damn against the idea of using these ACC tools for merely stimulus. He likes the idea of ACC if it can be used to:

  • Fill functional cov holes
  • Code coverage holes
  • Assertion cov misses/holes

A tough ask, but looks like nuSym can handle that – atleast based on the early reviews so far. Also reading their whitepaper on “intelligent verification”, they do a path tracing that enables them to systematically target code coverage without getting into Formal world – cool idea indeed! Kudos to nuSym folks (some of them my ex-colleagues BTW).

And on the application of these ACC tools to the poor, non-CRV/CDV folks – there is light at the end of the tunnel, if you read nuSym’s paper. We at CVC also have ideas on how to use this for a highly configurable IP verification with plain vanilla verilog/task based TBs. We need to prototype it before we can discuss it in detail though.

 

Anyway, good topic for otherwise a downturn mood.

 

More to follow.

Srini

P.S. Sorry for the “random” rambling, after all we are talking of “random verification” :-)

Wednesday, February 11, 2009

Certificate course on Functional Verification …basics to ASIC verification using SystemVerilog

Certificate course on Functional Verification …basics to ASIC verification using SystemVerilog

CVC is about to launch a 10-day certificate course on Functional Verification covering SystemVerilog in depth. Broadly it covers the following topics:

  • Comprehensive introduction to Functional Verification (CFV)
  • SystemVerilog basics (SVB)
  • SystemVerilog Assertion (SVA)
  • Verification Using SystemVerilog (VSV)
  • Verification Methodology (VM)

Duration

Here is a detailed breakdown of the course with duration. Note that we have several “mini projects” tightly embedded in the course that helps in mastering topics learned so far in the course. This is on top of the regular labs that are part of the training. The detailed breakup of topics and labs is covered in next sections of this proposal.

Topic

Duration

Comprehensive Functional Verification

1.5 days

Mini Project I

0.5 day

SystemVerilog Basics

0.5 day

SystemVerilog Assertions

1.5 days

Mini Project II

0.5 day

Verification using SystemVerilog

2.0 days

Mini Project III

0.5 day

Verification Methodology

2.0 days

Project IV

1.0 day

Schedule

Tentative: Feb 09-Mar09

Contact

Send an email to: cvc.training@gmail.com and/or training@noveldv.com for more details, cost etc. Or call us at: +91-9916176014

Monday, February 9, 2009

When will SV Interface be really useful?

The idea of adding interface construct to SV language has proven to be a short sighted one with all its ugly ramifications for the RTL side (refer to good papers on this from Jonathan @Douolos if you need proof). Add to it the fact that not all RTL synthesis (+ FPGA), Linters, Equiv. checkers fully supporting it yet!

Atleast on Verification front it has been proving good. However a significant drop has been the lack of proper debug support for it. I wish EDA takes debug seriously. It affects productivity so much that any language level gain we get is nullified with lack or weak support for these new constructs.

For instance, look at SV Interface as a simple "wire bundle" - do we have debuggers handle it at that level of abstraction?

Luckily Verdi seems to be doing it (leading the way as ever before), see:

http://newsletter.springsoft.com/.docs/pg/10768

Happy debugging!

Ajeetha, CVC
www.noveldv.com

Thursday, February 5, 2009

Excellent case study on automatic Coverage closure – nuSym & QCOM

An absolute *must read* for all those CDV/CRV fans (FWIW: CDV – Coverage Driven Verification, CRV – Constrained Random Verification):

 

http://www.deepchip.com/items/0479-05.html

 

A live case study from Jim @QCOM. It has good details about the setup, work done and results. Looks like nuSym does deliver the kind of promises/claims it makes, good going indeed! Based on the last 2 such results (both @deepchip.com) I have few observations:

 

1. Both of them were using Vera against SystemVerilog. While the technology shall be language independent, it will be good to get a SV case study out as well

2. I’m not very clear why and how nuSym can replace a core “simulator” – there are just lot more things in a “simulator” than just “coverage closure/intelligence” – what about debug, stability, memory footprint, gate level/ASIC sign off, dumping, Debussy like integration etc etc.? I fully appreciate the smartness in random generation – it is time EDA folks did that in so called modern “Verification platforms”. But I fail to see how a point tool like nuSym can “replace” a simulator, instead it shall augment it and bring the bigwig EDA vendor pricing down to a reasonable bargain :-)

 

More notes as we read/re-read that article.

 

Anyway thanks Jim for sharing those wonderful details!

 

Cheers

Srini

CTO @CVC www.noveldv.com

Tuesday, February 3, 2009

SVA challenges in creating, debugging and verifying assertions

During our SVA class today there were frequent questions/comments on:

 

  • How to write assertions easily without becoming a language guru?
  • How to ensure that the assertions we write are correct to start with? (Not syntax wise, rather functionally)
  • How to visualize assertions/attempts/threads easily?
  • Can we create assertions automatically from a timing diagram/dump file?
  • How to debug assertions? What sort of automation is available?
  • Given a dump file, can we explore a set of assertions about that design (without having to rerun the simulations)?
  • Can we verify assertions in isolation – i.e. even before RTL and/or TB is ready?

Sure a boatload of questions, some answers are available some are not as of today. CVC will address some of these in a seminar in one of the coming weeks. (Stay tuned to this site for that news).

Here are some answers:

Q: How to write assertions easily without becoming a language guru?

  • Leverage on assertion libraries such as OVL, VMM SVA lib, QVL, IAL etc.

Q: How to ensure that the assertions we write are correct to start with? (Not syntax wise, rather functionally)

  • Not an easy thing, but again use pre-verified assertion lib elements (see prev Q)

Q: How to visualize assertions/attempts/threads easily?

  • Know your tools: Springsoft (formerly Novas) has a great product called Verdi that can present a “Temporal Flow View” and thread view. It is so amazing and intuitive that you will stay locked with it for long long time to come, really speaking. The idea of temporal annotation is not new, we spoke about it in our PSL book, see below a snapshot:

 

image

 

The core idea is to annotate the values of signals at “appropriate time” and not just based on current time (the latter is what most of debuggers do, except for Verdi AFAIK).

 

Consider a code like:

 

mul_attempts : assert property (@posedge clk) start |=> s1;

sequence s1;

a ##1 b ##2 c;

endsequence

 

Below is a screenshot of how Verdi can display it.

 

image

 

2 key/novel ideas here to appreciate:

 

1. The threads are nicely displayed with “horizontal lines” – this is exactly how our PPT in training explains threads BTW!

2. The Temporal annotation of the sequence/property with values & time stamps. For the failure at 350 ns (assume 20 ns clock period), it shows value of: (a ##1 b ##2 c;)

    • “c” @350ns
    • “b” @310 ns
    • “a” @290
    • “start” @270ns

This is simply superb, hats off to Verdi!

Q: How to debug assertions? What sort of automation is available?

  • Refer to prev Q, almost every vendor provides some automation.

Q: Can we create assertions automatically from a timing diagram/dump file?

 

Q: Given a dump file, can we explore a set of assertions about that design (without having to rerun the simulations)?

  • VCS can do this
  • Springsoft/Verdi can do this
  • Veritools can do this too!

Q: Can we verify assertions in isolation – i.e. even before RTL and/or TB is ready?

  • Strictly speaking this is ideal job for formal verification tools. We believe some tools such as Magellan (Synopsys) already do this. We will update more here, stay tuned (for the 3rd time in this post…)

Sunday, February 1, 2009

Week long fest on Verification Using SystemVerilog - Bangalore

 

Quick facts
When: Feb 2 to Feb 6 2009

Cost: Rs. 5000 /- per day

Contact: cvc.training @ gmail.com, +91-9916176014, +91-80-42134156

What’s SystemVerilog?
IEEE 1800, SystemVerilog is the de-facto language for Digital system Verification (and Design). Almost every ASIC team is either using it or plan on using it in the next project! It is a major extension to Verilog-2001, adding significant new features to Verilog for verification and design. Enhancements range from simple enhancements to existing constructs, addition of new language constructs to the inclusion of complete Object-Oriented paradigm features.

What’s a week long fest?
A week long fest on SystemVerilog for Verification is aimed at introducing SystemVerilog in its full capacity covering basics, assertions, testbench features and ending with methodology. At the end of this fest the essential features of SystemVerilog shall be covered and will enable you develop complex testbenches using advanced techniques such as OOP, Constrained Random Verification and Coverage Driven Verification.  It is aimed at novice SV users and hence language is dealt with a target DUT in picture. The goal is to put SV to use for real life verification than understand the nitty gritties of the language from semantic/gotchas perspective.

Who should attend?
Practicing Design and Verification engineers with tight project schedules are ideal attendees. DV managers will equally find it useful as they can grasp the complexity of SV. Though it is strongly recommended to attend the whole 5 day fest, some may choose just the assertions/testbench/methodology and be present in those days accordingly. Please call use for more details.

Tools used

  • Questa/Modelsim (Mentor)
  • VCS (Synopsys)
  • Riviera (Aldec) - optional

What’s the cost?
The basic cost of this course is Rs. 5,000 /- + ST (12.36 %) per day per attendee.

Terms & Conditions
· In general we require that the fee is paid in 100% prior to the start of the training.
· For large corporate with more number of attendees to account for their internal process we do allow an exception to the above rule; however we charge an additional 25% of the training cost per attendee in such cases. In case the fee is paid after the training, the payment should be made within 1 week after the training is delivered. Any additional delay shall be charged at 10% every additional day.
· Any "offer" price mentioned in the course announcement is applicable only for individual attendees and not for corporate.

Cancellation Policy
Course tuition is fully refundable up to one week before the class starts. Cancellations within a week (2-7 days) of the class start date will incur a 50% cancellation fee. Those who cancel fewer than 2 days prior to the class will be billed for the full amount of the tuition. A no-show will be treated as cancellation and no refund shall be given. For genuine cases of absence, we can provide a training token that the trainee can avail in one of the future training classes subject to space availability.

How do I register for a class?
To attend this class, confirm your registration by sending an email to cvc.training @ gmail.com. +91-9916176014, +91-80-42134156

Please include the following details in your email:
Name:

Company Name:

Official Email ID:

Contact Number:

Trainer Profile

Srinivasan Venkataramanan, CTO

http://www.linkedin.com/in/svenka3

  • Over 12 years of experience in VLSI Design & Verification
  • Co-authored leading books in the Verification domain.
  • Worked at Philips, Intel, Synopsys in various capacities.
  • Presented papers, tutorials in various conferences, publications and avenues.
  • Conducted workshops and trainings on PSL, SVA, SV, VMM, E, ABV, CDV and OOP for Verification
  • Holds M.Tech in VLSI Design from prestigious IIT, Delhi.

Ajeetha Kumari, CEO & MD
· Has 8+ years of experience in Verification
· Co-authored leading books in the Verification domain.
· Presented papers, tutorials in various conferences, publications andavenues.
· Worked with all leading edge simulators and formal verification(Model Checking) tools.
· Conducted workshops and trainings on PSL, SVA, SV, VMM, E, ABV, CDVand OOP for Verification
· Holds M.S.E.E. from prestigious IIT, Madras.