Author Topic: How does industry design digital and analog ICs?  (Read 6788 times)

0 Members and 2 Guests are viewing this topic.

Offline JoeNTopic starter

  • Frequent Contributor
  • **
  • Posts: 991
  • Country: us
  • We Buy Trannies By The Truckload
How does industry design digital and analog ICs?
« on: August 13, 2015, 05:17:28 am »
I am told that there is standard software for designing digital ICs that is somewhat like FPGA software, it just renders the HDL into output that can be mapped onto "standard cell" ICs for mass production.  ASICs are made this way.  Are all digital ICs made this way, for example ARM through Intel and NVidia behemoth chips or are some digital ICs designed with processes other than "standard cell"?  What processes and software is used?  Anything you can suggest I read on this?

Now, how are analog ICs designed?  Is there software for that?  What I have been able to read makes me think it is simply voodoo.  How does Analog Devices design something like the AD5791 20 bit 1ppm DAC, their new AD7177 32 bit ADC, or their AD9219 5.7Ghz high speed DAC?  Is it just iteration after iteration working out the kinks until it is right or is there actually some rigorous computer assisted design approach that can help design these sort of circuits?

Is RF analog the same deal, or is that special in and of itself?  How is that done?  How does Hittite design 60+ Ghz RF parts?  I bet even the people who can help me understand analog design may say this is actually voodoo.

I'm not going into IC design, but it would be interesting to know how these companies approach solving these, to me, incredibly difficult problems.  Heck, I would like to know how they approach solving the design of lowly regulators and op-amps as well.
« Last Edit: August 13, 2015, 05:22:15 am by JoeN »
Have You Been Triggered Today?
 

Offline tec5c

  • Frequent Contributor
  • **
  • Posts: 423
  • Country: au
Re: How does industry design digital and analog ICs?
« Reply #1 on: August 13, 2015, 03:04:38 pm »
This would make a great episode for The Signal Path. We just need to convince Shahriar to do it   :D
 

Offline rfeecs

  • Frequent Contributor
  • **
  • Posts: 807
  • Country: us
Re: How does industry design digital and analog ICs?
« Reply #2 on: August 13, 2015, 05:10:49 pm »
These days an engineering run to fabricate an IC including the mask set costs hundreds of thousands of dollars.  So yes, everything is simulated before taping out a design.

Pretty much the industry standard software package is from Cadence: https://www.cadence.com/en/default.aspx

For that 60GHz design, the most popular package is Keysight ADS:  http://www.keysight.com/en/pc-1297113/advanced-design-system-ads?cc=US&lc=eng

At those frequencies you pretty much have to do a full EM (Electromagnetic) simulation that models the EM fields and waves for your given geometry.  That is built in to the simulator as well.  The layout is done by hand, because every transmission line bend and junction has an effect at microwave / millimeter wave frequencies.

Unfortunately, device models and simulators aren't perfect, so especially for those more challenging designs more than one iteration is often required.  So there is still some trial and error.
 

Offline free_electron

  • Super Contributor
  • ***
  • Posts: 8521
  • Country: us
    • SiliconValleyGarage
Re: How does industry design digital and analog ICs?
« Reply #3 on: August 13, 2015, 06:19:02 pm »
masksets are in the million dollar range now ...

The IC world is very tightly knit with tools from Cadence, Mentor, Magma,

Digital is almost all Synopsys pushed to place and route tools from Cadence and or Mentor.
analog design is almost exclusively done on ELDO , which is a real simulator that hits 99.99% accuracy
These tools interact and extract lots of data from layout like parasitic that are not available to PCB level designs.



Professional Electron Wrangler.
Any comments, or points of view expressed, are my own and not endorsed , induced or compensated by my employer(s).
 

Offline rfeecs

  • Frequent Contributor
  • **
  • Posts: 807
  • Country: us
Re: How does industry design digital and analog ICs?
« Reply #4 on: August 13, 2015, 10:10:45 pm »
masksets are in the million dollar range now ...

Depends on the process:



For analog and most RF you can use a cheaper process.  Surprisingly, for microwave stuff the mask set for most GaAs processes is less than $50k, because the process is simpler and only requires about 12 layers.
 

Offline rfbroadband

  • Supporter
  • ****
  • Posts: 186
  • Country: us
Re: How does industry design digital and analog ICs?
« Reply #5 on: August 16, 2015, 12:09:00 am »
well first you need a "good" EDA setup. Cadence, Synopsys and Mentor have effectively a monopoly. They all agreed that one can no longer purchase  perpetual licenses for their tools. A company can only "lease" a tool for a certain time frame. If you talk highly integrated SOC on 45nm or smaller you can easily spend millions of dollars in CAD lease fees per year depending on the size of the team. As an example a DRC/LVS license can easily cost several 100k$ / year depending on the technology.

I can't resist to point out that CAD vendors (of course) take advantage of this. You have a quad core processor in your server and you want to use all cores for your simulation....good luck. Default license allows only 1 or 2 cores....want to use the other cores as well to speed up the simulation...no problem just spend another 50k$ / year (don't forget to multiply this by the number designers in your team, because everyone has to have this capability...) 

From a design point of view, you have to simulate more or less everything. The cost of a mask set (even a shuttle run) is so expensive, you have no choice as to simulate every parameter that is important to you. This also means you have to learn to test PDK models that the foundry provides. Some larger companies have their own modeling teams because they don't trust the foundry models. Compared to PCB design, where you can spin a board in a few days for a few hundred or thousand dollars, in the chip design area you may pay 1Mio$ for a mask set after you had a team of 10 people work for 9 months on a highly integrated chip, you wait for 2-3 month for the chip to be in the lab for evaluation. 
The goal is the same for everyone: Time to market! For PCB design it may not always be worth to create a very complicated model that may take two weeks to develop when you figure it out by experimenting on the board. Chip design, no choice, period. If you don't have a model for something, you need to create one and you better figure out a way to validate what you simulate makes sense...otherwise you wait a few months again to see whether your fix is working.

Chip design tools offer very nice parasitic extraction tools where you can find every fF of parasitic capacitance that may hurt you, coupling between layout lines etc.. At higher frequencies you start to extract the parasitic inductance of a metal trace...etc the list of things to consider is very long. 

For RF design, companies do EM simulation as well. A good example are on chip inductors, or how close can we move the ground metal to the inductors before the inductors performance is impacted. What happens if we have multiple inductors.. how far apart do we space them in layout? How do the vias in inductors degrade the SRF and the Q?

Packaging:
At RF you may use 3D EM simulators and create 3D models of bondwires  that connect your die to the package pins. At a min. you estimate the length of a bondwire and capacitive and inductive coupling between pins...

ESD: you have to create models for your ESD structures, how large do the ESD diodes need to be to guarantee HBM ESD specs. What do you do if the parasitic capacitance of an ESD structure kills your performance of the RF input of your circuit?

Marketing:
that is often overlooked. For highly integrated ICs you better make sure you really understand what the customer or the market needs..you don't want to spend a year spending millions of dollars of R&D just to have your costumer tell you: Sorry nope to late, I moved on to the next gen. Or worse, product works, customer likes it but the price in the market fell drastically in the meantime and you can't make any money selling the chip.....

In general, nothing is black magic. Even 60GHz RF or highly integrated digital SOCs.

Having said that, I still hope that PCB designers would start using simulation tools more on a regular basis and try to design circuits like chip designers, because you can learn lots of things even if the simulation is not perfect.

For a hobbyist most tools are cost prohibitive, but you learn a LOT using free tools like LT spice or other tools.



 

Offline JoeNTopic starter

  • Frequent Contributor
  • **
  • Posts: 991
  • Country: us
  • We Buy Trannies By The Truckload
Re: How does industry design digital and analog ICs?
« Reply #6 on: August 16, 2015, 12:25:40 am »
That was a great explanation and I thank you and everyone else who replied.  Another question, probably an easy one compared to the generality of the original question:  How do people learn these hyper-complex EDA tools?  Is that something that is taught as electives for BS EE degrees, MS EE degrees, just at certain universities or is this something that is taught only by the EDA tool manufacturers themselves via documentation and professional training?  Beyond design and simulation, how are user-defined models made?  Is that done by the EE or by a scientist with solid state physics background?  If not voodoo, it seems like very specialized work for very intelligent and educated people.
Have You Been Triggered Today?
 

Offline rfbroadband

  • Supporter
  • ****
  • Posts: 186
  • Country: us
Re: How does industry design digital and analog ICs?
« Reply #7 on: August 16, 2015, 12:40:36 am »
the use of these tools is taught at universities when you study IC design for BS, MS or PhD. The EDA vendors provide these tools for free to universities...
a) the industry is requesting that graduates have the skills to use these tools when they earn their degree
b) EDA vendors have certainly an interest to have lots of students be familiar with their tools when they enter the industry...

One could write a book on how models are made, but especially in analog IC design, the better your understanding of physics the better of an IC designer you will become. I may create a post on how models are created as I spent a few years working in that area if people would find it interesting.     

If you as a hobbyist are interested in this topic download the free LT Spice and simulate every circuit you want to built and you will learn a ton of stuff. You are not under time to market pressure and even if you "waste" a few  days by messing around with models, the experience of simulating - building it, learning from it, tweaking the simulation to get closer to what you measure is priceless.
 

Offline djacobow

  • Super Contributor
  • ***
  • Posts: 1160
  • Country: us
  • takin' it apart since the 70's
Re: How does industry design digital and analog ICs?
« Reply #8 on: August 16, 2015, 04:07:52 am »
Let me stand up for the EDA companies a little bit on their pricing. For one, a perpetual license isn't really all that valuable. Every year you're at a new technology node and last year's tools rarely cut it. Sure, perhaps for a digital simulator like NCsim, you could get along with what you had, but for, say parasitic extraction, or even physical-aware synthesis and layout, eh, not so much.

Also, it should be noted that the EDA companies are not exactly raking in megabucks. It's actually a pretty crappy business. You have tools that are absolutely essential to the semiconductor industry, that have major ongoing engineering investments by teams of people who understands semiconductor design even better than the typical target users, and you are pulling in like $100k/seat-year in licensing, less at the big customers. A lot of those users are going to make many, many times that on the chip sales. Should also note that EDA vendors often provide special pricing to startups.

Keep in mind also that chip starts just keep going down. The investment to build new tools is the same, maybe even more than ever, but the number of users is diminishing.

It's not that you can't make money, it's just that it's not like they're running away with the bank.

[ disclosure: yeah, I worked in EDA as an FAE at Quickturn which became Cadence. ]

 

Offline c4757p

  • Super Contributor
  • ***
  • Posts: 7799
  • Country: us
  • adieu
Re: How does industry design digital and analog ICs?
« Reply #9 on: August 16, 2015, 04:11:45 am »
Agreed - the one thing that gets me about the pricing though is this:

I can't resist to point out that CAD vendors (of course) take advantage of this. You have a quad core processor in your server and you want to use all cores for your simulation....good luck. Default license allows only 1 or 2 cores....want to use the other cores as well to speed up the simulation...no problem just spend another 50k$ / year (don't forget to multiply this by the number designers in your team, because everyone has to have this capability...) 

With that many zeros on the price there is no excuse for things like locking out cores. That sort of bullshit ought to disappear the moment you leave the consumer grade of software :palm:

I'm amazed that people are willing to spend that much money and still receive artificially crippled software.
No longer active here - try the IRC channel if you just can't be without me :)
 

Offline djacobow

  • Super Contributor
  • ***
  • Posts: 1160
  • Country: us
  • takin' it apart since the 70's
Re: How does industry design digital and analog ICs?
« Reply #10 on: August 16, 2015, 04:18:14 am »
These days an engineering run to fabricate an IC including the mask set costs hundreds of thousands of dollars.  So yes, everything is simulated before taping out a design.

Pretty much the industry standard software package is from Cadence: https://www.cadence.com/en/default.aspx


I assume you're talking analog. For digital, I think Synopsys is really king, with Cadence second, and Mentor far behind. But the reality is that people mix and match for the best tools. Nobody has a suite that is the best at everything.
 

Offline rfbroadband

  • Supporter
  • ****
  • Posts: 186
  • Country: us
Re: How does industry design digital and analog ICs?
« Reply #11 on: August 16, 2015, 05:25:19 am »
A final comment on EDA tool pricing: I was involved for years in negotiating prices with EDA vendors for chip design tools and while I understand that investment is needed to develop these tools, the EDA companies go crazy on prices and license keys.

Since this is hobbyist forum I will provide a PCB design tool example: Take Cadence Allegro, one of THE layout tools in the industry, imagine you design a 12 layer board, you are done and you want to print each layer into a PDF file for documentation purposes, nope disabled. You a need an additional license call Allegro Design Publisher (> 10k$) to print individual layers into a PDF file. Even my sales guys was embarrassed and just said go download a free Gerber viewer and generate your PDF files for each layer automatically. Seriously 10k$ to print layout view into a PDF file?

You go back to the IC design world and everything requires separate/additional license key:
- want to determine temp dependence in via after you already paid 200k$ for extraction license, sure for another 100k$
- your office is on the west coast and you have a remote employee who wants to use your tool on the east cost, sure you need a different license which costs x% more (for the same tool)


the list is endless and everybody does it, Synopsys, Cadence and Mentor.

In reg. to perpetual licenses: If you are a company like Maxim, or Linear you can design e.g. LDOs forever in mature technologies and perpetual licenses make a lot of sense.

Anyway, this is how the game is played.

Positive mindset :-), suddenly 5k$ for Altium does not sound that expensive anymore ...I bet it generates PDF files as well.
 

Offline tec5c

  • Frequent Contributor
  • **
  • Posts: 423
  • Country: au
Re: How does industry design digital and analog ICs?
« Reply #12 on: August 17, 2015, 04:21:38 pm »
the use of these tools is taught at universities when you study IC design for BS, MS or PhD.

Anyone have experience with these programs from their BS degree? I tend to think it's beyond the scope of a BS degree and more aimed at MS.
 

Offline djacobow

  • Super Contributor
  • ***
  • Posts: 1160
  • Country: us
  • takin' it apart since the 70's
Re: How does industry design digital and analog ICs?
« Reply #13 on: August 18, 2015, 04:48:02 pm »

Ancient history, but my school had the Mentor suite (Anyone remember "Falcon Framework?") when I was an undergrad in the early 90's, and we used them for some small simulation projects and one larger project the ultimately targeted FPGAs. I remember the tools being incredibly clunky and slow, even on a sexy Sparc 10.

When I got my first job in industry, which was in VLSI design, my very big semicon employer literally laughed over my experience with MGC. They had better tools. About as clunky, but without pointless/hopeless GUIs, and definitely faster. Other companies I worked for used Cadence and Synopsys,  and in the rest of my career, I have never run an MGC tool for any reason.

I don't know if semiconductor companies hire newly minted BSEEs anymore, and what they expect of them. The one I joined expected us to know the fundamentals, and was more than willing to train on tools and methodologies. My experience at school pretty much taught me "tools exist."

If I were TEACHING undergraduates, I would teach them not how to run EDA tools, but how to export from them, write scripts to transform the data, and then import it into another tool! That's actually a useful skill that will serve you your entire life. There will always be incompatible file formats, and supposedly compatible formats with subtle incompatibilities, and the engineer who doesn't blink and can just write a script to deal with the problem is in a much better situation then the one who throws his hands up and complains to the EDA vendor.

Anyone have experience with these programs from their BS degree? I tend to think it's beyond the scope of a BS degree and more aimed at MS.
 

Offline mathias

  • Regular Contributor
  • *
  • Posts: 59
  • Country: 00
Re: How does industry design digital and analog ICs?
« Reply #14 on: August 19, 2015, 06:12:06 pm »
Other companies I worked for used Cadence and Synopsys,  and in the rest of my career, I have never run an MGC tool for any reason.
Isn't Calibre a tool from Mentor? That's still much used for DRC/LVS/etc.

Completely agree that scripting and data manipulation/extraction is an extremely usefull SKILL (TM) ;) .
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 8966
  • Country: gb
Re: How does industry design digital and analog ICs?
« Reply #15 on: August 19, 2015, 06:16:03 pm »
I don't know if semiconductor companies hire newly minted BSEEs anymore, and what they expect of them.
Most semiconductor companies won't look at your CV these days if the only thing on it is a BSEE.
« Last Edit: August 19, 2015, 06:36:40 pm by coppice »
 

Offline djacobow

  • Super Contributor
  • ***
  • Posts: 1160
  • Country: us
  • takin' it apart since the 70's
Re: How does industry design digital and analog ICs?
« Reply #16 on: August 20, 2015, 09:11:00 pm »
Isn't Calibre a tool from Mentor? That's still much used for DRC/LVS/etc.

I never did much of my own artwork, so didn't have much experience with LVS. It was the mask designers job to match the circuit he was given. I imagine it must much be different for people working a bit closer to the metal, analog folks, etc.

I did design memories for awhile, but even then it was an iterative thing between me and a mask designer to get the cells, columns, rows, row drivers, column drivers, sense amps right. He took care of the DRC/LVS. There was some ERC built into our schematic tools.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf