Author Topic: Do we see the end of the SI era?  (Read 2463 times)

0 Members and 1 Guest are viewing this topic.

Offline max-bitTopic starter

  • Frequent Contributor
  • **
  • Posts: 675
  • Country: pl
Do we see the end of the SI era?
« on: May 03, 2017, 05:17:20 pm »
Over the last two or three years, we can observe the slow collapse of Moore's law.
This is evidenced by the computing power of supercomputers, which reduced the speed of its increase.
Intel has big problems moving to 10nm (putting this technology into year 2018). In addition, Intel canceled this year's IDT conference, where every year Intel introduced new solutions... Of course, Intel has already forgotten its Tick Tock :) and now Tick Tock tock tock ?
Where is the boundary for sure <5nm will be no longer possible. At the end of the silicon era, G. Moore mentioned it in 2006.


 

Offline Kleinstein

  • Super Contributor
  • ***
  • Posts: 14727
  • Country: de
Re: Do we see the end of the SI era?
« Reply #1 on: May 03, 2017, 06:18:57 pm »
For computers there is no real alternative to silicon. There are transistors based on GaN and similar that can be faster, but these tend to be much larger and lower reliability - so more like a small circuits for a fiber optic IO part or a single RF power amplifier, but no large scale logic of memory. An interesting point might be what comes when flash memory hits it's limits. This limit is closer than that of pure logic transistors. Flash based SSDs made a big change for every days computers but here technological limits are in sight.

So Moore's law might finally no longer work and progress will likely be slower, but much of the computers will still be silicone based CMOS. Anyway the last decode or so, the CPU designers where a little in behind on what to do with all the transistors, except for ever more cache. So even if the transistors don't get much smaller and faster any more, there can still be an advance in the logic design and more powerful CPUs. We already see GPUs more efficient in number crunching than classical CPUs (like x86).

Besides the limit of getting ever smaller, there is an other area where things tend to change: in the last decades and especially in the early years the useful size of the chips also increased and thus more an more could be implemented in one chip and thus way make things faster. But at the current size, there is no more such an pronounced optimum chip size. So no more large advantage from making the chip area larger.

Personally I am a little skeptical about quantum computing. It might work for a few niches, but not for general purpose things. It is a little like analog computing - putting more data in a single gate, needs high resolution and this makes things slow and there is limit in to how many binary bits one can get from one Qbit.
 

Offline JoeN

  • Frequent Contributor
  • **
  • Posts: 991
  • Country: us
  • We Buy Trannies By The Truckload
Re: Do we see the end of the SI era?
« Reply #2 on: May 03, 2017, 06:21:29 pm »
I think the failure of The United States to adopt the metric system will not cause the end of the SI era.  It has been adopted by most countries and is the only system used in the sciences.
Have You Been Triggered Today?
 
The following users thanked this post: kony, evb149, tooki, Moe, abraxa

Offline rdl

  • Super Contributor
  • ***
  • Posts: 3667
  • Country: us
Re: Do we see the end of the SI era?
« Reply #3 on: May 03, 2017, 06:38:37 pm »
I was going to say that Sports Illustrated will be around as long as they continue to publish the annual swimsuit issue. Then I actually read the first post...
 
The following users thanked this post: tooki

Offline max-bitTopic starter

  • Frequent Contributor
  • **
  • Posts: 675
  • Country: pl
Re: Do we see the end of the SI era?
« Reply #4 on: May 03, 2017, 07:09:46 pm »
1. quantum computing - Real fiction In general, physical laws forbid the existence of quantum computers.
2. Massively parallel computing - Cool, but cost?
3. AI .. hmmm
4. Better fabrication techologies - Of corse, But 5 nm is the limit
5. 3D - Good for memory (although there are layers limitations), but not for CPU, it's going to cook :)

About gallium arsenide chips I heard in the 1980s, unfortunately they are expensive materials, but they are used in the military. Unless someone will afford a $ 5000 processor.

There is graphene technology, but here too nobody can build too much.

 

Offline Messtechniker

  • Frequent Contributor
  • **
  • Posts: 817
  • Country: de
  • Old analog audio hand - No voodoo.
Re: Do we see the end of the SI era?
« Reply #5 on: May 03, 2017, 07:28:12 pm »
More machine code programme snippits would certainly prolong the life of
current silicon chip computers tremendously.
Yes, I know this is difficult in practice.
Agilent 34465A, Siglent SDG 2042X, Hameg HMO1022, R&S HMC 8043, Peaktech 2025A, Voltcraft VC 940, M-Audio Audiophile 192, R&S Psophometer UPGR, 3 Transistor Testers, DL4JAL Transistor Curve Tracer, UT622E LCR meter, UT216C AC/DC Clamp Meter
 

Offline eugenenine

  • Frequent Contributor
  • **
  • Posts: 865
  • Country: us
Re: Do we see the end of the SI era?
« Reply #6 on: May 03, 2017, 08:07:38 pm »
Ohh your meaning Si not SI?  I thought you were meaning that pesky metric system was finally going away :)
 
The following users thanked this post: tooki


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf