Author Topic: MCU with FPGA vs. SoC FPGA  (Read 30632 times)

0 Members and 1 Guest are viewing this topic.

Offline asmi

  • Super Contributor
  • ***
  • Posts: 2839
  • Country: ca
Re: MCU with FPGA vs. SoC FPGA
« Reply #50 on: July 12, 2023, 05:35:10 pm »
There is no single definition of fine grained parallelism, especially since parallelism can be defined and expressed at many many levels and in many many ways.
Of course there is. Fine grained parallelism is defined as I stated in the post above.

But don't ignore the wood and concentrate on one clump of trees. That isn't enlightening.
That difference is crucial to understand the fundamental difference between FPGA and microcontrollers.

For many people it is more enlightening for them to consider the relationship between a specification/algorithm, how it can be implemented in hardware and/or software, and the deep equivalence between hardware and software. More useful, too :)
There is absolutely NO equivalence between hardware and software, they differ in the very fundamental way, which is why stuff like XMOS is nothing but a crutch for software developers who can't into hardware. Anything saying otherwise simply doesn't understand that difference, and that's why they embrace crutches like bit-banging, which is totally and fundamentally different to how things happen inside FPGA (or any hardware for that matter).
And as someone who came into hardware from the software world, it wasn't easy to fully understand this fundamental difference, but like everything, understanding comes with practice.

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20770
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: MCU with FPGA vs. SoC FPGA
« Reply #51 on: July 12, 2023, 08:54:47 pm »
But don't ignore the wood and concentrate on one clump of trees. That isn't enlightening.
That difference is crucial to understand the fundamental difference between FPGA and microcontrollers.

Of course there are differences; that's not the point.

The interesting point is the similarities in the specifications and designs, even if there are differences in the implementations.

Quote
For many people it is more enlightening for them to consider the relationship between a specification/algorithm, how it can be implemented in hardware and/or software, and the deep equivalence between hardware and software. More useful, too :)
There is absolutely NO equivalence between hardware and software, they differ in the very fundamental way, which is why stuff like XMOS is nothing but a crutch for software developers who can't into hardware. Anything saying otherwise simply doesn't understand that difference, and that's why they embrace crutches like bit-banging, which is totally and fundamentally different to how things happen inside FPGA (or any hardware for that matter).
And as someone who came into hardware from the software world, it wasn't easy to fully understand this fundamental difference, but like everything, understanding comes with practice.

I don't think you know as much as you think you know.

Please define the your boundary between hardware and software. Start with something we all know and use: a system with an Intel x86 processor and some memory. Where does software stop and the hardware start?

Background: some of the things I've designed and implemented include low-noise analogue electronics (including "DSP" circuits without ADCs/DACs :) ), semi-custom digital ICs, designing and implementing an application specific processor, through to life-critical software/hardware systems, cellular system RF modelling/instrumentation/measurement, and telecom server systems. I have a solid feel for where the boundaries aren't.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 28111
  • Country: nl
    • NCT Developments
Re: MCU with FPGA vs. SoC FPGA
« Reply #52 on: July 12, 2023, 09:16:37 pm »
I have to agree with tggzzz here. Where is the difference between a programmable statemachine working according to instructions from a flash memory compared to a bunch of programmable logic elements + flipflops working according to programmable connections read from a flash memory?

The only real difference is, is that the programmable statemachine (AKA CPU) offers a much more constrained interface and thus is easier to manage compared to a collection of logic that needs to be pieced together one by one. In the end only the level of abstraction is different. And more interestingly: a lot of effort has been put into 'defining' (by lack of a better word I can come up with right now) programmable logic functions by using high level programming languages.

Or more practical: if I have a bunch of software engineers and a problem that would lend itself to be solved in software somehow (even if it comes down to bitbanging signals using paralled processors), I'd go for a software solution instead of trying to turn software engineers into programmable logic designers.
« Last Edit: July 12, 2023, 09:22:48 pm by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20770
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: MCU with FPGA vs. SoC FPGA
« Reply #53 on: July 12, 2023, 09:30:57 pm »
I have to agree with tggzzz here. Where is the difference between a programmable statemachine working according to instructions from a flash memory compared to a bunch of programmable logic elements + flipflops working according to programmable connections read from a flash memory?

The only real difference is, is that the programmable statemachine (AKA CPU) offers a much more constrained interface and thus is easier to manage compared to a collection of logic that needs to be pieced together one by one. In the end only the level of abstraction is different. And more interestingly: a lot of effort has been put into 'defining' (by lack of a better word I can come up with right now) programmable logic functions by using high level programming languages.

Or more practical: if I have a bunch of software engineers and a problem that would lend itself to be solved in software somehow, I'd go for a software solution instead of trying to turn software engineers into programmable logic designers.

Yup :) (Plus it is much deeper and more pervasive than that good example, but we'll see what ASMI has to say :) )

Hardware engineers can make the transition to software more easily than the reverse.

Parallel thinking is one example.

Another example is to grok the triumphant recent software concept of "Inversion of Control", and understand the hardware equivalent. When you finally work out the key concepts behind IoC and why they really are useful, you realise IoC is just a fancy name for creating components that can be included in a hierarchical composition. You know, like a memory chip and a counter/timer can be wired together on a PCB, multiple PCBs and a PSU can be wired together in a crate, etc etc.

Traditional software engineers are hung up thinking in terms of composition of algorithms and composition of data. Systems engineers, not so much.
« Last Edit: July 12, 2023, 09:35:44 pm by tggzzz »
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline asmi

  • Super Contributor
  • ***
  • Posts: 2839
  • Country: ca
Re: MCU with FPGA vs. SoC FPGA
« Reply #54 on: July 12, 2023, 10:37:25 pm »
I don't think you know as much as you think you know.
Don't worry, I know more than enough.

Please define the your boundary between hardware and software. Start with something we all know and use: a system with an Intel x86 processor and some memory. Where does software stop and the hardware start?
Hardware works the way it works due to the way it's interconnected, software always uses the same circuit, so hardware don't change from one operation to another.

Background: some of the things I've designed and implemented include low-noise analogue electronics (including "DSP" circuits without ADCs/DACs :) ), semi-custom digital ICs, designing and implementing an application specific processor, through to life-critical software/hardware systems, cellular system RF modelling/instrumentation/measurement, and telecom server systems. I have a solid feel for where the boundaries aren't.
And yet apparently you are still missing the crucial difference.

Offline asmi

  • Super Contributor
  • ***
  • Posts: 2839
  • Country: ca
Re: MCU with FPGA vs. SoC FPGA
« Reply #55 on: July 12, 2023, 10:38:14 pm »
I have to agree with tggzzz here. Where is the difference between a programmable statemachine working according to instructions from a flash memory compared to a bunch of programmable logic elements + flipflops working according to programmable connections read from a flash memory?
The difference begins and ends with circuit.

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 28111
  • Country: nl
    • NCT Developments
Re: MCU with FPGA vs. SoC FPGA
« Reply #56 on: July 12, 2023, 10:47:45 pm »
I have to agree with tggzzz here. Where is the difference between a programmable statemachine working according to instructions from a flash memory compared to a bunch of programmable logic elements + flipflops working according to programmable connections read from a flash memory?
The difference begins and ends with circuit.
You can always try and see differences and then get totally worked up about how things are not equal. Try look at it a different way: a screw and a nail are completely different and yet they perform the same function: keep materials together. Which one is the best to use is highly debatable in many cases.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline asmi

  • Super Contributor
  • ***
  • Posts: 2839
  • Country: ca
Re: MCU with FPGA vs. SoC FPGA
« Reply #57 on: July 12, 2023, 11:09:53 pm »
You can always try and see differences and then get totally worked up about how things are not equal.
That difference has important implications for FPGA designs which hinders a lot of folks who are coming into FPGA world from software world. And the fact that HDLs visually look similar to software languages doesn't help to bridge this gap.

Try look at it a different way: a screw and a nail are completely different and yet they perform the same function: keep materials together. Which one is the best to use is highly debatable in many cases.
Mechanical folks might have a problem with that stament :-DD

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20770
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: MCU with FPGA vs. SoC FPGA
« Reply #58 on: July 12, 2023, 11:12:47 pm »
I don't think you know as much as you think you know.
Don't worry, I know more than enough.

Good. Use that to describe what is and isn't hardware in an x86 processor.

That should be easy enough.

Quote
Please define the your boundary between hardware and software. Start with something we all know and use: a system with an Intel x86 processor and some memory. Where does software stop and the hardware start?
Hardware works the way it works due to the way it's interconnected, software always uses the same circuit, so hardware don't change from one operation to another.

I don't understand what you are attempting to say. However, looking at "hardware don't change from one operation to another".

If that's the case then some FPGAs aren't hardware. Do you really mean that?

Example: the Xilinx Partial Reconfiguration. "Partial reconfiguration is a technique that allows replacing the logic of some parts of the FPGA, while its other parts are working normally. This consists of feeding the FPGA with a bitstream, exactly like the initial bitstream that programs its functionality on powerup. However the bitstream for Partial Reconfiguration doesn't cause the FPGA to halt. Instead, it works on specific logic elements, and updates the memory cells that control their behavior. It's a hot replacement of specific logic blocks." https://www.01signal.com/vendor-specific/xilinx/partial-reconfiguration/part1-introduction/

That, of course, is exactly equivalent to what happens when an operating system loads and runs an application program.

Quote
Background: some of the things I've designed and implemented include low-noise analogue electronics (including "DSP" circuits without ADCs/DACs :) ), semi-custom digital ICs, designing and implementing an application specific processor, through to life-critical software/hardware systems, cellular system RF modelling/instrumentation/measurement, and telecom server systems. I have a solid feel for where the boundaries aren't.
And yet apparently you are still missing the crucial difference.

There are many differences, just as there are many differences between design strategies, differences between computer languages - and screws and nails.

You are missing the crucial similarities.
You are overestimating the differences.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20770
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: MCU with FPGA vs. SoC FPGA
« Reply #59 on: July 12, 2023, 11:23:55 pm »
You can always try and see differences and then get totally worked up about how things are not equal.
That difference has important implications for FPGA designs which hinders a lot of folks who are coming into FPGA world from software world. And the fact that HDLs visually look similar to software languages doesn't help to bridge this gap.

The same observation can be made about different software programming paradigms, and even languages.

Fundamentally language syntax is trivial; language semantics is far more significant.

Simple example: "7 - 7 - 7" gives different results in different languages. In one language , "7 - 7 - 7" is numerically equal to "7 - 7 - 7 - 7 -7".

Other examples might include the radically different semantics of FSM languages, logic programming languages, constraint satisfaction language etc.

Quote
Try look at it a different way: a screw and a nail are completely different and yet they perform the same function: keep materials together. Which one is the best to use is highly debatable in many cases.
Mechanical folks might have a problem with that stament :-DD

The competent ones would understand nctnico's point.

Trivial example: frequently screws are inserted with a hammer - and then screwed up for the last quarter turn.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline asmi

  • Super Contributor
  • ***
  • Posts: 2839
  • Country: ca
Re: MCU with FPGA vs. SoC FPGA
« Reply #60 on: July 13, 2023, 01:55:55 am »
Good. Use that to describe what is and isn't hardware in an x86 processor.

That should be easy enough.
The circuit is hardware. Everything else is software. And yes I'm aware about microops and all that jazz. That's still software and not hardware.

I don't understand what you are attempting to say. However, looking at "hardware don't change from one operation to another".

If that's the case then some FPGAs aren't hardware. Do you really mean that?

Example: the Xilinx Partial Reconfiguration. "Partial reconfiguration is a technique that allows replacing the logic of some parts of the FPGA, while its other parts are working normally. This consists of feeding the FPGA with a bitstream, exactly like the initial bitstream that programs its functionality on powerup. However the bitstream for Partial Reconfiguration doesn't cause the FPGA to halt. Instead, it works on specific logic elements, and updates the memory cells that control their behavior. It's a hot replacement of specific logic blocks." https://www.01signal.com/vendor-specific/xilinx/partial-reconfiguration/part1-introduction/
When PR is occuring the circuit is not functional, so at that stage it's neither. But once it's completed, it behaves like a regular circuit.

That, of course, is exactly equivalent to what happens when an operating system loads and runs an application program.
Absolutely NOT. OS Task switch doesn't change the hardware. That is an example of a coarse grained parallelism, when commands streams are chopped into chunks, and each chunk is executed sequentially, but if they are made small enough and observed over large enough intervals, it appears as if those commands are being executed in parallel.

There are many differences, just as there are many differences between design strategies, differences between computer languages - and screws and nails.

You are missing the crucial similarities.
You are overestimating the differences.
Similarities are superficial, while differences are fundamental.

Offline asmi

  • Super Contributor
  • ***
  • Posts: 2839
  • Country: ca
Re: MCU with FPGA vs. SoC FPGA
« Reply #61 on: July 13, 2023, 01:57:03 am »
Trivial example: frequently screws are inserted with a hammer - and then screwed up for the last quarter turn.
Now I see why you love XMOS so much - driving screws with hammer also seem to be your thing :palm:
« Last Edit: July 13, 2023, 02:13:12 pm by asmi »
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20770
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: MCU with FPGA vs. SoC FPGA
« Reply #62 on: July 13, 2023, 10:10:12 am »
Good. Use that to describe what is and isn't hardware in an x86 processor.

That should be easy enough.
The circuit is hardware. Everything else is software. And yes I'm aware about microops and all that jazz. That's still software and not hardware.

So an intel processor is software? Most people would disagree with you on that point.

Quote
I don't understand what you are attempting to say. However, looking at "hardware don't change from one operation to another".

If that's the case then some FPGAs aren't hardware. Do you really mean that?

Example: the Xilinx Partial Reconfiguration. "Partial reconfiguration is a technique that allows replacing the logic of some parts of the FPGA, while its other parts are working normally. This consists of feeding the FPGA with a bitstream, exactly like the initial bitstream that programs its functionality on powerup. However the bitstream for Partial Reconfiguration doesn't cause the FPGA to halt. Instead, it works on specific logic elements, and updates the memory cells that control their behavior. It's a hot replacement of specific logic blocks." https://www.01signal.com/vendor-specific/xilinx/partial-reconfiguration/part1-introduction/
When PR is occuring the circuit is not functional, so at that stage it's neither. But once it's completed, it behaves like a regular circuit.

That, of course, is exactly equivalent to what happens when an operating system loads and runs an application program.
Absolutely NOT. OS Task switch doesn't change the hardware. That is an example of a coarse grained parallelism, when commands streams are chopped into chunks, and each chunk is executed sequentially, but if they are made small enough and observed over large enough intervals, it appears as if those commands are being executed in parallel.

The FPGA continues to operate during partial reconfiguration.

Task switching is not the point; it occurs after the program has been loaded and is running.

CPM and MSDOS load and run a single application at a time; there is no task switching per-se.

Quote
There are many differences, just as there are many differences between design strategies, differences between computer languages - and screws and nails.

You are missing the crucial similarities.
You are overestimating the differences.
Similarities are superficial, while differences are fundamental.

You've got that the wrong way round.

The fundamental similarities are easily understood when identical functionality is implemented using either hardware or software or a combination of the two. Frequently the fundamental behaviour remains fixed while the exact partitioning changes over time as the superficial differences change[1], but also according to different constraints[2].

[1] during development and/or product lifetime
[2] especially cost and size and performance
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20770
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: MCU with FPGA vs. SoC FPGA
« Reply #63 on: July 13, 2023, 10:13:23 am »
You can always try and see differences and then get totally worked up about how things are not equal.
That difference has important implications for FPGA designs which hinders a lot of folks who are coming into FPGA world from software world. And the fact that HDLs visually look similar to software languages doesn't help to bridge this gap.

The same observation can be made about different software programming paradigms, and even languages.

Fundamentally language syntax is trivial; language semantics is far more significant.

Simple example: "7 - 7 - 7" gives different results in different languages. In one language , "7 - 7 - 7" is numerically equal to "7 - 7 - 7 - 7 -7".

Other examples might include the radically different semantics of FSM languages, logic programming languages, constraint satisfaction language etc.

Since you, amsi, have chosen to omit commenting on that, do you realise your contention has limited validity in the context of trying to draw a single unambiguous boundary between hardware and software?
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline asmi

  • Super Contributor
  • ***
  • Posts: 2839
  • Country: ca
Re: MCU with FPGA vs. SoC FPGA
« Reply #64 on: July 13, 2023, 02:45:55 pm »
So an intel processor is software? Most people would disagree with you on that point.
Don't be so dense. Hardware of CPU is hardware, firmware (microcode) is software.

The FPGA continues to operate during partial reconfiguration.
But the part being reconfigured does not.

Task switching is not the point; it occurs after the program has been loaded and is running.
Not sure what this has to do with anything.

CPM and MSDOS load and run a single application at a time; there is no task switching per-se.
But it can "multitask" if application is coded as such, and there was also a concept of "resident" code (I'm old enough to remember that).

You've got that the wrong way round.
No, you've got this wrong way around.

The fundamental similarities are easily understood when identical functionality is implemented using either hardware or software or a combination of the two. Frequently the fundamental behaviour remains fixed while the exact partitioning changes over time as the superficial differences change[1], but also according to different constraints[2].

[1] during development and/or product lifetime
[2] especially cost and size and performance
Software and hardware implementatiuons are never identical, there are always fundamental differences, even if sometimes they appear subtle to some who doesn't understand hardware. For example, a logical gate reacts immediately (ignoring propagation delay) on a change of input signal, while software can only "react" at a fixed intervals of time. That is a fundamental difference.

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20770
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: MCU with FPGA vs. SoC FPGA
« Reply #65 on: July 13, 2023, 03:23:47 pm »
You are choosing to snip so much context that the points I make are being obscured in favour of the points you would like to make.

Not going to fall for that debating technique!

So an intel processor is software? Most people would disagree with you on that point.
Don't be so dense. Hardware of CPU is hardware, firmware (microcode) is software.

So which are the electrons stored in a transistor's gate?

<omitted points where the context was removed>

Quote
The fundamental similarities are easily understood when identical functionality is implemented using either hardware or software or a combination of the two. Frequently the fundamental behaviour remains fixed while the exact partitioning changes over time as the superficial differences change[1], but also according to different constraints[2].

[1] during development and/or product lifetime
[2] especially cost and size and performance
Software and hardware implementatiuons are never identical, there are always fundamental differences, even if sometimes they appear subtle to some who doesn't understand hardware.

Insufficient distinction.

Take some functionality and implement it using radically different software paradigms. The resulting implementations will not be identical and will have fundamental differences.

Hardware is just another step on the continuum between formal mathematical expressions and particles and waves.

Quote
For example, a logical gate reacts immediately (ignoring propagation delay) on a change of input signal, while software can only "react" at a fixed intervals of time. That is a fundamental difference.

Not true, of course - for several reasons related to the intractability of asynchronous behaviour of systems at various conceptual levels.

Almost all practical hardware only reacts only at fixed intervals of time, due to the intractibility of creating designs where the ordering of events is undefined. If you don't understand why, then take time to understand when and why it is necessary to insert "bridging terms" into logic implementations expressed in the form of Karnaugh maps.

There are deep reasons why distributed software systems have similar problems. If you don't understand why, then take time to understand Leslie Lamport's seminal works anbd their consequences.
« Last Edit: July 13, 2023, 03:26:13 pm by tggzzz »
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline asmi

  • Super Contributor
  • ***
  • Posts: 2839
  • Country: ca
Re: MCU with FPGA vs. SoC FPGA
« Reply #66 on: July 13, 2023, 05:08:27 pm »
You are choosing to snip so much context that the points I make are being obscured in favour of the points you would like to make.

Not going to fall for that debating technique!
I've skipped irrelevant details to focus on what's important such that readers won't have to suffer through your irrelevant brain dumps.

So which are the electrons stored in a transistor's gate?
I've answered your question with the best clarity I can manage. If you can't understand this, this is on you.

Insufficient distinction.

Take some functionality and implement it using radically different software paradigms. The resulting implementations will not be identical and will have fundamental differences.

Hardware is just another step on the continuum between formal mathematical expressions and particles and waves.
There is no continuum between hardware and software. They might solve the same ultimate problem, but the way they do it is fundamentally different.

Not true, of course - for several reasons related to the intractability of asynchronous behaviour of systems at various conceptual levels.
Of course it is true. Anybody can see it by taking a logical gate and feeding in a signal and observing the output, and then compare it to software implementation of the same logic.

Almost all practical hardware only reacts only at fixed intervals of time, due to the intractibility of creating designs where the ordering of events is undefined. If you don't understand why, then take time to understand when and why it is necessary to insert "bridging terms" into logic implementations expressed in the form of Karnaugh maps.
It becomes more and more clear to me that you simply don't understand what hardware is, hence you constant pitching of XMOS. You are too stuck to software world to see that there is a whole other world, which actually provides that you have something to run your software on.

Offline coppice

  • Super Contributor
  • ***
  • Posts: 9566
  • Country: gb
Re: MCU with FPGA vs. SoC FPGA
« Reply #67 on: July 13, 2023, 05:19:47 pm »
Almost all practical hardware only reacts only at fixed intervals of time, due to the intractibility of creating designs where the ordering of events is undefined. If you don't understand why, then take time to understand when and why it is necessary to insert "bridging terms" into logic implementations expressed in the form of Karnaugh maps.
They are not intractable. They are just difficult. There are, for example, fully asynchronous DSPs. I'm not clear whether they show any cost performance benefits over more conventional designs. What bothers many people about them is you just don't know how much latency there will be between input and output, as it varies from sample to sample of the device. As long as you design your system for the specified worst case of the device it shouldn't be an issue.
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 28111
  • Country: nl
    • NCT Developments
Re: MCU with FPGA vs. SoC FPGA
« Reply #68 on: July 13, 2023, 07:06:39 pm »
You can always try and see differences and then get totally worked up about how things are not equal.
That difference has important implications for FPGA designs which hinders a lot of folks who are coming into FPGA world from software world. And the fact that HDLs visually look similar to software languages doesn't help to bridge this gap.
IMHO your view is way too narrow and really focussed on the edge of an FPGA where signals go in & out. But that is just a very tiny part of doing digital logic design. Back when I was doing my EE study I also took several classes on digital IC logic design. Compared to software development, that was/is highly abstract. Didn't even involve controlling anything real (unlike software). Just running simulation after simulation and doing analysis of test vector coverage. And this is true for a lot of logic design work. When I design a new part for a complicated piece of logic in an FPGA, I start out with simulations and once the logic does what I need it to do, I add it somewhere to the rest of the design. However, I really can't see how this workflow is any different compared to developing a new software module in C (which also involves providing stimuli and checking output aimed to maximise test coverage).

So yes, HDLs look like software development tools because they are software development tools.

Biggest hurdle for me when starting digital logic design and HDL was the fact that in HDL all statements are executed at the same time and any form of sequencing must be implemented through a statemachine. This is likely the case for all software people entering into digital logic design.

Interaction with hardware is a different subject where software people have trouble because they don't know tools like an oscilloscope / logic analyser. But the hurdle they need to get over is equal whether they are using an FPGA, microcontroller or embedded system. It is connecting the dots between something they define in software and something that happens in hardware and vice versa. And then learn to understand the criteria surrounding the border between hardware and software (with timing as a recurring topic).
« Last Edit: July 13, 2023, 07:26:22 pm by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20770
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: MCU with FPGA vs. SoC FPGA
« Reply #69 on: July 13, 2023, 07:22:18 pm »
You are choosing to snip so much context that the points I make are being obscured in favour of the points you would like to make.

Not going to fall for that debating technique!
I've skipped irrelevant details to focus on what's important such that readers won't have to suffer through your irrelevant brain dumps.

You think they are irrelevant, but that's because you don't have sufficient experience in analogue and digital systems.

We know that since you have stated that you have only just started moving into hardware.

Quote
It becomes more and more clear to me that you simply don't understand what hardware is, hence you constant pitching of XMOS. You are too stuck to software world to see that there is a whole other world, which actually provides that you have something to run your software on.

I first designed and implemented hardware systems over 60 years ago; some included logic gates made from individual transistors and resistors.
I designed and implemented an Altair 8080 class computer in 1976 (it has 128 bytes of RAM, all I could afford).
I've been designing and implementing hardware and software systems professionally for 45 years.

For clarity, how many of these technologies have you used when designing and implementing systems:
  • low-noise analogue electronics controlled by logic gates
  • semi-custom digital ICs, where operation cannot be changed after manufacture - a mistake cost 3 months delay and a years salary
  • programmable logic based on PLAs and GALs
  • programmable logic based on FPGAs
  • application specific processor to discriminate between differing pulse streams (to replace a pure logic-gate implementation)
  • life support hardware+software systems (and many similar systems)
  • RF modelling, then instrumentation and processing of live cellular systems
  • and I'll omit the pure software, so as not to confuse you

So I do have a clue.

What similar evidence can you cite to demonstrate that you have a clue?
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20770
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: MCU with FPGA vs. SoC FPGA
« Reply #70 on: July 13, 2023, 07:37:28 pm »
Almost all practical hardware only reacts only at fixed intervals of time, due to the intractibility of creating designs where the ordering of events is undefined. If you don't understand why, then take time to understand when and why it is necessary to insert "bridging terms" into logic implementations expressed in the form of Karnaugh maps.
They are not intractable. They are just difficult.

I'm not convinced it is a good use of our time to discuss the boundary between "very difficult" and "intractible" :)

Asynchronous logic has so many potential advantages (especially high speed and low power) that people have come up with many many design concepts[1] over the decades. They have all come to nothing because they are so difficult to design and test, and don't offer sufficient advantages over conventional clocked synchronous designs.

Perhaps it is worth revisiting the technology, now that device geometry has hit the brickwall and leakage currents are so significant. Similar considerations are rubbing peoples' noses in the problems with C, and leading to languages like xC, Rust and Go.

[1] one of the more interesting was Steve Furber's group's AMULET1, an asynchronous ARM processor 30 years ago

Quote
There are, for example, fully asynchronous DSPs. I'm not clear whether they show any cost performance benefits over more conventional designs. What bothers many people about them is you just don't know how much latency there will be between input and output, as it varies from sample to sample of the device. As long as you design your system for the specified worst case of the device it shouldn't be an issue.

Specified worst case is an "interesting" concept in asynchronous systems, where metastability rears its ugly head.

Not knowing the timing is a problem during manufacturing test and product certification.

Do you have a reference to the DSP you mention? I'm curious to find out more.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20770
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: MCU with FPGA vs. SoC FPGA
« Reply #71 on: July 13, 2023, 07:50:57 pm »
You can always try and see differences and then get totally worked up about how things are not equal.
That difference has important implications for FPGA designs which hinders a lot of folks who are coming into FPGA world from software world. And the fact that HDLs visually look similar to software languages doesn't help to bridge this gap.
IMHO your view is way too narrow and really focussed on the edge of an FPGA where signals go in & out. But that is just a very tiny part of doing digital logic design. Back when I was doing my EE study I also took several classes on digital IC logic design. Compared to software development, that was/is highly abstract. Didn't even involve controlling anything real (unlike software). Just running simulation after simulation and doing analysis of test vector coverage. And this is true for a lot of logic design work. When I design a new part for a complicated piece of logic in an FPGA, I start out with simulations and once the logic does what I need it to do, I add it somewhere to the rest of the design. However, I really can't see how this workflow is any different compared to developing a new software module in C (which also involves providing stimuli and checking output aimed to maximise test coverage).

So yes, HDLs look like software development tools because they are software development tools.

Just so.

There's an additional step that can be added to "help" ASMI.
His contention appears to be that functions defined by doped silicon plus deposited metal (e.g. a 7400) that cannot be changed after manufacture is hardware (or "circuit" as he calls it).
His contention appears to be that functions defined by HDLs/HLLs, stored as charges in FETs' gates or in flip flops (e.g. memory or FPGA) and which can be changed after manufacture is software.

Semi-custom ICs, which I implemented 40 (gulp) years ago had their function defined in a HiLo (a precursor of Verilog and VHDL), and implemented in metalisation. No changes after manufacturing, each manufacturing iteration cost 3 months (at best) and a year's salary.

I wonder whether ASMI would class those ICs as software or "circuits".
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline gnuarm

  • Super Contributor
  • ***
  • Posts: 2247
  • Country: pr
Re: MCU with FPGA vs. SoC FPGA
« Reply #72 on: July 13, 2023, 07:57:19 pm »
They are not intractable. They are just difficult. There are, for example, fully asynchronous DSPs. I'm not clear whether they show any cost performance benefits over more conventional designs. What bothers many people about them is you just don't know how much latency there will be between input and output, as it varies from sample to sample of the device. As long as you design your system for the specified worst case of the device it shouldn't be an issue.

"Asynchronous" means different things to different people.  I learned asynchronous state machine design, which used no FFs.  In essence, the feedback paths, created latches in the logic, but still, there was no clock or enable. 

I've also studied asynchronous processors, which have no free running master clock.  They do have FFs with clock inputs, but the clocks are generated locally, and are stopped when there is no data to process.  The clock is often generated with variable delay, corresponding to the timing of the particular circuit processing the data.  The Green Arrays GA144 has 144 such processors running at 700 MIPS peak in each processor!

The speed advantage of the async processor is in being able to take advantage of portions of the design that are faster than the remainder of the logic.  In a properly clocked design, the entire circuit runs at the speed of the slowest circuit. 

It also achieves speed gains from PVT (Process, Voltage and Temperature).  However, these gains can not be counted on, other than perhaps running at higher or lower voltage to tradeoff speed and power. 
Rick C.  --  Puerto Rico is not a country... It's part of the USA
  - Get 1,000 miles of free Supercharging
  - Tesla referral code - https://ts.la/richard11209
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20770
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: MCU with FPGA vs. SoC FPGA
« Reply #73 on: July 13, 2023, 08:26:48 pm »
"Asynchronous" means different things to different people. 

Just so.

In addition to the cases you mentioned, "asynchronous" has analogous meaning in software systems, especially distributed systems.

Programming those is, in practice, difficult and error prone, largely - I believe - because softies can't think of things occurring in parallel in systems which cannot have a unique concept of time. (For those that doubt that, I refer you to Leslie Lamport's seminal and well-known works!)

One widely used and successful set of design strategies is used in the telecom system specifications and LAN specifications. There the specification and implementation is in the form of multiple independent FSMs each with their own internal state and communicating via events. Often the spec is written in the SDL language. Sometimes the events/states are implemented in logic gates, sometimes in packets, CPUs and memory.

For ASMI's benefit...

In token ring LANs (and probably others) sequences of voltage transitions at the PHY level are pattern matched by an FSM implemented in a few gates and interpreted as a packet representing the token.

The arrival of such a packet causes a logic signal to change, which in turn changes the state of some FSMs implemented in hardware and some FSMs implemented in software.

Other transition sequences are interpreted as the beginning and end of information packets, and the information is passed from the PHY level to other FSMs implemented in hardware or software at various levels of the networking stack.

The choice between hardware and software implementation is somewhat arbitrary, and different manufacturers make different choices.
« Last Edit: July 13, 2023, 08:38:27 pm by tggzzz »
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online langwadt

  • Super Contributor
  • ***
  • Posts: 4778
  • Country: dk
Re: MCU with FPGA vs. SoC FPGA
« Reply #74 on: July 13, 2023, 08:37:15 pm »
They are not intractable. They are just difficult. There are, for example, fully asynchronous DSPs. I'm not clear whether they show any cost performance benefits over more conventional designs. What bothers many people about them is you just don't know how much latency there will be between input and output, as it varies from sample to sample of the device. As long as you design your system for the specified worst case of the device it shouldn't be an issue.

"Asynchronous" means different things to different people.  I learned asynchronous state machine design, which used no FFs.  In essence, the feedback paths, created latches in the logic, but still, there was no clock or enable. 

I've also studied asynchronous processors, which have no free running master clock.  They do have FFs with clock inputs, but the clocks are generated locally, and are stopped when there is no data to process.  The clock is often generated with variable delay, corresponding to the timing of the particular circuit processing the data.  The Green Arrays GA144 has 144 such processors running at 700 MIPS peak in each processor!

The speed advantage of the async processor is in being able to take advantage of portions of the design that are faster than the remainder of the logic.  In a properly clocked design, the entire circuit runs at the speed of the slowest circuit. 

It also achieves speed gains from PVT (Process, Voltage and Temperature).  However, these gains can not be counted on, other than perhaps running at higher or lower voltage to tradeoff speed and power.

and it is much harder to analyse because it is no longer a matter of just checking path from one FF to another is shorter than the clock period
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf