• =?UTF-8?Q?Neural=20Networks=20(MNIST=20inference)=20on?= =?UTF-8?Q?=20the=20=E2=80=9C3-cent=E2=80=9D=20Microcontroller?=

    From D. Ray@d@ray to comp.misc,comp.ai.philosophy,alt.microcontrollers,comp.arch.embedded,alt.microcontrollers.8bit on Mon Oct 21 20:06:28 2024
    From Newsgroup: comp.arch.embedded

    Bouyed by the surprisingly good performance of neural networks with quantization aware training on the CH32V003, I wondered how far this can be pushed. How much can we compress a neural network while still achieving
    good test accuracy on the MNIST dataset? When it comes to absolutely
    low-end microcontrollers, there is hardly a more compelling target than the Padauk 8-bit microcontrollers. These are microcontrollers optimized for the simplest and lowest cost applications there are. The smallest device of the portfolio, the PMS150C, sports 1024 13-bit word one-time-programmable
    memory and 64 bytes of ram, more than an order of magnitude smaller than
    the CH32V003. In addition, it has a proprieteray accumulator based 8-bit architecture, as opposed to a much more powerful RISC-V instruction set.

    Is it possible to implement an MNIST inference engine, which can classify handwritten numbers, also on a PMS150C?





    <https://cpldcpu.wordpress.com/2024/05/02/machine-learning-mnist-inference-on-the-3-cent-microcontroller/>

    <https://archive.md/DzqzL>
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From Don Y@blockedofcourse@foo.invalid to comp.arch.embedded on Mon Oct 21 15:09:10 2024
    From Newsgroup: comp.arch.embedded

    On 10/21/2024 1:06 PM, D. Ray wrote:
    Is it possible to implement an MNIST inference engine, which can classify handwritten numbers, also on a PMS150C?

    Wouldn't it be smarter to come up with an approach that *can*
    rather than trying to force some approach to "fit"?


    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From George Neuner@gneuner2@comcast.net to comp.arch.embedded on Tue Oct 22 15:39:42 2024
    From Newsgroup: comp.arch.embedded

    On Mon, 21 Oct 24 20:06:28 UTC, D. Ray <d@ray> wrote:

    Bouyed by the surprisingly good performance of neural networks with >quantization aware training on the CH32V003, I wondered how far this can be >pushed. How much can we compress a neural network while still achieving
    good test accuracy on the MNIST dataset? When it comes to absolutely
    low-end microcontrollers, there is hardly a more compelling target than the >Padauk 8-bit microcontrollers. These are microcontrollers optimized for the >simplest and lowest cost applications there are. The smallest device of the >portfolio, the PMS150C, sports 1024 13-bit word one-time-programmable
    memory and 64 bytes of ram, more than an order of magnitude smaller than
    the CH32V003. In addition, it has a proprieteray accumulator based 8-bit >architecture, as opposed to a much more powerful RISC-V instruction set.

    Is it possible to implement an MNIST inference engine, which can classify >handwritten numbers, also on a PMS150C?


    <https://cpldcpu.wordpress.com/2024/05/02/machine-learning-mnist-inference-on-the-3-cent-microcontroller/>

    <https://archive.md/DzqzL>


    Depends on whether you mean implementing /their/ recognizer, or just implementing a recognizer that could be trained using their data set.

    Any 8-bitter can easily handle the computations ... FP is not required
    - fixed point fractions will do fine. The issue is how much memory is
    needed and what your target chip brings to the party.

    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From olcott@NoOne@NoWhere.com to comp.misc,comp.ai.philosophy,alt.microcontrollers,comp.arch.embedded,alt.microcontrollers.8bit on Sat Oct 26 20:43:01 2024
    From Newsgroup: comp.arch.embedded

    On 10/21/2024 3:06 PM, D. Ray wrote:
    Bouyed by the surprisingly good performance of neural networks with quantization aware training on the CH32V003, I wondered how far this can be pushed. How much can we compress a neural network while still achieving
    good test accuracy on the MNIST dataset? When it comes to absolutely
    low-end microcontrollers, there is hardly a more compelling target than the Padauk 8-bit microcontrollers. These are microcontrollers optimized for the simplest and lowest cost applications there are. The smallest device of the portfolio, the PMS150C, sports 1024 13-bit word one-time-programmable
    memory and 64 bytes of ram, more than an order of magnitude smaller than
    the CH32V003. In addition, it has a proprieteray accumulator based 8-bit architecture, as opposed to a much more powerful RISC-V instruction set.

    Is it possible to implement an MNIST inference engine, which can classify handwritten numbers, also on a PMS150C?





    <https://cpldcpu.wordpress.com/2024/05/02/machine-learning-mnist-inference-on-the-3-cent-microcontroller/>

    <https://archive.md/DzqzL>

    test to see if this posts or I should dump this paid provider.
    --
    Copyright 2024 Olcott

    "Talent hits a target no one else can hit;
    Genius hits a target no one else can see."
    Arthur Schopenhauer
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From George Neuner@gneuner2@comcast.net to comp.arch.embedded on Sun Oct 27 16:41:31 2024
    From Newsgroup: comp.arch.embedded

    On Sat, 26 Oct 2024 20:43:01 -0500, olcott <NoOne@NoWhere.com> wrote:


    test to see if this posts or I should dump this paid provider.


    Eternal September is a good, no cost Usenet provider.

    http://www.eternal-september.org/
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From D. Ray@d@ray to alt.microcontrollers,comp.misc,comp.ai.philosophy,comp.arch.embedded,alt.microcontrollers.8bit on Mon Oct 28 15:42:41 2024
    From Newsgroup: comp.arch.embedded

    olcott <NoOne@NoWhere.com> wrote:
    On 10/21/2024 3:06 PM, D. Ray wrote:
    Bouyed by the surprisingly good performance of neural networks with
    quantization aware training on the CH32V003, I wondered how far this can be >> pushed. How much can we compress a neural network while still achieving
    good test accuracy on the MNIST dataset? When it comes to absolutely
    low-end microcontrollers, there is hardly a more compelling target than the >> Padauk 8-bit microcontrollers. These are microcontrollers optimized for the >> simplest and lowest cost applications there are. The smallest device of the >> portfolio, the PMS150C, sports 1024 13-bit word one-time-programmable
    memory and 64 bytes of ram, more than an order of magnitude smaller than
    the CH32V003. In addition, it has a proprieteray accumulator based 8-bit
    architecture, as opposed to a much more powerful RISC-V instruction set.

    Is it possible to implement an MNIST inference engine, which can classify
    handwritten numbers, also on a PMS150C?





    <https://cpldcpu.wordpress.com/2024/05/02/machine-learning-mnist-inference-on-the-3-cent-microcontroller/>

    <https://archive.md/DzqzL>

    test to see if this posts or I should dump this paid provider.

    It worked.

    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From D. Ray@d@ray to comp.arch.embedded on Mon Oct 28 15:42:42 2024
    From Newsgroup: comp.arch.embedded

    George Neuner <gneuner2@comcast.net> wrote:

    Depends on whether you mean

    Perhaps you misunderstood me. I’m not the author, I just posted beginning
    of a blog post and provided the link to the rest of it because it seemed interesting. The reason I didn’t post a whole thing is because there are quite few illustrations.

    Blog post ends with:

    “It is indeed possible to implement MNIST inference with good accuracy
    using one of the cheapest and simplest microcontrollers on the market. A
    lot of memory footprint and processing overhead is usually spent on implementing flexible inference engines, that can accomodate a wide range
    of operators and model structures. Cutting this overhead away and reducing
    the functionality to its core allows for astonishing simplification at this very low end.

    This hack demonstrates that there truly is no fundamental lower limit to applying machine learning and edge inference. However, the feasibility of implementing useful applications at this level is somewhat doubtful.”
    --- Synchronet 3.20a-Linux NewsLink 1.114
  • From David Brown@david.brown@hesbynett.no to comp.arch.embedded on Mon Oct 28 17:50:12 2024
    From Newsgroup: comp.arch.embedded

    On 28/10/2024 16:42, D. Ray wrote:
    George Neuner <gneuner2@comcast.net> wrote:

    Depends on whether you mean

    Perhaps you misunderstood me. I’m not the author, I just posted beginning of a blog post and provided the link to the rest of it because it seemed interesting. The reason I didn’t post a whole thing is because there are quite few illustrations.

    Blog post ends with:

    “It is indeed possible to implement MNIST inference with good accuracy using one of the cheapest and simplest microcontrollers on the market. A
    lot of memory footprint and processing overhead is usually spent on implementing flexible inference engines, that can accomodate a wide range
    of operators and model structures. Cutting this overhead away and reducing the functionality to its core allows for astonishing simplification at this very low end.

    This hack demonstrates that there truly is no fundamental lower limit to applying machine learning and edge inference. However, the feasibility of implementing useful applications at this level is somewhat doubtful.”

    It's fine to quote from a blog post or other such sources, as long as
    you make it clear that this is what you are doing (and that you are not quoting so much that it is copyright infringement). Your first post in
    this thread was formatted in a way that makes it clear and obvious that
    it was your own original words, written for the Usenet post - but
    apparently that was not the case. Remember, no one reading Usenet is
    going to click on random links in a post - we need very good reason to
    do so. So please, next time write some introductory or explanatory text yourself and make the whole thing clearer.

    I think it is quite cool to hear that it is possible to do something
    like this on these 3-cent microcontrollers, but I would not expect
    anyone to use them in practice.

    --- Synchronet 3.20a-Linux NewsLink 1.114