Show HN: I used AI to recreate a $4000 piece of audio hardware as a plugin

Hi Hacker News,

This is definitely out of my comfort zone. I've never programmed DSP before. But I was able to use Claude code and have it help me build this using CMajor.

I just wanted to show you guys because I'm super proud of it. It's a 100% faithful recreation based off of the schematics, patents, and ROMs that were found online.

So please watch the video and tell me what you think

https://youtu.be/auOlZXI1VxA

The reason why I think this is relevant is because I've been a programmer for 25 years and AI scares the shit out of me.

I'm not a programmer anymore. I'm something else now. I don't know what it is but it's multi-disciplinary, and it doesn't involve writing code myself--for better or worse!

Thanks!

132 points | by johnwheeler 2 days ago

18 comments

  • franky47 13 hours ago
    I used to do that exact job 10 years ago (without AI, obviously). I figure that career would be very different now.

    There was something exciting about sleuthing out how those old machines worked: we used a black box approach, sending in test samples, recording the output, and comparing against the digital algorithm’s output. Trial and error, slowly building a sense of what sort of filter or harmonics could bend a waveform one way or another.

    I feel like some of this is going to be lost to prompting, the same way hand-tool woodworking has been lost to power tools.

    • ceva 5 hours ago
      I think you will like this talk https://youtu.be/XM_q5T7wTpQ?si=Nyb4lZEZjsjCCGBg
    • mycall 11 hours ago
      It will be the future for sure, software as a tool for everyone.

      While there is something lost in prompting, people will always seek out first-principles so they can understand what they are commanding and controlling, especially as old machines become new machines with new capabilities not even imaginable before due to the old software complexity wall.

      • johnwheeler 10 hours ago
        It's exactly as you say: software as a tool for everyone and it's hard for programmers like me to accept that because I've spent so much time, read so many books, and work so hard perfecting my craft.

        But smart programmers will realize the world doesn't care about any of that at all.

        • mycall 8 hours ago
          We will remain to be programmers but at a higher abstraction where code is just the glue to our means and imagination.
          • AstroBen 8 hours ago
            As a hobby I suppose. There's a very real chance there won't be enough paid work available for that
            • airspresso 6 hours ago
              You’re ignoring Jevons paradox. Everyone, both people and companies, will be making exponentially more software with these tools. Software that both needs to get created, debugged and updated to realize the intention of it. That’s what our time will be spent on as programmers.
              • AstroBen 5 hours ago
                Do you have any evidence that the demand for developers is largely price elastic?

                People are already struggling to find work from oversupply of talent and not enough demand

                • pixl97 1 hour ago
                  At the same time ability to write software is exploding we are watching large entities in the market consolidated and small businesses end up on the down side of the K shaped economy. Programmers demand and pay should go down as supply increases just like every other person in an economy.
            • Bombthecat 8 hours ago
              Exactly! There isn't enough woodworking jobs for thousands of thousands of workers. Of course you have people handcrafting things and people demanding handcrafted things. Programming will be the same.

              Way way way less developers.

              • moregrist 6 hours ago
                This doesn’t seem right to me. Carpentry still seems like a pretty solid line of work with a lot of jobs. I know at least one guy who moved from EE work to contracting because he could make a lot more money that way.

                Given, a lot of it is framing houses and remodeling. And there are fewer jobs in hardwood furniture these days. But a lot of that is because US furniture manufacturing was moved to China 20 years ago, not because of power tools.

                If anything, the advent of power tools in the 40s/50s made single family homes more affordable and increased construction demand.

    • rmnclmnt 13 hours ago
      I wonder if we could then have released the *stressor in a few months then...
      • franky47 12 hours ago
        I’d love to see someone try.

        Though using AI to build the devtools we used for signal analysis would have been helpful.

  • hebejebelus 13 hours ago
    I was hoping that the video was a walkthrough of your process - do you think you might share that at some point?

    > I'm not a programmer anymore. I'm something else now. I don't know what it is but it's multi-disciplinary, and it doesn't involve writing code myself--for better or worse!

    Yes, I agree. I think the role of software developer is going to evolve into much more of an administrative, managerial role, dealing more with working with whatever organisation you're in than actually typing code. Honestly I think it probably was always heading in this direction but it's definitely quite a step change. Wrote about it a little incoherently on my blog just this morning: https://redfloatplane.lol/blog/11-2025-the-year-i-didnt-writ...

    • askonomm 10 hours ago
      As someone who works at a place where we do a lot of code analysis and also research AI's effect on code quality, if you do not even so much as look at your code anymore, I do not believe you are creating maintainable, quality software. Maybe you don't need to, or care to, but it's definitely not what's sustainable in long-term product companies.

      AI is a force multiplier - it makes bad worse, it _can_ make good better. You need even more engineering disciplines than before to make sure it's the latter and not the former. Even with chaining code quality MCP's and a whole bunch of instructions in AGENTS.md, there's often a need to intervene and course adjust, because AI can either ignore AGENTS.md, or because whatever can pass code quality checks does not always mean the architecture is something that's solid.

      That being said, I do agree our job is changing from merely writing code, to more of a managerial title, like you've said. But, there's a new limit - your ability to review the output, and you most definitely should review the output if you care about long-term sustainable, quality software.

      • williamcotton 1 hour ago
        > if you care about long-term sustainable, quality software

        If software becomes cheaper to make it amortizes at a higher rate, ie, it becomes less valuable at a faster clip. This means more ephemeral software with a shorter shelf-life. What exactly is wrong with a world where software is borderline disposable?

        I’ve been using Photoshop since the 90s and without having watched the features expand over the years I don’t think I would find the tool useful for someone without a lot of experience.

        This being said, short-lived and highly targeted, less feature-full software for image creation and manipulation catered to the individual and specific to an immediate task seems advantageous.

        Dynamism applied not to the code but to the products themselves.

        Or something like that.

      • hebejebelus 9 hours ago
        Yes, I didn't do a great job of managing my language in that post (I blame flu-brain). In the case where _someone_ is going to be reading the code I output, I do review it and act more as the pilot-not-flying rather than as a passenger. For personal code (as opposed to code for a client), which is the majority of stuff that I've written since Opus 4.5 released, that's not been the case.

        I'll update the post to reflect the reality, thanks for calling it out.

        I completely agree with your comment. I think the ability to review code, architecture, abstractions matters more than the actual writing of the code - in fact this has really always been the case, it's just clearer now that everyone has a lackey to do the typing for them.

      • agentifysh 6 hours ago
        6 months ago I agreed with your statement

        but AI being solely a force multiplier is not accurate, it is a intelligence multiplier. There are significantly better ways now to apply skills and taste with less worry about technical debt. AI coding agents have gotten to the point that it virtually removes ALL effort barrierrs even paying off technical debt.

        While it is still important to pay attention to the direction your code is being generated, the old fears and caution we attributed to previous iteration of AI codegen is largely being eroded and this trend will continue to the point where our "specialty" will no longer matter.

        I'm already seeing small businesses that laid off their teams and the business owner is generating code themselves. The ability to defend the thinning moat of not only software but virtually all white collar jobs is getting tougher.

    • lifetimerubyist 8 hours ago
      Instead of becoming a people manager you're just a bot manager. Same roles, different underlings.
  • Blackthorn 13 hours ago
    How can you say it's a 100% faithful recreation if you've never programmed DSP before?
    • gbraad 7 hours ago
      Standard AI response. Similar to " production-ready", "according to industry standards" or "common practices" to justify and action or indicating it is done, without even compiling or running code, let alone understand the output. An AI can't hear, and even worse, relate this. Ask it to create a diode ladder filter, and it will boost it created a "physically correct analog representation" while output ting clean and pure signals...
      • Archit3ch 5 hours ago
        For context, I'm working on a proper SPICE component-level Diode Ladder.

        I tried this for laughs with Gemini 3 Pro. It spit out the same ZDF implementation that is on countless GitHub repos, originating from the 2nd Pirkle FX book (2019).

        • gbraad 3 hours ago
          Ha! Textbook... Literally.

          Since there is a Ursa Major project on github, made by an owner, who reimplemented this also based on observation, made into a plugin, I wonder how much was regurgitated by the AI agent.

    • utopiah 12 hours ago
      Indeed, same questions few days ago when somebody shared a "generated" NES emulator. We have to make this answered when sharing otherwise we can't compare.
      • Xmd5a 11 hours ago
        I’m not claiming a 100% faithful physical recreation in the strict scientific sense.

        If you look at my other comment in this thread, my project is about designing proprioceptive touch sensors (robot skin) using a soft-body simulator largely built with the help of an AI. At this stage, absolute physical accuracy isn’t really the point. By design, the system already includes a neural model in the loop (via EIT), so the notion of "accuracy" is ultimately evaluated through that learned representation rather than against raw physical equations alone.

        What I need instead is a model that is faithful to my constraints: very cheap, easily accessible materials, with properties that are usually considered undesirable for sensing: instability, high hysteresis, low gauge factor. My bet is that these constraints can be compensated for by a more circular system design, where the geometry of the sensor is optimized to work with them.

        Bridging the gap to reality is intentionally simple: 3D-print whatever geometry the simulator converges to, run the same strain/stress tests on the physical samples, and use that data to fine-tune the sensor model.

        Since everything is ultimately interpreted through a neural network, some physical imprecision upstream may actually be acceptable, or even beneficial, if it makes the eventual transfer and fine-tuning on real-world data easier.

        • utopiah 10 hours ago
          Well I'm glad you find new ways to progress on whatever you find interesting.

          This honestly though does not help me to estimate if what you claim to be is what it is. I'm not necessarily the audience for either project but my point remains :

          - when somebody claims to recreate something, regardless of why and how, it helps to understand how close they actually got.

          It's not negative criticism by the way. I'm not implying you did not faithfully enough recreate the DSP (or the other person the NES). I'm only saying that for outlookers, people like me who could be potentially interested, who do NOT have a good understanding of the process nor the initial object recreated, it is impossible to evaluate.

          • Xmd5a 8 hours ago
            Oh. just to be clear first, I’m not the OP. Sorry for the confusion.

            I do understand your point, and I think it’s a fair one: when someone claims to "recreate" something, it really helps readers to know how close the result is to the original, especially for people who don’t already understand the domain.

            I was mostly reacting to the idea that faithfulness always has to be the primary axis of evaluation. In practice, only a subset of users actually care about 100% fidelity. For example with DSP plugins or NES emulators, many people ultimately judge them by how they sound or feel, especially when the original artifact is aesthetic in nature.

            My own case is a bit different, but related. Even though I’m working on a sensor, having a perfectly accurate physical model of the material is secondary to my actual goal. What I’m trying to produce is an end result composed of a printable geometry, a neural model to interpret it, and calibration procedures. The physics simulator is merely a tool, not a claim.

            In fact, if I want the design to transfer well from simulation to reality, it probably makes more sense to intentionally train the model across multiple variations of the physics rather than betting everything on a single "accurate" simulator. That way, when confronted with the real world, adaptation becomes easier rather than harder.

            So I fully agree that clarity about "how close" matters when that’s the claim. I’m just suggesting that in some projects, closeness to the original isn’t always the most informative metric.

            One reason I find my case illuminating is that it makes the "what metric are we optimizing?" question very explicit.

            Sure, I can report proxy metrics (e.g. prediction error between simulated vs measured deformation fields, contact localization error, force/pressure estimation error, sensitivity/resolution, robustness across hysteresis/creep and repeated cycles). Those are useful for debugging.

            But the real metric is functional: can this cheap, printable sensor + model enable dexterous manipulation without vision – tasks where humans rely heavily on touch/proprioception, like closing a zipper or handling thin, finicky objects – without needing $500/sq-inch "microscope-like" tactile sensors (GelSight being the canonical example)?

            If it gets anywhere close to that capability with commodity materials, then the project is a success, even if no single simulator configuration is "the" ground truth.

            What could OP’s next move be? Designing and building their own circuit. Likewise, someone who built a NES emulator might eventually try designing their own console. It doesn’t feel that far-fetched.

            • utopiah 8 hours ago
              Ah that makes more sense, I couldn't make the connection!

              So on "So I fully agree that clarity about "how close" matters when that’s the claim. I’m just suggesting that in some projects, closeness to the original isn’t always the most informative metric." reminds me of https://en.wikipedia.org/wiki/Goodhart%27s_law

              That being said as OP titled " I used AI to recreate X" then I expect I would still argue that the audience has now expectation that whatever OP created, regardless of why and how, should be relatively close to X. If people are expert on X then they can probably figure out quite quickly if it is for them "close enough" but for others it's very hard.

            • utopiah 7 hours ago
              Ah that makes more sense, I couldn't make the connection!

              So on "So I fully agree that clarity about "how close" matters when that’s the claim. I’m just suggesting that in some projects, closeness to the original isn’t always the most informative metric." reminds me of https://en.wikipedia.org/wiki/Goodhart%27s_law

              That being said as OP titled " I used AI to recreate X" then I would still argue that the audience has now expectation that whatever OP created, regardless of why and how, should be relatively close to X. If people are expert on X then they can probably figure out quite quickly if it is for them "close enough" but for others it's very hard.

      • le-mark 12 hours ago
        At some point the llm ingested a few open source NES emulators and many articles on their architecture. So i question the llm creativity involved with these types examples. Probably also for dsps.
        • steveBK123 12 hours ago
          Right, the amount of hallucinated response data I see at work using any of these leading models is pretty staggering. So anytime I see one of these “AI created a 100% faithful ___” type posts that does not have detailed testing information, I laugh. Without that, this is v0 and only about 5% of the effort.
        • utopiah 11 hours ago
          > i question the llm creativity involved with these types examples.

          Indeed but to be fair I'm not sure anybody claimed much "creativity" only that it worked... but that itself is still problematic. What does it mean to claim it even manage to implement an alternative if we don't have an easy way to verify?

    • glimshe 12 hours ago
      Perhaps a subjective evaluation based on how it sounds.
      • steveBK123 12 hours ago
        It’s bold to call it 100% faithful without some rigorous test harness though, isn’t it?
    • johnwheeler 10 hours ago
      I had the hardware for both units and use them extensively so 100% familiar with how they sound.

      And I'm not doing it based off of my ears. I know the algorithm, have the exact coefficients, and there was no guesswork except for the potentiometer curves and parts of the room algorithm that I'm still working out, which is a completely separate component of the reverb.

      But when I put it up for sale, I'll make sure to go into detail about all that so people who buy it know what they're getting.

      • vunderba 9 hours ago
        Can you sell it, or would you have to do some renaming in order to get around trademark/etc ?

        Consider reaching out to Audiority - I know they have some virtual recreations of Space Station hardware.

        https://www.audiority.com/shop/space-station-um282

        • johnwheeler 9 hours ago
          Luckily the trademark is public domain!
          • psobot 6 hours ago
            Are the ROMs, though? (Not trying to be combative; I've had to deal with this a lot when developing emulation-based plugins.)
      • wrl 8 hours ago
        Are you also going to go into detail about the use of AI to generate the code?
      • indigodaddy 7 hours ago
        Sell it?
    • huflungdung 13 hours ago
      [dead]
    • steveBK123 13 hours ago
      Dude it’s AI just trust him
      • bigfishrunning 12 hours ago
        He included "100% faithful" in the prompt!
        • steveBK123 12 hours ago
          “You are an elite DSP programmer who never makes mistakes..”
    • baq 12 hours ago
      Maybe the OP has the hardware and can compare the sound both subjectively and objectively? Does it have to be 100% exact copy to be called the same? (Individual electronic components are never the same btw)
      • Blackthorn 12 hours ago
        The OP didn't clarify. But if there's a claim of 100% faithful recreation, I'd expect something to back it up, like time- and frequency-domain comparisons of input and output with different test signals. Or at least something. But there isn't anything.

        The video claims: "It utilizes the actual DSP characteristics of the original to bring that specific sound back to life." The author admits they have never programmed DSP. So how are they verifying this claim?

        • johnwheeler 10 hours ago
          Well it's a new project so give it some time. I feel confident that I'm not lying so I can make that claim.

          Also its target market is not a technical crowd but people who make music. I'm optimizing more for what they want to see (which are sound demos) rather than what a programmer would want to see.

      • _DeadFred_ 5 hours ago
        That might make it 100% faithful for OPs use cases, but not necessarily anyone else's.
  • Xmd5a 11 hours ago
    Very nice work. I’m curious: what kinds of projects are you guys currently working on that genuinely push you out of your comfort zone?

    I had a small epiphany a couple of weeks ago while thinking about robot skin design: using conductive 3D-printed structures whose electrical properties change under strain, combined with electrical impulses, a handful of electrodes, a machine-learning model to interpret the measurements, and computational design to optimize the printed geometry.

    While digging into the literature, I realized that what I was trying to do already has a name: proprioception via electrical impedance tomography. It turns out the field is very active right now.

    https://www.cam.ac.uk/stories/robotic-skin

    That realization led me to build a Bergström–Boyce nonlinear viscoelastic parallel rheological simulator using Taichi. This is far outside my comfort zone. I’m just a regular programmer with no formal background in physics (apart from some past exposure to Newton-Raphson).

    Interestingly, my main contribution hasn’t been the math. It’s been providing basic, common-sense guidance to my LLM. For example, I had to explicitly tell it which parameters were fixed by experimental data and which ones were meant to be inferred. In another case, the agent assumed that all the red curves in the paper I'm working with referred to the same sample, when they actually correspond to different conducting NinjaFlex specimens under strain.

    Correcting those kinds of assumptions, rather than fixing equations, was what allowed me to reproduce the results I was seeking. I now have an analytical, physics-grounded model that fits the published data. Mullins effect: modeled. Next up: creep.

    We’ll see how far this goes. I’ll probably never produce anything publishable, patentable, or industrial-grade. But I might end up building a very cheap (and hopefully not that inaccurate), printable proprioceptive sensor, with a structure optimized so it can be interpreted by much smaller neural networks than those used in the Cambridge paper.

    If that works, the gesture will have been worth it.

    • brcmthrowaway 7 hours ago
      I would put more effort into algotrading to make $$$ for yourself
  • LatencyKills 13 hours ago
    This is fantastic. I’m currently building a combustion engine simulator doing exactly what you did. In fact, I found a number of research papers, had Claude implement the included algorithms, and then incorporated them into the project.

    What I have now is similar to https://youtu.be/nXrEX6j-Mws?si=XdPA48jymWcapQ-8 but I haven’t implemented a cohesive UI yet.

    • johnwheeler 9 hours ago
      Right on that's awesome! I think I'm doing more what you did vs. the other way around. Looks like you're pretty established. How long did it take to build your YouTube to what it is? What's that process been like?
  • dubeye 13 hours ago
    awesome, in 2025 I made a few apps for my small business that I have spent hours trawling the web looking for, and I have little coding skills.

    Sometimes it feels like I'm living in a different world, reading the scepticism on here about AI.

    I'm sure there enterprise cases where it doesn't make sense, but for the your everyday business owner it's amazing what can be done.

    maybe it's a failure of imagination but I can't imagine a world where this doesn't impact enterprise in short order

    • hatefulheart 13 hours ago
      With all due respect you are living in a different world. Not in a bad way, it’s just you haven’t experienced what maintenance on a large complicated code base is like.
      • williamcotton 49 minutes ago
        Maybe the problem is large complicated codebases?
      • mirsadm 7 hours ago
        The worst part of the new wave of vibe coders is their confidence.
      • HPsquared 12 hours ago
        I think there will be a transition period.
      • kaffekaka 13 hours ago
        Different worlds yes, but they both exist.
        • hatefulheart 13 hours ago
          Sure, where one is ignorant of the other. That’s not a pro.
          • kaffekaka 12 hours ago
            Small business owners not being aware of maintenance hell in large org codebases, yes, is that a problem?

            I work for a large org and maintenance hell is my job, so I see both sides I think.

            • hatefulheart 12 hours ago
              I’m a small business owner and solo developer on that business. Let’s just say I’d rather know the costs of my choices upfront. I’m sure there is not one small business owner in tech who would turn their nose up at that.
              • dubeye 10 hours ago
                Good points. Works both ways though, you are splitting your time between two worlds, and don't have a fully clear view on the costs of bad choices on a small business

                to know this you need to know what processes these businesses have been using for the past decade to run real full time business with full time staff. for example, you don't know just how bad the prior systems were, that the self built systems replaced.

                with all due respect you don't have all the info to make the calculation on my world. just as I don't have it for yours.

                the same tool that helped me build our systems, is not going to be the same tool that helps you maintain your large code base. But my point is, that I'm on the front line of change, and my guess is it's not going to be limited to my size of business. I don't know what your tool will look like, but I'd bet it's coming

              • kaffekaka 9 hours ago
                Fair enough, that is a good point.
          • cyberge99 12 hours ago
            A pro is someone who makes money doing their profession.
            • hatefulheart 11 hours ago
              Pro as in pros and cons, not as in professional.
    • johnwheeler 10 hours ago
      No you're absolutely right. One of the things I'm starting to see and I wrote another Hacker News post about this is that more people are starting to come out talking about all the mistakes AI is making even as it gets better. Then You've got people like Karpathy talking about how drastic the landscape is shifting

      I've been doing this for 25 years and I can tell you that the AI is a better coder than me, but I know how to use it. I reviewed the code that it puts out and it's better. I'm assuming the developers that are having a hard time with it are just not as experienced with it.

      If you think your job is going to stay programmer, I just don't see it. I think you need to start providing value and using coding as just a means to do that, more so than coding being valuable in itself. It's just not as valuable anymore.

      • pengaru 4 hours ago
        > you're absolutely right.
  • gbraad 7 hours ago
    Isn't that like the Ursa Major Stargate 323 Reverb? Greybox audio released code for this about a year ago: https://github.com/greyboxaudio/SG-323
    • pera 6 hours ago
      Thanks for mentioning this project, I have been looking for a good reverb plugin for Linux for a while now and this sounds great.
      • gbraad 4 hours ago
        There might be a plugin based on freeverb, which is also a good sounding one. I ohave it as a logue unit, so can't recommend one immediately. At least I know greybox based on actual device comparison, as he owns one and has been doing this for 5 years sans AI.
  • utopiah 12 hours ago
    I'm not in the domain, even though I did dabble with DAW and tinker with a PGB-1 and its open source firmware, but how far would you say CMajor helped? I feel like solely picking the right tool, being framework, paradigm, etc can make or break a project.

    Consequently here for me to better understand how special this is I'd appreciate how (especially since I don't see a link to code itself) how does one go to e.g. https://cmajor.dev/docs/GettingStarted#creating-your-first-p... to a working DSP.

  • alexjplant 2 hours ago
    Nice! Earlier this week I discovered another enterprising engineer working on a digital sim of the Mesa Mark IIC+ preamp using a discrete component modeling approach [1]. Pretty cool stuff coming out in the digital audio production space these days.

    [1] https://www.youtube.com/watch?v=GcdyOtO5Id0

  • jvanderbot 12 hours ago
    On your "Scares the shit out of me" comment.

    Use AI like a CNC machinist uses a mill. You're still in the loop, but break it into manageable "passes" with testing touchpoints. These touchpoints allow you to understand what's going on. Nothing wrong with letting AI oneshot something, but it's more fun and less ennui to jump in and look around and exercise some control here and there. And, on larger systems, this is basically required. (for now, perhaps).

    This is how I do it now: https://jodavaho.io/posts/ai-useage-2025.html

    • vardalab 8 hours ago
      Exactly this! I am a retired EE just messing around with AI in my homelab datacenter and that has been my approach as well. Amazing force multiplier, I can finally create more or less what I want with sw based on first principles and basic systems engineering approaches by just guiding AI. I have used golang and ansible and terraform and typescript, languages I never had time to learn and now I can create a working tools/solutions to whatever is my need at the moment. Other day my STT subscription app became too laggy so I asked Claude to spin up an endpoint on one of my GPU boxes, create a proxy server, intercept transcript cleanup calls and create traces in langfuse, setup a prompt eval framework, etc. We make a plan, iterate on it, I usually get Codex or Gemini in on the call as well and in a couple of hours I have a good enough solutions for my personal needs. This would have been probably a weekend or more project before. Skepticism here does remind me a bit of when I learned how to use handtools for woodworking. Ultimately it was nice to be able to make mortises by hand with chisel but damn if using festool dominoes is not that much more productive.
  • newyankee 7 hours ago
    A similar approach might actually help build cheap, and decent hearing aids too
  • KellyCriterion 13 hours ago
    Great achievement!

    Regarding your own titleing: you are now some type of "platform operator/manager" of this agents :-))

  • gus_massa 1 day ago
    Did you recreate the UI only or also the internal circuits? Does it produce a similar distorsion?
    • johnwheeler 19 hours ago
      I created the UI and the internal circuits but it's a hundred percent DSP. The SST206 is a recreation of the SST282 (in DSP) and he expanded the bandwidth from 7 kHz to 22 kHz, so it doesn't produce distortion but it can get dark like the original. But yeah so the SST206 it's not grungy like the original so it lacks some of the character. It makes up for it in delay time.
  • drcongo 13 hours ago
    Cmajor, for anyone wondering: https://github.com/cmajor-lang/cmajor
    • reactordev 13 hours ago
      All I’ve ever known was JUCE. This looks nice!

      *edit* well duh, it’s the same guy!

  • atentaten 13 hours ago
    Nice! Which DAW are you using in the video?
  • tomhow 14 hours ago
    [under-the-rug stub]

    [see https://news.ycombinator.com/item?id=45988611 for explanation]

    • deaux 13 hours ago
      18 days late, but we needed one of those here: https://news.ycombinator.com/item?id=46290617
    • BenjaminHas 1 day ago
      This is really impressive! I love how you combined AI assistance with schematics, patents, and ROMs to recreate it.
      • johnwheeler 1 day ago
        Thank you so much. Yeah it's really cool. I also bought myself a copy of Max MSP and I'm recreating it visually so I can tweak all the parameters and really understand what's going on because the AI did all of the theory for me but I have all the numbers. When you tweak the numbers it makes it sound totally different. So yeah I'm excited. This is what I want to do with my life. I want to build electronic music gear. I'm not saying AI is not going to be able to do that one day but I just can't do web programming anymore. I don't know. I've been doing it too long and I just don't like it anymore.
        • chunkmonke99 1 day ago
          Wait you used Claude Code to recreate patents and schematics? Are the schematics for this easily available somewhere? Was Claude just able to one-shot this?
          • johnwheeler 19 hours ago
            I use Claude more as a learning tool in this context. It's kind of funny actually got the idea because I heard that in China they're basically replacing teachers with AI, where we're trying to get AI out of our school systems in the United States. So I went into it with that mindset instead of trying to have Claude do the whole thing to teach me how to do it so I understand it and I'm still learning, trying to recreate things with Max so I can have a lot more control and really play with it. I'm learning that reverb creation is a real craft.

            It's not able to one-shot it yet but I'm sure that's coming this year sometime. I did the UI a hundred percent by myself and I went in there and tweaked it and tried to rebuild it and just try to understand how reverb works etc. I also did a lot of the software licensing just because I have experience with that.

            • chunkmonke99 2 hours ago
              I am not seeing any evidence of China "replacing teachers with AI" anywhere (did some googling/geminiing). Are there any sources on this? Seems like they are trying to introduce students to GenAI/ML principles and creating "AI literacy guidelines" without just "replacing teachers with AI". Their current guidelines outright prohibit the use of AI to replace teachers' responsibilities.

              What is the point of asking it to teach you something to "understand it" if Claude can just do it for you? This is the real question everyone should be asking beyond just employment (employment will definitely change in the coming months, no doubt). I would pivot away from programming personally.

    • RiverStone 1 day ago
      Nicely done!!
    • ndgold 1 day ago
      Instant ROI
    • ushdbz 1 day ago
      [flagged]
    • ushdbz 1 day ago
      [flagged]
    • ushdbz 1 day ago
      [flagged]
  • synthded 12 hours ago
    [dead]
  • agentifysh 6 hours ago
    We are all glorified QA testers with software architect title now. Sure we set the structure of what we want but everything else the AI does and most of our time is now spent testing and complaining to the AI.

    Pretty soon AI will do the QA portion as well. It will generate any piece of software, even games for a cool $200/month from a vendor of choice: Microsoft (OpenAI) or Google

    Companies will stop paying for SaaS or complex ERP software, they will just generate their own that only the AI knows how to maintain, run, and add features.

    It's ironic that software developers are the most enthusiastic about automating their jobs out of existence. No union, no laws that interfere with free market forces.