Creating an SDCC-based development environment to replace Dynamic C?

I’ve thought a bit about the possibility of creating an alternative development environment for the Rabbits, which steps would need to be taken to get there. These are my thoughts so far: http://www.colecovision.eu/stuff/Rabbitplan.pdf
Assuming You would want to keep devloping for Rabbits (e.g. to upgrade existing Rabbit products with new firmware, etc), do you think this plan makes sense? Are there any obvious (to you, for your use-case) gaps?

It’s an interesting project, but sounds like a lot of effort for limited gain.

You may not be aware that in late 2024 Digi International announced “end of life” (EOL) status for ALL Rabbit products. Customers had until the end of January 2025 to commit to a last-time buy (LTB) and shipping of products is scheduled to complete by the end of 2025. The LTB includes Rabbit 2000, 3000, 4000, and 6000 ICs, limiting the possibility of a third party developing new hardware based on those CPUs.

Having a modern toolchain is interesting, but I feel that most of the value from Dynamic C came from its Open Source libraries. Due to the many quirks (and limitations) of the Dynamic C compiler, I think it would be extremely difficult to create a version of SDCC that could leverage that existing codebase. Seeing a modern TCP/IP stack and mbedTLS running on a Rabbit 6000 using far pointers for everything is an exciting idea, but I think it would be an enormous undertaking for a what’s soon to be a “retro” platform.

I first started using Rabbit products around 2001/2002. I created and sold a product based on the RCM2200, wrote code for companies using Rabbit hardware in their designs, worked on Rabbit products while an employee of Digi International, and have been the sole engineer responsible for supporting the Rabbit products since 2015. It certainly feels like the end of an era.

1 Like

While the new toolchain is most likely too late to save the Rabbits in any way, the current situation also is a bit of a chicken.and.egg problem: as long as Dynamic C is the only viable option for rabbit development, and Digi keeping Dynamic C to themselves, no one else would want to make Rabbits. Maybe the situation would have been different, if there had been an alternative toolchain a few years ago.
Anyway, I think by now we are well-prepared enough for phase I that most of it will be done in less than a year. On the other hand, phase II and III might or might not happen.

Softools created an alternative toolchain over 20 years ago. I’m developing on it right now for a legacy product that needed some upgrades. It produces fantastic code, transparently handles the extended memory in the form of far pointers, and is very, very fast. Unfortunately, the Rabbit people could not see the value in it. It is still available, but with very, very limited support. However, it is so mature, that for most projects, no support is really needed.

Yes, it feels like the people that were in charge of Rabbit Semi never realized the importance of a well-optimizing Standard C compiler, and consider Dynamic C as the only way (even going as far as assuming the value of the Rabbits is in Dynamic C).
But Softools wouldn’t be good enough today, either- it has support for ISO C90, and bits of ISO C99, but nothing beyond that, and targets Rabbit 2000 and 3000 only. And I’d have to check how their optimizations compare with the latest Dynamic C 9 and SDCC releases - they might have been good at code generation vs. dynamic C 9 long ago, but that wasn’t hard back then.

Softools was definitely a better compiler for those products. But it was difficult to compete with the free tools that ZWorld/Rabbit/Digi provided with their hardware.

I was on the team that added ANSI C90 support to Dynamic C 10, focused on the Standard C Library and contributing some improvements to the compiler’s code generation. But it still had the legacy of a compiler codebase that may have gone back to the late 1980’s, and libraries of code with proprietary extensions. Even the idea of a “project” with multiple C files had limited support.

There are so many embedded platforms available now that there really isn’t a place for the Rabbit ecosystem. It can’t compete with modern compilers, inexpensive 32-bit ARM CPUs, and Open Source libraries of code, but I think it was a great platform for its time. Inexpensive hardware, with an easy-to-use IDE that came with source code to libraries and sample programs that provided a kick start to developing a product.

Indeed, there are a lot of embedded platforms, though there were more before. 32-bit ARM is definitely strong, and I don’t see how the Rabbit 5000 or 6000 could compete with it. Especially considering that they both would require a lot of work in terms of drivers (relatively complex hardware), and that AFAIK they contain some Digi IP, so future Rabbit 5000 or 6000 devices would have to be made by Digi or under a license from Digi (similar to how Rochester has revived a lot of old µC (though AFAIK no Digi ones) by getting licenses and masks).
The Rabbit 2000, 3000 and 4000 could be in a better position, if someone manages to make cheap ones. Since the hardware is simple, it would also be less effort to write the drivers.
And despite ARM, there are still a lot of 8/16 bit architectures alive. Not as many as before, but they still exist.