Embedded Development Predictions for 2020-2029
Started by 5 years ago●5 replies●latest reply 5 years ago●325 viewsGreetings,
I am steeling the idea for this thread from a discussion on Reddit. I am curious to read the predictions from the EmbeddedRelated community.
What do you think the next 10 years have in store in terms of Embedded Development? Will there be any significant change as to which programming languages are used do develop Embedded Systems, will RISC-V gain traction, etc.
Looking forward to read your thoughts!
Thanks.
Stephane
Must generally predict more of the same, where same means:
- Mostly C code based projects (55-70%) (sigh)
- Fragmented as it has been. Because each product, business domain, technology domain, etc, all have unique needs, from performance, to NRE, to recurring costs, etc.
- Long live ARM. Long live MSP430. Long live PIC. etc.
- RTOS environment will continue to be fragmented. I'll be mildly bold and predict further shifting to FreeRTOS, especially with the impact of Amazon.
What I would LIKE to see:
- More modern C++ (C++14 or newer). Maybe micro-python in some projects.
- More pub/sub frameworks based on the active object concept. (like QP)
- More unit testing and preference for TDD. Please!
- Greater focus on code quality and architecture. Please!
- Some minor up-front planning wouldn't hurt.
- A realization that faux-agile mentality is hurting embedded software and firmware projects.
Hope everyone has a great New Year. And may we all be fruitful and prosperous!
Matthew
https://covemountainsoftware.com/consulting/
Good question -- I think one way to look at this is to consider what we expect to be coming our way application and technology-wise, and then "work back" to what we will need to achieve this.
I personally think that the biggest game-changer is going to be the deployment of mixed-reality (MR) with artificial intelligence (AI), where MR encompasses things like augmented reality (AR) and diminished reality (DR). See also my columns "What the FAQ are AI, ANNs, ML, DL, and DNNs?" (https://www.clivemaxfield.com/fundamentals-ai-anns...) and "What the FAQ are VR, MR, AR, DR, AV, and HR?" (https://www.clivemaxfield.com/fundamentals-vr-mr-a...).
I do expect RISC-V to gain traction -- but I also expect to see standard processors like this coupled with the widespread deployment of AI-specific chips. With regard to languages, C and C++ aren't going to go away, and Python will continue its slow crawl into appropriate systems, but I do expect to see MR and AI tools that allow these applications at a higher level of abstraction, thereby making these technologies available to a wider audience.
Think of how we used to write in assembly language, then C came along. Moving to the higher level of abstraction allowed programmers to more easily capture larger programs and experiment with different solutions. Using existing architectures like TensorFlow is the AI equivalent of working at the assembly level. There was a company called Bonzai that had tools at a higher level of abstraction, but they were quickly snapped up my a bigger company (Microsoft as I recall).
C will still be the primary language. There really is no other language for the task.
The size of logic is pretty close to the limit of physics (Moores Law has basically bottomed out.)
Quantum computing is getting closer && money is being spent on research. If they can do ambient temperature versions, they might become embedded.
chips with multiple small cores. Think of FPGA-style chips where there are mixed and same arrays of small MPUs for parallel processing. The tech currently exists.
GPUs will be integrated with traditional MPUs - this is pretty obvious.
Hardware solutions for internet communications with cyber security built-in.
Some embedded medical devices (pacemakers, insulin pumps, etc) will be hacked and some people will die. This will force cyber security for embedded medical devices.
Tools will get better. High-level design with auto-code generators. Better proving programs do what they should. Better testing methods, including TDD. Better code checkers for dangerous usage. Simulators looking for edge cases && race conditions.
IoT will continue to grow. Industrial use will increase. Home automation will increase. There will be a push-back on privacy issues (people will become more concerned, hopefully.)
VR/AR/MR will become ubiquitous in many areas, driven by dedicated embedded systems && increases in manufacturing techniques. This will cause a number of changes in many fields, both personal and industry. Tools will become simpler. Track the number of applications - the knee of the curve is pretty close.
More ARM-style IP (let others make the chips).
Services that will make "roll-your-own" chips will become available. Basically, the same model as a 3d printing service.
Increase in the number of people doing their own hardware, using tools like kicad and Chinese board houses. This can be taught in jr high and high schools.
My prediction (or plea) would be to advance the state-of-the-art for space computing. Increasing the robustness and performance of computing systems we launch is required for the kinds of capabilities we're trying to implement.
Until recently, satellites have had minimal requirements for on-board processing. A lot of data is just collected and routed to the ground for processing. In order to accomplish things like autonomous guidance and robotic control we need processors, GPUs, FPGAs, etc. that have modern capabilities that can withstand the radiation environment.
Hopefully the larger presence of commercial companies will increase demand for these products.
Easy predications:
- C will continue to dominate in microcontrollers (bare metal or RTOS), and C++ will gain in microprocessors running embedded Linux.
- Multi-core microprocessors become less costly and more common.
- 'PC' level module integration such as built-in 1G Ethernet, USB 3, WiFi, etc. requiring nothing but pins routed to connectors will become more common on lower-cost embedded processors (along with attendant security problems).
- More onboard RAM, FLASH etc. Possibly new technologies such as MRAM or FRAM, and integrated SSD in microprocessors.
One Reddit poster postulated "10% FPGAs or hybrid uC/FPGAs get incorporated into many embedded products". Looking forward I'd love to see more Zynq-like parts available at lower cost, and GPIO ports replaced with PLD ports, and there's nothing stopping the foundries from making such parts. RISC-V would also be more likely to happen if parts became more programmable. However, I don't really see that taking off unless a ubiquitous gcc-style System-C-like toolchain becomes available. Would be nice though.
I could see Python, Rust, et al becoming more common in the larger integrated microprocessors running embedded Linux because the toolchains and run-time environments are rather easily ported/cross-compiled. However, few other languages are as efficient at twiddling bits as C, and I don't see anyone replacing the massive existing C toolchain anytime soon on microcontrollers.
Looking back a decade, the biggest surprise to me was the rise of the Raspberry Pi and other $35 microprocessor modules. It had a significant impact on both the existing embedded processor module vendors and on enabling hobbyists in the embedded space. A considerable amount of innovation enabled by the Raspberry Pi trickled into the 'certified' embedded space, and the latest generation of Pi modules are being used for everything from telephony PBX to compute clusters . I expect to see a lot more innovation from that space creeping into 'corporate embedded'.