An article published this week in EDN about LTE baseband modems perfectly underscores how software is becoming much more important in the development of advanced nanometer SoCs. The article was written by Pascal Herczog of Cognovo, and it details the rise in computation required to implement successively complex baseband modems for cellular phone handsets. A graph in the article shows that computational requirements for cellular baseband modems have gone from essentially zero GOPS (giga operations per second) for GSM phones to 0.2 GOPS for GPRS phones to 5 GOPS for HSPA phones to 50 GOPS for LTE phones operating at 150 Mbps data rates. The next step up, LTE-A, will require an estimated 300 GOPS.
And why do you need all of those GOPS in an LTE baseband design? To run baseband DSP software, of course, which implements high-bit-rate OFDM (orthogonal frequency division multiplexing) signal algorithms that involve multiple, high-throughput FFT/iFFT, FIR, and MIMO-equalization operations. A brief moment of contemplation will tell you that one conventional DSP will not deliver 50 or 300 GOPS. We’re not going to run DSPs at 50 or 300 GHz in the near or mid term. So you get choices. Herczog’s article advocates a two-processor IP block that consists of a VLIW vector signal processor, sort of a DSP on steroids, paired with a general-purpose ARM Cortex processor. There are alternative approaches from other vendors that harness half a dozen tailored processors into a multicore IP block.
Whichever way you go, the common thread is going to be software, and lots of it, running through your baseband instruction mill. The cellular standards are always in flux and canning today’s baseband algorithms in hardware is a sure way to guarantee the need for a silicon respin next year. Far better to implement a soft modem comprised of processors tailored to specific tasks in the baseband modem says Herczog and I think you’d be foolish to disagree. With a soft baseband modem, changes to baseband algorithms will trigger only firmware changes and you avoid the need to change the hardware design—to a point. Eventually, any processor-based design will run out of gas (no matter how conservative the approach) and then you will need new silicon. But software-driven SoC design gives you a significant buffer zone and protection against the fates.
If this all sounds heretical, if you think that hardware alone is always the solution, then consider this. When they first came out, MP3 players used hardware decoders rather than processors because the hardware decoders used fewer gates and less energy. Eventually, the MP3 players evolved into Personal Media Players and had to be designed to handle one or two dozen different digital-audio compression standards. It doesn’t make sense to build a dozen or two individual hardware audio decoders into an SoC. The choice of a tailored audio processor with appropriate software codecs becomes the obvious right choice and that’s where we are today. Cellular baseband hardware design is now headed in exactly the same direction and for the same reasons.
This example yet again underscores the significant change taking place in SoC design, where the software development is as important and just as costly as the hardware design, or perhaps even more costly. With that kind of weighting, it’s no wonder that the EDA360 vision broadly embraces software development as an integral part of the SoC design process.