Algorithm Efficient DSP system development with standard implementation
Author: Steve blonstein, Technical Director-Software Development System
As the application of digital signal processor (DSP) expands, the demand for component-oriented software modules continues to grow. The ready-made algorithms provided by third parties can respond to the above requirements at the basic level. Third-party algorithms enable system vendors to integrate systems with more functions faster and cheaper without having to design major software features. Therefore, third-party algorithms play an important role in DSP system development.
In order to make the component software method work smoothly, components must be ensuredCodeStandards for interoperability, consistency, and portability. DSP vendors recognize the above requirements and establish algorithm standards for managing interfaces between different algorithms and applications. The above standards are not designed to ensure efficient implementation. Selecting the best algorithm for code size, characteristics, and performance is done by the system integrator. The standard provides a set of rules to ensure mutual collaboration between algorithms, making it easier for them to evaluate and then integrate them in the system environment.
Origin of algorithm standards
In the middle of 1990s, the demand for algorithm standards gradually emerged. At this time, a more powerful DSP emerged. It can support multiple channels of an algorithm or multiple algorithms on the same DSP. Although early DSPs may only be used as speech encoders, DSP from the Ti tms320c5000 platform can process the entire digital processing chain required by cellular phones, including audio codes, audio correction, and echo elimination. For example, other DSPs from the TMS320C6000 platform can develop DSL line cards, video servers, and other systems that require extremely high multi-channel performance on a single device.
With higher-level performance, many emerging signal processing standards are emerging, including JPEG, MPEG, teleconference, wireless telephone, and modem and fax improvements. Developers began to create dynamic systems for interaction and task change, rather than static systems with fixed functions based on DSP. In addition, the system code scale has also increased dramatically, which is often greatly increased to adapt to the complexity of the new multi-function system.
DSP system developers are always short of DSPs with rich experience and deep knowledge in signal processing.ProgramMember. Currently, these developers have begun to integrate more complex systems, and some other developers who have just entered the DSP field have begun their initial DSP design. Growing industry system integrators look for ways to integrate increasingly complex devices without having to design all the software from scratch. Fortunately, some developers with proven software technologies recognize new market opportunities and start selling their intellectual property rights, including algorithms, as a third party. System integrators will purchase and load the "black box" target code from a third party to the system to save valuable development time. At least this is the hypothetical workflow.
However, in practice, things are not so direct. Third-party developers often assume DSP usage to make their algorithms as concise as possible and achieve the best performance. Therefore, an algorithm may need to occupy all the memory, disable interruptions for a long time, and completely control the core. In addition, system integrators may not be able to understand the developer's assumptions in advance, because there is no unified method to specify the resource requirements and performance effects of algorithms.
Obviously, with the above assumptions, two or more algorithms cannot coexist in a multi-functional system. These problems are being exploited.Source codeIt may be quite difficult to re-set the timing, but the system developers who want to integrate the target code are powerless to change the algorithm. Moreover, if algorithms come from different third parties (as is often the case), integrators will face incompatibility difficulties and inevitable mutual criticism.
By the end of 1990s, it was obvious that DSP development would be stuck if no behavior rules were established for algorithms. Therefore, DSP vendors began to release such rules and compile them as standard code that third-party software developers must follow to ensure algorithm compatibility. Although these standards are proprietary, they all have the same goals and many rules are the same. Because some rules reflect hardware implementation and are only for some individual manufacturers, ownership is not reserved as industry standards. In addition, when the standard emerged, the manufacturer was forced to respond to the current demand in order to keep up with the pace of DSP development, rather than handing over the problem to a long standardization process in the industry.
Demonstration criteria
One of the first standards was TI's TMS320 DSP algorithm standard, also known as xdais. Ti introduced this standard as a basic element of its expressdsp software strategy, and also introduced the implementation kernel, integrated development environment (IDE) and third-party network, this indicates that algorithm standardization will play a key role in DSP software development. TMS320 algorithm standard is a demonstration of various DSP algorithm standards. In fact, it became a model of some standards subsequently introduced.
Xdais is built on the underlying software architecture of TMS320 DSP. Figure 1 shows how the DSP system is organized, so that the simple data sensor algorithm is separated from the I/O function and the underlying core runtime environment. Figure 2 shows the series of events required for normal algorithm running in the xpressdsp environment.
Xdais algorithm rules
Xdais rules are divided into four groups and have basic validation mechanisms to ensure compliance with standards.
Common-sense programming rules. The role of this group of rules is to enhance the portability, predictability, and ease of use of algorithms. Most DSP systems run in C environments, so the top-level algorithms must be called in C. The algorithm must not interfere with the running status of the application, and the Code must be re-imported in a preemptible environment to support multiple channels. The shared memory and global variables of multiple instances must be protected. All code references must be completely relocated and hard-coded memory addressing is not allowed; otherwise, other code will be disturbed. Because resources may vary by system, algorithms cannot directly access peripherals.
Cancel any selection. If a single mandatory method is required for a job, the Standard specifies the method to be used in different methods (as if the traffic regulations specify the Left or Right rows on the road ). To avoid naming conflicts, the signal name must follow the DSP/BIOS rules, which is the real-time kernel used by the TMS320 DSP. To avoid conflicts when porting code to different operating system environments, the algorithm must be encapsulated in a file that complies with the unified naming rules. External references must comply with the source, for example, C running support library functions or other modules that comply with the expressdsp. Algorithm Instances must be called and deleted according to the specified program, and they must be able to be located independently. For the C6000 platform, the algorithm must support at least the byte sequence from small to large, or preferably both, in order to provide a choice for system developers.
Resource management. Because algorithms are greedy and must be shared, this group is at the core of this standard. Now every algorithm has a forced memory management interface, and all algorithms must be coordinated at design time or interactive at runtime to use the memory. This rule applies to external and internal memory, as well as DMA channels and other peripherals. This application collects all memory requests like the control framework, and then allocates memory to the algorithm. The algorithm may not be able to obtain all of its requests, but the application framework can make good judgments between competing requests and optimize the division of system resources.
Unified Specification. This group of rules helps system integrators to measure algorithms and evaluate their compatibility in the system. All compatibility algorithms must present the worst-case interrupted transmission time, typical and worst-case execution, as well as program, heap array, static and stack memory requirements. For example, the algorithm supplier may no longer conceal the interruption transmission time that will allow the algorithm to exclusively occupy the kernel for several seconds. Now, you must specify and include the transmission time requirements in the algorithm technical description according to the determined method.
Verify consistency with expressdsp. Algorithm developers cannot simply say it meets the requirements of TMS320 algorithm standard. The developer must use TI's xdais consistency testing tool to verify that the Code complies with the rules. In addition, a third party must agree in writing that the standard is observed in algorithm development. When these requirements are met, a third party can declare that the algorithm complies with the expressdsp and use the flag shown in Figure 3 in advertising. Consistency tools can be applied to third-party and DSP customers so that they can check the software when developing their own software. System integrators can also use this tool to ensure that the code they purchased has not been modified after obtaining the expressdsp consistency title.
Development of xdais
When xdais was launched five years ago, there were less than 30 rules. Now it has 46 rules, which reflects the constant development of standards, but its development is carried out in a serious and controlled manner. For the following reasons:
New hardware features. Some rules are added to cover the development of silicon technology. For example, with the integration of advanced DMA functions into chips, xdais also adds new rules to cover the allocation of DMA channels. In the future, xdais may also include rules for using hardware accelerators as shared resources.
Performance Optimization. The DMA rules have been revised to optimize performance. These rules also demonstrate another field of change in the xdais standard. As early rules solve major conflicts, some newer guidelines tend to help developers better utilize their system advantages.
New application fields. Xdais's initial guidelines mainly aim to process single-function DSPs with data stream applications, such as voice and audio and video. However, today's multi-functional systems often have to deal with sudden data bursts, such as IP data packets or code similar to the framework in more complex modem standards. The core and system requirements of these applications are sometimes different from those of stream applications, and xdais rules must contain two types of data throughput.
There is a feature that has not changed, that is, the overhead needs to be kept at a low level. Experience shows that DSP customers and third parties will accept performance and memory interference up to one to two percentage points. This is a small percentage of overhead for a general-purpose microprocessor that can drive control tasks through interruptions, not very limited by the efficient use of memory. However, generally, each performance MIP is critical to the DSP, So Ti has made efforts to keep the xdais overhead within a limited range.
Acceptance of algorithm standards
Although we have been advising third parties on software rules, some third parties initially were skeptical about their ability to benefit from algorithm standards. Many third parties regard algorithm development as completely their own business. DSP vendors are not welcome to participate in the development. This is a disturbance. In addition, some repetitive work is inevitable in order to bring algorithms into line with new standards, while third parties are opposed to undertaking the work they consider unnecessary. In addition, there are also objections to the standard-related overhead penalties.
In contrast to a third party, DSP system integrators almost immediately welcomed the standard. Some large DSP developers are already working hard to establish their own rules, and the arrival of DSP standards saves their work. System integrators also realize that a small amount of overhead related to algorithm standards can help them avoid a large amount of time consumption and trouble. The value of such savings greatly exceeds the memory and performance trade-offs they must accept.
Once they are familiar with these standards, DSP system integrators begin to require algorithm consistency, so that even the most reluctant third parties have to follow suit. To cope with objections to additional development work, a tool has emerged to help third parties develop consistent algorithms. hyperception component wizard shown in Figure 4 is one of the instances, which can help create xdais algorithms.
Today, standards have been widely accepted, and even the most reluctant algorithm developers agree that standardization has greatly increased the business opportunities for selling software. Designing Based on standards also means that the support needs can be minimized to save costs for third parties. TMS320 algorithm standard is a representative of how successful the standard is: currently, there are 110 third-party developers complying with the expressdsp algorithm, and the number of third-party developers is constantly increasing. Other DSP manufacturers also recognize the needs of algorithm standards and provide similar products for their respective platforms and third-party algorithms. Because the standard covers the basic issues of interoperability programming, its rules are similar in many aspects to the originally released standard TMS320 algorithm standard.
Emerging Industries
In fact, DSP algorithm standards bring about an unprecedented international industry. Today, system integrators in a region can purchase a DSP algorithm from a third party in another region through the website, as long as the algorithm passes authentication and meets the algorithm standards, then the system developer will know that this code can play a normal role in the application framework. For DSP system integrators, the consistency algorithm has simplified the evaluation and integration of third-party target code, thus simplifying the development process and shortening the product release time.
Because we finally control the algorithm, the problems in the industry are the advantages of standardizing other software components (such as libraries, drivers, kernels, and communication stacks. DSP vendors are considering the expansion of the standardization of the above components even while improving the existing algorithm standards.
As the DSP industry continues to develop component software models, the value of algorithm standards is becoming more and more obvious. The standard provides a series of rules. Based on the design, these rules can ensure that components can implement interoperability with algorithms from different vendors in any application. Therefore, code portability and reusability are enhanced, while algorithms are more direct in measurement and evaluation, and algorithms are easier to integrate into the system. The overall system development becomes faster and more flexible, bringing more robust and cheaper products to end users in the market.