Linux gcc attribute

Source: Internet
Author: User
Tags deprecated emit

_ATTRIBUTE__ (Error ("message" )) Declare that calling the marked function was an error.
__ATTRIBUTE__ ((Warning ("message" ))) Declare that calling the marked function is suspect and should emit a warning.
__attribute__ ((deprecated)) Declare that using the Marked function, type, or variable is deprecated and would emit a warning.
__ATTRIBUTE__ ((const)) Declare that the marked function was a pure function, only examining its arguments and returning a value without examining or changing anything else.
__attribute__ ((Pure)) Declare The marked function is a pure function, with no side effects (although it may examine global state).
__ATTRIBUTE__ ((nonnull (N1, ...) ) Declare that the specified arguments (one-based) (or all arguments if no indexes is listed) should only be passed nonnull Pointers.
__attribute__ ((noreturn)) Declare that the marked function won't return (although it may throw).
__attribute__ (Hot)) Hint The marked function is ' hot ' and should be optimized more aggresively and/or placed ' hot ' functions ( For cache locality).
__attribute__ ((Cold)) Hint The marked function is "cold" and should are optimized for size, predicted as unlikely for branch prediction, and /or placed near and "cold" functions (so other functions can has improved cache locality).
__attribute__ ((Warn_unused_result)) Declare that the function's return value is important and should was warned about if ignored.

Reference: http://blog.csdn.net/lcw_202/article/details/6226217

__attribute__ (Hot)) |__attribute__ ((Cold)) __attribute__ ((Hot))
This extension is used in front of the function to indicate that the function will be called frequently, to optimize it when compiling a link, or to put it in a block with other equally hot (warm) functions, which facilitates cache access.

__attribute__ ((Cold))

Hint The marked function is "cold" and should are optimized for size, predicted as unlikely for branch prediction, and /or placed near and "cold" functions (so other functions can has improved cache locality).
This extension is used in front of the function to indicate that the function is less popular, so that the function is not prefetch in the branch prediction mechanism, or it is put together with other equally unpopular (cold) functions so that it is likely not to be placed in the cache, and the more hot instructions are placed in the cache. Branch prediction: The concept of first-level cache, level two cache, and instruction Prefetching are presented here in http://www.groad.net/bbs/read.php?tid-1455.html. Although implementing multiple levels of caching is one way to help speed up the execution of program logic, there is still no problem with "jumping" programs. If the program takes many different logical branches, it is almost impossible to make different levels of cache keep up with the branch, and the result will be more access to the script and data elements from memory at the last minute.

To solve this problem, the processor of the IA-32 platform proposed aBranch Prediction (branch prediction)The concept.

Branch PredictionUse a specialized algorithm to try to predict which scripts will be needed in the program branch next.

Specialized statistical algorithms and analyses are introduced to determine the most likely execution path in the script. The script on this path is pre-fetching and loaded into the cache.

3 Branch Prediction Techniques (PENTIUM-4):
        Dynamic Data flow analysis of
        Deep Branch prediction
      and
        push-rational execution

Deep branch prediction enables the processor to attempt to decode instructions across multiple branches in the program. A statistical algorithm is also used to predict the most likely execution path of the program in the branch. Although this technique is helpful, it is not completely error-solving and completely solves the problem of jumping.

Dynamic Data stream Analysis is a statistical analysis of the data flow in the processor in real time. It is predicted that the program flow must pass, but instructions that the instruction pointer has not yet reached are passed to the disorderly execution engine. In addition, any instruction that can be executed is processed when the processor waits for data related to an instruction.

The Inferential execution processor is able to determine which of the script branches are not immediately required, what "long distance" scripts may need to be executed later, and attempt to process these instructions, which also use the chaotic execution engine.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.