- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi.
In my code there are various computations. Some of them use constants, defined as fractions. For example
#define CONST_1 (9 / 2)
void Func1(void)
{
var1 = var2 * CONST_1 + 30;
}
Does the compiler compute the CONST_1 fraction and use the result as multiplication constant, or the CPU executes the full computation (i.e. multiplying by 9 and dividing by 2) every time the function is called?
Thanks.
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Neither 😉
The #define makes a text replacement only. So your line
var1 = var2 * CONST_1 + 30;
becomes
var1 = var2 * (9 / 2) + 30;
Now the compiler "sees" that (9 / 2) can be computed at compile-time and you get as result
var1 = var2 * 4 + 30;
Bob
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Neither 😉
The #define makes a text replacement only. So your line
var1 = var2 * CONST_1 + 30;
becomes
var1 = var2 * (9 / 2) + 30;
Now the compiler "sees" that (9 / 2) can be computed at compile-time and you get as result
var1 = var2 * 4 + 30;
Bob