diff: refactor COLOR_DIFF from a flag into an int

This lets us store more than just a bit flag for whether we
want color; we can also store whether we want automatic
colors. This can be useful for making the automatic-color
decision closer to the point of use.

This mostly just involves replacing DIFF_OPT_* calls with
manipulations of the flag. The biggest exception is that
calls to DIFF_OPT_TST must check for "o->use_color > 0",
which lets an "unknown" value (i.e., the default) stay at
"no color". In the previous code, a value of "-1" was not
propagated at all.

Signed-off-by: Jeff King <peff@peff.net>
Signed-off-by: Junio C Hamano <gitster@pobox.com>
This commit is contained in:
Jeff King
2011-08-17 22:03:12 -07:00
committed by Junio C Hamano
parent 2e6c012e10
commit f1c9626105
7 changed files with 29 additions and 34 deletions

View File

@ -31,7 +31,7 @@ static char decoration_colors[][COLOR_MAXLEN] = {
static const char *decorate_get_color(int decorate_use_color, enum decoration_type ix)
{
if (decorate_use_color)
if (decorate_use_color > 0)
return decoration_colors[ix];
return "";
}
@ -77,7 +77,7 @@ int parse_decorate_color_config(const char *var, const int ofs, const char *valu
* for showing the commit sha1, use the same check for --decorate
*/
#define decorate_get_color_opt(o, ix) \
decorate_get_color(DIFF_OPT_TST((o), COLOR_DIFF), ix)
decorate_get_color((o)->use_color, ix)
static void add_name_decoration(enum decoration_type type, const char *name, struct object *obj)
{