The most detailed comparative analysis of Typedef and Define in the C language

Mondo Technology Updated on 2024-02-12

My little nephew was taking the data structure course during the winter break because they were going to be taking it in the second semester of their freshman year. When he was learning about data structures, he found that the keyword typedef in C was used a lot in numbers.

Regardless of any data structure such as linked lists, stacks, queues, etc., in order to be convenient to write in use, it must be encapsulated with typedef, which is not only convenient to write, but also easy to understand.

But because he just finished learning C in the first semester of his freshman year, the keyword typedef is still stranger than pride. In the past two days, he told me on WeChat that he has made a new discovery, and define can also rename types, such as define myint int, and typedef int myint; The effect is the same.

I feel that it is necessary to talk to him systematically and correct deviations in time, so I have this article. define and typedef are very similar, but they are fundamentally different.

The main use of define is to define constants, with an emphasis on specific values. For example:

#define true 1

#define false 0

#define pi 3.14

#define null 0

Typedef can only be used to define a new alias for an existing data type. For example:

typedef int int;

typedef int myint;

typedef int integer;

But define can also be used to define aliases for a data type, for example:

#define int int

#define myint int

#define integer int

But typedef doesn't define constants (specific numbers), for example.

typedef 1 true;//error

typedef 0 false;//error

typedef 3.14 pi;//error

typedef 0 null;//error

This is certainly wrong.

Using tepedef for type definition, the compiler will help us check for errors during compilation, reducing the probability of bugs in the program. For example:

typedef int int;

typedef unsigned int int;

typedef integer myint;

The compiler will tell us that int is defined repeatedly during the compilation phase, and that the integer type does not exist. Defin, on the other hand, does not, it is simply replaced. For example:

#define int int

#sdefine int unsigned int

#define myint integer

#define float int

Before compilation, in the pre-processing phase, the replacement action will be performed, just replace! Without doing any error checking, there will be an error that the compiler may not check out, and the compilation passes, but there is a logical error when the program is executed, and this time you should always have the most dangerous bug.

define can create a new alias for expressions, but typedef can only create types. For example:

#define max(x,y) (x)>(y))?x):(y)

typedef ((x)>(y))?x):(y) max(x,y) ;//error!

Typedef can define new aliases for custom data types and complex data types, but define does not do so, for example:

typedef struct _point;

In this way, there is no problem, the compiler will replace it with struct point point = ; There are no grammatical issues.

#define point struct _point

point point = ;

point will be replaced with struct point point = ; Patently wrong.

Typedef aliases have scoped attributes, and define-defined aliases also have scoped attributes, but do not have explicit scoped attributes. For example:

void func1(){

#define int int

#undef

typedef float float;

void func2(){

typedef char char;

#define mychar char

At the bottom of the file. In this example, if an alias is defined in the file-wide typedef, it will still be valid for the entire file, even if it is defined at the end of the file. But define can only go from the line it defines until the undef range is valid, and if there is no undef, it will end at the end of the current block, such as in the function body, it will end at the end of the function.

define can only be used for the definition of basic data types, and if it is used for alias creation of complex types, an error will occur. For example:

#define pint int*

pint x,y;

It will be replaced with :

int* x,y;

Clearly not our intention. Here's what we're expecting:

typedef in* pint;

pint x,y;

Compile time is replaced with :

int* x;

int* y;

The handling is correct.

Another example: typedef array int[3];

array array1,array2;

The compilation phase is "replaced" with:

int array1[3];

int array2[3];

You can't handle it with define.

In fact, there are still many places worth talking about define and typedef, especially many common "pitfalls" of define, especially the multi-layer nesting of expressions, which are particularly easy for beginners to make mistakes, and I will publish articles on this later.

In the past two days, I have been preparing a series of articles on how to easily play the Windows console in C language, so stay tuned.

Duan Yu, February 11, written in Hefei. February** Dynamic Incentive Program

Related Pages