There are several sorts of "macro" systems:
Macro programming systems were first created in the 1960s as a way to ease production of assembly language programs.
They allowed the gentle programmer to combine "raw" assembly language with "pseudo operators" that the macro processor would then expand and resolve, transforming it all into "raw" assembly language. For instance, rather than having to assemble loops by hand, you might use the LOOP/ENDLOOP pseudo operations that would expand into a combination of tests, jumps, and labels, eliminating a lot of "bookkeeping."
The common perception that assembly language code is almost impossible to work with tends to result from people only being exposed to "raw" assembler, and realizing that there is a lot of bookkeeping to do. With a powerful macro assembler to help, the task does not forcibly need to be so daunting.
There are also systems that provide a shorthand whereby a set of commands in an interactive system could be associated with a "macro."
Such systems include Spreadsheets, where keyboard commands could be stored in cells, and executed in sequence, and text editors like Emacs, where one may assemble complex operations together into "macros" that may be executed multiple times.
Indeed, EMACS stands for "Editor MACroS," and was originally constructed as a set of macros in the powerful-but-arcane TECO macro language.
A third approach involves "substitution systems" where macros control some parameteric scheme for rewriting text.
Languages like m4 are among the best understood examples of this approach. "Rewriting" may be used as an implementation scheme for the first approach, and is a powerful scheme for code transformation.
For the most part, the "keyboard shorthand" approach has expanded over time into schemes to access some sort of embedded scripting language. That certainly does not represent macro programming.
Implementations of macro programming systems include:
Many of the more sophisticated control structures in Lisp are constructed via macros that generate combinations of the simpler control structures, essentially by rewriting trees of Lisp code into a new derived form.
Sophisticated users of Common Lisp often create macros to represent customized control and data structures.
Scheme offers a template-based R5RS / "hygenic macros" facility. There are two particularly notable aspects in the hygenic macro approach:
In being "hygenic," the macro system automatically generates private symbols to bind to the macro's parameters. This avoids the problem of "variable capture," where components of the macro may unexpectedly share names with variables in your application. This is also described as "referential transparency."
The other interesting thing about R5RS macros is that they use a declarative pattern matching language rather than having the macro use Lisp code to search for what it should replace.
There is a port of this macro approach to Common Lisp; see Scheme Macros for Common Lisp .
cpp, the C preprocessor
It is invoked on a text file, and will copy input to output, expanding macros as it goes along.
M4 is often used as a front end for compilers for other languages, much like cpp, but is also used in autoconf, and has been known to be used to generate material for HTML web pages. (See Current Uses of M4 for examples of this.)
The TeX document processor is a classical application of macrology, where macros expand themselves, ultimately into TeX primitives that generate DVI output.
groff provides a macro rewriting system for the construction of extensions to control document layout. It is not considered particularly easy to use.
Sendmail uses a macro rewriting system to rewrite email addresses.
TRAC is an interpretive, recursive, string-based, macroprocessing language with no compile step.
It is used to build syntax extensions or revisions much as one might use Lisp macros to build extended control structures.
A tool for generating program files with repeated text and varied substitutions. It is typically used to maintain code for processing program options, where the option appears assortedly as a character, to be processed, code to process the option, variables to store data about the option, and documentation for the option.
ML/I was a macro system initially implemented in the mid-1960s on PDP7 computers, and later on many others. (See The ML/I Macro Processor (CACM, Volume 10))
It included a scheme for bootstrapping ML/I to a different machine by mapping a machine-independent logic, called L, to a target system's machine language.
This idea was later revisited (not necessarily conscious of ML/I ) by Dr. C. H. Ting, who built a version of FORTH called eForth which set up a "core" set of FORTH words that were to be implemented in assembly language, with the rest of the language to be built on top of that.
STAGE2 was a macro processor dating back to 1969, by W.M. Waite, used as a "Mobile Programming System."
Its approach involves text transformations, being a bit like m4 . A notable distinction between it and m4 is that STAGE2 supported multi-line and nested constructs.
STAGE2 may still be in use for data conversions by one Chris Greaves...
gema is a general macro processor conceived as an extension to STAGE2; it bears some resemblance to awk .
Like STAGE2, a distinctive of gema is that its constructs are allowed to span lines as well as to nest.
BLISS was used by Digital to write VMS .
BLISS is a systems programming language somewhat similar to C.
Thanks are in order to Parzival Herzog for pointing out some of the older history.
A Taxonomy of meta-programming systems
In a meta-programming system meta-programs manipulate object-programs. Meta-programs may construct object-programs, combine object-program fragments into larger object-programs, observe the structure and other properties of object-programs, and execute object-programs to obtain their values.
Examples of use of this, with differing behaviour models, would include:
Macro rewriting languages like M4 or the C preprocessor, where strings are transformed to generate other strings;
Lisp , where an input stream (perhaps a list) is transformed into another list before being compiled;
MetaML, where programs are transformed into a parse tree that may be further transformed into a further-annotated parse tree.
Transformations for ML -like code would include partial evaluation and annotation based on type analysis.
An essay on the notion that you shouldn't just drop everything into /dev/null and start over, but rather consider scrubbing the code.
The author had some "creaky" ASP code for a bug tracker web application. Rather than rewriting the whole thing, he went through and did relatively minor restructurings of this and that.
No change involved introducing actual new features
Every time code was checked into the revision control system, the package was expected to continue to work properly.
All that was done were logical transformations , textual changes not expected to change code behaviour.
The result was much cleaner code, that was well-understood again. After the "scrubbing," it made sense to start looking at enhancements, and they could be more readily made as evolutionary changes, rather than having the new Generalissimo imprison the old one and establish a new regime of ugly code.
If this was useful, let others know by an Affero rating