Lisp (programming language)
View on Wikipedia| Lisp | |
|---|---|
| Paradigm | Multi-paradigm: functional, procedural, reflective, meta |
| Designed by | John McCarthy |
| Developer | Steve Russell, Timothy P. Hart, Mike Levin |
| First appeared | 1960 |
| Typing discipline | Dynamic, strong |
| Dialects | |
| Influenced by | |
| Information Processing Language (IPL) | |
| Influenced | |
Lisp (historically LISP, an abbreviation of "list processing") is a family of programming languages with a long history and a distinctive, fully parenthesized prefix notation.[3] Originally specified in the late 1950s, it is the second-oldest high-level programming language still in common use, after Fortran.[4][5] Lisp has changed since its early days, and many dialects have existed over its history. Today, the best-known general-purpose Lisp dialects are Common Lisp, Scheme, Racket, and Clojure.[6][7][8]
Lisp was originally created as a practical mathematical notation for computer programs, influenced by (though not originally derived from)[9] the notation of Alonzo Church's lambda calculus. It quickly became a favored programming language for artificial intelligence (AI) research.[10] As one of the earliest programming languages, Lisp pioneered many ideas in computer science, including tree data structures, automatic storage management, dynamic typing, conditionals, higher-order functions, recursion, the self-hosting compiler,[11] and the read–eval–print loop.[12]
The name LISP derives from "LISt Processor".[13] Linked lists are one of Lisp's major data structures, and Lisp source code is made of lists. Thus, Lisp programs can manipulate source code as a data structure, giving rise to the macro systems that allow programmers to create new syntax or new domain-specific languages embedded in Lisp.
The interchangeability of code and data gives Lisp its instantly recognizable syntax. All program code is written as s-expressions, or parenthesized lists. A function call or syntactic form is written as a list with the function or operator's name first, and the arguments following; for instance, a function f that takes three arguments would be called as (f arg1 arg2 arg3).
History
[edit]John McCarthy began developing Lisp in 1958 while he was at the Massachusetts Institute of Technology (MIT). He was motivated by a desire to create an AI programming language that would work on the IBM 704, as he believed that "IBM looked like a good bet to pursue Artificial Intelligence research vigorously."[14] He was inspired by Information Processing Language, which was also based on list processing, but did not use it because it was designed for different hardware and he found an algebraic language more appealing.[14] Due to these factors, he consulted on the design of the Fortran List Processing Language, which was implemented as a Fortran library. However, he was dissatisfied with it because it did not support recursion or a modern if-then-else statement (which was a new concept when Lisp was first introduced) [note 1].[14]
McCarthy's original notation used bracketed "M-expressions" that would be translated into S-expressions. As an example, the M-expression car[cons[A,B]] is equivalent to the S-expression (car (cons A B)). Once Lisp was implemented, programmers rapidly chose to use S-expressions, and M-expressions were abandoned.[14] M-expressions surfaced again with short-lived attempts of MLisp[15] by Horace Enea and CGOL by Vaughan Pratt.
Lisp was first implemented by Steve Russell on an IBM 704 computer using punched cards.[16] Russell was working for McCarthy at the time and realized (to McCarthy's surprise) that the Lisp eval function could be implemented in machine code.
According to McCarthy[17]
Steve Russell said, look, why don't I program this eval ... and I said to him, ho, ho, you're confusing theory with practice, this eval is intended for reading, not for computing. But he went ahead and did it. That is, he compiled the eval in my paper into IBM 704 machine code, fixing bugs, and then advertised this as a Lisp interpreter, which it certainly was. So at that point Lisp had essentially the form that it has today ...
The result was a working Lisp interpreter which could be used to run Lisp programs, or more properly, "evaluate Lisp expressions".
Two assembly language macros for the IBM 704 became the primitive operations for decomposing lists: car (Contents of the Address part of Register number) and cdr (Contents of the Decrement part of Register number),[18] where "register" refers to registers of the computer's central processing unit (CPU). Lisp dialects still use car and cdr (/kɑːr/ and /ˈkʊdər/) for the operations that return the first item in a list and the rest of the list, respectively.
McCarthy published Lisp's design in a paper in Communications of the ACM on April 1, 1960, entitled "Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I"[19].[20] He showed that with a few simple operators and a notation for anonymous functions borrowed from Church, one can build a Turing-complete language for algorithms.
The first complete Lisp compiler, written in Lisp, was implemented in 1962 by Tim Hart and Mike Levin at MIT, and could be compiled by simply having an existing LISP interpreter interpret the compiler code, producing machine code output able to be executed at a 40-fold improvement in speed over that of the interpreter.[21] This compiler introduced the Lisp model of incremental compilation, in which compiled and interpreted functions can intermix freely. The language used in Hart and Levin's memo is much closer to modern Lisp style than McCarthy's earlier code.
Garbage collection routines were developed by MIT graduate student Daniel Edwards, prior to 1962.[22]
During the 1980s and 1990s, a great effort was made to unify the work on new Lisp dialects (mostly successors to Maclisp such as ZetaLisp and NIL (New Implementation of Lisp)) into a single language. The new language, Common Lisp, was somewhat compatible with the dialects it replaced (the book Common Lisp the Language notes the compatibility of various constructs). In 1994, ANSI published the Common Lisp standard, "ANSI X3.226-1994 Information Technology Programming Language Common Lisp".
Timeline
[edit]| 1958 | 1960 | 1965 | 1970 | 1975 | 1980 | 1985 | 1990 | 1995 | 2000 | 2005 | 2010 | 2015 | 2020 | ||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| LISP 1, 1.5, LISP 2(abandoned) | |||||||||||||||
| Maclisp | |||||||||||||||
| Interlisp | |||||||||||||||
| MDL | |||||||||||||||
| Lisp Machine Lisp | |||||||||||||||
| Scheme | R5RS | R6RS | R7RS small | ||||||||||||
| NIL | |||||||||||||||
| ZIL (Zork Implementation Language) | |||||||||||||||
| Franz Lisp | |||||||||||||||
| muLisp | |||||||||||||||
| Common Lisp | ANSI standard | ||||||||||||||
| Le Lisp | |||||||||||||||
| MIT Scheme | |||||||||||||||
| XLISP | |||||||||||||||
| T | |||||||||||||||
| Chez Scheme | |||||||||||||||
| Emacs Lisp | |||||||||||||||
| AutoLISP | |||||||||||||||
| PicoLisp | |||||||||||||||
| Gambit | |||||||||||||||
| EuLisp | |||||||||||||||
| ISLISP | |||||||||||||||
| OpenLisp | |||||||||||||||
| PLT Scheme | Racket | ||||||||||||||
| newLISP | |||||||||||||||
| GNU Guile | |||||||||||||||
| Visual LISP | |||||||||||||||
| Clojure | |||||||||||||||
| Arc | |||||||||||||||
| LFE | |||||||||||||||
| Hy | |||||||||||||||
Connection to artificial intelligence
[edit]Since inception, Lisp was closely connected with the artificial intelligence research community, especially on PDP-10[23] systems. Lisp was used as the implementation of the language Micro Planner, which was used in the famous AI system SHRDLU. In the 1970s, as AI research spawned commercial offshoots, the performance of existing Lisp systems became a growing issue, as programmers needed to be familiar with the performance ramifications of the various techniques and choices involved in the implementation of Lisp.[24]
Genealogy and variants
[edit]Over its sixty-year history, Lisp has spawned many variations on the core theme of an S-expression language. Some of these variations have been standardized and implemented by different groups with different priorities (for example, both Common Lisp and Scheme have multiple implementations). However, in other cases a software project defines a Lisp without a standard and there is no clear distinction between the dialect and the implementation (for example, Clojure and Emacs Lisp fall into this category).
Differences between dialects (and/or implementations) may be quite visible—for instance, Common Lisp uses the keyword defun to name a function, but Scheme uses define.[25] Within a dialect that is standardized conforming implementations support the same core language, but with different extensions and libraries. This sometimes also creates quite visible changes from the base language - for instance, Guile (an implementation of Scheme) uses define* to create functions which can have default arguments and/or keyword arguments, neither of which are standardized.
Historically significant dialects
[edit]

- LISP 1[26] – First implementation.
- LISP 1.5[27] – First widely distributed version, developed by McCarthy and others at MIT. So named because it contained several improvements on the original "LISP 1" interpreter, but was not a major restructuring as the planned LISP 2 would be.
- Stanford LISP 1.6[28] – A successor to LISP 1.5 developed at the Stanford AI Lab, and widely distributed to PDP-10 systems running the TOPS-10 operating system. It was rendered obsolete by Maclisp and InterLisp.
- Maclisp[29] – developed for MIT's Project MAC, MACLISP is a direct descendant of LISP 1.5. It ran on the PDP-10 and Multics systems. MACLISP would later come to be called Maclisp, and is often referred to as MacLisp. The "MAC" in MACLISP is unrelated to Apple's Macintosh or McCarthy.
- Interlisp[30] – developed at BBN Technologies for PDP-10 systems running the TENEX operating system, later adopted as a "West coast" Lisp for the Xerox Lisp machines as InterLisp-D. A small version called "InterLISP 65" was published for the MOS Technology 6502-based Atari 8-bit computers. Maclisp and InterLisp were strong competitors.
- Franz Lisp – originally a University of California, Berkeley project; later developed by Franz Inc. The name is a humorous deformation of the name "Franz Liszt", and does not refer to Allegro Common Lisp, the dialect of Common Lisp sold by Franz Inc., in more recent years.
- muLISP – initially developed by Albert D. Rich and David Stoutemeyer for small microcomputer systems. Commercially available in 1979, it was running on CP/M systems of only 64KB RAM and was later ported to MS-DOS. Development of the MS-DOS version ended in 1995. The mathematical Software "Derive" was written in muLISP for MS-DOS and later for Windows up to 2007.
- XLISP, which AutoLISP was based on.
- Standard Lisp and Portable Standard Lisp were widely used and ported, especially with the Computer Algebra System REDUCE.
- ZetaLisp, also termed Lisp Machine Lisp – used on the Lisp machines, direct descendant of Maclisp. ZetaLisp had a big influence on Common Lisp.
- LeLisp is a French Lisp dialect. One of the first Interface Builders (called SOS Interface[31]) was written in LeLisp.
- Scheme (1975).[32]
- Common Lisp (1984), as described by Common Lisp the Language – a consolidation of several divergent attempts (ZetaLisp, Spice Lisp, NIL, and S-1 Lisp) to create successor dialects[33] to Maclisp, with substantive influences from the Scheme dialect as well. This version of Common Lisp was available for wide-ranging platforms and was accepted by many as a de facto standard[34] until the publication of ANSI Common Lisp (ANSI X3.226-1994). Among the most widespread sub-dialects of Common Lisp are Steel Bank Common Lisp (SBCL), CMU Common Lisp (CMU-CL), Clozure OpenMCL (not to be confused with Clojure!), GNU CLisp, and later versions of Franz Lisp; all of them adhere to the later ANSI CL standard (see below).
- Dylan was in its first version a mix of Scheme with the Common Lisp Object System.
- EuLisp – attempt to develop a new efficient and cleaned-up Lisp.
- ISLISP – attempt to develop a new efficient and cleaned-up Lisp. Standardized as ISO/IEC 13816:1997[35] and later revised as ISO/IEC 13816:2007:[36] Information technology – Programming languages, their environments and system software interfaces – Programming language ISLISP.
- IEEE Scheme – IEEE standard, 1178–1990 (R1995).
- ANSI Common Lisp – an American National Standards Institute (ANSI) standard for Common Lisp, created by subcommittee X3J13, chartered[37] to begin with Common Lisp: The Language as a base document and to work through a public consensus process to find solutions to shared issues of portability of programs and compatibility of Common Lisp implementations. Although formally an ANSI standard, the implementation, sale, use, and influence of ANSI Common Lisp has been and continues to be seen worldwide.
- ACL2 or "A Computational Logic for Applicative Common Lisp", an applicative (side-effect free) variant of Common LISP. ACL2 is both a programming language which can model computer systems, and a tool to help proving properties of those models.
- Clojure, a recent dialect of Lisp which compiles to the Java virtual machine and has a particular focus on concurrency.
- Game Oriented Assembly Lisp (or GOAL) is a video game programming language developed by Andy Gavin at Naughty Dog. It was written using Allegro Common Lisp and used in the development of the entire Jak and Daxter series of games developed by Naughty Dog.
2000 to present
[edit]After having declined somewhat in the 1990s, Lisp has experienced a resurgence of interest after 2000. Most new activity has been focused around implementations of Common Lisp, Scheme, Emacs Lisp, Clojure, and Racket, and includes development of new portable libraries and applications.
Many new Lisp programmers were inspired by writers such as Paul Graham and Eric S. Raymond to pursue a language others considered antiquated. New Lisp programmers often describe the language as an eye-opening experience and claim to be substantially more productive than in other languages.[38] This increase in awareness may be contrasted to the "AI winter" and Lisp's brief gain in the mid-1990s.[39]
As of 2010[update], there were eleven actively maintained Common Lisp implementations.[40]
The open source community has created new supporting infrastructure: CLiki is a wiki that collects Common Lisp related information, the Common Lisp directory lists resources, #lisp is a popular IRC channel and allows the sharing and commenting of code snippets (with support by lisppaste, an IRC bot written in Lisp), Planet Lisp[41] collects the contents of various Lisp-related blogs, on LispForum[42] users discuss Lisp topics, Lispjobs[43] is a service for announcing job offers and there is a weekly news service, Weekly Lisp News. Common-lisp.net is a hosting site for open source Common Lisp projects. Quicklisp[44] is a library manager for Common Lisp.
Fifty years of Lisp (1958–2008) was celebrated at LISP50@OOPSLA.[45] There are regular local user meetings in Boston, Vancouver, and Hamburg. Other events include the European Common Lisp Meeting, the European Lisp Symposium and an International Lisp Conference.
The Scheme community actively maintains over twenty implementations. Several significant new implementations (Chicken, Gambit, Gauche, Ikarus, Larceny, Ypsilon) have been developed in the 2000s (decade). The Revised5 Report on the Algorithmic Language Scheme[46] standard of Scheme was widely accepted in the Scheme community. The Scheme Requests for Implementation process has created a lot of quasi-standard libraries and extensions for Scheme. User communities of individual Scheme implementations continue to grow. A new language standardization process was started in 2003 and led to the R6RS Scheme standard in 2007. Academic use of Scheme for teaching computer science seems to have declined somewhat. Some universities are no longer using Scheme in their computer science introductory courses;[47][48] MIT now uses Python instead of Scheme for its undergraduate computer science program and MITx massive open online course.[49][50]
There are several new dialects of Lisp: Arc, Hy, Nu, Liskell, and LFE (Lisp Flavored Erlang). The parser for Julia is implemented in Femtolisp, a dialect of Scheme (Julia is inspired by Scheme, which in turn is a Lisp dialect).
In October 2019, Paul Graham released a specification for Bel, "a new dialect of Lisp."
Major dialects
[edit]Common Lisp and Scheme represent two major streams of Lisp development. These languages embody significantly different design choices.
Common Lisp is a successor to Maclisp. The primary influences were Lisp Machine Lisp, Maclisp, NIL, S-1 Lisp, Spice Lisp, and Scheme.[51] It has many of the features of Lisp Machine Lisp (a large Lisp dialect used to program Lisp Machines), but was designed to be efficiently implementable on any personal computer or workstation. Common Lisp is a general-purpose programming language and thus has a large language standard including many built-in data types, functions, macros and other language elements, and an object system (Common Lisp Object System). Common Lisp also borrowed certain features from Scheme such as lexical scoping and lexical closures. Common Lisp implementations are available for targeting different platforms such as the LLVM,[52] the Java virtual machine,[53] x86-64, PowerPC, Alpha, ARM, Motorola 68000, and MIPS,[54] and operating systems such as Windows, macOS, Linux, Solaris, FreeBSD, NetBSD, OpenBSD, Dragonfly BSD, and Heroku.[55]
Scheme is a statically scoped and properly tail-recursive dialect of the Lisp programming language invented by Guy L. Steele, Jr. and Gerald Jay Sussman. It was designed to have exceptionally clear and simple semantics and few different ways to form expressions. Designed about a decade earlier than Common Lisp, Scheme is a more minimalist design. It has a much smaller set of standard features but with certain implementation features (such as tail-call optimization and full continuations) not specified in Common Lisp. A wide variety of programming paradigms, including imperative, functional, and message passing styles, find convenient expression in Scheme. Scheme continues to evolve with a series of standards (Revisedn Report on the Algorithmic Language Scheme) and a series of Scheme Requests for Implementation.
Clojure is a dialect of Lisp that targets mainly the Java virtual machine, and the Common Language Runtime (CLR), the Python VM, the Ruby VM YARV, and compiling to JavaScript. It is designed to be a pragmatic general-purpose language. Clojure draws considerable influences from Haskell and places a very strong emphasis on immutability.[56] Clojure provides access to Java frameworks and libraries, with optional type hints and type inference, so that calls to Java can avoid reflection and enable fast primitive operations. Clojure is not designed to be backwards compatible with other Lisp dialects.[57]
Further, Lisp dialects are used as scripting languages in many applications, with the best-known being Emacs Lisp in the Emacs editor, AutoLISP and later Visual Lisp in AutoCAD, Nyquist in Audacity, and Scheme in LilyPond. The potential small size of a useful Scheme interpreter makes it particularly popular for embedded scripting. Examples include SIOD and TinyScheme, both of which have been successfully embedded in the GIMP image processor under the generic name "Script-fu".[58] LIBREP, a Lisp interpreter by John Harper originally based on the Emacs Lisp language, has been embedded in the Sawfish window manager.[59]
Standardized dialects
[edit]Lisp has officially standardized dialects: R6RS Scheme, R7RS Scheme, IEEE Scheme,[60] ANSI Common Lisp and ISO ISLISP.
Language innovations
[edit]Paul Graham identifies nine important aspects of Lisp that distinguished it from existing languages like Fortran:[61]
- Conditionals not limited to goto
- First-class functions
- Recursion
- Treating variables uniformly as pointers, leaving types to values
- Garbage collection
- Programs made entirely of expressions with no statements
- The symbol data type, distinct from the string data type
- Notation for code made of trees of symbols (using many parentheses)
- Full language available at load time, compile time, and run time
Lisp was the first language where the structure of program code is represented faithfully and directly in a standard data structure—a quality much later dubbed "homoiconicity". Thus, Lisp functions can be manipulated, altered or even created within a Lisp program without lower-level manipulations. This is generally considered one of the main advantages of the language with regard to its expressive power, and makes the language suitable for syntactic macros and meta-circular evaluation.
A conditional using an if–then–else syntax was invented by McCarthy for a chess program written in Fortran. He proposed its inclusion in ALGOL, but it was not made part of the Algol 58 specification. For Lisp, McCarthy used the more general cond-structure.[62] Algol 60 took up if–then–else and popularized it.
Lisp deeply influenced Alan Kay, the leader of the research team that developed Smalltalk at Xerox PARC; and in turn Lisp was influenced by Smalltalk, with later dialects adopting object-oriented programming features (inheritance classes, encapsulating instances, message passing, etc.) in the 1970s. The Flavors object system introduced the concept of multiple inheritance and the mixin. The Common Lisp Object System provides multiple inheritance, multimethods with multiple dispatch, and first-class generic functions, yielding a flexible and powerful form of dynamic dispatch. It has served as the template for many subsequent Lisp (including Scheme) object systems, which are often implemented via a metaobject protocol, a reflective meta-circular design in which the object system is defined in terms of itself: Lisp was only the second language after Smalltalk (and is still one of the very few languages) to possess such a metaobject system. Many years later, Alan Kay suggested that as a result of the confluence of these features, only Smalltalk and Lisp could be regarded as properly conceived object-oriented programming systems.[63]
Lisp introduced the concept of automatic garbage collection, in which the system walks the heap looking for unused memory. Progress in modern sophisticated garbage collection algorithms such as generational garbage collection was stimulated by its use in Lisp.[64]
Edsger W. Dijkstra in his 1972 Turing Award lecture said,
With a few very basic principles at its foundation, it [LISP] has shown a remarkable stability. Besides that, LISP has been the carrier for a considerable number of in a sense our most sophisticated computer applications. LISP has jokingly been described as "the most intelligent way to misuse a computer". I think that description a great compliment because it transmits the full flavour of liberation: it has assisted a number of our most gifted fellow humans in thinking previously impossible thoughts.[65]
Largely because of its resource requirements with respect to early computing hardware (including early microprocessors), Lisp did not become as popular outside of the AI community as Fortran and the ALGOL-descended C language. Because of its suitability to complex and dynamic applications, Lisp enjoyed some resurgence of popular interest in the 2010s.[66]
Syntax and semantics
[edit]- This article's examples are written in Common Lisp (though most are also valid in Scheme).
Symbolic expressions (S-expressions)
[edit]Lisp is an expression oriented language. Unlike most other languages, no distinction is made between "expressions" and "statements";[dubious – discuss] all code and data are written as expressions. When an expression is evaluated, it produces a value (possibly multiple values), which can then be embedded into other expressions. Each value can be any data type.
McCarthy's 1958 paper introduced two types of syntax: Symbolic expressions (S-expressions, sexps), which mirror the internal representation of code and data; and Meta expressions (M-expressions), which express functions of S-expressions. M-expressions never found favor, and almost all Lisps today use S-expressions to manipulate both code and data.
The use of parentheses is Lisp's most immediately obvious difference from other programming language families. As a result, students have long given Lisp nicknames such as Lost In Stupid Parentheses, or Lots of Irritating Superfluous Parentheses.[67] However, the S-expression syntax is also responsible for much of Lisp's power: the syntax is simple and consistent, which facilitates manipulation by computer. However, the syntax of Lisp is not limited to traditional parentheses notation. It can be extended to include alternative notations. For example, XMLisp is a Common Lisp extension that employs the metaobject protocol to integrate S-expressions with the Extensible Markup Language (XML).
The reliance on expressions gives the language great flexibility. Because Lisp functions are written as lists, they can be processed exactly like data. This allows easy writing of programs which manipulate other programs (metaprogramming). Many Lisp dialects exploit this feature using macro systems, which enables extension of the language almost without limit.
Lists
[edit]A Lisp list is written with its elements separated by whitespace, and surrounded by parentheses. For example, (1 2 foo) is a list whose elements are the three atoms 1, 2, and foo. These values are implicitly typed: they are respectively two integers and a Lisp-specific data type called a "symbol", and do not have to be declared as such.
The empty list () is also represented as the special atom nil. This is the only entity in Lisp which is both an atom and a list.
Expressions are written as lists, using prefix notation. The first element in the list is the name of a function, the name of a macro, a lambda expression or the name of a "special operator" (see below). The remainder of the list are the arguments. For example, the function list returns its arguments as a list, so the expression
(list 1 2 (quote foo))
evaluates to the list (1 2 foo). The "quote" before the foo in the preceding example is a "special operator" which returns its argument without evaluating it. Any unquoted expressions are recursively evaluated before the enclosing expression is evaluated. For example,
(list 1 2 (list 3 4))
evaluates to the list (1 2 (3 4)). The third argument is a list; lists can be nested.
Operators
[edit]Arithmetic operators are treated similarly. The expression
(+ 1 2 3 4)
evaluates to 10. The equivalent under infix notation would be "1 + 2 + 3 + 4".
Lisp has no notion of operators as implemented in ALGOL-derived languages. Arithmetic operators in Lisp are variadic functions (or n-ary), able to take any number of arguments. A C-style '++' increment operator is sometimes implemented under the name incf giving syntax
(incf x)
equivalent to (setq x (+ x 1)), returning the new value of x.
"Special operators" (sometimes called "special forms") provide Lisp's control structure. For example, the special operator if takes three arguments. If the first argument is non-nil, it evaluates to the second argument; otherwise, it evaluates to the third argument. Thus, the expression
(if nil
(list 1 2 "foo")
(list 3 4 "bar"))
evaluates to (3 4 "bar"). Of course, this would be more useful if a non-trivial expression had been substituted in place of nil.
Lisp also provides logical operators and, or and not. The and and or operators do short-circuit evaluation and will return their first nil and non-nil argument respectively.
(or (and "zero" nil "never") "James" 'task 'time)
will evaluate to "James".
Lambda expressions and function definition
[edit]Another special operator, lambda, is used to bind variables to values which are then evaluated within an expression. This operator is also used to create functions: the arguments to lambda are a list of arguments, and the expression or expressions to which the function evaluates (the returned value is the value of the last expression that is evaluated). The expression
(lambda (arg) (+ arg 1))
evaluates to a function that, when applied, takes one argument, binds it to arg and returns the number one greater than that argument. Lambda expressions are treated no differently from named functions; they are invoked the same way. Therefore, the expression
((lambda (arg) (+ arg 1)) 5)
evaluates to 6. Here, we're doing a function application: we execute the anonymous function by passing to it the value 5.
Named functions are created by storing a lambda expression in a symbol using the defun macro.
(defun foo (a b c d) (+ a b c d))
(defun f (a) b...) defines a new function named f in the global environment. It is conceptually similar to the expression:
(setf (fdefinition 'f) #'(lambda (a) (block f b...)))
where setf is a macro used to set the value of the first argument fdefinition 'f to a new function object. fdefinition is a global function definition for the function named f. #' is an abbreviation for function special operator, returning a function object.
Atoms
[edit]In the original LISP there were two fundamental data types: atoms and lists. A list was a finite ordered sequence of elements, where each element is either an atom or a list, and an atom was a number or a symbol. A symbol was essentially a unique named item, written as an alphanumeric string in source code, and used either as a variable name or as a data item in symbolic processing. For example, the list (FOO (BAR 1) 2) contains three elements: the symbol FOO, the list (BAR 1), and the number 2.
The essential difference between atoms and lists was that atoms were immutable and unique. Two atoms that appeared in different places in source code but were written in exactly the same way represented the same object,[citation needed] whereas each list was a separate object that could be altered independently of other lists and could be distinguished from other lists by comparison operators.
As more data types were introduced in later Lisp dialects, and programming styles evolved, the concept of an atom lost importance.[citation needed] Many dialects still retained the predicate atom for legacy compatibility,[citation needed] defining it true for any object which is not a cons.
Conses and lists
[edit]
A Lisp list is implemented as a singly linked list.[68] Each cell of this list is called a cons (in Scheme, a pair) and is composed of two pointers, called the car and cdr. These are respectively equivalent to the data and next fields discussed in the article linked list.
Of the many data structures that can be built out of cons cells, one of the most basic is called a proper list. A proper list is either the special nil (empty list) symbol, or a cons in which the car points to a datum (which may be another cons structure, such as a list), and the cdr points to another proper list.
If a given cons is taken to be the head of a linked list, then its car points to the first element of the list, and its cdr points to the rest of the list. For this reason, the car and cdr functions are also called first and rest when referring to conses which are part of a linked list (rather than, say, a tree).
Thus, a Lisp list is not an atomic object, as an instance of a container class in C++ or Java would be. A list is nothing more than an aggregate of linked conses. A variable that refers to a given list is simply a pointer to the first cons in the list. Traversal of a list can be done by cdring down the list; that is, taking successive cdrs to visit each cons of the list; or by using any of several higher-order functions to map a function over a list.
Because conses and lists are so universal in Lisp systems, it is a common misconception that they are Lisp's only data structures. In fact, all but the most simplistic Lisps have other data structures, such as vectors (arrays), hash tables, structures, and so forth.
S-expressions represent lists
[edit]
Parenthesized S-expressions represent linked list structures. There are several ways to represent the same list as an S-expression. A cons can be written in dotted-pair notation as (a . b), where a is the car and b the cdr. A longer proper list might be written (a . (b . (c . (d . nil)))) in dotted-pair notation. This is conventionally abbreviated as (a b c d) in list notation. An improper list[69] may be written in a combination of the two – as (a b c . d) for the list of three conses whose last cdr is d (i.e., the list (a . (b . (c . d))) in fully specified form).
List-processing procedures
[edit]Lisp provides many built-in procedures for accessing and controlling lists. Lists can be created directly with the list procedure, which takes any number of arguments, and returns the list of these arguments.
(list 1 2 'a 3)
;Output: (1 2 a 3)
(list 1 '(2 3) 4)
;Output: (1 (2 3) 4)
Because of the way that lists are constructed from cons pairs, the cons procedure can be used to add an element to the front of a list. The cons procedure is asymmetric in how it handles list arguments, because of how lists are constructed.
(cons 1 '(2 3))
;Output: (1 2 3)
(cons '(1 2) '(3 4))
;Output: ((1 2) 3 4)
The append procedure appends two (or more) lists to one another. Because Lisp lists are linked lists, appending two lists has asymptotic time complexity
(append '(1 2) '(3 4))
;Output: (1 2 3 4)
(append '(1 2 3) '() '(a) '(5 6))
;Output: (1 2 3 a 5 6)
Shared structure
[edit]Lisp lists, being simple linked lists, can share structure with one another. That is to say, two lists can have the same tail, or final sequence of conses. For instance, after the execution of the following Common Lisp code:
(setf foo (list 'a 'b 'c))
(setf bar (cons 'x (cdr foo)))
the lists foo and bar are (a b c) and (x b c) respectively. However, the tail (b c) is the same structure in both lists. It is not a copy; the cons cells pointing to b and c are in the same memory locations for both lists.
Sharing structure rather than copying can give a dramatic performance improvement. However, this technique can interact in undesired ways with functions that alter lists passed to them as arguments. Altering one list, such as by replacing the c with a goose, will affect the other:
(setf (third foo) 'goose)
This changes foo to (a b goose), but thereby also changes bar to (x b goose) – a possibly unexpected result. This can be a source of bugs, and functions which alter their arguments are documented as destructive for this very reason.
Aficionados of functional programming avoid destructive functions. In the Scheme dialect, which favors the functional style, the names of destructive functions are marked with a cautionary exclamation point, or "bang"—such as set-car! (read set car bang), which replaces the car of a cons. In the Common Lisp dialect, destructive functions are commonplace; the equivalent of set-car! is named rplaca for "replace car". This function is rarely seen, however, as Common Lisp includes a special facility, setf, to make it easier to define and use destructive functions. A frequent style in Common Lisp is to write code functionally (without destructive calls) when prototyping, then to add destructive calls as an optimization where it is safe to do so.
Self-evaluating forms and quoting
[edit]Lisp evaluates expressions which are entered by the user. Symbols and lists evaluate to some other (usually, simpler) expression – for instance, a symbol evaluates to the value of the variable it names; (+ 2 3) evaluates to 5. However, most other forms evaluate to themselves: if entering 5 into Lisp, it returns 5.
Any expression can also be marked to prevent it from being evaluated (as is necessary for symbols and lists). This is the role of the quote special operator, or its abbreviation ' (one quotation mark). For instance, usually if entering the symbol foo, it returns the value of the corresponding variable (or an error, if there is no such variable). To refer to the literal symbol, enter (quote foo) or, usually, 'foo.
Both Common Lisp and Scheme also support the backquote operator (termed quasiquote in Scheme), entered with the ` character (Backtick). This is almost the same as the plain quote, except it allows expressions to be evaluated and their values interpolated into a quoted list with the comma , unquote and comma-at ,@ splice operators. If the variable snue has the value (bar baz) then `(foo ,snue) evaluates to (foo (bar baz)), while `(foo ,@snue) evaluates to (foo bar baz). The backquote is most often used in defining macro expansions.[70][71]
Self-evaluating forms and quoted forms are Lisp's equivalent of literals. It may be possible to modify the values of (mutable) literals in program code. For instance, if a function returns a quoted form, and the code that calls the function modifies the form, this may alter the behavior of the function on subsequent invocations.
(defun should-be-constant ()
'(one two three))
(let ((stuff (should-be-constant)))
(setf (third stuff) 'bizarre)) ; bad!
(should-be-constant) ; returns (one two bizarre)
Modifying a quoted form like this is generally considered bad style, and is defined by ANSI Common Lisp as erroneous (resulting in "undefined" behavior in compiled files, because the file-compiler can coalesce similar constants, put them in write-protected memory, etc.).
Lisp's formalization of quotation has been noted by Douglas Hofstadter (in Gödel, Escher, Bach) and others as an example of the philosophical idea of self-reference.
Scope and closure
[edit]The Lisp family splits over the use of dynamic or static (a.k.a. lexical) scope. Clojure, Common Lisp and Scheme make use of static scoping by default, while newLISP, Picolisp and the embedded languages in Emacs and AutoCAD use dynamic scoping. Since version 24.1, Emacs uses both dynamic and lexical scoping.
List structure of program code; exploitation by macros and compilers
[edit]A fundamental distinction between Lisp and other languages is that in Lisp, the textual representation of a program is simply a human-readable description of the same internal data structures (linked lists, symbols, number, characters, etc.) as would be used by the underlying Lisp system.
Lisp uses this to implement a very powerful macro system. Like other macro languages such as the one defined by the C preprocessor (the macro preprocessor for the C, Objective-C and C++ programming languages), a macro returns code that can then be compiled. However, unlike C preprocessor macros, the macros are Lisp functions and so can exploit the full power of Lisp.
Further, because Lisp code has the same structure as lists, macros can be built with any of the list-processing functions in the language. In short, anything that Lisp can do to a data structure, Lisp macros can do to code. In contrast, in most other languages, the parser's output is purely internal to the language implementation and cannot be manipulated by the programmer.
This feature makes it easy to develop efficient languages within languages. For example, the Common Lisp Object System can be implemented cleanly as a language extension using macros. This means that if an application needs a different inheritance mechanism, it can use a different object system. This is in stark contrast to most other languages; for example, Java does not support multiple inheritance and there is no reasonable way to add it.
In simplistic Lisp implementations, this list structure is directly interpreted to run the program; a function is literally a piece of list structure which is traversed by the interpreter in executing it. However, most substantial Lisp systems also include a compiler. The compiler translates list structure into machine code or bytecode for execution. This code can run as fast as code compiled in conventional languages such as C.
Macros expand before the compilation step, and thus offer some interesting options. If a program needs a precomputed table, then a macro might create the table at compile time, so the compiler need only output the table and need not call code to create the table at run time. Some Lisp implementations even have a mechanism, eval-when, that allows code to be present during compile time (when a macro would need it), but not present in the emitted module.[72]
Evaluation and the read–eval–print loop
[edit]Lisp languages are often used with an interactive command line, which may be combined with an integrated development environment (IDE). The user types in expressions at the command line, or directs the IDE to transmit them to the Lisp system. Lisp reads the entered expressions, evaluates them, and prints the result. For this reason, the Lisp command line is called a read–eval–print loop (REPL).
The basic operation of the REPL is as follows. This is a simplistic description which omits many elements of a real Lisp, such as quoting and macros.
The read function accepts textual S-expressions as input, and parses them into an internal data structure. For instance, if you type the text (+ 1 2) at the prompt, read translates this into a linked list with three elements: the symbol +, the number 1, and the number 2. It so happens that this list is also a valid piece of Lisp code; that is, it can be evaluated. This is because the car of the list names a function—the addition operation.
A foo will be read as a single symbol. 123 will be read as the number one hundred and twenty-three. "123" will be read as the string "123".
The eval function evaluates the data, returning zero or more other Lisp data as a result. Evaluation does not have to mean interpretation; some Lisp systems compile every expression to native machine code. It is simple, however, to describe evaluation as interpretation: To evaluate a list whose car names a function, eval first evaluates each of the arguments given in its cdr, then applies the function to the arguments. In this case, the function is addition, and applying it to the argument list (1 2) yields the answer 3. This is the result of the evaluation.
The symbol foo evaluates to the value of the symbol foo. Data like the string "123" evaluates to the same string. The list (quote (1 2 3)) evaluates to the list (1 2 3).
It is the job of the print function to represent output to the user. For a simple result such as 3 this is trivial. An expression which evaluated to a piece of list structure would require that print traverse the list and print it out as an S-expression.
To implement a Lisp REPL, it is necessary only to implement these three functions and an infinite-loop function. (Naturally, the implementation of eval will be complex, since it must also implement all special operators like if or lambda.) This done, a basic REPL is one line of code: (loop (print (eval (read)))).
The Lisp REPL typically also provides input editing, an input history, error handling and an interface to the debugger.
Lisp is usually evaluated eagerly. In Common Lisp, arguments are evaluated in applicative order ('leftmost innermost'), while in Scheme order of arguments is undefined, leaving room for optimization by a compiler.
Control structures
[edit]Lisp originally had very few control structures, but many more were added during the language's evolution. (Lisp's original conditional operator, cond, is the precursor to later if-then-else structures.)
Programmers in the Scheme dialect often express loops using tail recursion. Scheme's commonality in academic computer science has led some students to believe that tail recursion is the only, or the most common, way to write iterations in Lisp, but this is incorrect. All oft-seen Lisp dialects have imperative-style iteration constructs, from Scheme's do loop to Common Lisp's complex loop expressions. Moreover, the key issue that makes this an objective rather than subjective matter is that Scheme makes specific requirements for the handling of tail calls, and thus the reason that the use of tail recursion is generally encouraged for Scheme is that the practice is expressly supported by the language definition. By contrast, ANSI Common Lisp does not require[73] the optimization commonly termed a tail call elimination. Thus, the fact that tail recursive style as a casual replacement for the use of more traditional iteration constructs (such as do, dolist or loop) is discouraged[74] in Common Lisp is not just a matter of stylistic preference, but potentially one of efficiency (since an apparent tail call in Common Lisp may not compile as a simple jump) and program correctness (since tail recursion may increase stack use in Common Lisp, risking stack overflow).
Some Lisp control structures are special operators, equivalent to other languages' syntactic keywords. Expressions using these operators have the same surface appearance as function calls, but differ in that the arguments are not necessarily evaluated—or, in the case of an iteration expression, may be evaluated more than once.
In contrast to most other major programming languages, Lisp allows implementing control structures using the language. Several control structures are implemented as Lisp macros, and can even be macro-expanded by the programmer who wants to know how they work.
Both Common Lisp and Scheme have operators for non-local control flow. The differences in these operators are some of the deepest differences between the two dialects. Scheme supports re-entrant continuations using the call/cc procedure, which allows a program to save (and later restore) a particular place in execution. Common Lisp does not support re-entrant continuations, but does support several ways of handling escape continuations.
Often, the same algorithm can be expressed in Lisp in either an imperative or a functional style. As noted above, Scheme tends to favor the functional style, using tail recursion and continuations to express control flow. However, imperative style is still quite possible. The style preferred by many Common Lisp programmers may seem more familiar to programmers used to structured languages such as C, while that preferred by Schemers more closely resembles pure-functional languages such as Haskell.
Because of Lisp's early heritage in list processing, it has a wide array of higher-order functions relating to iteration over sequences. In many cases where an explicit loop would be needed in other languages (like a for loop in C) in Lisp the same task can be accomplished with a higher-order function. (The same is true of many functional programming languages.)
A good example is a function which in Scheme is called map and in Common Lisp is called mapcar. Given a function and one or more lists, mapcar applies the function successively to the lists' elements in order, collecting the results in a new list:
(mapcar #'+ '(1 2 3 4 5) '(10 20 30 40 50))
This applies the + function to each corresponding pair of list elements, yielding the result (11 22 33 44 55).
Examples
[edit]Here are examples of Common Lisp code.
The basic "Hello, World!" program:
(print "Hello, World!")
Lisp syntax lends itself naturally to recursion. Mathematical problems such as the enumeration of recursively defined sets are simple to express in this notation. For example, to evaluate a number's factorial:
(defun factorial (n)
(if (zerop n) 1
(* n (factorial (1- n)))))
An alternative implementation takes less stack space than the previous version if the underlying Lisp system optimizes tail recursion:
(defun factorial (n &optional (acc 1))
(if (zerop n) acc
(factorial (1- n) (* acc n))))
Contrast the examples above with an iterative version which uses Common Lisp's loop macro:
(defun factorial (n)
(loop for i from 1 to n
for fac = 1 then (* fac i)
finally (return fac)))
The following function reverses a list. (Lisp's built-in reverse function does the same thing.)
(defun -reverse (list)
(let ((return-value))
(dolist (e list) (push e return-value))
return-value))
Object systems
[edit]Various object systems and models have been built on top of, alongside, or into Lisp, including
- The Common Lisp Object System, CLOS, is an integral part of ANSI Common Lisp. CLOS descended from New Flavors and CommonLOOPS. ANSI Common Lisp was the first standardized object-oriented programming language (1994, ANSI X3J13).
- ObjectLisp[75] or Object Lisp, used by Lisp Machines Incorporated and early versions of Macintosh Common Lisp
- LOOPS (Lisp Object-Oriented Programming System) and the later CommonLoops
- Flavors, built at MIT, and its descendant New Flavors (developed by Symbolics).
- KR (short for Knowledge Representation), a constraints-based object system developed to aid the writing of Garnet, a GUI library for Common Lisp.
- Knowledge Engineering Environment (KEE) used an object system named UNITS and integrated it with an inference engine[76] and a truth maintenance system (ATMS).
Operating systems
[edit]Several operating systems, including language-based systems, are based on Lisp (use Lisp features, conventions, methods, data structures, etc.), or are written in Lisp,[77] including:
Genera, renamed Open Genera,[78] by Symbolics; Medley, written in Interlisp, originally a family of graphical operating systems that ran on Xerox's later Star workstations;[79][80] Mezzano;[81] Interim;[82][83] ChrysaLisp,[84] by developers of Tao Systems' TAOS;[85] and also the Guix System for GNU/Linux.
See also
[edit]Footnotes
[edit]References
[edit]- ^ "Introduction". The Julia Manual. Read the Docs. Archived from the original on 2016-04-08. Retrieved 2016-12-10.
- ^ "Wolfram Language Q&A". Wolfram Research. Retrieved 2016-12-10.
- ^ Edwin D. Reilly (2003). Milestones in computer science and information technology. Greenwood Publishing Group. pp. 156–157. ISBN 978-1-57356-521-9.
- ^ "SICP: Foreword". Archived from the original on 2001-07-27.
Lisp is a survivor, having been in use for about a quarter of a century. Among the active programming languages only Fortran has had a longer life.
- ^ "Conclusions". Archived from the original on 2014-04-03. Retrieved 2014-06-04.
- ^ Steele, Guy L. (1990). Common Lisp: the language (2nd ed.). Bedford, MA: Digital Press. ISBN 1-55558-041-6. OCLC 20631879.
- ^ Felleisen, Matthias; Findler, Robert; Flatt, Matthew; Krishnamurthi, Shriram; Barzilay, Eli; McCarthy, Jay; Tobin-Hochstadt, Sam (2015). ""The Racket Manifesto"" (PDF).
- ^ "Clojure - Differences with other Lisps". clojure.org. Retrieved 2022-10-27.
- ^ Steele, Guy Lewis; Sussman, Gerald Jay (May 1978). "The Art of the Interpreter, or the Modularity Complex (Parts Zero, One, and Two), Part Zero, P. 4". MIT Libraries. hdl:1721.1/6094. Retrieved 2020-08-01.
- ^ Hofstadter, Douglas R. (1999) [1979], Gödel, Escher, Bach: An Eternal Golden Braid (Twentieth Anniversary Edition), Basic Books, p. 292, ISBN 0-465-02656-7,
One of the most important and fascinating of all computer languages is LISP (standing for "List Processing"), which was invented by John McCarthy around the time Algol was invented. Subsequently, LISP has enjoyed great popularity with workers in Artificial Intelligence.
- ^ Paul Graham. "Revenge of the Nerds". Retrieved 2013-03-14.
- ^ Chisnall, David (2011-01-12). Influential Programming Languages, Part 4: Lisp.
- ^ Jones, Robin; Maynard, Clive; Stewart, Ian (December 6, 2012). The Art of Lisp Programming. Springer Science & Business Media. p. 2. ISBN 9781447117193.
- ^ a b c d McCarthy, John; Wexelblat, Richard L. (1978). History of programming languages. Association for Computing Machinery. pp. 173–183. ISBN 0127450408.
- ^ Smith, David Canfield. MLISP Users Manual (PDF). Retrieved 2006-10-13.
- ^ McCarthy, John (12 February 1979). "History of Lisp: Artificial Intelligence Laboratory" (PDF).
- ^ Stoyan, Herbert (1984-08-06). Early LISP history (1956–1959). LFP '84: Proceedings of the 1984 ACM Symposium on LISP and functional programming. Association for Computing Machinery. p. 307. doi:10.1145/800055.802047.
- ^ McCarthy, John. "LISP prehistory - Summer 1956 through Summer 1958". Retrieved 2010-03-14.
- ^ McCarthy, John (1960). "Recursive functions of symbolic expressions and their computation by machine, Part I". Communications of the ACM. 3 (4). Association for computer machinery: 184–195. doi:10.1145/367177.367199. Retrieved 28 February 2025.
- ^ McCarthy, John. "Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I". Archived from the original on 2013-10-04. Retrieved 2006-10-13.
- ^ Hart, Tim; Levin, Mike. "AI Memo 39-The new compiler" (PDF). Archived from the original (PDF) on 2017-07-06. Retrieved 2019-03-18.
- ^ McCarthy, John; Abrahams, Paul W.; Edwards, Daniel J.; Hart, Timothy P.; Levin, Michael I. (1985) [1962]. LISP 1.5 Programmer's Manual (PDF). 15th printing (2nd ed.). p. Preface.
- ^ The 36-bit word size of the PDP-6/PDP-10 was influenced by the usefulness of having two Lisp 18-bit pointers in a single word. Peter J. Hurley (18 October 1990). "The History of TOPS or Life in the Fast ACs". Newsgroup: alt.folklore.computers. Usenet: 84950@tut.cis.ohio-state.edu.
The PDP-6 project started in early 1963, as a 24-bit machine. It grew to 36 bits for LISP, a design goal.
- ^ Steele, Guy L.; Gabriel, Richard P. (January 1996), Bergin, Thomas J.; Gibson, Richard G. (eds.), "The evolution of Lisp", History of programming languages---II, New York, NY, US: ACM, pp. 233–330, doi:10.1145/234286.1057818, ISBN 978-0-201-89502-5, retrieved 2022-07-25
- ^ Common Lisp:
(defun f (x) x)
Scheme:(define f (lambda (x) x))or(define (f x) x) - ^ McCarthy, J.; Brayton, R.; Edwards, D.; Fox, P.; Hodes, L.; Luckham, D.; Maling, K.; Park, D.; Russell, S. (March 1960). LISP I Programmers Manual (PDF). Boston: Artificial Intelligence Group, M.I.T. Computation Center and Research Laboratory. Archived from the original (PDF) on 2010-07-17. Accessed May 11, 2010.
- ^ McCarthy, John; Abrahams, Paul W.; Edwards, Daniel J.; Hart, Timothy P.; Levin, Michael I. (1985) [1962]. LISP 1.5 Programmer's Manual (PDF) (2nd ed.). MIT Press. ISBN 0-262-13011-4.
- ^ Quam, Lynn H.; Diffle, Whitfield. Stanford LISP 1.6 Manual (PDF).
- ^ "Maclisp Reference Manual". March 3, 1979. Archived from the original on 2007-12-14.
- ^ Teitelman, Warren (1974). InterLisp Reference Manual (PDF). Archived from the original (PDF) on 2006-06-02. Retrieved 2006-08-19.
- ^ Outils de generation d'interfaces : etat de l'art et classification by H. El Mrabet
- ^ Gerald Jay Sussman & Guy Lewis Steele Jr. (December 1975). "Scheme: An Interpreter for Extended Lambda Calculus" (PDF). MIT AI Lab. AIM-349. Retrieved 23 December 2021.
- ^ Steele, Guy L. Jr. (1990). "Purpose". Common Lisp the Language (2nd ed.). Digital Press. ISBN 0-13-152414-3.
- ^ Kantrowitz, Mark; Margolin, Barry (20 February 1996). "History: Where did Lisp come from?". FAQ: Lisp Frequently Asked Questions 2/7.
- ^ "ISO/IEC 13816:1997". Iso.org. 2007-10-01. Retrieved 2013-11-15.
- ^ "ISO/IEC 13816:2007". Iso.org. 2013-10-30. Retrieved 2013-11-15.
- ^ "X3J13 Charter".
- ^ "The Road To Lisp Survey". Archived from the original on 2006-10-04. Retrieved 2006-10-13.
- ^ "Trends for the Future". Faqs.org. Archived from the original on 2013-06-03. Retrieved 2013-11-15.
- ^ Weinreb, Daniel. "Common Lisp Implementations: A Survey". Archived from the original on 2012-04-21. Retrieved 4 April 2012.
- ^ "Planet Lisp". Retrieved 2023-10-12.
- ^ "LispForum". Retrieved 2023-10-12.
- ^ "Lispjobs". Retrieved 2023-10-12.
- ^ "Quicklisp". Retrieved 2023-10-12.
- ^ "LISP50@OOPSLA". Lisp50.org. Retrieved 2013-11-15.
- ^ Documents: Standards: R5RS. schemers.org (2012-01-11). Retrieved on 2013-07-17.
- ^ "Why MIT now uses python instead of scheme for its undergraduate CS program". cemerick.com. March 24, 2009. Archived from the original on September 17, 2010. Retrieved November 10, 2013.
- ^ Broder, Evan (January 8, 2008). "The End of an Era". mitadmissions.org. Retrieved November 10, 2013.
- ^ "MIT EECS Undergraduate Programs". www.eecs.mit.edu. MIT Electrical Engineering & Computer Science. Retrieved 31 December 2018.
- ^ "MITx introductory Python course hits 1.2 million enrollments". MIT EECS. MIT Electrical Engineering & Computer Science. Retrieved 31 December 2018.
- ^ Chapter 1.1.2, History, ANSI CL Standard
- ^ [1] Clasp is a Common Lisp implementation that interoperates with C++ and uses LLVM for just-in-time compilation (JIT) to native code.
- ^ [2] "Armed Bear Common Lisp (ABCL) is a full implementation of the Common Lisp language featuring both an interpreter and a compiler, running in the JVM"
- ^ [3] Archived 2018-06-22 at the Wayback Machine Common Lisp Implementations: A Survey
- ^ [4] Comparison of actively developed Common Lisp implementations
- ^ An In-Depth Look at Clojure Collections, Retrieved 2012-06-24
- ^ "Clojure rational". Retrieved 27 August 2019.
Clojure is a Lisp not constrained by backwards compatibility
- ^ Script-fu In GIMP 2.4, Retrieved 2009-10-29
- ^ librep at Sawfish Wikia, retrieved 2009-10-29
- ^ "IEEE Scheme". IEEE 1178-1990 - IEEE Standard for the Scheme Programming Language. Retrieved 27 August 2019.
- ^ Paul Graham (May 2002). "What Made Lisp Different".
- ^ "LISP prehistory - Summer 1956 through Summer 1958".
I invented conditional expressions in connection with a set of chess legal move routines I wrote in FORTRAN for the IBM 704 at M.I.T. during 1957–58 ... A paper defining conditional expressions and proposing their use in Algol was sent to the Communications of the ACM but was arbitrarily demoted to a letter to the editor, because it was very short.
- ^ "Meaning of 'Object-Oriented Programming' According to Dr. Alan Kay". 2003-07-23.
I didn't understand the monster LISP idea of tangible metalanguage then, but got kind of close with ideas about extensible languages ... The second phase of this was to finally understand LISP and then using this understanding to make much nicer and smaller and more powerful and more late bound understructures ... OOP to me means only messaging, local retention and protection and hiding of state-process, and extreme late-binding of all things. It can be done in Smalltalk and in LISP. There are possibly other systems in which this is possible, but I'm not aware of them.
- ^ Lieberman, Henry; Hewitt, Carl (June 1983), "A Real-Time Garbage Collector Based on the Lifetimes of Objects", Communications of the ACM, 26 (6): 419–429, CiteSeerX 10.1.1.4.8633, doi:10.1145/358141.358147, hdl:1721.1/6335, S2CID 14161480
- ^ Edsger W. Dijkstra (1972), The Humble Programmer (EWD 340) (ACM Turing Award lecture).
- ^ "A Look at Clojure and the Lisp Resurgence".
- ^ "The Jargon File - Lisp". Retrieved 2006-10-13.
- ^ Sebesta, Robert W. (2012). ""2.4 Functional Programming: LISP";"6.9 List Types";"15.4 The First Functional Programming Language: LISP"". Concepts of Programming Languages (print) (10th ed.). Boston, MA, US: Addison-Wesley. pp. 47–52, 281–284, 677–680. ISBN 978-0-13-139531-2.
- ^ NB: a so-called "dotted list" is only one kind of "improper list". The other kind is the "circular list" where the cons cells form a loop. Typically this is represented using #n=(...) to represent the target cons cell that will have multiple references, and #n# is used to refer to this cons. For instance, (#1=(a b) . #1#) would normally be printed as ((a b) a b) (without circular structure printing enabled), but makes the reuse of the cons cell clear. #1=(a . #1#) cannot normally be printed as it is circular, although (a...) is sometimes displayed, the CDR of the cons cell defined by #1= is itself.
- ^ "CSE 341: Scheme: Quote, Quasiquote, and Metaprogramming". University of Washington Computer Science & Engineering. Winter 2004. Retrieved 2013-11-15.
- ^ Bawden, Alan. "Quasiquotation in Lisp" (PDF). Archived from the original (PDF) on 2013-06-03.
- ^ "Time of Evaluation (Common Lisp Extensions)". GNU. Retrieved on 2013-07-17.
- ^ 3.2.2.3 Semantic Constraints in Common Lisp HyperSpec
- ^ 4.3. Control Abstraction (Recursion vs. Iteration) in Tutorial on Good Lisp Programming Style by Kent Pitman and Peter Norvig, August, 1993.
- ^ pg 17 of Bobrow 1986
- ^ Veitch, p 108, 1988
- ^ Proven, Liam (29 March 2022). "The wild world of non-C operating systems". The Register. Retrieved 2024-04-04.
- ^ "Symbolics Open Genera 2.0". GitHub Internet Archive. 7 January 2020. Retrieved 2022-02-02.
- ^ "Interlisp.org Project". Interlisp.org. 15 March 2022. Retrieved 2022-02-02.
- ^ "Interlisp Medley". GitHub. March 2022. Retrieved 2022-02-02.
- ^ froggey (1 August 2021). "Mezzano". GitHub. Retrieved 2022-02-02.
- ^ Hartmann, Lukas F. (10 September 2015). "Interim". Interim-os. Retrieved 2022-02-02.
- ^ Hartmann, Lukas F. (11 June 2021). "Interim". GitHub. Retrieved 2022-02-02.
- ^ Hinsley, Chris (23 February 2022). "ChrysaLisp". GitHub. Retrieved 2022-02-02.
- ^ Smith, Tony (21 August 2013). "UK micro pioneer Chris Shelton: The mind behind the Nascom 1". The Register. Retrieved 2022-02-02.
Further reading
[edit]- McCarthy, John (1979-02-12). "The implementation of Lisp". History of Lisp. Stanford University. Retrieved 2008-10-17.
- Steele, Jr., Guy L.; Richard P. Gabriel (1993). The evolution of Lisp (PDF). The second ACM SIGPLAN conference on History of programming languages. New York, NY: ACM. pp. 231–270. ISBN 0-89791-570-4. Archived from the original (PDF) on 2006-10-12. Retrieved 2008-10-17.
- Veitch, Jim (1998). "A history and description of CLOS". In Salus, Peter H. (ed.). Handbook of programming languages. Vol. IV, Functional and logic programming languages (1st ed.). Indianapolis, IN: Macmillan Technical Publishing. pp. 107–158. ISBN 1-57870-011-6.
- Abelson, Harold; Sussman, Gerald Jay; Sussman, Julie (1996). Structure and Interpretation of Computer Programs (2nd ed.). MIT Press. ISBN 0-262-01153-0.
- My Lisp Experiences and the Development of GNU Emacs, transcript of Richard Stallman's speech, 28 October 2002, at the International Lisp Conference
- Graham, Paul (2004). Hackers & Painters. Big Ideas from the Computer Age. O'Reilly. ISBN 0-596-00662-4.
- Berkeley, Edmund C.; Bobrow, Daniel G., eds. (March 1964). The Programming Language LISP: Its Operation and Applications (PDF). Cambridge, Massachusetts: MIT Press.
- Article largely based on the LISP - A Simple Introduction chapter: Berkeley, Edmund C. (September 1964). "The Programming Language Lisp: An Introduction and Appraisal". Computers and Automation: 16-23.
- Weissman, Clark (1967). LISP 1.5 Primer (PDF). Belmont, California: Dickenson Publishing Company Inc.
External links
[edit]History
- History of Lisp – John McCarthy's history of 12 February 1979
- Lisp History – Herbert Stoyan's history compiled from the documents (acknowledged by McCarthy as more complete than his own, see: McCarthy's history links)
- History of LISP at the Computer History Museum
- Bell, Adam Gordon (2 May 2022). LISP in Space, with Ron Garret. CoRecursive (podcast, transcript, photos). about the use of LISP software on NASA robots.
- Cassel, David (22 May 2022). "NASA Programmer Remembers Debugging Lisp in Deep Space". The New Stack.
Associations and meetings
- Association of Lisp Users
- European Common Lisp Meeting
- European Lisp Symposium
- International Lisp Conference
Books and tutorials
- Casting SPELs in Lisp, a comic-book style introductory tutorial
- On Lisp, a free book by Paul Graham
- Practical Common Lisp, freeware edition by Peter Seibel
- Lisp for the web
- Land of Lisp
- Let over Lambda
Interviews
- Oral history interview with John McCarthy at Charles Babbage Institute, University of Minnesota, Minneapolis. McCarthy discusses his role in the development of time-sharing at the Massachusetts Institute of Technology. He also describes his work in artificial intelligence (AI) funded by the Advanced Research Projects Agency, including logic-based AI (LISP) and robotics.
- Interview with Richard P. Gabriel (Podcast)
Resources
Lisp (programming language)
View on GrokipediaHistory
Origins in the 1950s
Lisp originated in 1958 when John McCarthy, an assistant professor at the Massachusetts Institute of Technology (MIT), developed it as a programming language to formalize elements of lambda calculus for artificial intelligence (AI) problem-solving. McCarthy's work was part of MIT's Artificial Intelligence Project, aimed at creating tools for symbolic computation that could support early AI experiments, such as the proposed Advice Taker system for processing declarative and imperative sentences with common-sense reasoning.[1][5] The key motivations for Lisp stemmed from the need to simulate mathematical notation in computing, particularly for list processing tasks central to AI research. This design was inspired by the Information Processing Language (IPL), developed earlier for the 1956 Dartmouth AI summer project, though McCarthy sought a more general and machine-independent approach than IPL's hardware-specific implementation on the JOHNNIAC computer. Lisp emphasized computing with symbolic expressions rather than numbers, using lists as the primary data structure to represent complex hierarchies of information efficiently.[1] The initial implementation of Lisp occurred on the IBM 704 computer, where McCarthy and his team hand-compiled functions into assembly language, relying entirely on recursive functions for control flow without support for loops or arrays. This recursive paradigm allowed elegant definitions of algorithms, such as computing factorials or greatest common divisors, directly mirroring mathematical recursion while leveraging list structures for data representation. McCarthy detailed these concepts in his seminal 1960 paper, "Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I," which formally defined Lisp's syntax and evaluation model.[5][1] One early challenge in this implementation was managing dynamic memory allocation for the growing list structures, leading McCarthy to invent garbage collection in 1959. This automatic process reclaimed unused storage cells when the free-storage list was exhausted, preventing manual memory management errors and enabling Lisp's flexible handling of symbolic data without explicit deallocation. The garbage collector, though initially slow—taking several seconds per run—added thousands of registers to the system's capacity on the IBM 704.[1][5]Early Development and AI Connections
Following the implementation of Lisp 1.5, significant expansion occurred at the MIT Artificial Intelligence Laboratory in the mid-1960s, where Maclisp was developed starting in 1966 as the primary dialect for PDP-6 and PDP-10 systems.[6] This dialect addressed the needs of AI research by providing enhanced support for interactive programming and symbolic computation on these DEC minicomputers, which became central to the lab's work.[7] A key milestone preceding this was the first complete Lisp compiler, written in Lisp itself by Timothy Hart and Michael Levin at MIT in 1962, which enabled more efficient execution and bootstrapping of the language.[1] Lisp's interactive read-eval-print loop (REPL), demonstrated in a prototype time-sharing environment on the IBM 704 in 1960 or 1961, profoundly influenced the development of interrupts and time-sharing systems.[1] This setup allowed real-time input processing via interrupts, highlighting Lisp's suitability for exploratory AI development and inspiring broader adoption of interactive computing paradigms.[1] A prominent AI application was SHRDLU, a natural language processing system created by Terry Winograd in 1970 at MIT, which used Lisp to enable a virtual robot to understand and execute commands in a blocks world, demonstrating early successes in comprehension and manipulation.[8][9] Early variants emerged to support networked AI efforts, including BBN Lisp in 1967, developed by Bolt, Beranek and Newman on the SDS 940 under an ARPA contract to distribute Lisp systems for AI research across ARPANET sites.[10] That same year, Interlisp (initially 940 Lisp) introduced advanced debugging tools such as tracing function calls, pretty-printing code, and breakpoints for inspecting parameters, facilitating complex AI program development.[10][11] In the 1970s, garbage collection saw refinements like incremental reference counting in Xerox Lisp machines and spaghetti stacks in BBN Lisp/Interlisp, which optimized memory management for large-scale symbolic processing without full stops.[12] The concept of dedicated Lisp machines was also conceptualized in the early 1970s at MIT, with initial hardware prototypes by 1974 incorporating specialized support for list processing and garbage collection to boost AI efficiency.[12] During the 1970s, Lisp implementations focused on efficiency for AI theorem proving, exemplified by the Pure Lisp Theorem Prover (PLTP) developed by Robert Boyer and J Strother Moore from 1971 to 1973, which automated proofs of recursive functions using Lisp's inductive logic in seconds to minutes on contemporary hardware.[13] This work emphasized structural sharing and heuristic simplification to handle theorems like list reversal and sorting, establishing Lisp's role in scalable automated reasoning for AI.[13]Dialect Evolution Through the 20th Century
In the 1980s, the Lisp ecosystem experienced significant fragmentation as various dialects proliferated to meet specialized needs in artificial intelligence research and development. Major implementations included ZetaLisp, developed for Symbolics Lisp machines, which extended earlier Lisp-Machine Lisp with features like the SETF macro for generalized assignment, the Flavors object system, and enhanced support for complex data structures such as multi-dimensional arrays.[7] Lisp Machines, Inc. (LMI), founded in 1979 to commercialize MIT's CADR design, pursued a dialect based on Maclisp, emphasizing portability and efficiency for AI applications on their Lambda machines.[7] Meanwhile, Scheme, originally conceived in 1975 by Gerald Sussman and Guy Steele for its minimalist design and lexical scoping, gained traction in the 1980s as a lightweight alternative focused on functional programming and teaching, distinguishing itself from more feature-rich dialects through its emphasis on tail recursion and first-class continuations.[7] This divergence, driven by hardware-specific optimizations and institutional preferences, prompted concerns from funding agencies like DARPA about interoperability, setting the stage for unification efforts.[7] Standardization initiatives emerged to address this fragmentation, beginning with Common Lisp in 1981 when representatives from MIT, Carnegie Mellon University (CMU), and Lisp machine vendors convened to design a unified dialect balancing expressiveness and portability.[14] The first draft, known as the "Swiss Cheese Edition," appeared in summer 1981, followed by a more complete version by early 1983, culminating in Guy Steele's Common Lisp: The Language (CLtL1) in 1984, which defined core semantics including dynamic typing, packages, and condition handling.[7] The ANSI X3J13 committee, formed in 1986, refined these through iterative drafts, incorporating the Common Lisp Object System (CLOS) in 1988 and finalizing the ANSI X3.226-1994 standard in 1994 after extensive community input.[15] For Scheme, the Revised Fourth Report (R4RS), completed in spring 1988 and published in 1991, formalized portable features like hygienic macros and a core numeric tower, serving as the basis for the IEEE P1178 standard ratified that year.[16] The Revised Fifth Report (R5RS), released in 1998, extended this with multiple return values, dynamic-wind for exception safety, and refined exception handling, promoting wider adoption in educational and embedded contexts.[16] The late 1980s marked the decline of dedicated Lisp hardware, as the rise of affordable personal computers like those from Apple and IBM eroded the market for specialized machines. By 1987, general-purpose hardware had advanced sufficiently under Moore's Law to run Lisp interpreters and compilers effectively, rendering expensive Lisp machines obsolete for most applications and contributing to the second AI winter.[17] Companies such as Symbolics and LMI faced bankruptcy or pivoted away from hardware, with production ceasing by the early 1990s.[7] Amid these shifts, key events underscored the field's maturation: the Lisp and Symbolic Computation journal launched its first issue in 1988, providing a dedicated forum for research on dialects, macros, and symbolic processing.[18] In the 1990s, open-source efforts gained momentum, exemplified by CMU Common Lisp (CMUCL), which originated as Spice Lisp in 1980 at Carnegie Mellon for the SPICE multiprocessor but was renamed in 1985 and ported to Unix workstations like the IBM RT PC by 1984.[19] CMUCL's compiler, rewritten in 1985 as the Python system, supported multiple architectures and influenced Common Lisp conformance; it evolved into Steel Bank Common Lisp (SBCL) via a 1999 fork, emphasizing maintainability and native compilation for broader accessibility.[20]Developments from 2000 to 2025
In the early 2000s, Common Lisp experienced a revival through the development of Steel Bank Common Lisp (SBCL), a high-performance open-source implementation forked from CMU Common Lisp in 1999 and achieving significant stability by 2002 with enhanced native compilation and cross-platform support.[20] This implementation emphasized optimizing for modern hardware, including advanced garbage collection and just-in-time compilation, making it suitable for production systems. Complementing SBCL, Quicklisp emerged in 2010 as a centralized package manager, simplifying library installation and dependency resolution across Common Lisp environments by hosting over 1,500 libraries in a single repository.[21] In the early 2000s, Paul Graham contributed significantly to Lisp's popularity through his influential essays. In "Beating the Averages" (2001), he described using Common Lisp to build Viaweb, the first web-based software application sold for more than a million dollars, portraying Lisp as a "secret weapon" for rapid prototyping and development due to its powerful macro system and flexibility. In "If Lisp is So Great" (2003), Graham explored Lisp's technical strengths, such as its homoiconicity and ability to treat code as data, while discussing reasons for its limited mainstream adoption despite these advantages. These essays helped revive interest in Lisp among startup founders, web developers, and the hacker community, influencing its perception in practical software engineering contexts.[22][23] Parallel to Common Lisp's resurgence, new dialects gained prominence for specific domains. Clojure, released in 2007 by Rich Hickey, integrated Lisp principles with the Java Virtual Machine (JVM), prioritizing immutable data structures and software transactional memory to address concurrency challenges in multi-threaded applications. Similarly, Racket, renamed from PLT Scheme in 2010, evolved in the 2010s as a versatile platform for education and scripting, offering modular language extensions and a robust ecosystem for teaching programming concepts through domain-specific languages.[24] Lisp's historical ties to artificial intelligence persisted into the 21st century, building on its use in early neural network research, such as Yann LeCun's 1989 implementation of backpropagation in Lisp for convolutional networks.[25] In the 2020s, this legacy informed discussions on integrating Lisp with modern AI tools; for instance, the European Lisp Symposium (ELS) 2025 in Zürich featured talks on leveraging Common Lisp for large language models (LLMs) and processing big data volumes with flexible abstractions.[26] These presentations highlighted Lisp's symbolic manipulation strengths in hybrid AI systems combining neural and symbolic approaches.[27] The Lisp community sustained momentum through recurring events like the International Lisp Conference (ILC), which from 2000 onward facilitated advancements in language implementations and applications, with editions in 2002 and 2005 focusing on practical deployments.[28] By 2025, symposium discussions emphasized Lisp's long-term stability for AI projects, citing its unchanged ANSI standard since 1994 and mature implementations as advantages for maintaining codebases over decades in evolving AI landscapes.[29] Recent tooling advancements further bolstered Lisp's utility. Roswell, a Common Lisp environment manager, saw updates through 2023–2025, including improved implementation switching and script distribution features via its GitHub repository, aiding developers in reproducible setups. Additionally, integrations with WebAssembly enabled web deployment; by 2025, projects like Web Embeddable Common Lisp allowed running Common Lisp code natively in browsers through compiled modules, supporting interactive applications without JVM dependencies.[30] Lisp found niche growth in specialized areas. In game development, the GOAL dialect—originally created by Naughty Dog in the late 1990s using Allegro Common Lisp—received modern extensions through the open-goal project, porting games like Jak and Daxter to PC while preserving Lisp's high-level expressiveness for low-level engine scripting.[31] For embedded systems, Lisp variants like MakerLisp (2019) and LambLisp (2025) targeted real-time control, offering lightweight interpreters embeddable in C++ environments for IoT edge devices and robotics.[32][33]Dialects and Implementations
Major Historical Dialects
Maclisp, developed in the mid-1960s at MIT's Project MAC for the DEC PDP-6 and PDP-10 computers, became a foundational dialect for AI research due to its extensions supporting advanced symbolic computation.[34] It introduced arbitrary-precision integer arithmetic and syntax extensibility through parse tables and macros, enabling flexible language customization.[34] For AI applications, Maclisp incorporated streams for input/output operations, which facilitated portable data handling across systems, and a 1973 compiler that generated efficient numerical code comparable to FORTRAN, driven by needs in projects like MACSYMA.[34] These features, including support for embedded languages such as PLANNER and CONNIVER, made Maclisp central to early AI development through the 1970s and 1980s.[34] Interlisp, originating in 1966 at Bolt, Beranek and Newman and evolving through the 1970s and 1980s, emphasized interactive programming environments tailored for AI experimentation.[35] Its key innovation was the DWIM (Do What I Mean) facility, an error-correction tool that analyzed and fixed common user mistakes during debugging by inferring intended actions, significantly enhancing productivity in exploratory coding.[35] Interlisp provided comprehensive interactive tools, such as an editor and the BREAK package, allowing programmers to manipulate code as symbolic expressions and enabling programs to analyze or modify other programs dynamically.[35] These capabilities, pioneered under Warren Teitelbaum's influence, positioned Interlisp as a leader in user-friendly AI systems until the 1980s.[35] ZetaLisp, developed in the 1970s for Symbolics Lisp Machines, optimized Lisp for high-performance AI and CAD applications on dedicated hardware.[36] It featured incremental compilation, automatic memory management, and hardware-supported type checking and garbage collection, enabling efficient interactive development.[36] A standout contribution was the Flavors object system, an early message-passing framework with multiple inheritance and generic functions, which allowed non-hierarchical object structures and influenced subsequent Lisp object-oriented designs.[36] Implemented on Symbolics machines like the 3600 and 3670 from the late 1970s through the 1980s, ZetaLisp integrated with the Genera environment for advanced windowing and debugging, solidifying its role in professional AI programming.[36] T, a dialect of Scheme developed in the early 1980s at Yale but rooted in 1970s research, prioritized lexical scoping and continuations for expressive control flow.[37] It implemented full first-class continuations through optimized tail recursion and a CATCH mechanism for non-local exits, allowing programmers to capture and manipulate control contexts dynamically.[37] Building on Steele and Sussman's Scheme prototypes, T tested efficient implementation on conventional architectures like VAX and MC68000, with portable interpreters and compilers that maintained compatibility between interpreted and compiled code.[37] This focus on continuations highlighted T's influence on minimalistic, continuation-passing dialects in the late 1970s transition toward modern Lisp variants.[37] The proliferation of dialects like Maclisp, Interlisp, ZetaLisp, and T in the 1960s through 1980s, each optimized for specific hardware such as PDP-10 or Lisp Machines, resulted in incompatible features and syntax, severely limiting code portability across systems.[34] Hardware dependencies, like PDP-10's 36-bit words or TENEX OS specifics, compounded fragmentation, making it challenging for AI researchers to share programs.[34] These portability issues motivated the 1982 formation of the Common Lisp effort, which unified key elements from these dialects into a standardized, portable language to resolve divergences and support broader adoption.[34]Standardized Dialects
The ANSI Common Lisp standard, formally known as ANSI INCITS 226-1994 (reaffirmed in 1999), defines a comprehensive dialect of Lisp designed to promote portability of programs across diverse data processing systems.[38] This nearly 1,100-page specification outlines over 1,000 functions, macros, and variables, providing a robust foundation for general-purpose programming.[39] Key elements include the condition system, which enables advanced error handling through signaling conditions and establishing handlers without immediate stack unwinding, allowing for restarts and interactive debugging. Additionally, it incorporates the Common Lisp Object System (CLOS), an integrated object-oriented framework supporting multiple inheritance, multimethods, and dynamic class redefinition.[40] Compliance with the ANSI Common Lisp standard is verified through test suites such as the RT regression testing library, originally developed at MIT and extended for the GNU Common Lisp (GCL) ANSI test suite, which encompasses over 20,000 individual tests covering core language features and libraries.[41][42] Prominent implementations adhering to this standard include Steel Bank Common Lisp (SBCL), a high-performance native-code compiler, and GNU CLISP, which supports both interpreted and compiled execution modes.[43] In contrast, Scheme's standardization emphasizes minimalism and elegance, with the Revised Fifth Report on Scheme (R5RS), published in 1998, defining a core language that is statically (lexically) scoped and requires proper tail-call optimization to support efficient recursion without stack overflow.[44] This 60-page report focuses on a small set of primitives, first-class procedures, and continuations, making it ideal for teaching and exploratory programming while ensuring portability.[44] The subsequent Revised Sixth Report on Scheme (R6RS), ratified in 2007, expands on R5RS by introducing a modular library system, Unicode support, exception handling, and enhanced data structures like bytevectors, while maintaining lexical scoping and tail-call requirements but adding phases for compile-time and run-time separation.[45] Scheme implementations compliant with these standards include GNU Guile, an extensible library for embedding Scheme in applications, and Chez Scheme, a high-performance compiler with full R6RS support and optimizations for production use.[46][47] The primary distinction between these standards lies in their philosophical approaches: ANSI Common Lisp offers a feature-rich environment suited for large-scale systems development, with extensive built-in facilities like CLOS and the condition system, whereas R5RS and R6RS prioritize simplicity and a minimal core, facilitating easier implementation and use in educational contexts, though at the cost of requiring more external libraries for advanced functionality.[39][44][45]Modern Dialects and Implementations
In the 21st century, Lisp dialects have adapted to modern computing environments, emphasizing interoperability with mainstream platforms, enhanced concurrency models, and performance optimizations. Clojure, first released in 2007 by Rich Hickey, is a functional Lisp dialect designed for the Java Virtual Machine (JVM), featuring persistent immutable data structures as core types to facilitate safe concurrent programming.[48][49] It incorporates software transactional memory (STM) for handling concurrency, allowing atomic updates to shared state without traditional locks, and seamlessly interop with Java libraries through direct access to JVM classes and methods.[48][49] Racket, evolving from PLT Scheme and renamed in the early 2010s, serves as a multi-paradigm platform for language-oriented programming, supporting dialects tailored for domains like web development (via libraries such as Scribble and Rackunit) and graphics (through packages like pict and slideshow).[50] Its ecosystem includes DrRacket, an integrated development environment that promotes educational use by providing interactive teaching languages and visualization tools for beginners.[50] Racket's design facilitates the creation of domain-specific languages through its powerful macro system, making it suitable for both research and pedagogy.[50] For Common Lisp, Steel Bank Common Lisp (SBCL), forked from CMU Common Lisp in 1999 and actively developed since the early 2000s, stands out as a high-performance implementation with an optimizing native code compiler that generates machine code for multiple architectures. It employs advanced type inference to enable optimizations like dead code elimination and precise garbage collection, often achieving speeds comparable to C in numerical computations.[51] SBCL supports features such as native threads on Unix-like systems and foreign function interfaces for C libraries, broadening its applicability in systems programming. Recent advancements include the integration of Chez Scheme into Racket in 2019, where Racket was rebuilt atop Chez's efficient compiler to leverage its nanopass intermediate representation for faster execution and compilation times, passing the full Racket test suite while maintaining compatibility.[52] Additionally, Wisp, introduced in 2015 (SRFI-119) as an indentation-sensitive preprocessor for Scheme and other Lisps, transforms whitespace-based syntax into standard S-expressions, aiming to improve readability while preserving homoiconicity and macro expressiveness.[53] Cross-platform portability has advanced through WebAssembly (Wasm) support in various Lisp implementations during the 2020s, enabling browser-based execution; for instance, projects like uLisp and Medley Interlisp have compiled to Wasm for embedded and web environments, allowing Lisp code to run efficiently in sandboxes without native plugins.[54]Core Language Features
Syntax: Symbolic Expressions and Lists
Lisp's syntax revolves around symbolic expressions, or S-expressions, which serve as the fundamental units for both data and code representation. An S-expression is defined recursively: it is either an atomic symbol—such as a string of capital letters and digits—or a compound expression formed by an ordered pair of two S-expressions, denoted as (e1 · e2), where e1 and e2 are themselves S-expressions.[5] In practice, this structure uses parentheses to enclose lists, making expressions like (add 1 2) a typical form, where "add" is an atomic symbol and 1 and 2 are numeric atoms.[5] At the core of Lisp's list structure is the cons cell, a primitive data constructor that builds linked lists by pairing a head element (the car) with a tail (the cdr), represented as cons[e1; e2] or (e1 · e2).[5] Lists are chains of cons cells terminating in the empty list, denoted as NIL, an atomic symbol that represents both the empty list and false in logical contexts.[5] For instance, the list (A B C) abbreviates to (A · (B · (C · NIL))), forming a proper list that ends in NIL; improper lists, by contrast, terminate in a non-list value, such as (A · B), which is a cons cell rather than a true list.[5] Lisp employs a prefix notation for its reader syntax, where operators precede their arguments without infix operators or precedence rules, ensuring uniform parsing of all expressions as nested lists.[5] To treat an S-expression literally without evaluation, Lisp provides quoting, originally expressed as (QUOTE E) to yield E unchanged, later abbreviated in many dialects as 'E.[5] This mechanism allows direct manipulation of symbolic structures. A defining feature of Lisp's syntax is its homoiconicity, where the code itself is represented as S-expressions identical to data lists, enabling seamless programmatic inspection and transformation of program structure.[5] For example, a function definition like (LABEL F (LAMBDA (X) (CONS X X))) is an S-expression that can be processed as a list, with its elements accessible via standard list operations.[5] This uniformity underpins Lisp's flexibility in symbolic computation, distinguishing it from languages with separate syntactic forms for code and data.Semantics: Evaluation and Quoting
Lisp's interactive development environment is centered around the read-eval-print loop (REPL), a cycle that reads user input as S-expressions, evaluates them, and prints the results, enabling rapid prototyping and experimentation.[55] This loop, formalized in early Lisp systems like LISP 1.5, processes expressions sequentially until termination, with evaluation occurring in a specified environment.[55] The core evaluation semantics, defined in the original Lisp design, operate via theeval function, which takes an expression and an environment—a list associating symbols with values—and returns the expression's value.[56] Atoms, such as numbers or symbols, are self-evaluating: numbers yield themselves, while symbols are looked up in the environment to retrieve their bound values, or an error occurs if unbound.[56] For lists (non-atomic S-expressions), evaluation first applies eval to the operator (the first element); if it is a special form, special rules apply; otherwise, the remaining elements (arguments) are evaluated via evlis, and the operator is applied to those values using apply.[56][55]
Special forms like quote, lambda, and if (or its predecessor cond) bypass standard function application to enable conditional execution, function creation, and unevaluated structures.[56] The quote form, (quote <datum>) or abbreviated ' <datum>, returns its argument unevaluated as a literal S-expression, preventing recursive evaluation of lists or symbols.[56] Lambda constructs a procedure from parameters and body without evaluating the body immediately, binding parameters dynamically upon application.[56] If evaluates its predicate; if true, it evaluates and returns the consequent, skipping the alternate; if false, it evaluates the alternate or returns a default.[55]
Quasiquotation extends quoting for templating, introduced informally in the 1970s and standardized later, using backquote syntax `<template> to quote a structure while allowing unquoting with comma (, <expr>) to insert evaluated subexpressions and splicing with comma-at (,@ <expr>) to insert lists inline.[57] For example, `(+ ,x ,@y) evaluates to a list like (+ 1 2 3) if x is 1 and y is (2 3), facilitating code generation.[57]
Lisp dialects vary in environment models: early systems and Common Lisp's special variables use dynamic scoping, where bindings are resolved at runtime based on the current call stack's association list, allowing outer bindings to affect inner functions unexpectedly.[55] In contrast, Scheme employs lexical scoping, where variable bindings are determined by the static program structure, ensuring a function captures the environment in which it was defined, promoting referential transparency.[58] Common Lisp supports both, with lexical as default for non-special variables and dynamic for declared specials.
Functions, Lambdas, and Closures
Lisp embodies a functional programming paradigm where functions are first-class objects, meaning they can be passed as arguments to other functions, returned as results, and assigned to variables, a design directly inspired by the lambda calculus as formalized in John McCarthy's foundational work.[5] This approach allows for concise expression of computations through composition and abstraction, distinguishing Lisp from imperative languages of its era that treated functions as second-class entities.[5] Anonymous functions in Lisp are defined using lambda expressions, which take the form(lambda (parameters) body), where parameters specifies the arguments and body contains the expressions to evaluate. This syntax, rooted in Church's lambda calculus and adapted by McCarthy for symbolic computation, enables the creation of functions without names, facilitating higher-order programming.[5] For instance, a simple lambda to compute the square of a number is (lambda (x) (* x x)), which can be immediately applied or stored. Named functions, in contrast, are typically defined using the defun macro in dialects like Common Lisp, as (defun name (parameters) body), which expands to a lambda expression and establishes a global binding.[59] Function application occurs via built-in operators such as funcall, which invokes a function with explicit arguments, or apply, which spreads a list of arguments to the function, supporting dynamic invocation essential for meta-programming.
A key feature enabled by lambda expressions is the formation of closures, where a function captures and retains the lexical environment in which it was defined, allowing access to non-local variables even after the defining scope has exited.[60] In Common Lisp, which adopts lexical scoping influenced by Scheme, evaluating (function (lambda (x) ...)) or a lambda form produces such a closure, preserving bindings from the surrounding context.[61] This mechanism supports higher-order functions that generate customized closures, such as a counter: (let ((count 0)) ([lambda](/page/Lambda) () (incf count))), which maintains its internal state across invocations.[60]
Recursion serves as the primary control mechanism in Lisp for iterative processes, particularly in list manipulation, eschewing explicit loops in favor of self-referential function calls for elegance and alignment with mathematical definitions.[5] For example, the classic factorial function is defined recursively as (defun factorial (n) (if (<= n 1) 1 (* n ([factorial](/page/Factorial) (- n 1))))). In Scheme, a standardized dialect, implementations must support proper tail recursion, ensuring that tail calls—where the recursive invocation is the last operation—do not consume additional stack space, enabling efficient unbounded recursion akin to iteration.
Higher-order functions further exemplify Lisp's functional strengths, accepting other functions as arguments to process collections like lists. The mapcar function applies a given function to each element of a list (or multiple lists), returning a new list of results, as in (mapcar (lambda (x) (* x 2)) '(1 2 3)) yielding (2 4 6).[62] Equivalents for filtering, such as remove-if-not, retain elements satisfying a predicate: (remove-if-not (lambda (x) (evenp x)) '(1 2 3 4)) produces (2 4). These utilities, integral to list processing, underscore Lisp's homoiconic nature where functions operate seamlessly on symbolic data structures.[62]
Control Structures and Macros
Lisp's control structures support conditional execution and iteration through macros that integrate seamlessly with its list-based syntax. Thecond macro in Common Lisp evaluates a series of clauses, each consisting of a test form followed by zero or more consequence forms; it returns the results of the first successful clause's consequences or nil if none succeed.[63] Scheme's cond operates analogously, treating an else clause specially if its test is the symbol else. For dispatching on discrete values, Common Lisp's case macro compares a key form against a list of clause keys using eql, executing the matching clause's body; variants ccase and ecase signal errors if no match occurs.[64] Scheme provides a similar case that uses eqv? for comparisons and defaults to an else clause.
Iteration in Lisp emphasizes flexibility over rigid loops. Common Lisp's loop macro offers an extensible domain-specific language for complex iterations, supporting variable initialization, stepping, termination tests, accumulation (e.g., sum, collect), and nesting across collections like lists or numbers. For simpler imperative loops, do binds variables with initial values, executes a body until an end-test succeeds, and applies step forms after each iteration; do* evaluates bindings and steps sequentially rather than in parallel.[65] Scheme's do mirrors this structure, initializing variables, testing for termination before body execution, and updating via step expressions, with the final value from a result expression upon exit.
Lisp's macro system empowers users to create custom special forms that expand at compile time, effectively extending the language's syntax without runtime overhead. In Common Lisp, defmacro defines a macro as a lambda-like form that takes arguments and produces code to replace invocations, evaluated during compilation or interpretation. For example, the when macro implements conditional execution without an else branch:
(defmacro when (condition &body body)
`(if ,condition (progn ,@body)))
This expands (when (> x 0) (print x) (incf x)) to (if (> x 0) (progn (print x) (incf x))), splicing the body forms into a progn for multiple statements. Macros perform this expansion before runtime, enabling optimizations and new abstractions like domain-specific control flows.
Macro authoring relies on tools to build and manipulate code templates safely. Common Lisp's backquote (`) creates quasi-quoted lists that preserve structure, while comma (,) unquotes expressions for evaluation within the template, and ,@ splices lists; this combination simplifies generating code with dynamic parts.[66] To prevent variable capture—where a macro's temporary variables shadow user-defined ones—gensym generates unique symbols (e.g., #:G1234) uninterned in any package, ensuring no clashes during expansion. For instance, a macro defining a local let-binding must use gensym for its internal variables to avoid capturing external bindings with the same name. In contrast, Scheme's define-syntax with syntax-rules produces hygienic macros that automatically rename bound identifiers to avoid unintended captures, preserving lexical scoping without manual intervention. This hygiene in Scheme contrasts with Common Lisp's non-hygienic approach, where explicit techniques like gensym are required for safety, though Scheme's system limits some low-level manipulations possible in Common Lisp.[67]
Advanced Paradigms
Object Systems in Lisp
Lisp's object-oriented capabilities evolved significantly from early experimental systems in the 1970s to standardized frameworks in the 1980s and beyond. Initial developments occurred within the MIT Lisp Machine environment, where ZetaLisp incorporated Flavors, an object system introduced in 1979 that supported multiple inheritance and message-passing mechanisms influenced by AI research tools like Planner and Conniver.[7][68] Flavors, developed by David A. Moon and Howard Cannon, integrated into programming environments for tasks such as window systems and emphasized non-hierarchical object structures.[7] By the mid-1980s, as Common Lisp standardization efforts advanced under the X3J13 committee formed in 1986, these ideas merged with Xerox PARC's CommonLoops to form the Common Lisp Object System (CLOS), finalized in the ANSI Common Lisp standard in 1994.[69][7] CLOS, developed primarily in the 1980s through collaborative efforts involving Gregor Kiczales, Daniel G. Bobrow, and others, provides a robust object-oriented extension to Common Lisp centered on classes, methods, and generic functions.[70] Classes are defined using thedefclass macro, which specifies slots—named storage units for instance data—with options for allocation (local to instances or shared across them) and automatic generation of accessor methods.[70] Inheritance supports multiple superclasses, resolved via a class precedence list to handle conflicts, with all classes forming a directed acyclic graph rooted at the universal superclass t.[70] Generic functions serve as the core dispatch mechanism, allowing behavior to vary based on the classes of all arguments through multiple dispatch, rather than single-argument method calls typical in other systems.[70] Methods, defined with defmethod, specialize on parameter classes or quoted objects and can employ qualifiers like :before, :after, and :around for method combination strategies that compose behaviors flexibly, such as short-circuiting or wrapping primary methods.[70]
This design integrates seamlessly with Lisp's functional paradigm, treating objects, classes, generic functions, and methods as first-class entities that can be manipulated like any Lisp data.[71] Generic functions extend Common Lisp's procedural abstraction, enabling polymorphism across multiple arguments while preserving closures and higher-order functions; for instance, methods can capture lexical environments, blending object state with functional composition.[71] CLOS also aligns with Lisp's type hierarchy, where built-in types like float (with subtypes such as single-float and double-float) coexist with user-defined classes, supporting runtime type selection without disrupting existing code.[71]
In contrast, Scheme dialects offer less standardized object support, relying on libraries or extensions rather than built-in systems. SRFI-9, a Scheme Request for Implementation finalized in 2000, introduces record types via define-record-type, which define structured data with constructors, predicates, accessors, and optional modifiers, enabling object-like encapsulation without full inheritance or dispatch.[72] These records provide distinct identities separate from core Scheme types, serving as a foundation for ad-hoc object-oriented patterns in implementations like Guile or Racket, though more advanced systems often build atop them using closures for methods.[72]
Metaprogramming and Code as Data
Lisp's homoiconicity, a core feature originating from its design, enables the representation of programs as data structures—specifically, S-expressions—which are lists that serve as both source code and manipulable objects. This allows developers to inspect and modify the abstract syntax tree (AST) directly, as the AST is structurally identical to the list-based code representation. At runtime, functions likeeval can execute dynamically generated lists as code, while at compile-time, macros facilitate structural transformations of these lists for optimization or extension.[73][74]
Reader macros extend this capability by customizing the Lisp reader's parsing of input streams, associating special characters with functions that transform raw input into Lisp objects before standard evaluation. For instance, the #| character initiates a multi-line comment that the reader skips until a matching #|, effectively ignoring the enclosed text during parsing. This mechanism supports metaprogramming by allowing the creation of domain-specific notations embedded within Lisp code, such as custom infix operators or string interpolation, without altering the core language syntax.[75][76]
The eval function further empowers metaprogramming by evaluating arbitrary forms in the current environment, facilitating the implementation of domain-specific languages (DSLs) through programmatic code generation and execution. Developers can construct lists representing DSL syntax—leveraging Lisp's list manipulation primitives like cons and append—and then invoke eval to interpret them as executable Lisp code, enabling embedded languages for configuration, querying, or scripting within applications. This approach contrasts with static compilation in other languages, as it supports runtime DSL evolution while maintaining full integration with the host Lisp environment.
Compiler macros provide compile-time metaprogramming by offering optional expansions for function calls, allowing selective optimizations such as inlining small functions or constant folding to reduce runtime overhead. Unlike ordinary macros, compiler macros are invoked only if they enhance efficiency, and they can produce side effects during compilation, such as logging or conditional code generation based on declarations. This enables advanced code transformations, like specializing arithmetic operations for known types, directly on the AST lists at compile time.[77]
Dialects exhibit variations in metaprogramming power: Common Lisp grants unrestricted access to the full macro system for direct AST manipulation, permitting non-hygienic expansions that can intentionally capture identifiers for advanced effects. In contrast, Scheme enforces hygiene in its syntax-rules macros to prevent accidental variable capture, ensuring that macro-introduced identifiers do not interfere with the surrounding lexical scope, though this constrains flexibility compared to Common Lisp's approach. These differences reflect trade-offs between safety and expressive power in code-as-data manipulation.[78]
Concurrency and Parallelism
Lisp dialects have evolved to support concurrency and parallelism through libraries and language features that address shared-state synchronization and task distribution, often building on the language's functional and dynamic nature. These mechanisms enable multi-threaded execution while mitigating risks like race conditions, though implementations vary by dialect due to differences in runtime environments and standards. In Common Lisp, concurrency is primarily facilitated by the Bordeaux-Threads library, a portable interface providing primitives such as threads, mutexes, condition variables, and semaphores for shared-state synchronization across implementations like SBCL and Clozure CL.[79] For parallelism, the lparallel library offers high-level constructs including task farms, kernel submissions, and promise-based futures, allowing efficient distribution of computations over thread pools without direct thread management.[80] Clojure emphasizes immutable data and software transactional memory (STM) for safe concurrent state management. Atoms provide uncoordinated, synchronous updates to single identities via compare-and-set operations, ensuring atomicity without locks.[81] Refs enable coordinated, synchronous changes across multiple identities through transactions, while agents support asynchronous, independent updates to individual locations, queuing actions for sequential execution by a thread pool.[82] Complementing these, the core.async library introduces channels for communicating sequential processes, inspired by Go's model, enabling non-blocking asynchronous communication and multiplexing via go blocks that park rather than block threads.[83] Scheme supports concurrency via SRFI-18, a standard specifying multithreading with threads, mutexes, condition variables, and time-based operations, allowing implementations to provide portable thread creation and synchronization.[84] In Gambit-C, an implementation of Scheme, lightweight green threads are managed entirely by the runtime, enabling cooperative multitasking with millions of threads on a single OS thread, suitable for I/O-bound or server applications without relying on host OS threading.[85] A key challenge in Lisp concurrency arises from garbage collection (GC), which can introduce stop-the-world pauses that disrupt real-time or latency-sensitive applications, particularly in generational collectors like those in SBCL where major collections may halt all threads for tens of milliseconds.[86] Some dialects and libraries explore actor models for message-passing concurrency to avoid shared mutable state; for instance, actor systems built on Common Lisp using Bordeaux-Threads isolate state within actors, reducing synchronization overhead.[87] In the 2020s, advances in WebAssembly (Wasm) integration have enabled Lisp implementations to leverage browser and edge computing environments with emerging threading support, such as shared memory and atomic operations in Wasm's threads proposal, facilitating distributed parallelism in dialects like those targeting Wasm runtimes for serverless applications.[88]Applications and Use Cases
Role in Artificial Intelligence
Lisp played a pivotal role in the inception of artificial intelligence research, providing a foundation for symbolic computation that influenced early AI programs. Although the Logic Theorist, developed in 1956 by Allen Newell, Herbert A. Simon, and Cliff Shaw using the Information Processing Language (IPL), predated Lisp and demonstrated automated theorem proving as a search problem, its concepts of heuristic search and symbolic manipulation directly inspired subsequent AI efforts.[89] John McCarthy's development of Lisp in 1958 was motivated by the need for a language supporting recursion and list processing to formalize such algorithms, making it the de facto tool for AI experimentation.[1] A landmark early application was ELIZA, Joseph Weizenbaum's 1966 natural language processing program simulating a psychotherapist, originally implemented in MAD-SLIP on MIT's MAC system with later ports to Lisp, and showcased pattern-based dialogue generation.[90] In the 1980s, Lisp dominated the expert systems boom, leveraging specialized Lisp machines for efficient symbolic processing and garbage collection. These systems, such as OPS5—a production rule language for rule-based reasoning—were implemented in Lisp interpreters and used for applications like diagnostic and planning tasks in AI.[91] Similarly, Intellicorp's Knowledge Engineering Environment (KEE), a frame-based tool for building knowledge bases, ran on Lisp machines like the Symbolics 3670, enabling graphical modeling of rules, objects, and inference engines for commercial expert systems.[92] Lisp machines facilitated rapid prototyping and deployment, with hardware optimizations for list operations supporting the era's AI winter-era optimism around knowledge representation.[93] Lisp's strength in symbolic reasoning stems from its homoiconic nature, where code and data are interchangeable lists, facilitating pattern matching and planning algorithms central to AI. The LISP70 system, for instance, introduced pattern-directed computation via rewrite rules, allowing flexible symbolic manipulation for tasks like theorem proving and natural language understanding.[94] In AI planning, Lisp enabled representations of states and actions as nested lists, with pattern matching to unify goals and preconditions, as exemplified in early planners like STRIPS, which influenced hierarchical task networks and partial-order planning.[95] In the 2020s, Lisp continues to contribute to symbolic AI, particularly in hybrid systems integrating large language models (LLMs) for enhanced reasoning. Architectures like persistent Lisp REPLs allow LLMs to dynamically generate and execute Lisp code for symbolic tasks, bridging neural pattern recognition with logical inference.[96] Libraries such as CLML provide machine learning tools in Common Lisp, supporting statistical methods like clustering and neural networks alongside symbolic extensions for interpretable AI.[97] The European Lisp Symposium 2025 highlighted these trends, featuring a keynote on Lisp's relevance in the AI era, a paper on deep learning in Common Lisp using frameworks like MGL, and a round table on Lisp's AI applications, underscoring its role in semantic processing and open reasoning paradigms.[26]Operating Systems and Embedded Applications
Lisp machines, developed in the late 1970s and 1980s by organizations such as MIT's Artificial Intelligence Laboratory and companies including Symbolics and Lisp Machines Incorporated (LMI), featured operating systems entirely implemented in Lisp dialects like ZetaLISP. These systems, such as Symbolics' Genera released around 1982, integrated the operating system, utilities, and programming environment into a cohesive Lisp-based framework, supporting multiple independent processes within a single address space via an event-driven scheduler.[98][99] Genera pioneered innovations like a unified virtual memory space where functions and data were treated as structured Lisp objects, enabling automatic garbage collection for storage management and hardware-assisted type checking for reliability.[99] Its Generic Network System provided seamless protocol-agnostic networking, allowing uniform file transfers and communication across diverse systems like Chaosnet, DECnet, and TCP/IP without requiring user-level protocol expertise.[99] LMI's operating system, a derivative of MIT's earlier Lisp machine OS, similarly emphasized extensibility and integration, powering machines like the LMI Lambda for AI research and development. In modern contexts, Lisp continues to influence operating systems and embedded applications through dialects suited to constrained environments. For instance, the Nyxt web browser, implemented in Common Lisp, demonstrates Lisp's role in system-level software by providing a programmable, extensible environment that integrates low-level browser operations with high-level scripting.[100] In robotics, ROSlisp serves as a client library for the Robot Operating System (ROS), enabling Common Lisp nodes for real-time control and perception in embedded robotic systems, such as those using ARM processors.[101] Real-time capabilities have been enhanced in Lisp implementations for embedded use. Clojure's clojure-rt compiler targets deterministic execution on real-time Java virtual machines compliant with the Real-Time Specification for Java (RTSJ), supporting applications requiring predictable response times.[102] Similarly, Steel Bank Common Lisp (SBCL) allows garbage collection tuning via parameters like generation sizes and allocation limits to minimize pause times in real-time scenarios, leveraging its generational collector for embedded systems with periodic GC invocations.[103] Examples from the 1980s, such as the LMI-based systems, paved the way for contemporary efforts like Mezzano, a 64-bit Common Lisp OS designed for modern hardware. Lisp dialects for the 2020s, such as uLisp and MakerLisp, target IoT and embedded devices on microcontrollers like AVR and ARM, offering compact implementations with Lisp-1 semantics for resource-limited environments.[104][32] These enable rapid development of firmware for sensors and edge devices, with uLisp supporting platforms like ESP32 for wireless IoT applications.[105] A key advantage of Lisp in operating systems and embedded applications is its dynamic typing, which facilitates rapid prototyping and runtime adaptability in memory-constrained settings, allowing developers to modify code and data structures incrementally without recompilation.[106] This trait, combined with code-as-data principles, supports extensible kernels and real-time tuning, as seen in SBCL's configurable GC for low-latency embedded tasks.[103]Influence on Modern Software and Languages
Lisp's introduction of automatic garbage collection in 1959 by John McCarthy revolutionized memory management, eliminating the need for manual allocation and deallocation that plagued earlier languages. This innovation, first implemented in Lisp to handle dynamic list structures, directly influenced modern languages such as Java and Python, where garbage collection became a core feature for safe and efficient runtime environments. By automating the reclamation of unused memory, Lisp's approach reduced common errors like memory leaks and dangling pointers, enabling developers to focus on logic rather than low-level details. Lisp's emphasis on functional programming paradigms, including higher-order functions and immutable data, has shaped features in contemporary languages like Scala and Rust. Scala's support for functional constructs, such as pattern matching and currying, draws from Lisp's foundational role in promoting pure functions and recursion over imperative loops. Similarly, Rust incorporates functional elements like closures and iterators, inspired by Lisp's treatment of functions as first-class citizens, which enhances code safety and composability in systems programming. These influences underscore Lisp's contribution to blending functional purity with practical performance needs in hybrid languages. Specific dialects and concepts from Lisp have permeated specialized domains. Emacs Lisp, a dialect tailored for extensibility, powers the Emacs text editor, allowing users to customize and extend its functionality through programmable macros and scripts, a model that has influenced interactive development environments. Julia's metaprogramming capabilities, including macros that manipulate abstract syntax trees, explicitly inherit Lisp's homoiconic design where code is treated as data, enabling domain-specific language creation without external tools. The read-eval-print loop (REPL), a hallmark of Lisp for interactive development, inspired the interactive computing paradigm in Jupyter notebooks, facilitating exploratory data analysis and prototyping in languages like Python. Rust's procedural macros, which allow arbitrary code generation at compile time, are inspired by Lisp-family languages like Scheme, providing hygienic metaprogramming to extend syntax safely while avoiding common pitfalls like variable capture. In 2025, Lisp's stability and expressiveness continue to influence AI tools and frameworks; for instance, its historical dominance in symbolic AI informs chain-of-thought reasoning in libraries like LangChain, where dynamic code generation mirrors Lisp's code-as-data philosophy for building adaptive agents. Lisp-family languages have influenced the design of nearly every major modern programming language through concepts like dynamic typing and recursion.Examples
Basic Syntax Examples
Lisp's core syntax revolves around S-expressions, which are either atomic elements like numbers or symbols, or parenthesized lists that represent function calls or data structures. A fundamental aspect is the prefix notation for expressions, where the operator precedes its arguments. For instance, the expression(+ 1 2 3) evaluates to 6 in both Common Lisp and Scheme, demonstrating arithmetic operations as the first element of the list followed by operands.
Variable binding is typically achieved using the let special form, which introduces local variables within its body. In Common Lisp, (let ((x 10) (y 20)) (+ x y)) binds x to 10 and y to 20, then evaluates to 30. Scheme uses a similar construct, such as (let ((x 10) (y 20)) (+ x y)), yielding the same result, though Scheme requires all bindings to be specified before the body.
List manipulation forms the backbone of Lisp data structures. The cons function constructs lists by prepending an element to an existing list; for example, (cons 'a '(b c)) produces the list (a b c). Accessors like car and cdr retrieve the first element and the rest of the list, respectively: (car '(a b c)) returns a, while (cdr '(a b c)) returns (b c). These operations are identical in both Common Lisp and Scheme.
Quoting preserves expressions as literal data rather than evaluating them. The expression '(+ 1 2) yields the list (+ 1 2) as a data structure, which can later be evaluated using the eval function: (eval '(+ 1 2)) returns 3. This code-as-data principle is central to Lisp and works consistently across dialects.
In a Read-Eval-Print Loop (REPL), Lisp interactively processes input and displays results. For example, entering (+ 1 2) at the prompt outputs 3, while (cons 'a '(b c)) displays (A B C) in Common Lisp (using uppercase by default) but (a b c) in Scheme (preserving case). This difference in printing conventions highlights minor dialect variations while keeping core evaluation uniform.
Functional Programming Example
Lisp supports functional programming paradigms through its emphasis on first-class functions, recursion, and immutable data structures, enabling the composition of programs as transformations on data without side effects.[107] A classic demonstration is the recursive computation of the factorial function, which avoids iterative loops by breaking down the problem into smaller subproblems. In Common Lisp, the factorial of a non-negative integer $ n $ can be defined recursively as follows:(defun fact (n)
(if (<= n 1)
1
(* n (fact (- n 1)))))
This function returns 1 for the base case where $ n \leq 1 $, and otherwise multiplies $ n $ by the factorial of $ n-1 $. For instance, (fact 5) evaluates to 120 by unfolding the recursion: $ 5 \times (4 \times (3 \times (2 \times (1 \times 1)))) $.[107]
Higher-order functions like mapcar apply a given function to each element of a list, producing a new list of results and preserving immutability. The mapcar function takes a function and one or more lists, applying the function to the corresponding elements.[62] For example:
(mapcar #'sqrt '(1 4 9))
This yields (1 2 3), as sqrt is applied element-wise to the input list without modifying the original.[108]
Scheme, a dialect of Lisp, extends functional techniques with continuations via call-with-current-continuation (abbreviated call/cc), allowing non-local exits for control flow. This enables structured escapes from computations, such as early termination in searches.[109] A representative example finds the first negative number in a list and exits immediately:
(call-with-current-continuation
(lambda (exit)
(for-each (lambda (x)
(if (negative? x)
(exit x)))
'(54 0 37 -3 245 19))))
This evaluates to -3, abandoning the rest of the loop upon encountering the negative value.[109]
To maintain immutability, Lisp provides functions like copy-list, which creates a shallow copy of a list's structure while sharing elements.[110] For a list lst bound to (1 (2 3)), (setq clst (copy-list lst)) produces a new list clst that is equal to lst but distinct under eq, ensuring modifications to one do not affect the other. This supports pure functional styles by avoiding unintended mutations.[110]
Many Lisp implementations optimize tail-recursive functions, where the recursive call is the last operation, by reusing the current stack frame instead of allocating a new one—effectively turning recursion into iteration.[111] In LispWorks, for example, self-tail-recursive functions like a tail-optimized factorial are compiled as efficiently as loops, preventing stack overflow for deep recursions.[111]
Macro Usage Example
One common practical use of macros in Lisp is to define conditional execution constructs that simplify code readability without the overhead of runtime checks. For instance, thewhen macro provides a concise way to execute a sequence of forms only if a condition is true, avoiding the need to explicitly write an if with a nil else branch. This macro is defined as follows:
(defmacro when (condition &rest body)
`(if ,condition (progn ,@body)))
Here, the macro takes a condition and a variable number of body forms, expanding them into an if form where the body is wrapped in progn if the condition holds.[112]
To illustrate the expansion process, consider the usage (when (> x 10) (print 'big-value) (incf x)). During macro expansion, which occurs at compile time or load time, this form is transformed into (if (> x 10) (progn (print 'big-value) (incf x))). The expansion trace can be observed using the macroexpand-1 function, which applies a single level of expansion: (macroexpand-1 '(when (> x 10) (print 'big-value) (incf x))) yields the if form shown, confirming that the macro generates efficient, direct code without introducing unnecessary runtime evaluation of the body when the condition is false. This expansion happens before the code is compiled or interpreted, ensuring the generated if is treated as ordinary Lisp code.[112][113]
Quasiquotation, or backquote, plays a central role in such macro definitions by allowing selective unquoting of parts of the template form. In the when macro, the backquote `(if ,condition (progn ,@body)) creates a quoted list structure for the if form, where ,condition splices in the evaluated condition form and ,@body splices the body forms as a list into the progn. This enables variable insertion while preserving the literal structure of the code template, as seen in expansions like (list 1 2 ,(+ 1 3) ,@'(4 5)) which becomes (list 1 2 4 4 5). Backquote is detailed further in the control structures section.[112]
For more complex iteration, macros can define custom loop variants tailored to specific needs, such as iterating over prime numbers. A simple for-each-like macro for this purpose, do-primes, might be defined to loop over primes in a range:
(defun is-prime (n)
(loop for i from 2 to (isqrt n)
when (zerop (mod n i)) return nil
finally (return t)))
(defun primep (n)
(and (>= n 2) (is-prime n)))
(defmacro do-primes ((var start end) &body body)
(let ((n (gensym))
(endv (gensym)))
`(do ((,n ,start (1+ ,n))
(,endv ,end))
((> ,n ,endv))
(when (primep ,n)
(let ((,var ,n))
,@body)))))
This macro expands a form like (do-primes (p 0 10) (format t "~d " p)) into a do loop that increments a counter, checks for primality with primep, and executes the body only for primes, effectively providing a for-each iteration over primes without evaluating the primality test for non-primes in the body context. The use of gensym ensures hygienic expansion by generating unique symbols (e.g., #:G1234) for loop variables like n and endv, preventing unintended variable capture if the macro is used within a lexical scope where similarly named variables exist. For example, expanding the do-primes form introduces fresh symbols that do not conflict with outer bindings, maintaining lexical hygiene as required by the macro system.[112]
Macros differ fundamentally from functions in their evaluation timing: macros execute at compile time to generate code, whereas functions receive data at runtime. In the when example, the macro body runs during expansion to produce the if form, allowing compile-time decisions that optimize the final code, such as avoiding any evaluation of the else branch (which is absent). This compile-time evaluation contrasts with a hypothetical when function, which would evaluate all arguments at runtime, potentially executing the body even if the condition is false before discarding the result—leading to inefficiencies or side effects. Such timing enables macros to perform static analysis or code generation that functions cannot, as verified through expansion traces showing the generated code's structure before runtime.[112]
