Recent from talks
Nothing was collected or created yet.
List of programming languages by type
View on Wikipedia
| Programming language lists |
|---|
This is a list of notable programming languages, grouped by notable language attribute. As a language can have multiple attributes, the same language can be in multiple groupings.
Agent-oriented programming languages
[edit]Agent-oriented programming allows the developer to build, extend and use software agents, which are abstractions of objects that can message other agents.
Array languages
[edit]Array programming (also termed vector or multidimensional) languages generalize operations on scalars to apply transparently to vectors, matrices, and higher-dimensional arrays.
Aspect-oriented programming languages
[edit]Aspect-oriented programming enables developers to add new functionality to code, known as "advice", without modifying that code itself; rather, it uses a pointcut to implement the advice into code blocks.
Assembly languages
[edit]Assembly languages directly correspond to a machine language (see below), so machine code instructions appear in a form understandable by humans, although there may not be a one-to-one mapping between an individual statement and an individual instruction. Assembly languages let programmers use symbolic addresses, which the assembler converts to absolute or relocatable addresses. Most assemblers also support macros and symbolic constants.
Authoring languages
[edit]An authoring language is a programming language designed for use by a non-computer expert to easily create tutorials, websites, and other interactive computer programs.
Command-line interface languages
[edit]This section's factual accuracy is disputed. (July 2025) |
Command-line interface (CLI) languages are also called batch languages or job control languages. Examples:
- 4DOS (shell for IBM PCs)
- 4OS2 (shell for IBM PCs)
- Batch files for DOS and Windows
- COMMAND.COM command language for DOS and pre-Windows NT Windows
- cmd.exe command language for Windows NT
- sh (standard Unix shell, by Stephen R. Bourne) and compatibles
- CLIST (MVS Command List)
- CMS EXEC
- csh (C shell, by Bill Joy) and compatibles
- tcsh
- Hamilton C shell (a C shell for Windows)
- DIGITAL Command Language CLI for OpenVMS
- EXEC 2
- Expect (a Unix automation and test tool)
- fish (a Unix shell)
- Nushell (a cross-platform shell)
- PowerShell (.NET-based CLI)
- rc (shell for Plan 9)
- Rexx
- TACL (programming language)
- zsh (a Unix shell)
Compiled languages
[edit]These are languages typically processed by compilers, though theoretically any language can be compiled or interpreted.
- ArkTS
- ActionScript
- Ada (multi-purpose language)
- ALGOL 58
- ALGOL 60 (influential design)
- SMALL a Machine ALGOL
- ALGOL 68
- Ballerina→ bytecode runtime
- BASIC (including the first version of Dartmouth BASIC)
- BCPL
- C (widely used procedural language)
- C++ (multiparadigm language derived from C)
- C# (into CIL runtime)
- Ceylon (into JVM bytecode)
- CHILL
- CLIPPER 5.3 (DOS-based)
- CLEO for Leo computers
- Clojure (into JVM bytecode)
- COBOL
- Cobra
- Common Lisp
- Crystal
- Curl
- D (from a reengineering of C++)
- DASL→Java, JS, JSP, Flex.war
- Delphi (Borland's Object Pascal development system)
- DIBOL (a Digital COBOL)
- Dylan
- Eiffel (developed by Bertrand Meyer)
- Elm
- Emacs Lisp
- Emerald
- Erlang
- Factor
- Fortran (first compiled by IBM's John Backus)
- GAUSS
- Go
- Gosu (into JVM bytecode)
- Groovy (into JVM bytecode)
- Haskell
- Harbour
- HolyC
- Inform (usually story files for Glulx or Z-code)
- Java (usually JVM bytecode; to machine code)
- JOVIAL
- Julia (on the fly to machine code)
- Kotlin (Kotlin/Native uses LLVM to produce binaries)
- LabVIEW
- Mercury
- Mesa
- Nemerle (into intermediate language bytecode)
- Nim
- Objective-C
- P
- Pascal (most implementations)
- PL/I
- Plus
- Pony
- Python (to intermediate VM bytecode)
- RPG (Report Program Generator)
- Red
- Rust
- Scala (into JVM bytecode)
- Scheme (e.g. Gambit)
- SequenceL – purely functional, parallelizing and race-free
- Simula (object-oriented superset of ALGOL 60)
- Smalltalk platform independent VM bytecode
- Swift
- ML
- Standard ML (SML)
- OCaml
- F# (into CIL, generates runtime)
- Turing
- V (Vlang)
- Vala (GObject type system)
- Visual Basic (CIL JIT runtime)
- Visual FoxPro
- Visual Prolog
- Xojo
- Zig
Concatenative programming languages
[edit]A concatenative programming language is a point-free computer programming language in which all expressions denote functions, and the juxtaposition of expressions denotes function composition.
Concurrent languages
[edit]Message passing languages provide language constructs for concurrency. The predominant paradigm for concurrency in mainstream languages such as Java is shared memory concurrency. Concurrent languages that make use of message passing have generally been inspired by process calculi such as communicating sequential processes (CSP) or the π-calculus.
- Ada – multi-purpose language
- Alef – concurrent language with threads and message passing, used for systems programming in early versions of Plan 9 from Bell Labs
- Ateji PX – an extension of the Java language for parallelism
- Ballerina – a language designed for implementing and orchestrating micro-services. Provides a message based parallel-first concurrency model.
- C++ (since C++11)
- ChucK – domain specific programming language for audio, precise control over concurrency and timing
- Cilk – a concurrent C
- Cω – C Omega, a research language extending C#, uses asynchronous communication
- Clojure – a dialect of Lisp for the Java virtual machine
- Chapel
- Co-array Fortran
- Concurrent Pascal (by Brinch-Hansen)
- Curry
- E – uses promises, ensures deadlocks cannot occur
- Eiffel (through the SCOOP mechanism, Simple Concurrent Object-Oriented Computation)
- Elixir (runs on the Erlang VM)
- Emerald – uses threads and monitors
- Erlang – uses asynchronous message passing with nothing shared
- Gambit Scheme – using the Termite library
- Gleam (runs on the Erlang VM)
- Go
- Haskell – supports concurrent, distributed, and parallel programming across multiple machines
- Java
- Julia
- Joule – dataflow language, communicates by message passing
- LabVIEW
- Limbo – relative of Alef, used for systems programming in Inferno (operating system)
- MultiLisp – Scheme variant extended to support parallelism
- OCaml
- occam – influenced heavily by Communicating Sequential Processes (CSP)
- occam-π – a modern variant of occam, which incorporates ideas from Milner's π-calculus
- Orc
- Oz – multiparadigm language, supports shared-state and message-passing concurrency, and futures, and Mozart Programming System cross-platform Oz
- P
- Pony
- Pict – essentially an executable implementation of Milner's π-calculus
- Python – uses thread-based parallelism and process-based parallelism[4]
- Raku[5]
- Rust
- Scala – implements Erlang-style actors on the JVM
- SequenceL – purely functional, automatically parallelizing and race-free
- SR – research language
- V (Vlang)
- Unified Parallel C
- XProc – XML processing language, enabling concurrency
Constraint programming languages
[edit]A constraint programming language is a declarative programming language where relationships between variables are expressed as constraints. Execution proceeds by attempting to find values for the variables which satisfy all declared constraints.
Contract languages
[edit]Design by contract (or contract programming) is programming using defined preconditions, postconditions, and invariants.
- Ada (since Ada 2012)
- Ciao
- Clojure
- Cobra
- C++ (since C++26)
- D[7]
- Dafny
- Eiffel
- Fortress
- Kotlin
- Mercury
- Oxygene (formerly Chrome and Delphi Prism[8])
- Racket (including higher order contracts, and emphasizing that contract violations must blame the guilty party and must do so with an accurate explanation[9])
- Sather
- Scala[10][11]
- SPARK (via static analysis of Ada programs)
- Vala
- Vienna Development Method (VDM)
Curly bracket languages
[edit]A curly bracket or curly brace language has syntax that defines a block as the statements between curly brackets, a.k.a. braces, {}. This syntax originated with BCPL (1966), and was popularized by C. Many curly bracket languages descend from or are strongly influenced by C. Examples:
- ABCL/c+
- Alef
- AWK
- ArkTS
- B
- bc
- BCPL
- Ballerina
- C – developed circa 1970 at Bell Labs
- C++
- C#
- Ceylon
- Chapel
- ChucK – audio programming language
- Cilk – concurrent C for multithreaded parallel programming
- Cyclone – a safer C variant
- D
- Dart
- DASL – based on Java
- E
- ECMAScript
- GLSL
- Go
- HLSL
- Java
- Limbo
- LPC
- MEL
- Nemerle (curly braces optional)[12]
- Objective-C
- PCASTL
- Perl
- PHP
- Pico
- Pike
- PowerShell
- R
- Raku
- Rust
- S-Lang
- Scala (curly-braces optional)
- sed
- Solidity[13]
- SuperCollider
- Swift
- UnrealScript
- V (Vlang)
- Yorick
- YASS
- Zig
Dataflow languages
[edit]Dataflow programming languages rely on a (usually visual) representation of the flow of data to specify the program. Frequently used for reacting to discrete events or for processing streams of data. Examples of dataflow languages include:
Data-oriented languages
[edit]Data-oriented languages provide powerful ways of searching and manipulating the relations that have been described as entity relationship tables which map one set of things into other sets.[citation needed] Examples of data-oriented languages include:
- Associative Programming Language
- Clarion
- Clipper
- dBase a relational database access language
- Gremlin
- MUMPS (an ANSI standard general-purpose language with specializations for database work)
- Caché ObjectScript (a proprietary superset of MUMPS)
- RETRIEVE
- RDQL
- SPARQL
- SQL
- Visual FoxPro – a native RDBMS engine, object-oriented, RAD
- Wolfram Mathematica (Wolfram language)
Decision table languages
[edit]Decision tables can be used as an aid to clarifying the logic before writing a program in any language, but in the 1960s a number of languages were developed where the main logic is expressed directly in the form of a decision table, including:
Declarative languages
[edit]Declarative languages express the logic of a computation without describing its control flow in detail. Declarative programming stands in contrast to imperative programming via imperative programming languages, where control flow is specified by serial orders (imperatives). (Pure) functional and logic-based programming languages are also declarative, and constitute the major subcategories of the declarative category. This section lists additional examples not in those subcategories.
- Analytica
- Ant (combine declarative programming and imperative programming)
- Curry
- Cypher
- Datalog
- Distributed Application Specification Language (DASL) (combine declarative programming and imperative programming)
- ECL
- Gremlin
- Inform (combine declarative programming and imperative programming)
- Lustre
- Mercury
- Metafont
- MetaPost
- Modelica
- Nix
- Prolog
- QML
- Oz
- RDQL
- SequenceL – purely functional, automatically parallelizing and race-free
- SPARQL
- SQL (Only DQL, not DDL, DCL, and DML)
- Soufflé
- VHDL (supports declarative programming, imperative programming, and functional programming)
- Wolfram Mathematica (Wolfram language)
- WOQL (TerminusDB)
- xBase
- XSL Transformations
Embeddable languages
[edit]In source code
[edit]Source embeddable languages embed small pieces of executable code inside a piece of free-form text, often a web page.
Client-side embedded languages are limited by the abilities of the browser or intended client. They aim to provide dynamism to web pages without the need to recontact the server.
Server-side embedded languages are much more flexible, since almost any language can be built into a server. The aim of having fragments of server-side code embedded in a web page is to generate additional markup dynamically; the code itself disappears when the page is served, to be replaced by its output.
Server side
[edit]- PHP
- VBScript
- Tcl – server-side in NaviServer and an essential component in electronics industry systems
The above examples are particularly dedicated to this purpose. A large number of other languages, such as Erlang, Scala, Perl, Ring and Ruby can be adapted (for instance, by being made into Apache modules).
Client side
[edit]- ActionScript
- JavaScript (aka ECMAScript or JScript)
- VBScript (Windows only)
In object code
[edit]A wide variety of dynamic or scripting languages can be embedded in compiled executable code. Basically, object code for the language's interpreter needs to be linked into the executable. Source code fragments for the embedded language can then be passed to an evaluation function as strings. Application control languages can be implemented this way, if the source code is input by the user. Languages with small interpreters are preferred.
Educational programming languages
[edit]Languages developed primarily for the purpose of teaching and learning of programming.
Esoteric languages
[edit]An esoteric programming language is a programming language designed as a test of the boundaries of computer programming language design, as a proof of concept, or as a joke.
Extension languages
[edit]Extension programming languages are languages embedded into another program and used to harness its features in extension scripts.
- AutoLISP (specific to AutoCAD)
- BeanShell
- CAL
- C/AL (C/SIDE)
- Guile
- Emacs Lisp
- JavaScript and some dialects, e.g., JScript
- Lua (embedded in many games)
- OpenCL (extension of C and C++ to use the GPU and parallel extensions of the CPU)
- OptimJ (extension of Java with language support for writing optimization models and powerful abstractions for bulk data processing)
- Perl
- Pike
- PowerShell
- Python (embedded in Maya, Blender, and other 3-D animation packages)
- Rexx
- Ring
- Ruby (Google SketchUp)
- S-Lang
- SQL
- Squirrel
- Tcl
- Vim script (vim)
- Visual Basic for Applications (VBA)
Fourth-generation languages
[edit]Fourth-generation programming languages are high-level programming languages built around database systems. They are generally used in commercial environments.
- 1C:Enterprise programming language
- ABAP
- CorVision
- CSC's GraphTalk
- CA-IDEAL (Interactive Development Environment for an Application Life) for use with CA-DATACOM/DB
- Easytrieve report generator (now CA-Easytrieve Plus)
- FOCUS
- IBM Informix-4GL
- LINC 4GL
- LiveCode (Not based on a database; still, the goal is to work at a higher level of abstraction than 3GLs.)
- MAPPER (Unisys/Sperry) – now part of BIS
- MARK-IV (Sterling/Informatics) now VISION:BUILDER of CA
- NATURAL
- Progress 4GL
- PV-Wave
- RETRIEVE
- SAS
- SQL
- Ubercode (VHLL, or Very-High-Level Language)
- Uniface
- Visual DataFlex
- Visual FoxPro
- xBase
Functional languages
[edit]Functional programming languages define programs and subroutines as mathematical functions and treat them as first-class. Many so-called functional languages are "impure", containing imperative features. Many functional languages are tied to mathematical calculation tools. Functional languages include:
Pure
[edit]Impure
[edit]- APL
- ATS
- CAL
- C++ (since C++11)
- C#
- VB.NET
- Ceylon
- Curl
- D
- Dart
- ECMAScript
- Erlang
- Fexl
- Flix
- G (used in LabVIEW)
- Groovy
- Hop
- Java (since version 8)
- Julia
- Kotlin
- Lisp
- ML
- Standard ML (SML)
- OCaml
- F#
- Nemerle
- Nim
- Opal
- OPS5
- Perl
- PHP
- PL/pgSQL
- Python
- Q (equational programming language)
- R
- Rebol
- Red
- Ring
- Ruby
- REFAL
- Rust
- Scala
- Swift
- Spreadsheets
- V (Vlang)
- Tcl
- Wolfram Mathematica (Wolfram language)
Hardware description languages
[edit]In electronics, a hardware description language (HDL) is a specialized computer language used to describe the structure, design, and operation of electronic circuits, and most commonly, digital logic circuits. The two most widely used and well-supported HDL varieties used in industry are Verilog and VHDL. Hardware description languages include:
HDLs for analog circuit design
[edit]- Verilog-AMS (Verilog for Analog and Mixed-Signal)
- VHDL-AMS (VHDL with Analog/Mixed-Signal extension)
HDLs for digital circuit design
[edit]Imperative languages
[edit]Imperative programming languages may be multi-paradigm and appear in other classifications. Here is a list of programming languages that follow the imperative paradigm:
- Ada
- ALGOL 58
- ALGOL 60 (very influential language design)
- ALGOL 68
- BASIC
- C
- C++
- C#
- Ceylon
- CHILL
- COBOL
- D
- Dart
- ECMAScript
- FORTRAN
- GAUSS
- Go
- Groovy
- Icon
- Java
- Julia
- Lua
- MATLAB
- Machine languages
- Modula-2, Modula-3
- MUMPS
- Nim
- OCaml
- Oberon
- Object Pascal
- Open Object Rexx (ooRexx)
- Open Programming Language (OPL)
- OpenEdge Advanced Business Language (ABL)
- Pascal
- Perl
- PHP
- PL/I
- PL/S
- PowerShell
- PROSE
- Python
- Raku
- Rexx
- Ring
- Ruby
- Rust
- SETL
- Speakeasy
- Swift
- Tcl
- V (Vlang)
- Wolfram Mathematica (Wolfram language)
Interactive mode languages
[edit]Known as REPL - Interactive mode languages act as a kind of shell: expressions or statements can be entered one at a time, and the result of their evaluation seen immediately.
- APL
- BASIC (some dialects)
- Clojure
- Common Lisp
- Dart (with Observatory or Dartium's developer tools)
- ECMAScript
- Erlang
- Elixir (with iex)
- F#
- Fril
- GAUSS
- Groovy
- Guile
- Haskell (with the GHCi or Hugs interpreter)
- IDL
- J
- Java (since version 9)
- Julia
- Lua
- MUMPS (an ANSI standard general-purpose language)
- Maple
- MATLAB
- ML
- Nim (with INim)
- OCaml
- Perl
- PHP
- Pike
- PostScript
- PowerShell (.NET-based CLI)
- Prolog
- Python
- PROSE
- R
- Raku
- Rebol
- Red
- Rexx
- Ring
- Ruby (with IRB)
- Scala
- Scheme
- Smalltalk (anywhere in a Smalltalk environment)
- S-Lang (with the S-Lang shell, slsh)
- Speakeasy
- Swift
- Tcl (with the Tcl shell, tclsh)
- Unix shell
- Visual FoxPro
- Wolfram Mathematica (Wolfram language)
Interpreted languages
[edit]Interpreted languages are programming languages in which programs may be executed from source code form, by an interpreter. Theoretically, any language can be compiled or interpreted, so the term interpreted language generally refers to languages that are usually interpreted rather than compiled.
- Ant
- APL
- AutoHotkey scripting language
- AutoIt scripting language
- BASIC (some dialects)
- Programming Language for Business (PL/B, formerly DATABUS, later versions added optional compiling)
- Eiffel (via Melting Ice Technology in EiffelStudio)
- Emacs Lisp
- FOCAL
- GameMaker Language
- Groovy
- J
- jq
- Java bytecode
- Julia (compiled on the fly to machine code, by default, interpreting also available)
- JavaScript
- Lisp (early versions, pre-1962, and some experimental ones; production Lisp systems are compilers, but many of them still provide an interpreter if needed)
- LPC
- Lua
- MUMPS (an ANSI standard general-purpose language)
- Maple
- MATLAB
- OCaml
- Pascal (early implementations)
- PCASTL
- Perl
- PHP
- PostScript
- PowerShell
- PROSE
- Python
- Rexx
- R
- Raku
- Rebol
- Red
- Ring
- Ruby
- S-Lang
- Seed7
- Speakeasy
- Standard ML (SML)
- Spin
- Tcl
- Tea
- TorqueScript
- VBScript
- Windows PowerShell – .NET-based CLI
- Some scripting languages – below
- Wolfram Mathematica (Wolfram language)
Iterative languages
[edit]Iterative languages are built around or offering generators.
Languages by memory management type
[edit]Garbage collected languages
[edit]Garbage Collection (GC) is a form of automatic memory management. The garbage collector attempts to reclaim memory that was allocated by the program but is no longer used.
Languages with manual memory management
[edit]This section needs expansion. You can help by adding to it. (November 2016) |
- ALGOL 68
- Assembly (various)
- BLISS
- C
- C++
- Component Pascal
- Forth
- Fortran
- FreeBASIC
- Modula-2
- Oberon
- Pascal
- PL/I
- Zig
Languages with optional manual memory management
[edit]- Ada[a]
- Blitz BASIC[18][19][20]
- COBOL[21][22][23]
- D[24]
- Nim[25]
- Objective-C[26]
- Objective-C++
- PostScript[b]
- Rust[28]
- V[29]
- Vala[30]
Some programming languages without the inherent ability to manually manage memory, like Cython,[31] Swift,[c] and Scala[32] (Scala Native only), are able to import or call functions like malloc and free from C through a foreign function interface.
Languages with deterministic memory management
[edit]This section needs expansion. You can help by adding to it. (April 2018) |
Languages with automatic reference counting (ARC)
[edit]This section needs expansion. You can help by adding to it. (September 2018) |
List-based languages – LISPs
[edit]List-based languages are a type of data-structured language that are based on the list data structure.
Little languages
[edit]Little languages[35] serve a specialized problem domain.
- awk – used for text file manipulation.
- sed – parses and transforms text
- SQL – has only a few keywords and not all the constructs needed for a full programming language[d] – many database management systems extend SQL with additional constructs as a stored procedure language
- XPL - a language designed for, although not limited to, compiler writing
Logic-based languages
[edit]Logic-based languages specify a set of attributes that a solution must-have, rather than a set of steps to obtain a solution.
Notable languages following this programming paradigm include:
- ALF
- Alma-0
- Curry
- Datalog
- Fril
- Flix (a functional programming language with first-class Datalog constraints)
- Janus
- λProlog (a logic programming language featuring polymorphic typing, modular programming, and higher-order programming)
- Oz, and Mozart Programming System cross-platform Oz
- Prolog (formulates data and the program evaluation mechanism as a special form of mathematical logic called Horn logic and a general proving mechanism called logical resolution)
- Mercury (based on Prolog)
- Visual Prolog (object-oriented Prolog extension)
- ROOP
- Soufflé
Machine languages
[edit]Machine languages are directly executable by a computer's CPU. They are typically formulated as bit patterns, usually represented in octal or hexadecimal. Each bit pattern causes the circuits in the CPU to execute one of the fundamental operations of the hardware. The activation of specific electrical inputs (e.g., CPU package pins for microprocessors), and logical settings for CPU state values, control the processor's computation. Individual machine languages are specific to a family of processors; machine-language code for one family of processors cannot run directly on processors in another family unless the processors in question have additional hardware to support it (for example, DEC VAX processors included a PDP-11 compatibility mode). They are (essentially) always defined by the CPU developer, not by 3rd parties.[e] The symbolic version, the processor's assembly language, is also defined by the developer, in most cases. Some commonly used machine code instruction sets are:
- RISC-V
- ARM
- DEC:
- Intel 8008, 8080 and 8085
- x86:
- 16-bit x86, first used in the Intel 8086
- Intel 8086 and 8088 (the latter was used in the first and early IBM PC)
- Intel 80186
- Intel 80286 (the first x86 processor with protected mode, used in the IBM PC AT)
- IA-32, introduced in the 80386
- x86-64 – The original specification was created by AMD. There are vendor variants, but they're essentially the same:
- 16-bit x86, first used in the Intel 8086
- Burroughs Corporation
- IBM[f]
- MIPS
- Motorola 6800 (8-bit)
- Motorola 68000 series (CPUs used in early Macintosh and early Sun computers)
- MOS Technology 65xx (8-bit)
- 6502 (CPU for NES, VIC-20, BBC Micro, Apple II, and Atari 8-bit computers)
- 6510 (CPU for Commodore 64)
- Western Design Center 65816/65802 (CPU for Apple IIGS and (variant) Super Nintendo Entertainment System)
- National Semiconductor NS320xx
- POWER, first used in the IBM RS/6000
- PowerPC – used in Power Macintosh and in many game consoles, particularly of the seventh generation.
- Power ISA – an evolution of PowerPC.
- Sun Microsystems (now Oracle) SPARC
- UNIVAC[f]
- MCST Elbrus 2000
Macro languages
[edit]Textual substitution macro languages
[edit]Macro languages transform one source code file into another. A "macro" is essentially a short piece of text that expands into a longer one (not to be confused with hygienic macros), possibly with parameter substitution. They are often used to preprocess source code. Preprocessors can also supply facilities like file inclusion.
Macro languages may be restricted to acting on specially labeled code regions (pre-fixed with a # in the case of the C preprocessor). Alternatively, they may not, but in this case it is still often undesirable to (for instance) expand a macro embedded in a string literal, so they still need a rudimentary awareness of syntax. That being the case, they are often still applicable to more than one language. Contrast with source-embeddable languages like PHP, which are fully featured.
- C preprocessor
- m4 (originally from AT&T, bundled with Unix)
- ML/I (general-purpose macro processor)
- TTM (developed at the California Institute of Technology)
Application macro languages
[edit]Scripting languages such as Tcl and ECMAScript (ActionScript, ECMAScript for XML, JavaScript, JScript) have been embedded into applications. These are sometimes called "macro languages", although in a somewhat different sense to textual-substitution macros like m4.
Metaprogramming languages
[edit]Metaprogramming is the writing of programs that write or manipulate other programs, including themselves, as their data or that do part of the work that is otherwise done at run time during compile time. In many cases, this allows programmers to get more done in the same amount of time as they would take to write all the code manually.
Modular languages
[edit]Modular programming is a programming paradigm of organising functions and symbols into independent modules.
- Ada
- ALGOL
- BlitzMax
- C (via Clang extensions)
- C++ (via C++ modules)
- C#
- Clojure
- COBOL
- Common Lisp
- D
- Dart
- eC
- Erlang
- Elixir
- Elm
- F
- F#
- Fortran
- Go
- Haskell
- IBM/360 Assembler
- IBM System/38 and AS/400 Control Language (CL)
- IBM RPG
- Java (via Java packages and Java modules)
- JavaScript[38]
- Julia
- MATLAB
- ML
- Modula, Modula-2, Modula-3
- Morpho
- NEWP
- Oberon, Oberon-2
- Objective-C
- OCaml
- Pascal derivatives
- Perl
- PHP
- PL/I
- PureBasic
- Python
- R
- Ruby[39]
- Rust
- Visual Basic (.NET)
- WebDNA.
Multiparadigm languages
[edit]Multiparadigm languages support more than one programming paradigm. They allow a program to use more than one programming style. The goal is to allow programmers to use the best tool for a job, admitting that no one paradigm solves all problems in the easiest or most efficient way.
- 1C:Enterprise programming language (generic, imperative, object-oriented, prototype-based, functional)
- Ada (concurrent, distributed, generic (template metaprogramming), imperative, object-oriented (class-based))
- ALF (functional, logic)
- Alma-0 (constraint, imperative, logic)
- APL (functional, imperative, object-oriented (class-based))
- BETA (functional, imperative, object-oriented (class-based))
- C++ (generic, imperative, object-oriented (class-based), functional, metaprogramming)
- C# (generic, imperative, object-oriented (class-based), functional, declarative)
- Ceylon (generic, imperative, object-oriented (class-based), functional, declarative)
- ChucK (imperative, object-oriented, time-based, concurrent, on-the-fly)
- Cobra (generic, imperative, object-oriented (class-based), functional, contractual)
- Common Lisp (functional, imperative, object-oriented (class-based), aspect-oriented (user may add further paradigms, e.g., logic))
- Curl (functional, imperative, object-oriented (class-based), metaprogramming)
- Curry (concurrent, functional, logic)
- D (generic, imperative, functional, object-oriented (class-based), metaprogramming)
- Dart (generic, imperative, functional, object-oriented (class-based))
- Delphi Object Pascal (generic, imperative, object-oriented (class-based), metaprogramming)
- Dylan (functional, object-oriented (class-based))
- ECMAScript (functional, imperative, object-oriented (prototype-based))
- Eiffel (imperative, object-oriented (class-based), generic, functional (agents), concurrent (SCOOP))
- F# (functional, generic, object-oriented (class-based), language-oriented)
- Fantom (functional, object-oriented (class-based))
- Go (imperative, procedural),
- Groovy (functional, object-oriented (class-based), imperative, procedural)
- Harbour
- Hop
- J (functional, imperative, object-oriented (class-based))
- Java (generic, imperative, object-oriented (class-based), functional)
- Julia (imperative, multiple dispatch ("object-oriented"), functional, metaprogramming)
- LabVIEW (visual, dataflow, concurrent, modular, functional, object-oriented, scripting)
- Lua (functional, imperative, object-oriented (prototype-based))
- Mercury (functional, logical, object-oriented)
- Metaobject protocols (object-oriented (class-based, prototype-based))
- Nemerle (functional, object-oriented (class-based), imperative, metaprogramming)
- Objective-C (imperative, object-oriented (class-based), reflective)
- OCaml (functional, imperative, object-oriented (class-based), modular)
- Oz (functional (evaluation: eager, lazy), logic, constraint, imperative, object-oriented (class-based), concurrent, distributed), and Mozart Programming System cross-platform Oz
- Object Pascal (imperative, object-oriented (class-based))
- Perl (imperative, functional (can't be purely functional), object-oriented, class-oriented, aspect-oriented (through modules))
- PHP (imperative, object-oriented, functional (can't be purely functional))
- Pike (interpreted, general-purpose, high-level, cross-platform, dynamic programming language )
- Prograph (dataflow, object-oriented (class-based), visual)
- Python (functional, compiled, interpreted, object-oriented (class-based), imperative, metaprogramming, extension, impure, interactive mode, iterative, reflective, scripting)
- R (array, interpreted, impure, interactive mode, list-based, object-oriented prototype-based, scripting)
- Racket (functional, imperative, object-oriented (class-based) and can be extended by the user)
- Raku (concurrent, concatenative, functional, metaprogramming generic, imperative, reflection object-oriented, pipelines, reactive, and via libraries constraints, distributed)
- Rebol (functional, imperative, object-oriented (prototype-based), metaprogramming (dialected))
- Red (functional, imperative, object-oriented (prototype-based), metaprogramming (dialected))
- ROOP (imperative, logic, object-oriented (class-based), rule-based)
- Ring (imperative, functional, object-oriented (class-based), metaprogramming, declarative, natural)
- Ruby (imperative, functional, object-oriented (class-based), metaprogramming)
- Rust (concurrent, functional, imperative, object-oriented, generic, metaprogramming, compiled)
- Scala (functional, object-oriented)
- Seed7 (imperative, object-oriented, generic)
- SISAL (concurrent, dataflow, functional)
- Spreadsheets (functional, visual)
- Swift (protocol-oriented, object-oriented, functional, imperative, block-structured)
- Tcl (functional, imperative, object-oriented (class-based))
- Tea (functional, imperative, object-oriented (class-based))
- V (Vlang) (functional, imperative, procedural, structured, concurrent)
- Windows PowerShell (functional, imperative, pipeline, object-oriented (class-based))
- Wolfram Mathematica (Wolfram language)
Numerical analysis
[edit]Several general-purpose programming languages, such as C and Python, are also used for technical computing, this list focuses on languages almost exclusively used for technical computing.
Non-English-based languages
[edit]- Chinese BASIC (Chinese)
- Fjölnir (Icelandic)
- Language Symbolique d'Enseignement (French)
- Rapira (Russian)
- ezhil (Tamil)
Object-oriented class-based languages
[edit]Class-based object-oriented programming languages support objects defined by their class. Class definitions include member data. Message passing is a key concept, if not the main concept, in object-oriented languages.
Polymorphic functions parameterized by the class of some of their arguments are typically called methods. In languages with single dispatch, classes typically also include method definitions. In languages with multiple dispatch, methods are defined by generic functions. There are exceptions where single dispatch methods are generic functions (e.g. Bigloo's object system).
Single dispatch
[edit]- ActionScript 3.0
- Actor
- Ada 95 and Ada 2005 (multi-purpose language)
- APL
- BETA
- C++
- C#
- Ceylon
- Dart
- Oxygene (formerly named Chrome)
- ChucK
- Cobra
- ColdFusion
- Curl
- D
- Distributed Application Specification Language (DASL)
- Delphi Object Pascal
- E
- GNU E
- Eiffel
- Fortran 2003
- Fortress
- Gambas
- Game Maker Language
- Harbour
- J
- Java
- LabVIEW
- Lua
- Modula-2 (data abstraction, information hiding, strong typing, full modularity)
- Modula-3 (added more object-oriented features to Modula-2)
- Nemerle
- NetRexx
- Oberon-2 (full object-orientation equivalence in an original, strongly typed, Wirthian manner)
- Object Pascal
- Object REXX
- Objective-C (a superset of C adding a Smalltalk derived object model and message passing syntax)
- OCaml
- OpenEdge Advanced Business Language (ABL)
- Oz, Mozart Programming System
- Perl 5
- PHP
- Pike
- Prograph
- Python (interpretive language, optionally object-oriented)
- Revolution (programmer does not get to pick the objects)
- Ring
- Ruby
- Scala
- Speakeasy
- Simula (first object-oriented language, developed by Ole-Johan Dahl and Kristen Nygaard)
- Smalltalk (pure object-orientation, developed at Xerox PARC)
- SPIN
- SuperCollider
- VBScript (Microsoft Office 'macro scripting' language)
- Visual DataFlex
- Visual FoxPro
- Visual Prolog
- X++
- Xojo
- XOTcl
Object-oriented prototype-based languages
[edit]Prototype-based languages are object-oriented languages where the distinction between classes and instances has been removed:
- 1C:Enterprise programming language
- Actor-Based Concurrent Language (ABCL, ABCL/1, ABCL/R, ABCL/R2, ABCL/c+)
- Agora
- Cecil
- ECMAScript
- ActionScript
- ECMAScript for XML
- JavaScript (first named Mocha, then LiveScript)
- JScript
- Etoys in Squeak
- Io
- Lua
- MOO
- NewtonScript
- Obliq
- R
- Rebol
- Red
- Self (first prototype-based language, derived from Smalltalk)
- TADS
Off-side rule languages
[edit]Off-side rule languages denote blocks of code by their indentation.
Procedural languages
[edit]Procedural programming languages are based on the concept of the unit and scope (the data viewing range) of an executable code statement. A procedural program is composed of one or more units or modules, either user coded or provided in a code library; each module is composed of one or more procedures, also called a function, routine, subroutine, or method, depending on the language. Examples of procedural languages include:
- Ada (multi-purpose language)
- ALGOL 58
- ALGOL 60 (very influential language design)
- SMALL Machine ALGOL Like Language
- ALGOL 68
- Alma-0
- BASIC (these lack most modularity in (especially) versions before about 1990)
- BCPL
- BLISS
- C
- C++
- C# (similar to Java/C++)
- Ceylon
- CHILL
- ChucK (C/Java-like syntax, with new syntax elements for time and parallelism)
- COBOL
- Cobra
- ColdFusion
- CPL (Combined Programming Language)
- Curl
- D
- Distributed Application Specification Language (DASL) (combine declarative programming and imperative programming)
- ECMAScript
- ActionScript
- ECMAScript for XML
- JavaScript (first named Mocha, then LiveScript)
- JScript
- Source
- Eiffel
- Forth
- Fortran (better modularity in later Standards)
- GAUSS
- Go
- Harbour
- HyperTalk
- Java
- JOVIAL
- Julia
- Language H
- Lasso
- Modula-2 (fundamentally based on modules)
- MATLAB
- Mesa
- MUMPS (first release was more modular than other languages of the time; the standard has become even more modular since then)
- Nemerle
- Nim
- Oberon, Oberon-2 (improved, smaller, faster, safer follow-ons for Modula-2)
- OCaml
- Occam
- Oriel
- Pascal (successor to ALGOL 60, predecessor of Modula-2)
- Free Pascal (FPC)
- Object Pascal, Delphi
- PCASTL
- Perl
- Pike
- PL/C
- PL/I (large general-purpose language, originally for IBM mainframes)
- Plus
- PowerShell
- PROSE
- Python
- R
- Raku
- Rapira
- RPG
- Rust
- S-Lang
- VBScript
- Visual Basic
- Visual FoxPro
- Wolfram Mathematica (Wolfram language)
- Microsoft Dynamics AX (X++)
Query languages
[edit]Reflective languages
[edit]Reflective programming languages let programs examine and possibly modify their high-level structure at runtime or compile-time. This is most common in high-level virtual machine programming languages like Smalltalk, and less common in lower-level programming languages like C. Languages and platforms supporting reflection:
- Befunge
- C++ (since C++26)
- Ceylon
- Charm
- ChucK
- CLI
- Cobra
- Component Pascal BlackBox Component Builder
- Curl
- Cypher
- Delphi Object Pascal
- ECMAScript
- Emacs Lisp
- Eiffel
- Harbour
- Julia
- JVM
- Lisp
- Lua
- Maude system
- Oberon-2 – ETH Oberon System
- Objective-C
- PCASTL
- Perl
- PHP
- Pico
- Poplog
- PowerShell
- Prolog
- Python
- Raku[41]
- Rebol
- Red
- Ring
- Ruby
- Rust (with third-party libraries)[42]
- Smalltalk (pure object-orientation, originally from Xerox PARC)
- SNOBOL
- Tcl
- Wolfram Mathematica (Wolfram language)
- XOTcl
- X++
- Xojo
Rule-based languages
[edit]Rule-based languages instantiate rules when activated by conditions in a set of data. Of all possible activations, some set is selected and the statements belonging to those rules execute. Rule-based languages include:[citation needed]
Scripting languages
[edit]- AngelScript
- AppleScript
- AutoHotKey
- AutoIt
- AWK
- bc
- BeanShell
- C (via Tiny C Compiler)
- Ch (Embeddable C/C++ interpreter)
- CLI
- CLIST
- ColdFusion
- ECMAScript
- ActionScript
- ECMAScript for XML
- JavaScript (first named Mocha, then LiveScript)
- JScript
- Source
- Emacs Lisp
- CMS EXEC
- EXEC 2
- Game Maker Language (GML)
- GDScript
- Io
- JASS
- Julia (compiled on the fly to machine code, by default, interpreting also available)
- JVM
- Lasso
- Lua
- MAXScript
- MEL
- Oriel
- Pascal Script
- Perl
- PHP (intended for Web servers)
- Python
- R
- Raku
- Rebol
- Red
- Rexx
- Object REXX (OREXX, OOREXX)
- Revolution
- Ring
- Ruby
- RUNCOM (scripting language for running CTSS) programs)
- S-Lang
- sed
- Smalltalk
- Squirrel
- Tea
- Tcl
- TorqueScript
- VBScript
- Many shell command languages have powerful scripting abilities:
- sh and compatibles
- DIGITAL Command Language (DCL) on VMS
- PowerShell (.NET-based CLI)
Stack-based languages
[edit]Stack-based languages are a type of data-structured language that are based on the stack data structure.
Synchronous languages
[edit]Synchronous programming languages are optimized for programming reactive systems, systems that are often interrupted and must respond quickly. Many such systems are also called realtime systems, and are used often in embedded systems.
Examples:
Shading languages
[edit]A shading language is a graphics programming language adapted to programming shader effects. Such language forms usually consist of special data types, like "color" and "normal". Due to the variety of target markets for 3D computer graphics.
Real-time rendering
[edit]They provide both higher hardware abstraction and a more flexible programming model than previous paradigms which hardcoded transformation and shading equations. This gives the programmer greater control over the rendering process and delivers richer content at lower overhead.
- Adobe Graphics Assembly Language (AGAL)[44]
- ARB assembly language (ARB assembly)
- OpenGL Shading Language (GLSL or glslang)
- High-Level Shading Language (HLSL) or DirectX Shader Assembly Language
- PlayStation Shader Language (PSSL)
- Metal Shading Language (MSL)
- Cg
Offline rendering
[edit]Shading languages used in offline rendering produce maximum image quality. Processing such shaders is time-consuming. The computational power required can be expensive because of their ability to produce photorealistic results.
- RenderMan Shading Language (RSL)
- Open Shading Language (OSL)
Syntax-handling languages
[edit]These languages assist with generating lexical analyzers and parsers for context-free grammars.
System languages
[edit]A system programming language is for low-level tasks like memory management or task management; it usually refers to a language used for systems programming; such languages are designed for writing system software, which usually requires different development approaches relative to application software.
System software is computer software designed to operate and control computer hardware, and provide a platform to run application software. System software includes software categories such as operating systems, utility software, device drivers, compilers, and linkers. Examples of system languages include:
| Language | Originator | First appeared | Influenced by | Used for |
|---|---|---|---|---|
| ESPOL | Burroughs Corporation | 1961 | ALGOL 60 | MCP |
| PL/I | IBM, SHARE | 1964 | ALGOL 60, FORTRAN, some COBOL | Multics |
| PL360 | Niklaus Wirth | 1968 | ALGOL 60 | ALGOL W |
| C | Dennis Ritchie | 1969 | BCPL | Most operating system kernels, including Windows NT and most Unix-like systems |
| PL/S | IBM | 196x | PL/I | OS/360 |
| BLISS | Carnegie Mellon University | 1970 | ALGOL-PL/I[46] | VMS (portions) |
| PL/8 | IBM | 197x | PL/I | AIX |
| PL/MP and PL/MI | IBM | 197x | PL/I | CPF, OS/400 |
| PL-6 | Honeywell, Inc. | 197x | PL/I | CP-6 |
| SYMPL | CDC | 197x | JOVIAL | NOS subsystems, most compilers, FSE editor |
| C++ | Bjarne Stroustrup | 1979 | C, Simula | See C++ Applications[47] |
| Ada | Jean Ichbiah, S. Tucker Taft | 1983 | ALGOL 68, Pascal, C++, Java, Eiffel | Embedded systems, OS kernels, compilers, games, simulations, CubeSat, air traffic control, and avionics |
| D | Digital Mars | 2001 | C++ | Multiple domains[48] |
| Nim | Andreas Rumpf | 2008 | Ada, Modula-3, Lisp, C++, Object Pascal, Python, Oberon | OS kernels, compilers, games |
| Rust | Mozilla Research[49] | 2010 | C++, Haskell, Erlang, Ruby | Servo layout engine, RedoxOS |
| Swift | Apple Inc. | 2014 | C, Objective-C, Rust | macOS, iOS app development[h] |
| Zig | Andrew Kelley | 2016 | C, C++, LLVM IR, Go, Rust, JavaScript | As a replacement for C |
| V (Vlang) | Alexander Medvednikov | 2019 | C, Go, Oberon-2, Rust, Swift, Kotlin | Vinix OS, OS kernels, compilers, games |
Transformation languages
[edit]Transformation languages serve the purpose of transforming (translating) source code specified in a certain formal language into a defined destination format code. It is most commonly used in intermediate components of more complex super-systems in order to adopt internal results for input into a succeeding processing routine.
Visual languages
[edit]Visual programming languages let users specify programs in a two-(or more)-dimensional way, instead of as one-dimensional text strings, via graphic layouts of various types. Some dataflow programming languages are also visual languages.
Wirth languages
[edit]Computer scientist Niklaus Wirth designed and implemented several influential languages.
- ALGOL W
- Euler
- Modula
- Oberon (Oberon, Oberon-07, Oberon-2)
- Pascal
- Object Pascal (umbrella name for Delphi, Free Pascal, Oxygene, others)
XML-based languages
[edit]These are languages based on or that operate on XML.
- Ant
- Cω
- ECMAScript for XML
- Extensible Application Markup Language (XAML)
- MXML
- LZX
- XPath
- XQuery
- XProc
- eXtensible Stylesheet Language Transformations (XSLT)
See also
[edit]Notes
[edit]- ^ Some Ada implementations include a garbage collector,[17] though the language specification does not require its inclusion.[citation needed]
- ^ Developers initially had to manually reclaim memory using the
saveandrestoreoperators. PostScript Level 2 introduced a garbage collector, but its usage is optional.[27] - ^ On Apple platforms, these functions are imported from the C standard library (which is imported from
Foundation,AppKitorUIKit); on Linux, the developer needs to importGlibc, anducrton Windows.[citation needed] - ^ The objects of SQL are collections of database records, called tables. A full programming language can specify algorithms, irrespective of runtime. Thus an algorithm can be considered to generate usable results. In contrast, SQL can only select records that are limited to the current collection, the data at hand in the system, rather than produce a statement of the correctness of the result.
- ^ A notable exception would be the Soviet/Russian 1801 series CPU, which originally used their own domestic ISA, but were later redesigned to be PDP-11 compatible as a policy decision.
- ^ a b Submodels are not listed, only base models.
- ^ The concept of object with the traditional single-dispatch OO semantics is not present in Julia, instead with the more general multiple dispatch on different types at runtime.
- ^ Swift uses automatic reference counting.
References
[edit]- ^ "Operators". Retrieved 2024-05-13.
- ^ "wrap".
- ^ ""Aspects in Raku"".
- ^ Documentation » The Python Standard Library » Concurrent Execution
- ^ "Channels and other mechanisms".
- ^ "ProblemSolver".
- ^ Bright, Walter (2014-11-01). "D Programming Language, Contract Programming". Digital Mars. Retrieved 2014-11-10.
- ^ Hodges, Nick. "Write Cleaner, Higher Quality Code with Class Contracts in Delphi Prism". Embarcadero Technologies. Archived from the original on 26 April 2021. Retrieved 20 January 2016.
- ^ Findler, Felleisen Contracts for Higher-Order Functions
- ^ "Scala Standard Library Docs - Assertions". EPFL. Retrieved 2019-05-24.
- ^ Strong typing as another "contract enforcing" in Scala, see discussion at scala-lang.org/.
- ^ a b "Indentation based syntax · rsdn/nemerle Wiki". GitHub. Retrieved 2022-03-18.
- ^ "Solidity: Solidity 0.8.11 documentation".
- ^ "Iterator".
- ^ "std::pointer_safety - cppreference.com". en.cppreference.com. Retrieved 2024-12-09.
- ^ JF Bastien; Alisdair Meredith (2021-04-16). "Removing Garbage Collection Support".
- ^ "Conservative Garbage Collection for GNAT". Florian Weimer's Home Page. Retrieved 2025-03-12.
- ^ "Memory Management · BlitzMax". Retrieved 2023-07-14.
- ^ "Pointers · BlitzMax". Retrieved 2023-07-14.
- ^ "BRL.Blitz · BlitzMax". Retrieved 2023-07-14.
- ^ "Using Pointers in an ILE COBOL Program - IBM Documentation". IBM. June 2012. Retrieved 2023-07-14.
- ^ "HEAP - IBM Documentation". IBM. Retrieved 2023-07-14.
- ^ "SOM-based OO COBOL language elements that are changed - IBM Documentation". IBM. Retrieved 2023-07-14.
- ^ "Garbage Collection". D Programming Language. Retrieved 2022-03-18.
- ^ "Nim's Memory Management". Retrieved 2022-03-18.
- ^ "About Memory Management". Apple Developer. Retrieved 2025-03-12.
- ^ Adobe (February 1999). PostScript Language Reference, third edition (PDF). Addison-Wesley Publishing Company. pp. 56–65.
- ^ "alloc::rc - Rust". Retrieved 2025-03-12.
- ^ "V Documentation". Retrieved 2025-03-12.
- ^ "Projects/Vala/ReferenceHandling - GNOME Wiki!". Archived from the original on 2024-01-21. Retrieved 2022-03-21.
- ^ "Memory Allocation — Cython 3.0.0.dev0 documentation". Retrieved 2023-07-14.
- ^ "Native code interoperability – Scala Native 0.4.14 documentation". Retrieved 2023-07-05.
- ^ "Understanding Ownership - The Rust Programming Language". doc.rust-lang.org.
- ^ "Smart Pointers - The Rust Programming Language". doc.rust-lang.org.
- ^ Jon Bentley (AT&T) August 1986 CACM 29 (8) "Little Languages", pp 711-721 from his Programming Pearls column
- ^ "Meta-programming: What, why and how". 2011-12-14.
- ^ "Procedural Macros for Generating Code from Attributes". doc.rust-lang.org.
- ^ ECMAScript® 2015 Language Specification, 15.2 Modules
- ^ "class Module - Documentation for Ruby 3.5".
- ^ "Classes and Roles".
- ^ "Meta-object protocol (MOP)".
- ^ "bevy_reflect - Rust". docs.rs. 30 May 2025.
- ^ "JBang". jbang.dev. Retrieved 11 September 2025.
- ^ Scabia, Marco. "What is AGAL". Adobe Developer Connection. Adobe. Retrieved 8 May 2018.
- ^ "Grammars".
- ^ Wulf, W.A.; Russell, D.B.; Haberman, A.N. (December 1971). "BLISS: A Language for Systems Programming". Communications of the ACM. 14 (12): 780–790. CiteSeerX 10.1.1.691.9765. doi:10.1145/362919.362936. S2CID 9564255.
- ^ "C++ Applications".
- ^ "Organizations using the D Language". D Programming Language.
- ^ "Mozilla Research". 1 January 2014.
List of programming languages by type
View on GrokipediaCore Paradigms
Imperative languages
Imperative programming is a programming paradigm in which programs are constructed as a sequence of commands or statements that explicitly describe how to perform computations by modifying the program's state step by step.[21] This approach focuses on specifying the precise control flow and actions needed to achieve a result, contrasting with paradigms that abstract away such details.[1] In imperative languages, the programmer directly manages the sequence of operations, making it closely aligned with the underlying hardware execution model.[22] Key characteristics of imperative languages include the use of variables to store and update state, assignment statements to change variable values, control structures such as loops and conditional branches to direct execution flow, and procedures or subroutines to encapsulate reusable code blocks.[22] These elements enable programmers to mimic the sequential processing of von Neumann architectures, where instructions are fetched, executed, and alter memory contents iteratively.[23] Imperative programs typically emphasize mutable data and side effects, allowing for efficient manipulation of complex data structures and direct interaction with system resources like input/output devices.[24] The imperative paradigm originated in the 1950s as computing shifted toward high-level languages that abstracted machine code while retaining explicit control.[25] FORTRAN, developed by IBM and released in 1957, was the first widely used imperative language, designed primarily for scientific and engineering computations involving numerical analysis.[26] ALGOL 60, introduced in 1960 by an international committee, further standardized imperative constructs like block structures and recursion, influencing nearly all subsequent languages in this paradigm.[27] Prominent examples of imperative languages include C, created in 1972 by Dennis Ritchie at Bell Labs for systems programming on Unix, valued for its efficiency and low-level access to hardware.[28] Pascal, designed by Niklaus Wirth around 1970 and published in 1971, emphasized structured programming with strong typing and modular design to promote teachable, reliable code.[29] Ada, standardized in 1983 by the U.S. Department of Defense, incorporates imperative features with built-in support for concurrency and error handling, targeting safety-critical applications in aerospace and defense.[30] Imperative languages continue to dominate general-purpose computing due to their natural mapping to the von Neumann model prevalent in most processors, enabling high performance and broad applicability.[23]Declarative languages
Declarative programming is a paradigm in which programs specify the desired outcome or logic of computation without detailing the step-by-step control flow or implementation mechanics, leaving the execution details to an underlying system or engine.[31] This approach contrasts with imperative programming by emphasizing descriptions of "what" should be achieved rather than "how," enabling higher-level abstractions that simplify complex problem-solving.[32] Key characteristics of declarative languages include the use of rules, expressions, or constraints to define relationships and goals, often leveraging built-in optimizers or interpreters to handle efficiency and order of operations.[33] These languages promote immutability and domain-specific notations, reducing the need for programmers to manage low-level details like loops or memory allocation.[33] Subtypes encompass logic programming, which uses formal logic for inference; functional styles, focusing on mathematical functions without side effects; and query-based approaches for data retrieval, though each shares the core principle of outcome specification over procedural steps.[34] Prominent examples illustrate the paradigm's versatility. Prolog, developed in 1972 by Alain Colmerauer and Philippe Roussel at the University of Marseille as a tool for natural language processing and automated theorem proving, pioneered logic-based declarative programming through its use of Horn clauses and backtracking inference.[35] SQL, created in 1974 by Donald D. Chamberlin and Raymond F. Boyce at IBM to query relational databases, exemplifies declarative data manipulation by allowing users to express selection criteria without specifying access paths.[36] Declarative languages reduce errors in complex domains like artificial intelligence and databases by abstracting implementation details, allowing focus on problem logic while engines optimize performance.[37] This abstraction facilitates maintainability and scalability, as seen in Prolog's role in AI inference and SQL's dominance in data management. Note that declarative approaches overlap with logic-based paradigms, such as those explored in dedicated sections on logic programming.[38]Functional Programming
Pure functional languages
Pure functional languages adhere strictly to functional programming principles, treating functions as mathematical mappings from inputs to outputs without any side effects, such as modifying external state or performing input/output operations within function bodies. This enforces immutability, where data cannot be altered after creation, and referential transparency, meaning that expressions can be replaced by their values without changing the program's behavior.[39][40] Key characteristics of these languages include support for higher-order functions, which treat functions as first-class citizens that can be passed as arguments, returned from other functions, or stored in data structures; recursion as the fundamental mechanism for loops and iteration, avoiding mutable counters; and lazy evaluation in many implementations, where computations are deferred until their results are actually required, potentially improving efficiency by avoiding unnecessary work.[41][42] The roots of pure functional programming trace back to Alonzo Church's lambda calculus, a formal system developed in the 1930s to model computation through function abstraction and application. Practical implementations began emerging in the 1970s, with languages like SASL (1975) and KRC (1981) pioneering non-strict evaluation and pure expressions, building on earlier influences such as Landin's ISWIM (1966).[43][44][45] Prominent examples include Haskell, a standardized lazy functional language released in 1990 by a committee led by Simon Peyton Jones, which popularized pure functional programming through its comprehensive type system and module support. Miranda, designed by David Turner between 1983 and 1986, is a pure, polymorphic, higher-order language that influenced Haskell and emphasized non-strict semantics for concise expression of complex algorithms.[46][47] The strict purity of these languages enables significant advantages, such as straightforward parallelism due to the absence of shared mutable state, which eliminates race conditions and simplifies concurrent programming, and enhanced formal verification, where Haskell's advanced type system, including dependent types and monads, allows encoding and proving program properties mathematically.[48][49]Impure functional languages
Impure functional programming refers to a paradigm that maintains the core principles of functional programming, such as higher-order functions, immutability where possible, and compositionality, while permitting side effects like input/output operations, mutable state, and exceptions to enhance practicality.[50] This approach allows functions to interact with the external world without strictly adhering to referential transparency, where the same inputs always produce the same outputs in isolation.[51] Key characteristics of impure functional languages include a blend of pure functions for core logic with imperative constructs for tasks requiring state management, often integrated to improve performance, enable interoperability with existing systems, or handle real-world concurrency.[52] These languages typically provide mechanisms like mutable variables or monads to encapsulate impurities, allowing developers to minimize side effects while retaining flexibility.[53] Unlike purely functional languages that prohibit side effects to ensure predictability, impure variants prioritize applicability in production environments.[54] The development of impure functional languages traces back to the 1970s, with early examples emerging from theorem-proving systems at the University of Edinburgh, where the need for efficient state handling drove the inclusion of imperative features alongside functional ones.[55] This evolution continued into the 2000s, as languages sought to bridge theoretical functional ideals with practical demands of industry, such as integration with object-oriented ecosystems and support for concurrent programming.[56] Prominent examples include ML, first implemented in 1973 as the Meta Language for the LCF theorem prover, which combines strong static typing and pattern matching with mutable references for state.[55] Scala, released in 2004, runs on the JVM and offers functional constructs like immutable collections alongside mutable options and imperative loops for seamless Java interoperability.[52] F#, introduced in 2005 by Microsoft Research for the .NET platform, supports functional patterns such as currying and recursion while incorporating imperative elements like mutable variables and async workflows.[53] Clojure, a Lisp dialect launched in 2007, emphasizes immutable data structures and pure functions but includes software transactional memory for safe mutable state and concurrency.[57] A unique aspect of impurity in early languages like ML is its influence on modern hybrids, as it demonstrated efficient state handling through references and exceptions without sacrificing functional expressiveness, paving the way for scalable applications in diverse domains.[55]Object-Oriented Programming
Class-based object-oriented languages
Class-based object-oriented programming (OOP) is a paradigm in which objects are instantiated from classes that act as templates or blueprints, defining the data attributes and methods that encapsulate state and behavior. This approach emphasizes core principles including inheritance, which allows subclasses to reuse and extend the properties of parent classes; encapsulation, which bundles data and operations while restricting direct access to promote modularity; and polymorphism, enabling objects of different classes to be treated uniformly through a common interface.[58][59] The paradigm originated with Simula 67, developed in 1967 by Ole-Johan Dahl and Kristen Nygaard at the Norwegian Computing Center, marking it as the first language to introduce classes and objects for simulation purposes while laying the foundation for modern OOP through its support for hierarchical structures and dynamic behavior.[60][61] Key characteristics of class-based OOP include support for single inheritance, where a class derives from one superclass to maintain a clear hierarchy, or multiple inheritance, permitting derivation from multiple superclasses to combine features, though the latter can introduce complexities like the diamond problem in resolution. Many such languages incorporate static typing, where variable types are declared and checked at compile time to catch errors early and optimize performance.[59][62] Prominent examples illustrate the paradigm's evolution and applications. Java, released in 1995 by Sun Microsystems, emphasizes platform independence via the Java Virtual Machine, enabling "write once, run anywhere" for enterprise and web development.[63] C++, introduced in 1985 by Bjarne Stroustrup at Bell Labs as an extension of C, adds OOP features like classes while retaining low-level control for systems programming.[64] C#, launched by Microsoft in 2000 alongside the .NET Framework, integrates seamlessly with Windows ecosystems for building robust applications with strong type safety.[65] Smalltalk, developed starting in 1972 at Xerox PARC under Alan Kay's influence, pioneered pure OOP by treating everything as an object and emphasizing message-passing for dynamic, reflective systems.[66] A distinctive variant appears in Julia, released in 2012, which extends class-based OOP with multiple dispatch to select methods based on the types of all arguments, enhancing flexibility for scientific computing.[67]Prototype-based object-oriented languages
Prototype-based object-oriented programming is a paradigm in which objects are created and manipulated directly, serving as prototypes from which new objects inherit properties and behaviors through mechanisms such as delegation or cloning, rather than relying on predefined classes.[68] This approach emphasizes the direct reuse and modification of existing objects, allowing for more fluid object creation without the need for class hierarchies.[69] Key characteristics of prototype-based languages include dynamic inheritance, where objects can change their prototypes at runtime, the absence of fixed classes to enforce structure, and a focus on sharing behaviors through prototype delegation or copying.[70] These features enable developers to extend or alter object behaviors on the fly, promoting flexibility in exploratory programming environments.[71] Unlike class-based systems, prototypes allow for incremental customization, where new objects start as shallow copies or delegates of existing ones, inheriting slots or methods as needed.[72] The paradigm was introduced in the 1980s with the development of the Self language at Xerox PARC, designed from 1985 to 1995 as a pure prototype-based system to simplify object-oriented programming by eliminating classes and variables in favor of slots and delegation.[70] Self's innovations, such as cloning prototypes for object creation, directly influenced subsequent languages, including JavaScript, which adopted prototypal inheritance as a core mechanism.[73] Prominent examples include JavaScript, released in 1995, which uses prototypal inheritance where objects delegate to prototype objects for property resolution, though later versions added syntactic sugar for class-like declarations without altering the underlying prototype model.[73][74] Lua, first released in 1993, implements prototype-based OOP through tables acting as objects and metatables for inheritance via the__index metamethod, enabling delegation without explicit classes.[75][76] Io, introduced in 2002, integrates prototype-based OOP with an actor-model for concurrency, where all values are objects that inherit via message passing and cloning from prototypes like Object.[77][78]
Prototypes provide greater flexibility for runtime-modifiable structures compared to rigid class definitions, allowing objects to evolve dynamically and supporting rapid prototyping in domains like scripting and user interfaces.[68][71]
Other Paradigms
Procedural languages
Procedural programming is a subset of imperative programming that emphasizes organizing code into reusable procedures or subroutines to promote modularity and code reuse.[79] These procedures encapsulate specific tasks, allowing programs to be broken down into manageable, hierarchical components that can be invoked as needed.[80] Key characteristics of procedural languages include top-down design, where complex problems are decomposed into simpler subproblems solved by individual procedures; parameter passing mechanisms, such as pass-by-value or pass-by-reference, to exchange data between procedures; and scope management, which controls variable visibility within blocks or procedures to prevent unintended interactions.[81] This approach facilitates structured control flow, often using sequence, selection, and iteration constructs to guide execution.[82] The historical roots of procedural programming trace back to the late 1950s, with ALGOL 60, released in 1960, playing a pivotal role in standardizing procedural concepts such as block structure, recursion, and procedure calls with formal parameter passing.[27] ALGOL 60's design influenced subsequent languages by providing a rigorous, machine-independent syntax for expressing algorithms procedurally.[83] Prominent examples include Fortran, which evolved from its 1957 origins to incorporate subroutines and procedural elements in Fortran II (1958), enabling modular scientific computing; BASIC, introduced in 1964 at Dartmouth College, which supported simple procedures via GOSUB statements for beginner-friendly programming; and Modula-2, developed by Niklaus Wirth in 1978 as an extension of Pascal, which introduced modules for enhanced abstraction and separate compilation of procedures.[84][85][86] Procedural languages laid the groundwork for structured programming by promoting the avoidance of unstructured jumps like the goto statement, an idea advanced by Edsger Dijkstra in his 1968 letter "Go To Statement Considered Harmful," which argued for clearer, more maintainable code through procedural discipline.[87]Logic-based languages
Logic-based programming languages, also known as logic programming languages, represent a declarative paradigm in which programs are formulated as collections of logical statements, typically in the form of Horn clauses, and execution proceeds through automated theorem proving via logical inference.[16] In this approach, computation is achieved by deriving facts from given axioms and rules using resolution, where the system's inference engine searches for proofs that satisfy queries posed to the program.[88] This contrasts with imperative paradigms by focusing on what the program should accomplish rather than how to achieve it step by step.[89] The foundations of logic programming trace back to the resolution theorem proving method introduced by J. A. Robinson in 1965, which provided a complete and mechanically verifiable inference rule for first-order logic, enabling automated deduction.[88] This work laid the groundwork for practical implementations, culminating in the development of Prolog in 1972 by Alain Colmerauer and Philippe Roussel at the University of Marseille, as part of efforts to model natural language processing through logical rules.[90] Prolog's early versions emphasized backtracking search and unification, making it suitable for non-deterministic problem solving. Building on this, Datalog emerged in the early 1980s as a query language subset of Prolog, optimized for deductive databases and relational data manipulation without support for complex terms or negation.[91] Later, Mercury was introduced in 1995 by researchers at the University of Melbourne, incorporating strong static typing, modes, and compilation to higher-order languages like C, aiming to address Prolog's performance limitations for large-scale applications.[92] Dialects of Prolog, such as SWI-Prolog, continue to evolve with enhancements for modern computing needs.[93] Key characteristics of logic-based languages include the use of Horn clauses, which are implications of the form Head ← Body1 ∧ Body2 ∧ ... ∧ Bodyn, restricting programs to definite clauses for decidable inference; the unification algorithm, which matches variables and terms to bind values during resolution; and non-deterministic backtracking, which explores alternative computation paths when a branch fails to yield a solution.[89] These features enable concise expression of search problems, such as parsing or planning, through recursive rules and facts.[94] Logic-based languages have excelled in artificial intelligence, particularly for knowledge representation and inference in expert systems, where rule-based reasoning mimics human decision-making processes, as demonstrated in early systems like MYCIN from the 1970s that influenced the adoption of logical formalisms for medical diagnosis.[95]Concurrent languages
Concurrent programming languages are designed to support the execution of multiple threads or processes simultaneously, enabling parallelism while providing mechanisms for synchronization to coordinate their interactions and avoid conflicts such as race conditions.[96] These languages incorporate primitives that allow developers to express concurrent behaviors explicitly, facilitating the development of scalable and responsive systems, particularly in domains requiring high throughput or real-time performance.[97] The foundations of concurrent programming languages trace back to the 1970s, with significant influence from formal models like Communicating Sequential Processes (CSP), introduced by Tony Hoare in 1978, which emphasized communication between independent processes as a core primitive for concurrency.[98] This model inspired subsequent language designs by promoting message-passing over shared state to ensure safe parallel execution. Key characteristics of these languages include support for various concurrency models, such as the actor model where independent entities communicate via asynchronous messages, channels for typed message passing between processes, lightweight threads for efficient multiplexing, and shared memory models augmented with locks or atomic operations for synchronization.[98] Prominent examples illustrate these principles in practice. Erlang, developed in 1986 at Ericsson for telecommunications systems, employs the actor model through lightweight processes and message passing, enabling fault-tolerant distributed applications that can handle millions of concurrent connections without shared mutable state.[99] Go, released publicly in 2009 by Google, introduces goroutines as inexpensive threads and channels inspired by CSP for safe communication, simplifying concurrent programming for server-side applications and making it easier to achieve performance comparable to low-level threading without the complexity of manual synchronization.[100] Occam, created in 1983 by David May at Inmos, directly implements CSP concepts with channels and parallel process composition, targeting hardware like transputers for embedded concurrent systems. A notable early integration of concurrency primitives appears in Ada, standardized in 1983, where tasking features provided rendezvous-based synchronization for multiple tasks, specifically addressing the needs of real-time systems in avionics and defense applications during the 1980s by ensuring predictable timing and resource management.[101] These languages demonstrate how explicit concurrency support, often orthogonal to other paradigms like imperative programming, enables robust handling of parallelism in demanding environments.[102]Execution Models
Compiled languages
Compiled languages are programming languages in which the source code is translated by a compiler into machine code or an intermediate form that can be executed directly by the hardware, typically before runtime.[103] This process involves ahead-of-time (AOT) compilation, where the compiler performs lexical analysis, parsing, optimization passes to improve code efficiency, and code generation to produce object files.[104] These object files are then linked with libraries and other modules to form an executable binary, resolving references to external functions and data.[105] The development of compiled languages traces back to the late 1950s, with the first optimizing compiler created for FORTRAN by John Backus and his team at IBM in 1957, marking a shift from assembly coding to higher-level abstractions for scientific computing.[106] This innovation enabled automatic generation of efficient machine code from mathematical expressions, significantly boosting programmer productivity on early computers like the IBM 704.[107] Prominent examples include C, which uses the GNU Compiler Collection (GCC) to compile portable systems code into native executables across platforms.[108] Rust, initially announced by Mozilla in 2010, emphasizes safe systems programming through its compiler, which performs rigorous checks during compilation.[109] Swift, introduced by Apple in 2014 for iOS and macOS development, compiles to optimized binaries tailored to the Apple ecosystem, supporting modern features like optionals and protocol-oriented programming.[110] A key advantage of compilation is the ability to conduct static analysis at compile time, which detects and prevents potential runtime errors before execution; for instance, Rust's borrow checker enforces memory safety rules by analyzing ownership and borrowing patterns, eliminating common vulnerabilities like null pointer dereferences without runtime overhead.[111]Interpreted languages
Interpreted programming languages are those in which the source code is executed directly by an interpreter at runtime, without the need for prior compilation into machine code. This process involves the interpreter reading the code line by line or in smaller units, translating and executing it immediately, which contrasts with compilation's ahead-of-time approach. Key characteristics of interpreted languages include frequent support for dynamic typing, where variable types are determined during execution rather than at compile time, facilitating rapid prototyping and easier debugging through immediate feedback on errors. They also offer high portability, as the interpreter can be implemented on various platforms, allowing the same code to run across different hardware without recompilation. However, this runtime execution often results in slower performance compared to compiled languages due to the overhead of interpretation. The historical roots of interpreted languages trace back to Lisp, developed in 1958 by John McCarthy as an early tool for artificial intelligence research, where its interpreter enabled interactive experimentation with symbolic expressions. Lisp's design emphasized list processing and recursion, influencing subsequent interpreted systems by demonstrating the feasibility of runtime evaluation for complex computations. Prominent examples include Python, first released in 1991 with its reference implementation CPython using an interpreter to execute bytecode, making it popular for scripting and data analysis due to its readability and extensive libraries. Ruby, introduced in 1995 by Yukihiro Matsumoto, employs an interpreter for dynamic scripting, emphasizing developer productivity and object-oriented features, which has driven its use in web development frameworks like Ruby on Rails. JavaScript, originally created in 1995 for client-side web scripting, was initially executed via interpreters in browsers, enabling dynamic content updates without page reloads. A unique aspect of interpretation is the ability to hot-swap code during execution, allowing modifications without restarting the program, which has been particularly vital for web development in JavaScript to support real-time interactivity.Interactive mode languages
Interactive mode languages support immediate execution and feedback in environments designed for exploratory programming, typically through a Read-Eval-Print Loop (REPL). A REPL provides a simple, iterative interface where the system reads user input as code, evaluates it, prints the output, and repeats the process, facilitating rapid prototyping and experimentation.[112][113] Key characteristics of these languages include incremental evaluation, allowing code to be executed in small segments with instant results, and seamless integration with scripting workflows for dynamic development. This mode is particularly valuable in domains requiring quick iteration, such as data analysis and scientific computing, where users can test hypotheses in real time without compiling entire programs.[112][114] The historical roots of interactive computing trace back to Dartmouth BASIC, developed in 1964 by John G. Kemeny and Thomas E. Kurtz at Dartmouth College, which leveraged time-sharing systems to enable individualized, real-time user interaction with computers.[115] This innovation democratized access to computing, shifting from batch processing to responsive sessions that influenced subsequent language designs. Prominent examples include Python, whose standard REPL is augmented by IPython for advanced features like command history and tab completion, and Jupyter notebooks for multimedia-rich interactivity.[114][116] R, created in 1993 by Ross Ihaka and Robert Gentleman at the University of Auckland, offers a console-based REPL tailored for statistical computing, supporting immediate data manipulation and visualization.[117] Similarly, Mathematica, launched in 1988 by Wolfram Research, pioneered a notebook-style interface for interactive symbolic and numerical computations.[118] The rise of interactive modes has profoundly impacted data science by enabling reproducible, collaborative workflows; Jupyter, spun off from IPython in 2014, has standardized the notebook format as a cornerstone for such practices.[116][119] While often relying on interpretation for execution, this mode emphasizes user-driven, session-based exploration beyond mere code running.[113]Memory Management Types
Garbage collected languages
Garbage collection (GC) is a form of automatic memory management in which a runtime system identifies and reclaims memory occupied by objects that are no longer referenced by the program, preventing memory leaks and reducing the risk of dangling pointers.[120] This process contrasts with manual memory management by eliminating the need for explicit deallocation calls, allowing developers to focus on higher-level logic while the runtime handles resource cleanup.[121] The concept of garbage collection originated in the late 1950s, with John McCarthy introducing it in the Lisp programming language in 1959 as a mechanism to automatically reclaim unused memory during program execution.[122] It gained widespread adoption and popularization in the 1990s through Java, released in 1995, whose Java Virtual Machine (JVM) integrated GC as a core feature, making automatic memory management a standard in enterprise and web development.[123] Key characteristics of GC include algorithms like mark-and-sweep, where the runtime first marks all reachable objects starting from roots (such as stack variables) and then sweeps through the heap to free unmarked objects.[124] Generational GC further optimizes this by dividing the heap into generations based on object age—younger generations for short-lived objects collected frequently, and older ones for long-lived objects collected less often—to improve efficiency and reduce overhead.[125] However, traditional GC implementations often involve "stop-the-world" pauses, during which the application halts to allow the collector to run, potentially impacting performance in latency-sensitive applications.[125] Prominent examples of garbage-collected languages include Java, which employs generational GC in the JVM to manage object lifecycles across young and old generations.[121] Python primarily uses reference counting for immediate deallocation of unreferenced objects, augmented by a generational GC to detect and break cyclic references that reference counting alone cannot handle.[126] Go implements a concurrent tri-color mark-and-sweep algorithm, which colors objects as white (unvisited), gray (to visit), or black (visited) to enable collection without fully halting the application, minimizing pauses.[127] While GC simplifies development by abstracting memory management, it can introduce unpredictable latency due to collection cycles; innovations in Go's GC, such as the concurrent tri-color approach introduced in 2015 and further optimizations in 2021, aim to keep pauses under 100 microseconds in typical workloads.[127][128]Manual memory management languages
Manual memory management requires programmers to explicitly allocate and deallocate memory for dynamic objects, typically via standard library functions such asmalloc and free in C or new and delete in C++ for heap storage.[129] This approach contrasts with automatic methods by placing full responsibility on the developer to track and release resources, often through pointers that reference allocated blocks.[130]
Key characteristics include providing fine-grained control over memory layout and timing, which can yield superior performance and lower overhead in resource-constrained scenarios compared to runtime-managed alternatives.[131] However, it introduces substantial risks, such as memory leaks when allocations are not freed, leading to gradual resource exhaustion, and dangling pointers when deallocation occurs before all references are cleared, potentially causing undefined behavior or crashes.[132]
Historically, manual memory management gained prominence through the C programming language, developed by Dennis Ritchie at Bell Labs from 1971 to 1973 as an evolution of the B language for implementing the Unix operating system, where explicit control was essential for efficiency in early computing environments.[133]
Examples of languages employing manual memory management include C, which relies on its standard library for core allocation primitives; C++, which builds on C's model but introduces RAII to automate cleanup via destructors at scope exit, reducing error-prone manual calls; and Rust, which uses an ownership and borrowing system enforced at compile time to ensure safe manual handling, restricting direct pointer manipulation to unsafe blocks.[129][134][135]
This paradigm excels in embedded systems, where manual control guarantees deterministic timing and avoids the unpredictable pauses of automatic reclamation, enabling deployment on hardware with severe memory and power constraints.[131] Rust's borrow checker, a core feature since the language's inaugural 0.1 release in 2012, further innovates by delivering memory safety guarantees through static analysis, eliminating common pitfalls like data races without incurring garbage collection overhead.
Domain-Specific Languages
Query languages
Query languages are specialized programming languages designed for retrieving, manipulating, and managing data within databases and information systems, particularly emphasizing operations such as data selection, joins, and aggregation.[136] These languages enable users to specify desired data outcomes without detailing the procedural steps for retrieval, making them essential for interacting with structured or semi-structured data stores.[137] Key characteristics of query languages include their declarative syntax, which allows users to describe what data is needed rather than how to compute it; support for set-based operations that treat data as collections or relations for efficient processing; and reliance on query planners or optimizers to automatically generate efficient execution plans, often using techniques like indexing and cost-based analysis to minimize resource use.[136][138] This declarative nature aligns with broader declarative programming paradigms, where the focus is on the logic of the query rather than implementation details.[136] The historical development of query languages traces back to the 1960s with efforts like the CODASYL Data Base Task Group, which defined the network data model and influenced early database query approaches, paving the way for Edgar F. Codd's relational model introduced in 1970.[138] This relational foundation led to the creation of Structured Query Language (SQL), which was standardized by the American National Standards Institute (ANSI) in 1986 and by the International Organization for Standardization (ISO) in 1987 as ISO/IEC 9075.[138] The standard has been revised multiple times, with the latest version, SQL:2023, published in 2023, introducing features like enhanced JSON support and property graph queries.[139] Prominent examples include SQL, the ISO-standardized language for relational databases that supports declarative queries for selection, projection, and joins across tables.[139] Cypher, developed in 2011 for Neo4j graph databases, uses pattern-matching syntax to perform graph-specific queries like traversal and relationship filtering.[140] Another is Multidimensional Expressions (MDX), introduced in 1997 as part of Microsoft's OLE DB for OLAP specification, which enables queries on multidimensional data cubes for analytical processing, including slicing, dicing, and aggregations over hierarchies. A notable evolution beyond traditional relational models is seen in NoSQL query approaches, such as MongoDB's JSON-like query syntax introduced with its 2009 release, which allows flexible, document-based retrieval and manipulation in distributed, schema-less environments, or Apache Cassandra's Cassandra Query Language (CQL), introduced in 2010 for scalable NoSQL databases.[141][142]Hardware description languages
Hardware description languages (HDLs) are specialized computer languages designed to model the structure and behavior of digital and analog electronic circuits at various levels of abstraction, from behavioral descriptions to register-transfer levels and gate-level implementations.[143] These languages enable engineers to specify hardware designs textually, facilitating simulation, verification, and synthesis into physical implementations such as ASICs or FPGAs.[144] Key characteristics of HDLs include support for concurrent signal assignments, which model parallel hardware operations; explicit timing models to capture delays and synchronization; and synthesis capabilities that translate high-level descriptions into optimized gate-level netlists.[145] The development of HDLs emerged in the 1980s to address the growing complexity of integrated circuit design. VHDL (VHSIC Hardware Description Language) was initiated in 1981 by the U.S. Department of Defense under the Very High Speed Integrated Circuit (VHSIC) program and standardized as IEEE 1076 in 1987, becoming a key DoD standard for reusable digital designs.[146] Verilog, introduced commercially in 1984 by Gateway Design Automation as a proprietary simulation language, gained widespread adoption and was standardized as IEEE 1364 in 1995.[147] Prominent examples include VHDL, which supports both digital and mixed-signal (analog-digital) modeling through extensions like VHDL-AMS; Verilog and its extension SystemVerilog (standardized as IEEE 1800 in 2005), which enhance simulation and synthesis with object-oriented features and advanced verification constructs; Chisel, an open-source HDL developed at UC Berkeley in 2012 as a Scala-embedded domain-specific language for generating highly parameterized hardware via high-level synthesis; and SpinalHDL, an open-source HDL introduced in 2015 that uses Scala for more expressive and scalable hardware designs.[148][149][150][151] HDLs are essential for programming field-programmable gate arrays (FPGAs), allowing reconfigurable hardware deployment. Bluespec, originating from MIT research and commercialized in 2003, introduces guarded atomic actions—concise rules with enabling conditions—to improve modularity and formal verification in hardware designs.[152]Constraint programming languages
Constraint programming languages enable the declarative specification of problems involving variables with finite domains and relations (constraints) between them, where solutions are found through systematic search or constraint propagation techniques.[153] These languages focus on combinatorial optimization and satisfaction problems, allowing users to model constraints without prescribing algorithmic steps for resolution.[154] Key characteristics include the use of backtracking search to explore solution spaces by incrementally assigning values to variables and retracting inconsistent choices, combined with constraint propagation to prune impossible values from variable domains early in the process.[155] Arc consistency, a common propagation method, ensures that for every value in a variable's domain, there exists a compatible value in the domains of related variables, reducing search effort.[156] These languages often integrate with logic programming paradigms, extending unification with domain-specific constraint solvers for numeric or symbolic domains.[153] The roots of constraint programming trace to artificial intelligence planning efforts in the 1970s, where early constraint satisfaction techniques addressed scene labeling and resource allocation.[157] Significant advancement occurred with the development of CHIP (Constraint Handling in Prolog) in 1983, an early constraint logic programming system that combined Prolog's logic with arithmetic and Boolean constraint solving.[158] Prominent examples include MiniZinc, introduced in 2007 as a high-level modeling language that compiles to flat constraint representations for various solvers, facilitating portable problem descriptions.[159] Oz, conceived in 1991, is a multiparadigm language that embeds constraint programming within functional, imperative, and concurrent features, supporting distributed constraint solving.[160] Gecode, released in 2005, is an open-source C++ library providing efficient propagators and search engines for custom constraint applications.[161] ECLiPSe, originating in 1989 at the European Computer-Industry Research Centre, pioneered commercial constraint toolkits by extending Prolog with modular solvers for scheduling and optimization. Another example is Google's OR-Tools, released in 2010, which includes a CP-SAT solver for advanced constraint programming and optimization, widely used in industry as of 2025.[162][163] Constraint programming languages find extensive use in scheduling applications, such as job-shop and resource allocation, where propagation efficiently handles temporal and precedence constraints to generate feasible timetables.[164]Specialized and Niche Languages
Esoteric languages
Esoteric programming languages, also known as esolangs, are computer programming languages intentionally designed not for practical software development but to explore unconventional ideas, serve as thought experiments, art pieces, or jokes within the programming community.[165] These languages challenge traditional notions of usability and clarity by emphasizing difficulty, humor, or theoretical abstraction over efficiency or readability.[166] Key characteristics of esoteric languages include minimalistic or obfuscated syntax, non-standard computational paradigms, and features that deliberately hinder programming, such as self-modifying code or unusual data models. For instance, many employ Turing-complete mechanisms in bizarre forms, like tape-based memory manipulation, to demonstrate computational universality while prioritizing whimsy over practicality.[166] This focus on extremity often results in languages that are theoretically powerful but extremely challenging to use, fostering creativity in language design boundaries.[167] The history of esoteric languages traces back to the early 1970s, with INTERCAL (Compiler Language With No Pronounceable Acronym) created in 1972 by Donald R. Woods and James M. Lyon at Princeton University as a satirical parody of contemporary programming languages like FORTRAN.[168] The genre gained momentum in the 1990s with the rise of personal computing and online sharing; Brainfuck, invented in 1993 by Urban Müller, exemplifies this era with its extreme minimalism, consisting of just eight commands to manipulate an infinite tape of cells, originally aimed at creating the smallest possible compiler for the Amiga OS.[169] The esoteric languages community expanded significantly in the 2000s through internet forums and archives, enabling collaborative experimentation and a shared appreciation for these unconventional designs.[170] Notable examples include Malbolge, developed in 1998 by Ben Olmstead, which employs a trinary (base-3) machine model and code obfuscation techniques—such as crazy operations that modify the program itself—to make writing valid programs extraordinarily difficult, with the first "Hello, World!" program emerging only after two years via automated search.[167] Befunge, created in 1993 by Chris Pressey, introduces a two-dimensional execution model where the instruction pointer moves on a grid, allowing programs to alter their own flow in non-linear paths, which has inspired code golfing contests and variants exploring multidimensional computation.[171] These languages, while overlapping with educational tools in demonstrating core concepts like Turing completeness, prioritize artistic challenge over pedagogical accessibility.[166]Educational programming languages
Educational programming languages are designed to introduce programming concepts to beginners, particularly children and novice learners, through simplified syntax that emphasizes fundamental ideas such as variables, loops, conditionals, and procedures without the complexities of professional languages. These languages prioritize accessibility by incorporating visual elements like drag-and-drop blocks or graphical interfaces, which reduce syntax errors and encourage experimentation, while maintaining a limited scope to focus on core computational thinking rather than advanced features. Key characteristics include error-friendly design that provides immediate feedback and forgiving mechanisms to build confidence, as well as support for creative expression to engage users in meaningful projects.[172][173] The historical roots of educational programming languages trace back to Logo, developed in 1967 by Seymour Papert, Wallace Feurzeig, and Cynthia Solomon at the MIT Artificial Intelligence Laboratory as a tool for mathematical and computational education. Logo introduced turtle graphics, where users command a virtual "turtle" to draw shapes and patterns on screen, providing immediate visual feedback to illustrate concepts like sequencing and iteration in an engaging way suitable for schoolchildren. This approach revolutionized early computing education by promoting constructivist learning, where students actively build knowledge through hands-on exploration.[174][175] Subsequent examples built on these foundations to further enhance accessibility. Scratch, launched in 2007 by the MIT Media Lab, is a block-based visual language that allows users to create interactive stories, games, and animations by snapping colorful blocks together, fostering creativity and sharing within an online community. Alice, originating in 1995 at Carnegie Mellon University under Randy Pausch's leadership, uses a drag-and-drop interface for 3D animations, enabling beginners to program virtual worlds and narratives to learn object-oriented principles without text-based coding. Blockly, released by Google in 2012 and now maintained by the Raspberry Pi Foundation since November 2025, extends this model as a web-based library for building custom visual editors that generate code in languages like JavaScript or Python, powering tools like MIT App Inventor and influencing no-code platforms by enabling intuitive, browser-accessible learning experiences for millions.[176][177][178][179]Scripting languages
Scripting languages are high-level programming languages designed primarily for automating tasks, such as gluing together existing system components, manipulating files, or handling web-related operations, often without the need for compilation.[180] These languages emphasize ease of use in integrating disparate software tools and performing quick, ad-hoc computations, making them ideal for system administration, data processing, and web scripting.[181] Key characteristics of scripting languages include their interpreted execution model, which allows for rapid prototyping and immediate feedback during development, as code is processed line-by-line at runtime rather than compiled beforehand.[182] They are typically cross-platform, enabling scripts to run on multiple operating systems with minimal modifications, and support dynamic typing to facilitate quick variable handling and reduce boilerplate code.[183] This interpreted nature contributes to their versatility in automating repetitive tasks efficiently.[184] The history of scripting languages traces back to the late 1970s with the development of awk in 1977 at AT&T Bell Laboratories, initially created as a tool for text processing and pattern scanning in Unix environments.[185] A significant advancement came in 1987 with the release of Perl by Larry Wall, designed as a Unix scripting language to simplify report processing and system administration tasks by combining features from awk, sed, and other utilities.[186] Prominent examples include Perl, renowned for its powerful text-processing capabilities through regular expressions and string manipulation, which made it a staple for system administrators handling log files and data extraction.[186] PHP, first developed in 1994 by Rasmus Lerdorf as a set of CGI binaries for personal web page tools and publicly released in 1995, became widely adopted for server-side web development, embedding scripts directly into HTML to generate dynamic content.[187] Bash, released in 1989 by Brian Fox for the GNU Project, serves as a shell scripting language that extends command-line interactions, allowing users to automate sequences of Unix commands for file management and process control.[188] A notable evolution in scripting occurred with the introduction of Node.js in 2009 by Ryan Dahl, which enabled JavaScript—traditionally a client-side language—to dominate server-side scripting through its event-driven, non-blocking I/O model, facilitating scalable web applications and full-stack development with a single language.[189]Syntax and Structure Categories
Curly bracket languages
Curly bracket languages, also known as C-style languages, constitute a syntactic family in programming where curly braces{} are used to delimit code blocks, functions, and scopes, providing explicit boundaries for structured control flow and variable visibility. This approach contrasts with keyword-based or indentation-sensitive delimitation in other language families, allowing for flexible code organization within nested structures. The syntax emphasizes readability through visual separation of logical units, though it requires careful matching of opening and closing braces to avoid errors.[190][191]
The progenitor of this family is the C programming language, developed in the early 1970s by Dennis Ritchie at Bell Laboratories to support the implementation of the Unix operating system on resource-constrained hardware. Evolving from earlier languages like BCPL and B, C adopted curly braces for block delimitation to enable compact yet expressive code suitable for systems programming, marking a shift toward portable, low-level control without sacrificing abstraction. By 1978, the publication of "The C Programming Language" by Brian Kernighan and Ritchie standardized this syntax, cementing its influence.[190][191]
Key characteristics of curly bracket languages include semicolon (;) termination of statements to separate executable units, algebraic notation for expressions (e.g., a + b * c), and a defined operator precedence hierarchy that dictates evaluation order without mandatory parentheses for most operations. For instance, multiplicative operators like * and / bind tighter than additive ones like + and -, promoting concise mathematical expressions while requiring awareness of associativity rules. These features facilitate imperative programming paradigms, where sequential execution and mutable state are central, though many languages in this family have evolved to support object-oriented or functional elements.[191]
Prominent examples include C itself, which remains foundational for systems and embedded software; Java, which inherits C's syntax for platform-independent applications; JavaScript, widely used for web development with dynamic typing; and PHP, designed for server-side scripting in dynamic web content. This family's syntax has profoundly shaped modern programming, with over 70 languages adopting similar conventions due to C's ubiquity in education and industry.[190]
While curly bracket languages enable concise and powerful expression through explicit scoping, they often spark debates over stylistic conventions, such as brace placement—exemplified by the K&R style (where the opening brace follows the control statement on the same line) versus the Allman style (where it appears on a new line for vertical alignment). These variations, while functionally equivalent, influence code readability and maintainability in collaborative environments.[192][193]
Off-side rule languages
Off-side rule languages employ a syntactic convention where indentation, rather than explicit delimiters such as braces or end keywords, delineates the scope and hierarchy of code blocks. This approach, known as the off-side rule, treats leading whitespace as semantically significant, ensuring that the visual alignment of code directly reflects its logical structure.[194] The rule originates from Peter Landin's 1966 description of ISWIM, a conceptual language where any token positioned to the left of the first token on the preceding line signals the end of a block, promoting a layout that mirrors mathematical notation.[194] The off-side rule gained prominence in subsequent languages, beginning with its adoption in the ALGOL 68 report published in 1968, which integrated indentation sensitivity alongside reversed keywords for block closure to enhance readability in procedural code.[195] It was further popularized in the early 1990s through Python, released in 1991 by Guido van Rossum, who drew inspiration from the ABC language's use of indentation to enforce clean, outline-like structure without punctuation overhead.[196] Haskell's 1990 language report introduced a refined "layout rule" variant, allowing implicit semicolons and braces based on column positions relative to a reference indent, which integrates seamlessly with its lazy evaluation model to produce concise functional expressions.[46] Key characteristics of off-side rule languages include an emphasis on readability through enforced visual hierarchy, where inconsistent indentation triggers parse errors to maintain discipline.[197] This design reduces visual clutter from delimiters, fostering code that resembles structured prose or outlines, but it demands precise whitespace handling by parsers, often tracking a "reference column" to detect off-side tokens that close blocks.[198] Unlike delimiter-based systems, these languages prioritize human intuition in layout, though they can complicate automated formatting if indentation levels vary.[197] Prominent examples illustrate the rule's versatility across paradigms. Python mandates consistent indentation (typically four spaces) for blocks in control structures and functions, eliminating the need for braces and enabling succinct scripts.[199] Haskell's layout rule applies selectively to contexts likelet bindings and do notation, combining with lazy evaluation to yield elegant, indentation-driven functional code without explicit terminators.[46] In data serialization, YAML (first specified in 2001) uses indentation to denote nested mappings and sequences, treating increased whitespace as scope entry and alignment or reduction as boundaries, which supports human-readable configuration files.[200]
Visual languages
Visual programming languages (VPLs) are programming paradigms that enable users to construct programs through graphical notations, such as blocks, diagrams, icons, or flowcharts, rather than textual syntax. These languages leverage visual elements to represent code constructs like variables, loops, and functions, allowing programmers to manipulate them spatially on a canvas. [201] This approach draws on human cognitive strengths in spatial reasoning and pattern recognition, making complex logic more intuitive to design and debug. [202] Key characteristics of visual languages include drag-and-drop interfaces for assembling components, which minimize syntax errors common in text-based coding, and support for direct manipulation of visual artifacts to simulate program execution. These features make VPLs particularly suitable for non-programmers, domain experts in fields like engineering or design, and rapid prototyping scenarios where iteration speed is critical. By replacing linear text with multidimensional representations, VPLs facilitate parallel expression of control flow, data dependencies, and user interfaces in a single view. [201] The historical roots of visual programming trace back to early experiments in human-computer interaction, with GRAIL (Graphical Input Language), developed by RAND Corporation in 1969, serving as one of the first systems where users drew flowcharts and icons on a tablet to generate code for graphical displays. [203] A landmark advancement came in 1986 with LabVIEW (Laboratory Virtual Instrument Engineering Workbench), created by National Instruments, which introduced dataflow-based graphical programming for test and measurement applications in engineering; programs are built by wiring functional nodes together to represent signal processing and control logic. [204] LabVIEW's G programming language uses icons and wires to model execution as parallel data propagation, significantly reducing development time for hardware-integrated systems. [205] Prominent modern examples illustrate the versatility of visual languages across domains. Scratch, developed by the MIT Media Lab and released in 2007, employs interlocking puzzle-like blocks to teach computational thinking, where users drag categories like motion, sound, and control to build interactive stories, games, and animations without typing code. [206] Node-RED, launched in 2013 by IBM's Emerging Technology Services, is a flow-based tool for Internet of Things (IoT) applications, allowing users to connect nodes visually to integrate hardware devices, APIs, and services through event-driven wiring. [207] In game development, Unreal Engine's Blueprints system, introduced with Unreal Engine 4 in 2014, provides node-based visual scripting that integrates seamlessly with C++ code, enabling designers to prototype gameplay mechanics like AI behaviors and user interactions without compiling text. [208] This hybrid capability has made Blueprints a staple for iterative content creation in professional studios.Historical and Low-Level Languages
Assembly languages
Assembly languages are low-level programming languages that provide a symbolic representation of a processor's machine code instructions, using mnemonics to stand in for binary opcodes and allowing programmers to write code that maps nearly one-to-one with the underlying hardware operations. This direct correspondence enables precise control over the computer's execution but requires knowledge of the target architecture's instruction set.[209] Key characteristics of assembly languages include their strong dependence on specific processor architectures, such as the arrangement of CPU registers for temporary data storage and manipulation, and instructions that facilitate direct access to memory locations for loading, storing, or modifying data. Unlike higher-level languages, assembly code typically involves explicit management of registers—small, high-speed storage units within the CPU—and memory addressing modes to optimize performance by minimizing data movement between memory and registers. These features make assembly languages ideal for tasks requiring fine-grained hardware interaction, though they demand architecture-specific adaptations for portability.[210][211][212] The origins of assembly languages trace back to the late 1940s, with one of the earliest implementations appearing in 1949 for the EDSAC computer at the University of Cambridge, where an assembler using single-letter mnemonics simplified programming the machine's initial orders. By the 1950s, assemblers had become widespread across early computers, evolving from manual binary coding to automated tools that translated mnemonic-based source code into executable machine instructions, significantly improving programmer productivity.[213][209] Prominent examples include x86 assembly, which uses Intel syntax and dominates desktop and server computing due to its complex instruction set architecture (CISC) supporting backward compatibility. ARM assembly, with its reduced instruction set computing (RISC) design and efficient power usage, is prevalent in mobile devices and embedded systems. MIPS assembly serves as an educational staple for teaching RISC principles, featuring simple, fixed-length instructions that highlight clean hardware-software mapping.[214][215][216] A distinctive application of assembly languages is their integration via inline assembly in languages like C++, where developers embed short assembly snippets directly within higher-level code to achieve performance optimizations, particularly in resource-constrained embedded systems. This technique allows mixing low-level precision for critical sections, such as interrupt handling or signal processing, while leveraging the compiler's management of the surrounding code. Assembly languages thus serve as a human-readable abstraction over machine code, bridging binary instructions with programmable control.[217][218]Machine languages
Machine languages consist of raw binary code that is directly executed by a computer's central processing unit (CPU) without interpretation or translation by additional software layers. These languages represent the lowest level of programming, where instructions are encoded as sequences of bits that the hardware fetches, decodes, and executes to perform computations.[219][219] Key characteristics of machine languages include the use of opcodes, which specify the operation to be performed, and operands, which provide the data or addresses involved in that operation. They are inherently dependent on the underlying instruction set architecture (ISA), such as x86 or ARM, meaning the same binary sequence may execute different behaviors—or fail entirely—on different processor architectures. For instance, in the x86 ISA originating from the Intel 8086, instructions are variable-length, typically 1 to several bytes, with the opcode occupying the initial bits followed by operand specifiers.[220][219][221] Historically, machine languages were the primary means of programming early computers, as seen with the ENIAC, completed in 1945, which relied on physical switches and patch cables to set binary instructions for its operations. High-level programming languages did not emerge until the 1950s, with Fortran developed between 1954 and 1957 as one of the first.[222][223] Representative examples illustrate the binary nature of machine languages across architectures. For the Intel 8086, the instruction to move the immediate value 42 (decimal) into the AL register is encoded as the binary10110000 00101010 (hex B0 2A), where the first byte is the opcode for MOV to AL and the second byte is the operand. In ARM architecture, the instruction to add the contents of R1 and R2 and store the result in R0 is 1110 0000 0100 0001 0000 0000 0000 0010 (hex E0810002), with bits allocated for condition codes, opcode, and register operands in a fixed 32-bit format. For the PDP-11, the instruction to clear register R0 is 000000 000101 000000 000000 in binary (octal 005000, hex 0050), using a 16-bit word where the opcode specifies the clear operation and the operand selects the register. Machine code offers the ultimate performance by executing directly on hardware but is highly architecture-specific, which severely limits code portability across different systems.[224][220][225][219]
System languages
System programming languages are designed for developing low-level software components, such as operating system kernels, device drivers, and other performance-critical systems that directly interface with hardware. These languages prioritize efficiency and control over hardware resources, enabling the construction of foundational software that manages computing infrastructure and provides services to higher-level applications.[226][227] Key characteristics of system programming languages include support for direct hardware access through features like pointers and manual memory management, a minimal runtime environment to reduce overhead, and mechanisms to address portability across different architectures. They often emphasize compile-time optimizations and low-level abstractions to achieve high performance while maintaining sufficient safety and expressiveness for complex system tasks.[133][228] Historically, the development of system programming languages traces back to the late 1960s at Bell Labs, where Ken Thompson created the B language in 1969 as a simplified derivative of BCPL for implementing early versions of the Unix operating system on PDP-7 hardware. This evolved into the C language in the early 1970s, led by Dennis Ritchie, which became the standard for Unix kernel development due to its balance of portability and efficiency.[229][133] Prominent examples include C, widely used for the Unix kernel and its derivatives, providing unstructured control flow and direct memory manipulation for optimal hardware interaction. Rust, developed starting in 2006 by Graydon Hoare and sponsored by Mozilla from 2009, introduces memory safety through ownership and borrowing semantics without a garbage collector, making it suitable for safe systems programming in kernels and embedded applications.[133][228][109] D, developed starting in 2001 by Walter Bright at Digital Mars as a successor to C++, combines C-like syntax with modern features like contract programming and garbage collection options for building reliable system software. Additionally, Zig, created in 2016 by Andrew Kelley, emphasizes explicit control flow and seamless cross-compilation without hidden dependencies, facilitating development of embedded operating systems and portable low-level code.[230][231]Embeddable and Extension Languages
Embeddable languages
Embeddable languages are programming languages whose interpreters or compilers are designed to be integrated directly into host applications, providing mechanisms for extensibility, customization, and runtime scripting without requiring recompilation of the host code.[232] These languages typically feature a compact implementation that allows developers to embed the language's runtime environment within larger systems written in languages like C or C++.[233] Key characteristics of embeddable languages include the provision of a C API for bidirectional communication between the host application and the embedded scripts, lightweight parsers to reduce memory and performance overhead, and facilities for configuration scripting that enable dynamic behavior modification.[233] For instance, the API often exposes host functions to scripts while allowing scripts to register callbacks or extend core functionality, ensuring tight integration and efficient resource use.[232] In historical context, Tcl was created in 1988 by John Ousterhout at the University of California, Berkeley, originally as a tool command language for integrated circuit design applications, evolving to support embedding in various systems including graphical user interfaces via the Tk toolkit.[234] Lua followed in 1993, developed at the Pontifical Catholic University of Rio de Janeiro (PUC-Rio) in Brazil by a team at Tecgraf, aimed at extending software applications to address increasing customization needs in embedded scenarios.[233][235] Representative examples illustrate their practical application: Lua is embedded in Redis, where its interpreter enables server-side scripting for atomic, multi-key operations that minimize network roundtrips.[236] Tcl/Tk facilitates the integration of cross-platform GUI elements into host programs, allowing developers to script interactive interfaces dynamically. Squirrel, released in 2003 by Alberto Demichelis, functions as an embeddable scripting language optimized for real-time environments like video game engines, with its virtual machine adding minimal overhead to executables.[237][238] Embedding these languages uniquely enables the creation of domain-specific extensions tailored to an application's needs; for example, Lua's integration into World of Warcraft since the game's 2004 launch has empowered users to develop add-ons for UI customization using Lua scripts and XML, supporting a vast ecosystem of community-driven enhancements while enforcing security constraints to protect core gameplay.[239] This approach underscores how embeddable languages bridge general-purpose programming with specialized, runtime adaptability.[233]Extension languages
Extension languages are programming languages specifically designed to enable users to add functionality to existing software applications at build time or runtime, often through scripts or plugins that customize behavior without altering the core codebase.[240] These languages typically provide mechanisms for integrating user-defined extensions seamlessly into the host application, supporting tasks like automation, UI modifications, and feature augmentation.[241] Key characteristics of extension languages include bindings to the host application's application programming interfaces (APIs), which allow scripts to access and manipulate internal components, and modular loading capabilities that permit dynamic incorporation of extensions during execution. This dynamic nature often emphasizes ease of use for end-users, with features like procedural control structures combined with data manipulation tools to create reusable modules.[240] Such design prioritizes extensibility, enabling software to evolve through community contributions rather than vendor-only updates. The historical roots of extension languages trace back to efforts in editor customization, exemplified by Emacs Lisp developed in 1976 for the Emacs text editor at MIT's AI Lab.[242] This Lisp dialect allowed users to redefine commands and add macros incrementally, making Emacs highly adaptable and influencing subsequent extensible systems.[242] Prominent examples include Emacs Lisp, which powers macros and plugins in GNU Emacs for advanced editor customization. AutoLISP, introduced in January 1986 with AutoCAD version 2.18, extends computer-aided design software by enabling Lisp scripts for automating drawing tasks and custom commands.[243] JavaScript functions as an extension language for web browsers, where it drives extensions via content scripts and APIs to modify page interactions and add features like ad blockers or productivity tools.[244] Additionally, Vimscript, debuted in 1991 alongside the Vim editor, facilitates plugin development for text editing; it evolved into Vim9 script with Vim 9.0 in 2022, offering compiled execution for up to 100 times faster performance in extensions.[245]Macro languages
Macro languages provide syntax and mechanisms for metaprogramming, allowing developers to define macros that transform source code into other source code during compilation or preprocessing, thereby automating repetitive patterns and extending language expressiveness.[246] These languages focus on compile-time code generation, distinct from runtime execution, and are integral to systems where custom syntax or boilerplate reduction is needed.[247] The roots of macro languages trace back to the late 1950s with the development of Lisp by John McCarthy, where macros enabled early forms of code expansion to support symbolic computation and list processing.[248] By 1973, the C programming language introduced a dedicated preprocessor with directives like #define, facilitating textual substitution to handle portability and conditional compilation in Unix development.[133] This evolution marked a shift from Lisp's symbolic approach to more procedural, text-based macro systems in systems programming. Key characteristics of macro languages include hygienic macros, which automatically rename bound variables during expansion to avoid unintended capture by surrounding code, thus preventing subtle scoping errors.[249] They often employ pattern matching to identify code structures for transformation and Lisp-style quoting to treat code as data, enabling precise manipulation without immediate evaluation. Subtypes include textual macros, which perform simple string replacements, and application-level macros, which operate on abstract syntax trees for more sophisticated refactoring.[246] Prominent examples include Lisp and Scheme macros, defined via define-syntax for hygienic expansions that support domain-specific syntax extensions.[250] The C preprocessor uses #define for token-based substitutions, such as creating constants or inline functions, though it lacks hygiene and can lead to name clashes.[251] Rust's declarative macros, introduced around version 0.6 in 2012, use macro_rules! for pattern-based code generation, balancing safety with expressiveness in a systems language.[252] In Scala, macros—added in version 2.10—facilitate the creation of domain-specific languages (DSLs) by leveraging the host language's rich type system, often blurring the line between host code and generated output for concise APIs.[253]Miscellaneous Types
Array languages
Array programming languages, also known as vector or multidimensional programming languages, are designed such that operations are applied element-wise to entire arrays or vectors without the need for explicit loops or indexing, treating arrays as first-class data types.[254] This paradigm enables concise expression of computations on multi-dimensional data structures, where functions operate uniformly across array elements to produce results of compatible shapes. A key characteristic of array languages is their support for implicit parallelism, as operations on arrays can be executed concurrently across elements without programmer-specified threading, leveraging the inherent structure of the data for efficient vectorization.[255] They also emphasize rank and dimensional awareness, where the shape (dimensions and size) of arrays determines the behavior of operations, ensuring automatic broadcasting and reshaping as needed.[256] Many array languages adopt APL-like notation, using terse symbols or operators to denote array manipulations, which prioritizes mathematical expressiveness over verbose syntax.[257] The foundational array language, APL, was developed by Kenneth E. Iverson starting in the late 1950s as a mathematical notation and first implemented as a programming language between 1962 and 1966 at IBM, revolutionizing array-based mathematics by providing a notation that directly mirrored linear algebra and multidimensional operations.[258] Iverson's work, formalized in his 1962 book A Programming Language, shifted programming from sequential control flow to declarative array transformations, influencing subsequent languages in scientific and data processing domains.[257] Prominent examples include APL itself, which uses specialized vector notation with symbols like+/ for sum reduction across arrays; J, introduced in 1990 by Iverson and Roger Hui as an ASCII-based successor to APL to enable portable implementation without custom keyboards; and NumPy, a Python library released in version 1.0 in 2006 that embeds array programming capabilities, allowing Python code to mimic array language semantics through functions like numpy.add(a, b) for element-wise addition.[259][254] Another example is K, developed by Arthur Whitney in 1993 as a minimalist array language derived from APL, which emphasizes nested lists for array representation.[260]
Array languages accelerate scientific computing by enabling high-performance numerical operations on large datasets without low-level optimization, and K's design has notably influenced high-speed database query engines, such as kdb+, a column-oriented system used in financial time-series analysis.[255][260]
