Recent from talks
Nothing was collected or created yet.
Bash (Unix shell)
View on WikipediaThis article has multiple issues. Please help improve it or discuss these issues on the talk page. (Learn how and when to remove these messages)
|
| Original author | Brian Fox |
|---|---|
| Developer | Chet Ramey |
| Initial release | 8 June 1989 |
| Stable release | 5.3[1] |
| Repository | |
| Written in | C |
| Operating system | |
| Platform | GNU |
| Available in | Multilingual (gettext) |
| Type | Shell (computing), Unix shell, command language |
| License |
|
| Website | www |
Bash (short for "Bourne Again SHell") is an interactive command interpreter and programming language developed for Unix-like operating systems.
It is designed as a 100% free software alternative for the Bourne shell, `sh`, and other proprietary Unix shells.[7] Bash has gained widespread adoption and is commonly used as the default login shell for numerous Linux distributions.[8]
Created in 1989 by Brian Fox for the GNU Project, it is supported by the Free Software Foundation.[9]
It also supports the execution of commands from files, known as shell scripts, facilitating automation.
The Bash command syntax is a superset of the Bourne shell's syntax, from which all basic features of the Bash syntax were copied. As a result, Bash can execute the vast majority of Bourne shell scripts without modification. Some other ideas were borrowed from the C shell, its successor tcsh, and the Korn Shell. It is available on nearly all modern operating systems, making it a versatile tool in various computing environments.
Definitions
[edit]ASCII, strings and numbers
[edit]The input language to the shell shall be first recognized at the character level.
— "POSIX 1003.1-2024, 2.10.1 Shell Grammar Lexical Conventions". The Open Group Base Specifications Issue 8, IEEE Std 1003.1-2024. The Open Group. Retrieved 25 August 2025.
$ printf '<newline>: <%b>\n' $'\n'
<newline>: <
>
$ printf '<tab>: <%b>\n' $'\t'
<tab>: < >
$ printf '<space>: <%s>\n' " "
<space>: < >
$ printf '<NUL>: <%b>\n' $'\0'
<NUL>: <>
Any series of characters is called a "string," or sometimes a "string literal." In Unix-like operating systems, all characters, printable and non-printing, except for a few such as the null character and forward slash /, can be used in filenames. In addition, all strings are case-sensitive.[10]
Bash, like many other programming languages, uses zero-based numbering.
Control+key combinations
[edit]The Control+key functionality is provided by GNU Readline and is available in interactive mode only. Certain keypress combinations allow a user to operate Bash to use tab completion and to search the command history.
- Tab ↹ - Activate tab completion
- ↑ - Scroll up (ie, backward) in the command history
- ↓ - Scroll down (ie, forward) in the command history
- Ctrl+r - Search the command history
Some keypress combinations also allow a user to operate the terminal emulator in order to move the cursor within the terminal window and to control the emulator program. By default, these keypress combinations in Bash mirror those of Emacs.[11]
Default keybindings for control codes include:
- Ctrl+f - Move the cursor one character to the right
- Ctrl+b - Move the cursor one character to the left
- Alt+f - Move the cursor one word to the right
- Alt+b - Move the cursor one word to the left
- Ctrl+a - Move the cursor to the beginning of the current commandline
- Ctrl+c - Cancels the current command and presents a new prompt
- Ctrl+d - Closes the current Bash instance, possibly also closing the terminal-emulator
- Ctrl+e - Move the cursor to the end of the current commandline
- Ctrl+q - Wake the terminal; buffered keypresses are then processed
- Ctrl+s - Put the terminal to sleep
- Ctrl+w - Remove one word to the left of the cursor
- Ctrl+z - Stop a foregrounded process
Vi keybindings are also available and can be enabled by running set -o vi.[12][13]
Syntax
[edit]When Bash reads a full command line, the complete string is broken down into tokens. "Tokens" are identified using, and separated from each other using metacharacters.
As of Bash 5.3, the 10 metacharacters are the space, tab, and newline, as well as the following characters: |&;()<>
"Blanks" are composed entirely of unquoted metacharacters, "operators" each contain at least one unquoted metacharacter and "words" may not include any unquoted metacharacters.
In practice, Bash breaks down full command strings into tokens or groups of tokens that do contain metacharacters and tokens or groups of tokens that do not contain any metacharacters -- called "words." From there it further breaks words down into more specific, meaningful pieces like command names, variable assignment statements, etc.
The two blanks are space and tab.
Operators
[edit]Control operators perform a control function. They can be either a newline or one of the following: ||, &&, &, ;, ;;, ;&, ;;&, |, |&, (, or ).
Redirection operators redirect the input or output streams. They include <, >, &>, <<, and <<<.
Words
[edit]A word is a sequence of (non-meta-) characters treated as a single unit by the shell. A reserved word is a kind of a word that has a special meaning to the shell.[14] A name is a kind of a word separate from reserved words. Names consist solely of letters, underscores and numbers; which begins with either a letter or an underscore; which, however, may not begin with a number. Names also called identifiers, may be used for naming variables and functions.
Sixteen of the twenty-two "reserved words," which may be characters or words are as follows:
‘!’ ‘[[’ ‘{’ ‘]]’ ‘}’ case in esac for do done if then elif else fi ...
Names may only contain the characters ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789_.
In the following example of a full command string, metacharacters have a comma placed above them, ,, reserved words have a caret placed beneath them, ^, and other tokens have a backtick placed also beneath them, `.
$ #, , , ,, , ,, , $ if echo foo; then bar=abc; fi $ # ^^ ```` ``` ^^^^ ``````` ^^
Subshells
[edit]A "subshell" is an additional instance of the shell which has been initialized by a current instance of the shell. When a "parent" shell creates a subshell, or a "child" shell, an exact copy of the parent's environment information is re-created and becomes the environment of the subshell.
In Bash, in non-arithmetic contexts, one can force the use of a subshell by enclosing a full command string in single parentheses.
$ echo foo foo $ ( echo foo ) foo $
For this simple case, the preceding two commands are equivalent, however, use of subshells can have certain unexpected side effects. There are numerous different forms of syntax which can cause the initialization of a subshell.[clarification needed]
Expansion
[edit]Data structures
[edit]Bash offers variables and arrays as data structures, and though there are numerous kinds of each of these available, the data structures are relatively simple compared to other languages like C or Java.[15] All data is stored in memory as a string.
Beginning a word with a dollar character signifies that the word is the name of a variable or array. Surrounding the dollar / variable name syntax in double quotes is always advised. This practice shields the value(s) held by the parameter(s) from unwanted side effects.[clarification needed]
Wrapping the variable name in curly brackets {} is recommended for readablility and consistency between variables and arrays. When writing variables, curly braces are optional and square brackets would be a syntax error. The parameter names are always on the left side of the equals sign and values are always on the right.
Variables
[edit]A variable is assigned to using the syntax name=value
To use a variable, the syntax $name is used, or ${name}, which expands to the value assigned to the variable.
The latter syntax must be used for certain names to prevent unwanted side effects.
For example, $10 will be parsed as ${1}0, so using ${10} means it will be parsed as intended.
Positional parameters, usually passed to a bash script, are denoted by the variables numbered starting from $0.
Special parameters are signified by punctuation characters.[15]
For example, $@ expands to a list of the first through last positional parameters, "individually requoted, separated by spaces."
Environment variables are signified by all capital letters.
Environment variables include UNIX variables like LESS_SIGUSR1, and Bourne shell variables such as $HOME.[15]
Scripting variables are signified by all lower case letters or CamelCase.
Arrays
[edit]Arrays are data structures which hold multiple values.[16] Arrays have a set of square brackets placed at the end of the variable name and inside the curly braces. When writing arrays, curly braces and square brackets are required.
An array is assigned using the syntax name=( one or more elements ).
It is expanded using ${quux[@]} or ${quux[*]} or ${quux[1]}, depending on the use case.
Each kind of parameter is distinguished by a specific naming convention.[15]
Since Bash 4.0[17], Bash also supports associative arrays.
In this article, examples of variables from this section include ${foo}, PID, PWD, EUID, $$, ${quux} and ${zork} .
Execution
[edit]"Execution" of a given program occurs when a user (or some other program) asks the operating system to act upon the instructions contained in the given program.
By default, Bash reads user code one line at a time, interprets any newline or semi-colon character ; as the end of the current command, and executes commands in sequence.
If an interactive command extends beyond the width of the terminal emulator, it's usually possible to keep typing and the command will wrap around.
To extend a command beyond a newline onto an additional line, it's necessary that the final character of the first line be an unescaped backslash, \, which signals "line continuation."
Bash always finishes parsing and executing one full commandline before moving on to and beginning with the parsing of the next commandline.
$ foo=aa bar=bb quux=cc zork=dd; set -o xtrace $ : "${foo}"; : "${bar}" + : aa + : bb $ : "${quux}" \ > : "${zork}" + : cc : dd $
The first word of a commandline is known as the "command position."
Under UNIX coventionality, the first word of the commandline is always some kind of a command, and the rest of the words in the commandline string are either options for the command, arguments for the options, or some kind of input upon which the command will operate.
"Options" are also called "flags," "switches," or, more formally, "operators."
When Bash attempts to locate a command for execution, the directories it searches are those listed in the $PATH variable and the current working directory.[18]
$ # [COMMAND POSITION] [OPTION] [ARGUMENTS] $ # ,--^ ,------------^ ,----^ $ declare -p USER BASH_VERSION declare -x USER="liveuser" declare -- BASH_VERSION="5.2.37(1)-release" $
Users and PS1
[edit]A user account can be created for either a human or a programmatic user. In Unix-like OS's, there are two kinds of users: "privileged" and "regular." A privileged user, such as "root" or the operating system kernel, is allowed to do anything whatsoever on the machine. Unprivileged users are limited in various ways.
When an interactive shell session waits for user input, by default it prints a particular string of characters to the screen.
In Bash, the value of this waiting-string is held in the shell variable $PS1.
For regular users, a common default value for $PS1 is the dollar character, $.[a]
For the superuser, a common default value is hashtag (#)
$ sudo --login --user root [sudo] password for liveuser: # vim /home/liveuser/names.txt # exit $ grep -e bob ./names.txt grep: ./names.txt: Permission denied
Modes
[edit]Programming paradigm
[edit]Although most users think of the shell as an interactive command interpreter, it is really a programming language in which each statement runs a command. Because it must satisfy both the interactive and programming aspects of command execution, it is a strange language, shaped as much by history as by design.
— Brian W. Kernighan & Rob Pike, Kernighan, Brian W.; Pike, Rob (1984). The UNIX Programming Environment. Englewood Cliffs: Prentice-Hall. ISBN 0-13-937699-2.
Bash was written in C. A modular style can be approximated through good style and careful design.[19] It is often used in an imperative or procedural style.
Interactive and non-interactive modes
[edit]As a command processor, Bash can operate in two modes: interactive or non-interactive. In interactive mode, commands are usually read from a terminal emulator. In non-interactive mode, which facilitates automation, commands are usually read from named files known today as shell scripts. When executed as a standalone command at the command-line interface (CLI), by default Bash opens a new shell in interactive mode.
Scripts
[edit]Shell scripts are text files that contain code, often commands, intended to be read and acted upon by some particular interpreter in a batch process in a non-interactive mode and without any further user interaction. Interpreted scripts are programs that do not require their source code to be compiled: all of the relevant source code is contained within the script. There are many programs which can serve as a script interpreter: Perl, AWK, etc. Interpreted scripts are most often written for Unix shells.
The first two characters of the first line of any (executable) shell script begins with a something called a shebang: literally the characters hashtag (#) and bang (!) side by side.
$ cat ./example.sh #! /bin/env bash echo foo exit $
If a script is intended to be run by a user as a stand-alone program on the commandline, then it is referred to as an "executable." By convention, the filenames of executable unix shell scripts are identified the suffix .sh. The "execute" bit can be enabled on a shell script with the utility chmod:
$ ls -l ./example.sh -rw-r--r--.1 liveuser liveuser 32 Aug 3 22:33 example.sh $ ./example.sh bash: ./example.sh: Permission denied $ chmod 0744 ./example.sh $ ls -l ./example.sh -rwxr--r--.1 liveuser liveuser 32 Aug 3 22:33 example.sh $ ./example.sh foo $
The source builtin
[edit]With the source, or synonymous . command, Bash reads and executes shell commands from any text file by name.[20]
Login and non-login shells
[edit]Bash can be executed as a login shell, or "session leader," in both interactive and non-interactive modes via the --login option.
"Logging in" requires user authentication.
For this reason, only one login shell exists per user session.
In GNU/Linux, a user's login shell is identified in the /etc/passwd file.
$ awk -F ':' '$1 ~ /root/' /etc/passwd root:x:0:0:Super User:/root:/bin/bash
When a human user initiates a login session, this procedure often occurs in a graphical user interface (GUI). When a user opens a terminal emulator, the emulator executes a non-login instance of the user's login shell.
Logging out of a shell session from within a terminal emulator can be accomplished with the exit command or, by default in Bash, pressing Ctrl+d.
Startup source files
[edit]When Bash starts, it uses source to execute commands in a variety of dotfiles (see lists below).[21]
These dotfiles, unlike shell scripts, typically have neither the execute permission enabled nor a hash-bang.
By default Bash will source a somewhat different set of files, and in a different sequence, depending on:[22]
- How bash is called
- interactively, non-interactively, invoked with name
sh
- interactively, non-interactively, invoked with name
- Which options are used
--login,--rcfile,--norc,--posix
- Which environment variables are defined
BASH_ENV,ENV, and
- Which files exist
/etc/profile~/.bash_profile~/.bash_login~/.profile~/.bash_logout, and~/.bashrcamong others.
Of course, any startup file can also execute commands from any other file. Startup files can affect shell behavior, terminal emulators, the X window system and the window manager.
POSIX mode
[edit]The POSIX IEEE 1003.1 standard specifies a common set of definitions that any shell system application (bash, dash, zsh, etc.) may conform to.
Any shell user script (./myscript.sh) written in conformance with POSIX guidelines should be executable by any shell system application that has implemented the POSIX specification.
As a result, there can be a reasonable expectation that POSIX-compliant scripts can be executed with success on any Unix or Unix-like operating systems which implements the POSIX standard (Linux, OpenBSD, Oracle Linux, HP-UX, etc.).
These scripts are considered "portable" as they are and without any further modifications.
The portion of POSIX that applies to shells and command line utilities is a subset of a larger group of POSIX standards that further specify how terminals and terminal emulators aught to function in order to also be considered portable.
When Bash is operating in POSIX mode, fewer features are available but the resulting code can be executed on a greater variety of operating systems.
To enable POSIX mode at the initialization of an interactive shell, Bash can be executed as either sh, bash --posix or bash -o posix.[23]
To cause a script to be initialized in POSIX mode, one would use the either the hashbang #! /bin/env sh or the less portable #!/bin/sh.
When an instance of Bash is operating in POSIX mode, the environment variable $POSIXLY_CORRECT is defined, and the value of the environment variable $SHELLOPTS includes the string posix.
$ declare -p POSIXLY_CORRECT bash: declare: POSIXLY_CORRECT: not found $ sh $ declare -p POSIXLY_CORRECT declare -- POSIXLY_CORRECT="y" $
The full list of features available in Bash which are not specified by POSIX is considerable.[24] Here is a partial list:
- Any arrays other than the array of positional parameters,
$@, are not POSIX - The double bracket extended test construct,
[[...]], is not POSIX[...]andtestare POSIX
- One of the double-parentheses arithmetic-evaluation syntaxes,
((...)), is not POSIX$((...))is POSIX
- Brace expansion,
kernel{,-headers}, is not POSIX - Dynamic scoping of parameters and the
localbuiltin are not POSIX - Process substitution,
<(...), is not POSIX - Certain string-manipulation operations in Parameter Expansions are not POSIX
- Most Bash builtin commands are not POSIX
- The command
enable -sprints the list of Bourne Special Builtins, which are POSIX$ enable -s | wc --lines 16 $ enable | wc --lines 61
- The
enablebuiltin itself is not POSIX - In Bash, in non-POSIX mode, the
.andsourcebuiltins are synonymous- The
.(i.e., 'dot') builtin is POSIX, however - The
sourcebuiltin is not POSIX
- The
- The command
- The
$EPOCHSECONDSand$EPOCHREALTIMEshell variables are not POSIX
System commands which are available in modern Unix-like operating systems, and which are also specified by POSIX may have fewer option flags or fewer relevant environment variables available under POSIX.
Because of these and other differences, modern (version 5) Bash shell scripts are rarely runnable "as-is" under the Bourne or legacy Korn shell interpreters. Scripting with portability in mind is becoming less common as GNU/Linux becomes more widespread.[23][25]
Code that is valid syntax in Bash and yet is not specified by POSIX is called a "bashism." The program checkbashisms can be used to make sure that a script can be executed in Debian Linux without any portability errors.[26] Vidar Holen's shellcheck is another static linter written in Haskell which can parse script syntax for compatibility with any or all of bash, dash, ksh, and Bourne sh.[27] The syntax requirements for each shell are each a little different. For example, Debian's policy allows some extensions in their scripts (as they are in the dash shell),[25] while a script intending to support pre-POSIX Bourne shells, like autoconf's configure, are even more limited in the features they can use.[28]
Other modes
[edit]Restricted mode
[edit]A restricted shell is used to set up an environment more controlled than the standard shell. A restricted shell behaves identically to bash with the exception that numerous actions are disallowed or not performed, including:
- Changing directories with the
cdbuiltin. - Setting or unsetting the values of the
$SHELL,$PATH,$HISTFILE,$ENV, or$BASH_ENVvariables. - Specifying command names containing slashes on the CLI.
- Using absolute pathnames as arguments to the
.,history, orhash -pcommands - Specifying a path search with
. -porcommand -p - Importing function definitions and parsing the value of
$SHELLOPTSfrom the shell environment at startup. - Redirecting output using the
>,>,<>,>&,&>, and>>redirection operators. - Using the
execbuiltin to replace the shell with another command. - Altering shell builtins
Once restricted mode is enabled, it cannot be disabled. These restrictions are enforced after any startup files are read, and it does not apply to shell scripts. Restricted mode is rarely used.
Privileged mode
[edit]- In Bash, "privileged mode" is a rarely used option inherited [citation needed] from the SVR4.2 UNIX System V shell (circa 1992).[29] It can be enabled with
set -pand disabled withset +p.[30] When privileged mode is enabled, the$SHELLOPTSshell variables includes the string, "privileged."
Extended debugging mode
[edit]- Enabled via
bash --debuggerat invocation or viashopt -s extdebugduring either interactive or non-interactive modes. It uses a separate program called bashdb.[31] extdebug is not available in POSIX mode. See documentation for more information. See also § Debugging.
Compatibility modes
[edit]Bash-4.0 introduced the concept of a shell compatibility level, specified as a set of options to the shopt builtin (compat31, compat32, compat40, compat41, and so on). There is only one current compatibility level – each option is mutually exclusive. The compatibility level is intended to allow users to select behavior from previous versions that is incompatible with newer versions while they migrate scripts to use current features and behavior. It's intended to be a temporary solution.[32]
— Bash Reference Manual, 6.12 Shell Compatibility Mode
Observability
[edit]The xtrace option
[edit]When xtrace is enabled, simple debugging content is printed to the terminal.
It can be enabled with set -o xtrace or set -x, and disabled with set +o xtrace, set +x or set -.
These options are also accepted at the commandline and at hash-bangs: #!/bin/bash -x, etc.
$ bash -x
$ echo $(( 2 + 2 ))
+ echo 4
4
$ set -- 1 2 3
$ printf '<%s>\n' "$@"
+ printf '<%s>\n' 1 2 3
<1>
<2>
<3>
$
The xtrace shell setting is specified by POSIX. See also § Debugging.
The verbose option
[edit]The verbose option prints strings to the terminal as they are read, and before any expansions are performed. Rarely used.[33]
Comments
[edit]Comments can be a valuable way of clarifying information or explaining a script or source file to someone else who might not be familiar with the scripter's intentions or context.
Standard comments in Bash are denoted with a hash character: #. Any text to the right of the hash to the end of the line will be ignored. Inline comments are allowed, but hash comments will not print during debugging. See also: § xtrace.
Comments denoted with a colon character, :, originated with the Thompson shell. Any arguments to the right of colon : builtin are ignored. Inline comments are not possible, but colon comments will print during debugging and any parameters will have been expanded. [34]
$ # Define foo $ foo=bar # An inline hash comment occurs on the same line as a command $ set -x $ # A regular comment (no output) $ : "${foo}" + : bar $
Exit codes
[edit]When bash executes commands, exit status codes, also called "return codes," are produced which can offer some insight into the manner in which a program ceased running.
The value of the most recently captured exit code is held within the shell parameter, 'question mark:' $?.
In non-arithmetic contexts, (i.e., most of the time) the numerical or "Boolean" value of "true" is zero (0), and the value of "false" is one (1).
When a system command has executed, the intended meaning of its exit status can most often be found in its man page; usually a zero indicates success and a nonzero exit status indicates some kind of failure condition or partial success.
ping is a well known command with three meaningful exit codes: 0, 1, and 2.
In Bash, within arithmetic contexts, the numerical truth values are reversed: "true" is one and "false" is zero.
An arithmetic context can usually be identified by the syntax ((...)) or $((...)).
If an arithmetic statement evaluates to the integer zero, then the statement is considered "true," and the exit code is one.
If the statement evaluates to any number other than zero the arithmetic statement is "false" and the exit code is zero.
Not all Linux/UNIX commands provide meaningful exit codes beyond zero and one, and there is no standard system for definitions of exit codes in Linux.
$ true; echo "$?" # Exit code means "true" 0 $ false; echo "$?"; echo # Exit code means "false" 1 $ $ bash -c 'exit 99'; printf 'exit-code: %d\n\n' "$?" exit-code: 99 $ $ (( 1 - 1 )); printf '%d\n' "$?" # This exit code means "true" 1 $ (( 1 + 1 )); printf '%d\n' "$?" # ...and this exit code means "false" 0
Job control
[edit]The Bash shell has two modes of execution for commands: batch (asynchronous), and concurrent (synchronous). To execute commands in batch mode (i.e., in sequence) they must be separated by the character ;, or on separate lines:
$ command1; command2 $ command3 $
In this example, when command1 is finished, command2 is executed, and when command2 has completed, command3 will execute. A background execution of command1 can occur using symbol & at the end of an execution command, and process will be executed in background while immediately returning control to the shell and allowing continued execution of commands.
$ command1 & $
Or to have a concurrent execution of command1 and command2, they must be executed in the Bash shell in the following way:
$ command1 & command2 $
In this case command1 is executed in the background, & symbol, returning immediate control to the shell that executes command2 in the foreground. A process can be stopped and control returned to bash by typing Ctrl+z while the process is running in the foreground.[35] A list of all processes, both in the background and stopped, can be achieved by running jobs:
$ jobs [1]- Running command1 & $
In the output, the number in brackets refers to the job id. The plus sign signifies the default process for bg and fg. The text "Running" and "Stopped" refer to the process state. The last string is the command that started the process.
The state of a process can be changed using various commands. The fg command brings a process to the foreground, while bg sets a stopped process running in the background. bg and fg can take a job id as their first argument, to specify the process to act on. Without one, they use the default process, identified by a plus sign in the output of jobs. The kill command can be used to end a process prematurely, by sending it a signal. The job id must be specified after a percent sign:
$ sleep 100 & [1] 4904 $ kill %1 $ jobs [1]+ Terminated sleep 100 $
Job control, also known as "Monitor mode," is enabled by default in interactive shells, and can be disabled with set +m.
Signals
[edit]Signaling is a means of inter-process communication (IPC). Sometimes a commandline process may seem to freeze in the middle of execution. In these instances it may become necessary to identify which process may be blocked and to manually end the offending process.
At an interactive terminal, it is usually sufficient to press Ctrl-c to end the current foreground process and return control back to the user prompt, or to press Ctrl-z to suspend it. Occasionally attempting to suspend a process will succeed when attempts to cancel a process appear unresponsive. In other cases it may be necessary to use the kill program to send an IPC signal. In this example, we use the kill command from a second terminal screen to terminate the process with PID 4331.
$ tty # Terminal one /dev/pts/0 $ whoami liveuser $ sleep 1000 # Command hangs
$ tty # Terminal two /dev/pts/1 $ whoami liveuser $ ps aux | grep -e sleep -e PID USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND liveuser 4331 0.0 0.0 230336 2312 pts/1 S+ 11:19 0:00 sleep 1000 liveuser 4333 0.0 0.0 231248 2516 pts/0 S+ 11:19 0:00 grep --color=auto -e sleep -e PID $ kill 4331 $ ps aux | grep -e sleep -e PID # The sleep process has ended USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND liveuser 4333 0.0 0.0 231248 2516 pts/0 S+ 11:19 0:00 grep --color=auto -e sleep -e PID $
$ tty # Terminal one again /dev/pts/0 $ whoami liveuser $ sleep 1000 Terminated $
In Unix-like operating systems, a user is allowed to instruct the kernel to send a signal to a process that is owned by the user. A regular user may not send a signal to a privileged process. Signals can be sent to a process using the kill builtin or using the system binary of the same name.
$ whoami liveuser $ ps aux | awk '$2 ~ /\<1\>/' # Let\s view some info on the kernel process, process 1. root 1 0.0 0.2 37140 20440 ? Ss 04:44 0:18 /usr/lib/systemd/systemd --switched-root --system --deserialize=53 rhgb $ kill -s SIGKILL 1 bash: kill: (1) - Operation not permitted $ type -a kill kill is a shell builtin kill is /usr/bin/kill $ /usr/bin/kill -s SIGKILL 1 kill: sending signal to 1 failed: Operation not permitted $
The most commonly used signals can be viewed with kill -L | head -n 4.
Each IPC signal is associated with a signal number, but exit codes and signal codes are two different things.
While sending a process an IPC signal of 9 (a "KILL" signal) will almost certainly terminate the process immediately, it will most likely not result in the process returning an exit code of 9.
By default in Bash, builtin kill sends a TERM ("terminate") signal. It's common for commandline utilities to respond to a SIGTERM by shutting down and exiting cleanly. (TERM and SIGTERM are the same, the SIG- prefix to all signal names can be omitted.) The Ctrl-c keypress sequence in Bash sends a SIGINT, interrupt signal, to the foreground process. The Ctrl-z keypress sequence sends the SIGSTOP, stop signal.[36] When a process receives a SIGKILL, the process terminates immediately and messily. It is recommended to use SIGKILL only as a last resort.[37] The SIGKILL signal cannot be blocked or handled.
Processes can "catch" and "handle" IPC signals they receive.
A user can use the kill builtin to "send" an IPC signal to another process.
That target process can set up a mechanism, some plan beforehand, for how to repsond whenever any particular signal might be received, or "caught."
The way a target program responds is referred to as how the program "handles" receiving the signal.
In the man pages one can see how some system commands will print out certain information to the terminal when they receive a SIGHUP: for example, the dd command.[38]
When bash is interactive, in the absence of any traps, it ignores SIGTERM (so that
kill 0does not kill an interactive shell), and catches and handles SIGINT (so that the wait builtin is interruptible). When bash receives SIGINT, it breaks out of any executing loops. In all cases, bash ignores SIGQUIT. If job control is in effect, bash ignores SIGTTIN, SIGTTOU, and SIGTSTP.[39]
— bash(1)
By default Bash shell scripts receive and respond to any and all IPC signals sent to them, however, Bash scripts can utilize the trap builtin to catch and handle signals.[40]
$ cat ./trap-example.sh #! /usr/bin/env bash trap umask EXIT echo bar exit 0 $ chmod 0700 trap-example.sh $ ./trap-example.sh bar 0077 $
There are a few signals which are only available from within Bash as GNU extensions: ERR, EXIT, RETURN and DEBUG. These signals can be useful in debugging, and can only be sent and handled by shell builtins. See also § Debugging.
Values of parameters
[edit]There are many different implementations of echo. Some have the -e option, and some don't.[41]
The list of options is not uniform across implementations, though echo and printf are both specified by POSIX.
If a scripter wishes to know the precise value of a string contained by a variable, then the most consistent way of doing so is to use printf.
For any string containing any character (besides null?) including digits, the format specifier is %s.[citation needed]
$ foo=abc bar=123 $ printf '<%s>\n' "${foo}" "${bar}" <abc> <123> $
For digits only, the format specifier is %d.
$ printf '<%d>\n' "${foo}" "${bar}" bash: printf: abc: invalid number <0> <123> $
With printf, a newline is never included in the output unless the scripter includes a newline in the format string. In the example below, where a newline has been omitted from the format string, the value of PS1 is printed on the same line as the output of the previous command.
$ printf '<%s>' "${foo}" "${bar}" <abc><123>$
Another very consistent method is to use declare -p.
The output of declare -p can be reused as input.
However, not all variables and parameters can be printed using declare -p, for example, the values of the Special Parameters.
The Special Parameter hashtag, "$#", reports how many Positional Parameters are currently defined.
$ declare -p foo bar declare -- foo="abc" declare -- bar="123" $ declare -p "$#" bash: declare: 0: not found $
For a full string of input at an interactive shell...
$ declare -p #
...the hashtag would be interpreted by Bash as an inline comment.
With the comment and all text to the right of it removed, the command that Bash would execute would be declare -p.
This command would, according to help declare, "display the values and attributes of each NAME," i.e., each variable, and, "if no NAMEs are given, display the values and attributes and values of all variables," which can be over 100 lines of output.
On the other hand, printf cannot display variables' attributes. See also § Debugging.
$ readonly foo $ declare -p foo declare -r foo="abc" $ printf '<%s>' "${foo}" <abc> $
Environment
[edit]This article contains instructions or advice. (January 2019) |
Configurable execution environment(s):[42]
- Shell and session startup files such as
~/.bashrcand~/.profile(i.e., dotfiles); - Settings (set built-in) and shell options (shopt built-in) which alter shell behavior;
Shell and session startup Files (a.k.a., "dot files")
When Bash starts, it executes the commands in a variety of dot files.[21]
Unlike Bash shell scripts, dot files typically have neither the execute permission enabled nor an interpreter directive like #!/bin/bash.
- Legacy-compatible Bash startup example
The example ~/.bash_profile below is compatible with the Bourne shell and gives semantics similar to csh for the ~/.bashrc and ~/.bash_login.
The [ -r filename ] && cmd is a short-circuit evaluation that tests if filename exists and is readable, skipping the part after the && if it is not.
[ -r ~/.profile ] && ~/.profile # set up environment, once, Bourne-sh syntax only if [ -n "$PS1" ]; then # are we interactive? [ -r ~/.bashrc ] && ~/.bashrc # tty/prompt/function setup for interactive shells [ -r ~/.bash_login ] && ~/.bash_login # any at-login tasks for login shell only fi # End of "if" block
- Operating system issues in Bash startup
Some versions of Unix and Linux contain Bash system startup scripts, generally under the /etc directory.
Bash executes these files as part of its standard initialization, but other startup files can read them in a different order than the documented Bash startup sequence.
The default content of the root user's files may also have issues, as well as the skeleton files the system provides to new user accounts upon setup.
The startup scripts that launch the X window system may also do surprising things with the user's Bash startup scripts in an attempt to set up user-environment variables before launching the window manager.
These issues can often be addressed using a ~/.xsession or ~/.xprofile file to read the ~/.profile — which provides the environment variables that Bash shell windows spawned from the window manager need, such as xterm or Gnome Terminal.
Standard streams
[edit]Standard streams - STDIN, STDOUT and STDERR
Commands
[edit]System commands
[edit]Aliases
[edit]Aliases allow a string to be substituted for a word that is in a position in the input where it can be the first word of a simple command. Aliases have names and corresponding values that are set and unset using the alias and unalias builtin commands.
Keywords and reversed words
[edit]function- Bash function declarations which include this particular keyword are not compatible with Bourne/Korn/POSIX scripts, however, Bash does accepts the function declaration syntax used by Bourne, Korn and POSIX-compliant shells.
Functions
[edit]Shell functions are a way to group commands for later execution using a single name for the group. They are executed just like a "regular" simple command. When the name of a shell function is used as a simple command name, the shell executes the list of commands associated with that function name. Shell functions are executed in the current shell context; there is no new process created to interpret them.
Builtin commands
[edit]- Various Built-In Commands:
- POSIX Special builtins:[54]
- cd, pwd, etc.
- set[55]
- Xtrace: [
set -x|set -o xtrace]. The shell's primary means of debugging. Both xtrace and verbose can be turned off at the same time with the commandset -. - Verbose: [
set -v|set -o verbose]. Prints a command to the terminal as Bash reads it. Bash reads constructs all at once, such as compound commands which include if-fi and case-esac blocks. If aset -vis included within a compound command, then "verbose" will be enabled the next time Bash reads code as input, i.e., after the end of the currently executing construct.[56] - Both xtrace and verbose can be turned off at the same time with the command
set -.
- Xtrace: [
- shopt[57]
- expand-aliases: On by default in interactive shells. Some developers discourage its use in scripts.
- POSIX Special builtins:[54]
PATH and system commands
[edit]When the shell looks for external commands, it relies on the Bourne shell variable $PATH. $PATH contains a list of directories separated by colons, :.
Beginning with the leftmost directory and selecting directories in a left to right pattern, each directory is searched until a match is found.
In Linux, so that a user can locate additional commands, it's common practice for distribution administrators and package developers to alter the value of an end user's $PATH by including source files in /etc/profile.d and other locations.
When looking for the command, chmod, for instance, after considering internal commands and finding nothing, Bash will search the directories in $PATH and will select the absolute path of the first executable found that has a basename which matches the search string.[18]
If there is more than one command echo available in the directories listed in $PATH, during the process of parsing and executing a commandline, by default only the first command found will be selected.
$PATH lookups are slow.
The shell speeds up the commandline execution process by remembering command locations in a hash table.
To perform a full $PATH search without any interference from the hash table, remove the current table with hash -r and search for all kinds of commands with type -a.
$ # Force a full path search $ PATH=${PATH}:${HOME} $ printf 'echo script_file: "$@"\n' > ./echo $ chmod 0700 ./echo $ hash -r; type -a echo echo is a shell builtin echo is /usr/bin/echo echo is /home/liveuser/echo $
In order to execute a commandline with a command found later in the $PATH string, you can specify an absolute path or you can anchor path resolution relative to the current working directory.
$ /home/liveuser/echo foo script_file: foo $ ./echo bar script_file: bar $
For security reasons it is advisable to make sure the directories in PATH are not world-writeable, or are writeable only by root and trusted users.
Command lookup
[edit]- Command position: after expansions, the first word of the full text of the command line.
- Command name lookup is performed, in the following order:
- Commands internal to the shell:
- Shell aliases,
- Shell reserved words,
- Shell functions, and
- Shell built-in commands;
- Commands external to the shell, using the PATH shell variable:
- Commands internal to the shell:
- The resulting string is executed as a command.
Control structures
[edit]Subshells
[edit]Subshells: (...);
Pipelines
[edit]{{Blockquote
| However, by using a pipeline, they can engage in multiple cycles of computation at the same time, substantially increasing their speed. In a pipelined control unit, different instructions simultaneously go through the process but at different points. While one instruction is being fetched, a second is being decoded, and so forth.
Unix-style pipelines: |.
Logical operators
[edit]- AND (
&&) - OR (
||) - NOT (
!)
Bash supplies "conditional execution" command separators that make execution of a command contingent on the exit code set by a precedent command. For example:
$ cd "$SOMEWHERE" && ./do_something || echo "An error occurred" >&2
Where ./do_something is only executed if the cd (change directory) command was "successful" (returned an exit status of zero) and the echo command would only be executed if either the cd or the ./do_something command return an "error" (non-zero exit status).
Iteration
[edit]ITERATION: Sometimes programs are repeated indefinitely or until a specific outcome is reached. Each execution of the instructions is an "iteration."[58]
- while, until, and select loop compound commands;
- Arithmetic C-style and list-enumerating for loop compound commands; and
- continue, break, return, and exit flow control commands;
Compound commands
[edit]compound: something formed by a union of elements or parts.[59]
— Merriam-Webster's Collegiate Dictionary
Bash also supports if ... fi and case ... esac forms of conditional command evaluation.[c]
Testing
[edit]Built in commands for testing file attributes, comparing string and integer values, etc.:
- Traditional test command,
- Traditional single bracket test:
[, - Modern double bracket test:
[[...]], which includes advanced features:- Extended regular expression and extglob matching
- Lexicographic comparisons with
<and>;
((...))numeric evaluation and testing; this includes almost all "C" language operators for arithmetic and numeric comparison;
For all commands the exit status is stored in the special variable $?.
Regular Expressions
[edit]Bash 3.0 supports in-process regular expression matching using a syntax reminiscent of Perl.[61]
Regexp matching is limited to strings on the right side of the =~ operator in the [[..]] extended test construct.[62]
[[ $line =~ [[:space:]]*(a)?b ]] means values for line like ‘aab’, ‘ aaaaaab’, ‘xaby’, and ‘ ab’ will all match, as will a line containing a ‘b’ anywhere in its value.
Coprocesses
[edit]A coprocess is a shell command preceded by the coproc reserved word. A coprocess is executed asynchronously in a subshell, as if the command had been terminated with the ‘&’ control operator, with a two-way pipe established between the executing shell and the coprocess.[63]
— Bash Reference Manual, 3.2.6 Coprocesses
Data manipulation
[edit]Word Splitting
[edit]Split into words (i.e., word splitting)
Quoting
[edit]When in doubt -- Quote![64]
— Mastering Linux Shell Scripting, by Andrew Mallett
Bash has certain quoting rules: uses of
- single quotes
'...'
- double quotes
"..."
- backslashes
\, and
- ANSI-C quoting
$'...'.
See also § Locales, $"..."
See also backticks `...`: § Deprecated syntax.
Unicode
[edit]Support for Unicode in echo -e and ANSI-C quoting.
Brace Expansion
[edit]$ echo kernel{,-headers} kernel kernel-headers
Brace expansion, also called alternation, is a feature copied from the C shell. It generates a set of alternative combinations.[65] Generated results need not exist as files. The results of each expanded string are not sorted and left to right order is preserved:
$ echo a{p,c,d,b}e ape ace ade abe $ echo {a,b,c}{d,e,f} ad ae af bd be bf cd ce cf
Users should not use brace expansions in portable shell scripts, because the Bourne shell does not produce the same output.
$ # bash shell $/bin/bash -c 'echo a{p,c,d,b}e' ape ace ade abe $ # A traditional shell does not produce the same output $ /bin/sh -c 'echo a{p,c,d,b}e' a{p,c,d,b}e
When brace expansion is combined with wildcards, the braces are expanded first, and then the resulting wildcards are substituted normally. Hence, a listing of JPEG and PNG images in the current directory could be obtained using:
ls *.{jpg,jpeg,png} # expands to *.jpg *.jpeg *.png – after which, # the wildcards are processed echo *.{png,jp{e,}g} # echo just shows the expansions – # and braces in braces are possible.
In addition to alternation, brace expansion can be used for sequential ranges between two integers or characters separated by double dots. Newer versions of Bash allow a third integer to specify the increment.
$ echo {1..10} 1 2 3 4 5 6 7 8 9 10 $ echo {01..10} 01 02 03 04 05 06 07 08 09 10 $ echo file{1..4}.txt file1.txt file2.txt file3.txt file4.txt $ echo {a..e} a b c d e $ echo {1..10..3} 1 4 7 10 $ echo {a..j..3} a d g j
When brace expansion is combined with variable expansion (a.k.a., parameter expansion and parameter substitution) the variable expansion is performed after the brace expansion, which in some cases may necessitate the use of the eval built-in, thus:
$ start=1; end=10 $ echo {$start..$end} # fails to expand due to the evaluation order {1..10} $ eval echo {$start..$end} # variable expansion occurs then resulting string is evaluated 1 2 3 4 5 6 7 8 9 10
Tilde Expansion
[edit]This section is empty. You can help by adding to it. (August 2025) |
Parameter and variable expansion
[edit]- Type
- Shell parameters
- Environment variables
- User variables
- Scope
- Arrays
- Indexed arrays: size is unlimited.
- Associative arrays: via
declare -A[e]
- Parameter Expansion
- Expansion syntaxes which can perform some tasks more quickly than external utilities, including, among others:
- Pattern Substitution
${foo//x/y}forsed ',s/x/y/g'
- Remove Matching Prefix or Suffix Pattern
${bar##[a-zA-Z0-9]*}forcut -c8-,
- Enumerate Array Keys
${!array[@]}, and
- Display Error if Null or Unset
${var:?error message},
Pathname expansion
[edit]Pathname expansion, i.e., shell-style globbing and pattern matching using *, ?, [...].[f]
Locales
[edit]Locale-specific translation via $"..." quoting syntax.[69]
Process redirections and parsing
[edit]Command substitution
[edit]Command substitution: $(...),
Process substitution
[edit]Process substitution, <() or >(), when a system supports it:
Bash supports process substitution using the <(command) and >(command) syntax, which substitutes the output of (or input to) a command where a filename is normally used.
(This is implemented through /proc/fd/ unnamed pipes on systems that support that, or via temporary named pipes where necessary).
Arithmetic expansion
[edit]Arithmetic expansion, ((...)) or $((...)), including
- Integer arithmetic in any base from two to sixty-four, although
- Floating-point arithmetic is not available from within the shell itself (for this functionality, see current versions of bc and awk, among others),
Bash can perform integer calculations ("arithmetic evaluation") without spawning external processes.
It uses the ((...)) command and the $((...)) variable syntax for this purpose.
Redirection
[edit]Redirections of Standard Input, Standard Output and Standard Error data streams are performed, including
- File writing,
>, and appending,>, - Here documents,
<<, - Here strings,
<<<, which allow parameters to be used as input, and - A redirection operator,
>, which can force overwriting of a file when a shell's noclobber setting is enabled;
Its syntax simplifies I/O redirection.
For example, it can redirect standard output (stdout) and standard error (stderr) at the same time using the &> operator.
This is simpler to type than the Bourne shell equivalent 'command > file 2>&1'.
Bash supports here documents.
Since version 2.05b Bash can redirect standard input (stdin) from a "here string" using the <<< operator.
Command parsing
[edit]- (B) Commands are parsed one line at a time:
- Control structures are honored, and
- Backslash
\escapes are also honored at the ends of lines;
- (C) Split into words (i.e., word splitting) according to quoting rules,
- Including ANSI-C quoting
$'...';
- Including ANSI-C quoting
- (D) Seven types of expansions are performed in the following order on the resulting string:
- (Type 1) Brace expansion
kernel{-headers}, - (Type 2) Tilde expansion
~, - In a left-to-right fashion:
- (Type 3) Parameter and variable expansion
$fooor${bar}, including - (Type 4) Command substitution:
$(...), - (Type 5) Process substitution,
<()or>(), when a system supports it: - (Type 6) Arithmetic expansion,
((...))or$((...)), including- Integer arithmetic in any base from two to sixty-four, although
- Floating-point arithmetic is not available from within the shell itself.[g]
- (Type 3) Parameter and variable expansion
- Word splitting (again),
- (Type 7) Pathname expansion, i.e., shell-style globbing and pattern matching using
*,?,[...],[f] - Quote removal;
- (Type 1) Brace expansion
- (E) Redirections of Standard Input, Standard Output and Standard Error data streams are performed, including
- File writing,
>, and appending,>>, - Here documents,
<<, - Here strings,
<<<, which allow parameters to be used as input, and - A redirection operator,
>, which can force overwriting of a file when a shell'snoclobbersetting is enabled;
- File writing,
- (F) Command name lookup is performed, in the following order:
- Commands internal to the shell:
- Shell aliases,
- Shell reserved words,
- Shell functions, and
- Shell built-in commands;
- Commands external to the shell:
- Commands internal to the shell:
- (G) The resulting string is executed as a command.
Interactive-only features
[edit]Command History
[edit]Unlimited size command history.[72] This feature is available in interactive mode only.
Directory stack
[edit]A directory stack (pushd and popd built-ins) feature is available in interactive mode only.
Programmable completion
[edit]Also known as "tab completion" or "command-line completion", when a user presses the Tab ↹, within an interactive command-shell Bash automatically uses any available completion scripts to suggest partly typed program names, filenames and variable names.[73] [4] The Bash command-line completion system is very flexible and customizable, and is often packaged with functions that complete arguments and filenames for specific programs and tasks.
Bash supports programmable completion via built-in complete, compopt, and compgen commands.[74]
The feature has been available since the beta version of 2.04 released in 2000.[75]
These commands enable complex and intelligent completion specification for commands (i.e., installed programs), functions, variables, and filenames.[76]
The complete and compopt two commands specify how arguments of some available commands or options are going to be listed in the readline input.As of version 5.1 completion of the command or the option is usually activated by the Tab keystroke after typing its name.[76]
This feature is available in interactive mode only.
Prompts
[edit]Configurable prompts. This feature is available in interactive mode only.
Documentation
[edit]User Manual
[edit]A user manual for Bash is provided by the GNU Project.
It is sometimes considered to be a more user-friendly document than the man page.
"You may also find information about Bash ...by looking at /usr/share/doc/bash, /usr/local/share/doc/bash, or similar directories on your system."[77]
On GNU/Linux systems, if the info program is available then the GNU Manual version relevant for your installation should also be available at info bash.[78][79]
Man page
[edit]The most recent technical manual, or 'man page', is intended to be the authoritative explanatory technical document for the understanding of how bash operates.
On GNU/Linux systems, the version relevant for your installation is usually available through the man program at man bash.[78][39][80]
help builtin
[edit]With recent versions of Bash, information on shell built-in commands can be found by executing help, help [name of builtin] or man builtins at a terminal prompt where bash is installed.
The printf command can be invoked via env to ensure that you run the program found via your shell's search path, and not a shell alias or built-in function: env printf --help.[81]
POSIX Specification
[edit]For the purpose of allowing inter-operability among different shell programs running on different operating systems, the POSIX Specification influences how modern UNIX-like shells are written. Bash "is intended to be a conformant implementation of the IEEE POSIX "Shell and Utilities" portion of the IEEE POSIX specification (IEEE Standard 1003.1)."[82] The most recent publication of the standard (2024) is available online.[83]
As the standard upon which bash is based, the POSIX Standard, or IEEE Std 1003.1,[84] et seq, is especially informative.
Further resources
[edit]"The project maintainer also has a Bash page which includes Frequently Asked Questions",[77][85][86] this FAQ is current as of bash version 5.1 and is no longer updated.
Informal avenues of support are available via IRC at libera.chat, in the #bash channel, and mailing lists are available at Bash - GNU Project - Free Software Foundation.
Security and vulnerabilities
[edit]Root scripts
[edit]Running any shell scripts as the root user has, for years, been widely criticized as poor security practice. One commonly given reason is that, when a script is executed as root, the negative effects of any bugs in a script would be magnified by root's elevated privileges.
One common example: a script contains the command, rm -rf ${dir}/, but the variable $dir is left undefined.
In Linux, if the script was executed by a regular user, the shell would attempt to execute the command rm -rf / as a regular user, and the command would fail.
However, if the script was executed by the root user, then the command would likely succeed and the filesystem would be erased.
It is recommended to use sudo on a per-command basis instead.
CGI scripts
[edit]CGI scripts are a significant source of vulnerability.[87][88][89][clarification needed]
builtin eval
[edit]"The eval command is extremely powerful and extremely easy to abuse."[90]
Input validation
[edit]"Input validation is the process of ensuring data has undergone data cleansing to confirm it has data quality, that is, that it is both correct and useful."
Input validation is performed to ensure only properly formed data is entering the workflow in an information system, preventing malformed data from persisting in the database and triggering malfunction of various downstream components. Input validation should happen as early as possible in the data flow, preferably as soon as the data is received from the external party.[91]
— OWASP Input Validation Cheat Sheet
Shellshock
[edit]In September 2014, a security bug was discovered[92] in the program. It was dubbed "Shellshock." Public disclosure quickly led to a range of attacks across the Internet.[93][94][95]
Exploitation of the vulnerability could enable arbitrary code execution in CGI scripts executable by certain versions of Bash. The bug involved how Bash passed function definitions to subshells through environment variables.[96] The bug had been present in the source code since August 1989 (version 1.03)[97] and was patched in September 2014 (version 4.3).
Patches to fix the bugs were made available soon after the bugs were identified. Upgrading to a current version is strongly advised.
It was assigned the Common Vulnerability identifiers CVE-2014-6271, CVE-2014-6277 and CVE-2014-7169, among others. Under CVSS Metrics 2.x and 3.x, the bug is regarded as "high" and "critical", respectively.
Deprecated syntax
[edit]- Backtick style command substitutions:
`...`is deprecated in favor of$(...);
- Use of
-aor-ointest/[/[[commands,- for example,
[ -r ./file -a ! -l ./file ]is deprecated in favor of[ -r ./file ] && ! [ -l ./file ];
- for example,
- Use of the arithmetic syntax
$[...]is deprecated in favor of$((...))or((...)), as appropriate;
- Use of
^as a pipeline is deprecated in favor of|; - Any uses of expr or let.
Debugging
[edit]Table of Features
[edit]| Feature | POSIX 2024 | Description | Bash ver. | ||
|---|---|---|---|---|---|
| Grammar type | Formal name | Syntax | |||
| Special Built-In Utility | set / xtrace | set -x
|
Yes | The shell's primary means of debugging.
It "writes to standard error a trace for each command after it expands the command and before it executes it." |
? |
| Special Parameters | Exit Status | "$?"
|
Yes | "Expands to the shortest representation of the decimal exit status." | ? |
| Parameter Expansions | Indicate Null or Unset | "${parameter:?[word]}"
|
Yes | "Where the expansion of [word], perhaps an error message or a line number, is written to standard error and the shell exits with a non-zero exit code."
|
? |
| Special Parameters | PID of Invoked Shell | "$$"
|
Yes | "Expands to the shortest representation of the decimal process ID of the invoked shell." | ? |
| Special Built-In Utility | set / verbose | set -v
|
Yes | "Writes its input to standard error as it is read." | ? |
| Special Built-In Utility | set / pipefail | set -o pipefail
|
Yes | "Derive the exit status of a pipeline from the exit statuses of all of the commands in the pipeline, not just the last (rightmost) command." | ? |
| Special Built-In Utility | set / nounset | set -u
|
Yes | When enabled, will cause the shell to exit with an error message when it encounters an unset variable expansion.
Its use has a number of counter-intuitive pitfalls. |
? |
| Special Built-In Utility | set / errexit | set -e
|
Yes | Errexit is a setting that, when enabled, will, under certain very specific conditions, cause the shell to exit without an error message whenever the shell receives a non-zero exit code.
Its use is somewhat controversial, to the extent that any somewhat obscure computer program can be controversial. Adherents claim that Errexit provides an assurance of verifiability in situations where shell scripts "must not fail." However, opponents claim that its use is unreliable, deceptively simple, highly counter-intuitive, rife with gotchas and pitfalls, and in essence "security theater." Numerous developers of Bash have strongly discouraged the use of this particular setting. |
? |
| Special Built-In Utility | trap / EXIT | trap '[arg]' EXIT
|
Yes | "If a signal specifier is 0 or EXIT, [arg] is executed when the shell exits." If [arg] contains expansions, then [arg] should be in single quotes.
|
? |
| Utility | printf | printf '<%s>\n' "${var}"
|
Yes | A means of reliably printing the contents of a variable. | ? |
| Bash Variables | BASHPID | "${BASHPID}"
|
No | "Expands to the process ID of the current bash process."[99] | ? |
| Bash Variables | BASH_ARGC | "${BASH_ARGC[@]}"
|
No | "An array variable whose values are the number of parameters in each frame of the current bash execution call stack."[100] | ? |
| Bash Variables | BASH_ARGV | "${BASH_ARGV[@]}"
|
No | "An array variable containing all of the parameters in the current bash execution call stack."[101] | ? |
| Bash Variables | BASH_LINENO | "${BASH_LINENO[@]}"
|
No | "An array variable whose members are the line numbers in source files where each corresponding member of "${FUNCNAME[@]}" was invoked."[102]
|
? |
| Bash Variables | BASH_REMATCH | "${BASH_REMATCH[@]}"
|
No | "An array variable whose members are assigned by the =~ binary operator to the [[ conditional command."[103]
|
? |
| Bash Variables | BASH_SOURCE | "${BASH_SOURCE[@]}"
|
No | "An array variable whose members are the source filenames where the corresponding shell function names in the "${FUNCNAME[@]}" array variable are defined."[104]
|
? |
| Bash Variables | BASH_XTRACEFD | "${BASH_XTRACEFD}"
|
No | "If set to an integer corresponding to a valid file descriptor, Bash will write the trace output generated when set -x is enabled to that file descriptor."[105]
|
? |
| Bash Variables | EPOCHREALTIME | "${EPOCHREALTIME}"
|
No | "Each time this parameter is referenced, it expands to the number of seconds since the Unix Epoch (see time(3)) as a floating point value with micro-second granularity."[106]
|
? |
| Bash Variables | FUNCNAME | "${FUNCNAME[@]}"
|
No | "An array variable containing the names of all shell functions currently in the execution call stack."[107] | ? |
| Bash Variables | LINENO | "${LINENO}"
|
No | "Each time this parameter is referenced, the shell substitutes a decimal number representing the current sequential line number (starting with 1) within a script or function."[108] | ? |
| Bash Variables | PIPESTATUS | "${PIPESTATUS[@]}"
|
No | "An array variable containing a list of exit status values from the processes in the most-recently-executed foreground pipeline (which may contain only a single command)."[109] | ? |
| Bash Variables | PPID | "${PPID}"
|
No | "The process ID of the shell's parent."[110] | ? |
| Bash Variables | PS4 | "${PS4}"
|
No | "The value of this parameter is expanded as with PS1 and the value is printed before each command bash displays during an execution trace."[111] | ? |
| Shell Builtin | set / restricted | set -r
|
No | Restricted mode is intended to improve the security of an individual shell instance from a malicious human with physical access to a machine.
As threat models have changed, it has become less commonly used now than it once was. |
? |
| Shell Builtin | shopt / extdebug | shopt -s extdebug
|
No | "Behavior intended for use by debuggers." | ? |
| Shell Builtin | trap / DEBUG | trap '[arg]' DEBUG
|
No | "If a sigspec is DEBUG, the command arg is executed before" certain kinds of commands. | ? |
| Shell Builtin | trap / ERR | trap '[arg]' ERR
|
No | "If a sigspec is ERR, the command arg is executed whenever..." certain kinds of commands "return a non-zero exit status", subject to similar restrictions as with ErrExit. | ? |
| Shell Builtin | trap / RETURN | trap '[arg]' RETURN
|
No | "If a sigspec is RETURN, the command arg is executed each time a shell function or a script executed with the . or source builtins finishes executing."
|
? |
- Shell features specified by POSIX:
- Bash features not specified by POSIX:
- Third party debugging utilities:
Examples
[edit]With the "${var:?}" parameter expansion, an unset or null variable can halt a script.
$ cat ex.sh #!/bin/bash bar="foo is not defined" echo "${foo:?$bar}" echo this message doesn't print $ ./ex.sh ./ex.sh: line 3: foo: foo is not defined $
Reliably printing the contents of an array that contains spaces and newlines first in a portable syntax, and then the same thing in Bash.
Note that POSIX doesn't have named array, only the list of arguments, "$@", which can be re-set by the set builtin.
$ # In POSIX shell: $ set -- "a" " b" " > c " $ printf ',%s,\n' "$@" ,a, , b, , c,
Note that in Bash, the number of spaces before the newline is made clear.
$ # In Bash: $ array=( "a" " b" " > c " ) $ declare -p array declare -a array=([0]="a" [1]=" b" [2]=$' \n c ')
Printing an error message when there's a problem.
$ cat error.sh #!/bin/env bash if ! lsblk | grep sdb then echo Error, line "${LINENO}" fi $ ./error.sh Error, line 130
Using xtrace.
If errexit had been enabled, then echo quux would not have been executed.
$ cat test.sh #!/bin/env bash set -x foo=bar; echo "${foo}" false echo quux $ ./test.sh + foo=bar + echo bar bar + false + echo quux quux
Note: $BASHPID differs from $$ in certain circumstances, such as subshells that do not require bash to be reinitialized.
$ echo $(echo $BASHPID $$) $$ $BASHPID 25680 16920 16920 16920 # | | | | # | | | \-- $BASHPID outside of the subshell # | | \-- $$ outside of the subshell # | \-- $$ inside of the subshell # \-- $BASHPID inside of the subshell
Bug reporting
[edit]An external command called bashbug reports Bash shell bugs. When the command is invoked, it brings up the user's default editor with a form to fill in. The form is mailed to the Bash maintainers (or optionally to other email addresses).[125][126]
History
[edit]Shell script functionality originated with files called "runcoms" in reference to the 1963 macro processor of the same name. The suffix "rc" is short for "runcom."[127] The term "shell" was coined by Louis Pouzin in 1964 or 1965, and appeared in his 1965 paper, "The SHELL, A Global Tool for Calling and Chaining Procedures in the System," which describes many features later found in many UNIX shells.[128][129] The ASCII standard for character encoding was defined in 1969 in a document called Request for Comments (RFC) 20.[130]
Timeline
[edit]Significant events in Bash history are listed below:
| Date | Event |
|---|---|
| 1988-01-10 |
Brian Fox began coding Bash after Richard Stallman became dissatisfied with the lack of progress being made by a prior developer.[131] Stallman and the FSF considered a free shell that could run existing shell scripts so strategic to a completely free system built from BSD and GNU code that this was one of the few projects they funded themselves. Fox undertook the work as an employee of FSF.[131][132] |
| 1989-06-08 |
Fox released Bash as a beta, version 0.99.[133] The license was GPL-1.0-or-later. "In addition to supporting backward-compatibility for scripting, Bash has incorporated features from the Korn and C shells. You'll find command history, command-line editing, a directory stack (pushd and popd), many useful environment variables, command completion, and more."[134] Eventually it supported "regular expressions (similar to Perl), and associative arrays". |
| 1991 |
Bash holds historical significance as one of the earliest programs ported to Linux by Linus Torvalds, alongside the GNU Compiler (GCC).[135] |
| 1992 ~ 1994 |
Brian Fox retired as the primary maintainer sometime between mid-1992 [136] and mid-1994.[137][138] His responsibility was transitioned to another early contributor, Chet Ramey.[85][139][9][8] Since then, Bash has become the most popular default interactive shell among the major GNU/Linux distributions, such as Fedora, Debian, and openSUSE, as well as among their derivatives and competitors.[140][141] |
| 1994-01-26 |
Debian – initial release. Bash is the default interactive and non-interactive shell.[142] |
| 1996-12-31 |
Chet Ramey released bash 2.0. The license was GPL-2.0-or-later |
| 1997-06-05 |
Bash 2.01 released. |
| 1998-04-18 |
Bash 2.02 released. |
| 1999-02-19 |
Bash 2.03 released. |
| 2000-03-21 |
Bash 2.04 released. |
| 2000-09-14 |
Bug-bash mailing list exists.[143] |
| 2001-04-09 |
Bash 2.05 released.[144] |
| 2003 |
Bash became the default shell on Apple's operating systems (i.e., MacOS) starting with OS X 10.3 Panther.[145][146] It was available on OS X 10.2 Jaguar as well where the default shell was tcsh. |
| 2004-07-27 |
Bash 3.0 released.[147] |
| 2005-12-09 |
Bash 3.1 released.[148] |
| 2006-10-12 |
Bash 3.2 released.[149] The license was GPL-2.0-or-later. |
| 2006 |
Ubuntu replace bash with dash as its default shell. |
| 2009-02-20 |
Bash 4.0 released[150] Its license is GPL-3.0-or-later. |
| 2010-01-02 |
Bash 4.1 released.[151] |
| 2011-02-14 |
Bash 4.2 released.[152] |
| 2012 |
On Solaris 11, "the default user shell is the Bourne-again (bash) shell."[153] |
| 2014-02-27 |
Bash 4.3 released.[154] |
| 2014-09-08 |
Shellshock (software bug).[155][156] Patches to fix the bugs were made available soon after the bugs were identified.[157] |
| 2015 |
Termux and other terminal emulation applications provide availability of Bash on Android. |
| 2016-09-15 |
Bash 4.4 released. |
| 2009 ~ 2018 |
Apple declines to accept version 4 of Bash being licensed under version 3 of the GNU GPL, and ceases to supply upgrades to Bash beyond version 3.2 (as supplied in MacOS Mojave). |
| 2019-06-05 |
Apple declares zsh its default shell[158] and supplies version 5.7 in its Catalina release of MacOS.[159][160][161] |
| 2019-01-07 |
Bash 5.0 released.[162] |
| 2020-12-07 |
Bash 5.1 released.[163] |
| 2022-09-26 |
Bash 5.2 released. |
| 2025 |
Bash 5.3 released. |
See also
[edit]- Comparison of command shells
- Multics § Commands, exec_com: the first command processor.
- FTP download from GNU Project of Bash versions 1.14.0 to current.[164]
Unix shells
[edit]- Almquist shell (ash)
- Bourne shell (sh)
- BusyBox
- C shell (csh)
- Debian-Almquist Shell (dash)
- Fish shell: Friendly Interactive Shell
- Google Shell (goosh) – a UNIX-like front-end for Google Search.
- Korn shell (ksh), of which there are numerous variations.
- nsh – "A command-line shell like fish, but POSIX compatible."[165]
- osh – "Oil Shell is a Bash-compatible UNIX command-line shell"; available on Arch.
- Mashey or Programmer's Workbench shell
- Qshell for IBM i
- rc from Plan 9
- RUNCOM
- rush – Restricted User Shell, available on Debian.[142]
- Stand-alone shell (sash)
- scsh – The Scheme Shell.
- TENEX C shell (tcsh)
- Thompson shell (tsh)
- Toybox
- yash – Yet Another Shell, aims "to be the most POSIX-compliant shell in the world"; available on Arch.
- Z shell (zsh)
Graphical interface to scripts
[edit]There are many programs that allow you to create a graphical interface for shell scripts.
- curses - curses is a terminal control library for Unix-like systems, enabling the construction of text user interfaces (TUI) applications.
- dialog - is a utility that allows you to create dialog boxes in the console, using the curses and ncurses libraries.
- gtkdialog - is the most functional utility for creating graphical applications on bash scripts.[166]
- kdialog - is a KDE equivalent of zenity.[167]
- ncurses - a programming library for creating textual user interfaces (TUI's) that work across a wide variety of terminals.
- xdialog - is a replacement for dialog that is designed to give programs launched from the terminal an X Window System interface.
- yad - is a fork of zenity, with more features.[169]
Further reading
[edit]- [A]
- "Shell Scripting Primer". apple.com. Apple. Retrieved 8 August 2025.
Copyright © 2003, 2014 Apple Inc All Rights Reserved. ... Updated: 2014-03-10
- "Shell Scripting Primer". apple.com. Apple. Retrieved 8 August 2025.
- [G]
- "Shell Style Guide". github.io. Google. Retrieved 8 August 2025.
- [H]
- Stephenson, Neal (2003). In the Beginning... Was the Command Line. HarperCollins. ISBN 978-0380815937.
- [I]
- M. Jones (9 December 2011). "Evolution of shells in Linux: From Bourne to Bash and beyond". ibm.com. IBM. Retrieved 8 August 2025.
- [M]
- Pouzin, Louis (2 April 1965). "The SHELL: A Global Tool for Calling and Chaining Procedures in the System" (PDF). mit.edu. Massachusetts Institute of Technology. Retrieved 8 August 2025.
- [O]
- Newham, Cameron; Rosenblatt, Bill. "Learning the Bash Shell, 2e". oreilly.com. O'Reilly Media, Inc. Retrieved 8 August 2025.
Content preview
- Newham, Cameron; Rosenblatt, Bill. "Learning the Bash Shell, 2e". oreilly.com. O'Reilly Media, Inc. Retrieved 8 August 2025.
- [U]
- "Scripting Reference :: Scripting with the Bourne-Again Shell (Bash)". berkeley.edu. University of California, Berkeley. Retrieved 19 May 2024.
- "IRIS :: Instructional & Research Information Systems :: FAQ: Unix :: About UNIX Shells". berkeley.edu. University of California, Berkeley. Retrieved 8 August 2025.
Notes
[edit]- ^ not to be confused with the shell variable
$$ - ^ a b Shell scripts do not require compilation before execution and, when certain requirements are met, can be invoked as commands by using their filename.
- ^ concept drawn from ALGOL 68;[60]
- ^ Bash 4 also switches its license to GPL-3.0-or-later.
- ^ In February 2009,[66] Bash 4.0[d][67] introduced support for associative arrays.[4] Associative array indices are strings, in a manner similar to AWK or Tcl.[68] They can be used to emulate multidimensional arrays.
- ^ a b Although they can be used in conjunction, the use of brackets in pattern matching,
[...], and the use of brackets in the testing commands,[and[[ ... ]], are each one different things. - ^ for this functionality, see current versions of
bcandawk, among others.
References
[edit]- ^ Chet Ramey (5 July 2025). "Bash-5.3-release available". Retrieved 5 July 2025.
- ^
"GNU Bash". Free Software Foundation, Inc. GNU Project. Archived from the original on 26 April 2019. Retrieved 8 August 2025.
Bash is free software, distributed under the terms of the [GNU] General Public License as published by the Free Software Foundation, version 3 of the License (or any later version).
- ^
"bash-1.11". oldlinux.org. Archived from the original on 15 October 2021. Retrieved 8 August 2025.
See test.c for GPL-2.0-or-later
- ^ a b c d "BashFAQ/061: Is there a list of which features were added to specific releases (versions) of Bash?". wooledge.org. Archived from the original on 2 March 2021. Retrieved 8 August 2025.
- ^ "bash-1.05". oldlinux.org. Archived from the original on 6 May 2021. Retrieved 8 August 2025.
- ^ "Is there a way to download the presumably initial bash source bash-0.99?". unix.stackexchange.com. Retrieved 8 August 2025.
- ^ https://www.gnu.org/software/bash/
- ^ a b Morris, Richard (14 December 2015). "Chet Ramey: Geek of the Week". Simple Talk. Archived from the original on 31 July 2020. Retrieved 8 August 2025.
- ^ a b
- Hamilton, Naomi (30 March 2008). "The A-Z of Programming Languages: BASH/Bourne-Again Shell". computerworld.com.au. Computerworld Australia. p. 2. Archived from the original on 11 August 2016. Retrieved 8 August 2025.
When Richard Stallman decided to create a full replacement for the then-encumbered Unix systems, he knew that he would eventually have to have replacements for all of the common utilities, especially the standard shell, and those replacements would have to have acceptable licensing.
- Hamilton, Naomi (30 May 2008). "The A-Z of Programming Languages: BASH/Bourne-Again Shell". readthedocs.io. Computerworld Australia. Retrieved 8 August 2025.
- Hamilton, Naomi (30 March 2008). "The A-Z of Programming Languages: BASH/Bourne-Again Shell". computerworld.com.au. Computerworld Australia. p. 2. Archived from the original on 11 August 2016. Retrieved 8 August 2025.
- ^
Michael, Randal (2008). Mastering Unix Shell Scripting, 2e. Wiley Publishing, Inc., Indianapolis, Indiana. p. 3. ISBN 978-0-470-18301-4. Retrieved 16 August 2025.
UNIX is case sensitive. Because UNIX is case sensitive, our shell scripts are also case sensitive.
- ^ "GNU Bash Manual: 8.2.1 Readline Bare Essentials". Free Software Foundation, Inc. GNU Project. Retrieved 8 August 2025.
- ^ "GNU Bash Manual: 8.5 Readline vi mode". Free Software Foundation, Inc. GNU Project. Retrieved 8 August 2025.
- ^ Rippee, Scott (5 October 2012). "Getting started with BASH: A Bash Tutorial". hypexr.org. Archived from the original on 2 March 2021. Retrieved 8 August 2025.
- ^ "Bash Reference Manual, D.2 Index of Shell Reserved Words". Retrieved 18 August 2025.
- ^ a b c d https://www.gnu.org/software/bash/manual/bash.html
- ^ https://www.gnu.org/software/bash/manual/html_node/Arrays.html
- ^ https://www.linux-magazine.com/Online/News/Bash-4.0-Introduces-Associative-Arrays
- ^ a b
"Introduction to Linux, Ch. 3 About files and the filesystem, 3.2 Orientation in the filesystem, 3.2.1 The path". Linux Documentation Project. Retrieved 13 August 2025.
The PATH environment variable ... lists those directories in the system where executable files can be found, and thus saves the user a lot of typing and memorizing locations of commands.
- ^ "Modular data structures in C". dartmouth.edu. Dartmouth University. Retrieved 15 August 2025.
- ^ "4.1 Bourne Shell Builtins". gnu.org. Free Software Foundation, Inc. Retrieved 26 August 2025.
- ^ a b
Stevens, Al (1 July 2021). "I Almost Get a Linux Editor and Compiler". drdobbs.com. Archived from the original on 2 March 2021. Retrieved 8 August 2025.
But virtually all the configure and install scripts that come with open-source programs are written for bash, and if you want to understand those scripts, you have to know bash.
- ^ "Bash reference manual, 6.2 Bash startup files". Free Software Foundation, Inc. GNU Project. Retrieved 12 August 2025.
- ^ a b Cooper, Mendel. "Advanced Bash Scripting Guide: 36.9: Portability Issues". Linux Documentation Project. ibiblio.org. Archived from the original on 27 January 2012. Retrieved 8 August 2025.
- ^ "6.11 Bash and POSIX". case.edu. Case Western Reserve University. Retrieved 8 August 2025.
- ^ a b "Debian Policy Manual v4.5.0.2: 10 - Files". debian.org. Archived from the original on 12 May 2020. Retrieved 11 May 2020.
- ^ a b – Linux General Commands Manual from ManKier.com
- ^ a b – Linux General Commands Manual from ManKier.com
- ^ "Autoconf: 11: Portable Shell". Free Software Foundation, Inc. GNU Project. Archived from the original on 2 March 2021. Retrieved 20 January 2020.
- ^
"Bash Reference Manual, B.1 Implementation Differences from the SVR4.2 Shell". Free Software Foundation, Inc. GNU Project. Retrieved 13 August 2025.
In a questionable attempt at security, the SVR4.2 shell, when invoked without the -p option, will alter its real and effective UID and GID....
- ^
"Bash Reference Manual, 4.3.1 The Set Builtin". Free Software Foundation, Inc. GNU Project. Retrieved 13 August 2025.
In this mode, the $BASH_ENV and $ENV files are not processed, shell functions are not inherited from the environment, and the SHELLOPTS, BASHOPTS, CDPATH and GLOBIGNORE variables, if they appear in the environment, are ignored....
- ^ a b "BASH Debugger". sourceforge.net. Retrieved 18 August 2025.
- ^ Free Software Foundation. "Bash Reference Manual, 6.12 Shell Compatibility Mode". Free Software Foundation, Inc. GNU Project. Retrieved 5 August 2025.
- ^ See
set -vin the documentation. - ^ The Open Group. "2.15 Special Builtins: colon - null utility". opengroup.org. Base Specifications Issue 8: IEEE Std 1003.1-2024. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 7.1 Job Control Basics". Free Software Foundation, Inc. GNU Project. Archived from the original on 15 March 2018. Retrieved 8 August 2025.
- ^
Michael, Randal (2008). Mastering Unix Shell Scripting, 2e. Wiley Publishing, Inc., Indianapolis, Indiana. p. 25. ISBN 978-0-470-18301-4. Retrieved 16 August 2025.
19 :: SIGSTOP :: Stop, usually Ctrl + z
- ^
Newham, Cameron (29 March 2005). Learning the bash Shell: Unix Shell Programming. O'Reilly Media, Inc. p. 205. ISBN 978-0-596-55500-9. Retrieved 16 August 2025.
Use KILL only as a last resort!
- ^ "dd(1)". www.man7.org. Retrieved 19 October 2025.
- ^ a b "Bash(1)". case.edu. Case Western Reserve University. Retrieved 8 August 2025.
- ^ "Sending and Trapping Signals". wooledge.org. Retrieved 5 August 2025.
- ^
Michael, Randal (2008). Mastering Unix Shell Scripting, 2e. Wiley Publishing, Inc., Indianapolis, Indiana. p. 20. ISBN 978-0-470-18301-4. Retrieved 16 August 2025.
In Korn shell the echo command recognizes these command options by default. In Bash shell we must add the -e switch to the echo command,
echo -e "\n"for one new line. - ^ "Bash Reference Manual: 3.7.3: Command Execution Environment". Free Software Foundation, Inc. GNU Project. Retrieved 8 August 2025.
- ^
"Bash Reference Manual, 6.6 Aliases". Free Software Foundation, Inc. GNU Project. Retrieved 14 August 2025.
Aliases allow a string to be substituted for a word that is in a position in the input where it can be the first word of a simple command. Aliases have names and corresponding values that are set and unset using the alias and unalias builtin commands.
- ^ "alias - define or display aliases". opengroup.org. The Open Group. Retrieved 14 August 2025.
- ^ Cooper, Mendel. "Advanced Bash Scripting Guide, Ch 25. Aliases". Linux Documentation Project. Retrieved 14 August 2025.
- ^ "Commands and Arguments: Aliases". wooledge.org. Retrieved 14 August 2025.
- ^ "Compound Commands: Aliases". wooledge.org. Retrieved 14 August 2025.
- ^
"Bash Reference Manual, 3.3 Shell Functions". Free Software Foundation, Inc. GNU Project. Retrieved 14 August 2025.
Shell functions are a way to group commands for later execution using a single name for the group. They are executed just like a "regular" simple command. When the name of a shell function is used as a simple command name, the shell executes the list of commands associated with that function name. Shell functions are executed in the current shell context; there is no new process created to interpret them.
- ^ "2.9.5 Function Definition Command". opengroup.org. The Open Group. Retrieved 14 August 2025.
- ^ Cooper, Mendel. "Advanced Bash Scripting Guide, Ch 24. Functions". Linux Documentation Project. Retrieved 14 August 2025.
- ^ "Compound Commands, 4. Functions". wooledge.org. Retrieved 14 August 2025.
- ^ "Bash Programming, 2. Basic Concepts, 7. Functions". wooledge.org. Retrieved 14 August 2025.
- ^ "Bash Weaknesses, 13. Functions". wooledge.org. Retrieved 14 August 2025.
- ^ "Bash Reference Manual: 4.1: Bourne Shell Builtins". Free Software Foundation, Inc. GNU Project. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 4.3.1: The Set Builtin". Free Software Foundation, Inc. GNU Project. Retrieved 8 August 2025.
- ^ "bug-bash archives, Re: Document that set -v inside case statements is special". bug-bash (Mailing list). GNU Project. 20 April 2021. Retrieved 8 August 2025.
- ^ a b "Bash changes". bash-hackers.org. Archived from the original on 23 September 2019. Retrieved 8 August 2025.
- ^ "Glossary of Coding Terms for Beginners: iteration". syracuse.edu. Syracuse University. 13 January 2020. Retrieved 15 August 2025.
- ^ "compound - noun (1)". merriam-webster.com. Merriam-Webster's Collegiate Dictionary. Retrieved 15 August 2025.
- ^ Stephen R Bourne (12 June 2015). "Early days of Unix and design of sh" (PDF). bsdcan.org. BSDcan 2015: The Technical BSD Conference. Retrieved 8 August 2025.
- ^ "Advanced Bash Scripting Guide: 37.2: Bash, version 3". Linux Documentation Project. Section 37.2 (Bash, version 3). Archived from the original on 5 May 2017. Retrieved 5 March 2017.
- ^ "GNU Bash Manual: 3.2.5.2: Conditional Constructs". Free Software Foundation, Inc. GNU Project. Retrieved 8 August 2025.
- ^ "Bash Reference Manual, 3.2.6 Coprocesses". Free Software Foundation, Inc. GNU Project. Retrieved 15 August 2025.
- ^
Mallett, Andrew (24 December 2015). Mastering Linux Shell Scripting. Packt Publishing, Ltd. p. 56. ISBN 978-1-78439-759-3. Retrieved 16 August 2025.
Learning this now can save us a lot of pain and heartache later, especially....
- ^ "Bash Reference Manual: 5.3.1 Brace Expansion". Free Software Foundation, Inc. GNU Project. Archived from the original on 15 March 2018. Retrieved 8 August 2025.
- ^ "Advanced Bash Scripting Guide: 37.3: Bash, version 4". Linux Documentation Project. Archived from the original on 1 July 2018. Retrieved 8 August 2025.
- ^ "Update bash to version 4.0 on OSX". apple.stackexchange.com. Archived from the original on 25 June 2018. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 6.7: Arrays". Free Software Foundation, Inc. GNU Project. Archived from the original on 11 July 2018. Retrieved 8 August 2025.
- ^
"Bash Reference Manual, 3.1.2.5 Locale-Specific Translation". Free Software Foundation, Inc. Free Software Foundation. Retrieved 16 August 2025.
Prefixing a double-quoted string with a dollar sign $, such as
$"hello, world", causes the string to be translated according to the current locale. The gettext infrastructure performs the lookup and translation, using the$LC_MESSAGES,$TEXTDOMAINDIR, and$TEXTDOMAINshell variables. - ^ "The Bash Parser". wooledge.org. Retrieved 15 August 2025.
- ^ Ramey, Chet. "The Architecture of Open Source Applications (Volume 1): The Bourne-Again Shell". aosabook.org. Retrieved 15 August 2025.
- ^ "Bash Reference Manual: 9.2: Bash History Builtins". Free Software Foundation, Inc. GNU Project. Archived from the original on 15 September 2019. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 8.6: Programmable Completion". Free Software Foundation, Inc. GNU Project. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 8.6 Programmable completion". case.edu. Case Western Reserve University. Retrieved 8 August 2025.
- ^ "Index of /gnu/bash". swin.edu.au. Swinburne University of Technology. Archived from the original on 8 March 2020. Retrieved 8 August 2025.
- ^ a b "Advanced Bash Scripting Guide: Appendix J: An Introduction to Programmable Completion". Linux Documentation Project. Retrieved 21 January 2022.
- ^ a b "Bash". Free Software Foundation, Inc. GNU Project. Retrieved 8 August 2025.
- ^ a b Free Software Foundation. "GNU Bash manual". Free Software Foundation, Inc. GNU Project. Retrieved 8 August 2025.
- ^ "Bash Reference Manual". case.edu. Case Western Reserve University. Retrieved 8 August 2025.
- ^ "git: index : bash.git". Free Software Foundation, Inc. GNU Project. Retrieved 8 August 2025.
- ^ "GNU Coreutils manual v.9.7, 15.2 printf: Format and print data". Free Software Foundation, Inc. GNU Project. Retrieved 11 August 2025.
- ^ "GNU Bash Manual: 1.1: What is Bash?". Free Software Foundation, Inc. GNU Project. Retrieved 8 August 2025.
- ^ Open Group. "POSIX 2024". Free Software Foundation, Inc. Retrieved 30 July 2025.
- ^ "The Open Group Base Specifications Issue 7, 2018 edition". opengroup.org.
- ^ a b "The GNU Bourne-Again Shell, Top Page". case.edu. Case Western Reserve University. Retrieved 8 August 2025.
- ^ "Frequently Asked Questions". case.edu. Case Western Reserve University. Retrieved 8 August 2025.
- ^ "CVE-2024-2448: Authenticated Command Injection In Progress Kemp LoadMaster". rhinosecuritylabs.com. Rhino Security Labs, Inc. 23 April 2024. Retrieved 17 August 2025.
- ^ "CGI-BIN Specific Vulnerabilities". ucdavis.edu. University of California, Davis. January 1999. Retrieved 17 August 2025.
- ^ "CGI Security". BITS: computing and communications news. Los Alamos National Laboratory. March 1996. Archived from the original on 16 April 2000. Retrieved 17 August 2025.
- ^ "Eval command and security issues". wooledge.org. Archived from the original on 21 July 2025. Retrieved 17 August 2025.
- ^ "Input Validation Cheat Sheet". owasp.org. OWASP. Retrieved 17 August 2025.
- ^ Juliana, Cino (10 June 2017). "Linux bash exit status and how to set exit status in bash - Techolac". techolac.com. Archived from the original on 21 June 2019. Retrieved 21 June 2019.
- ^ Leyden, John (24 September 2014). "Patch Bash NOW: 'Shell Shock' bug blasts OS X, Linux systems wide open". theregister.co.uk. The Register. Archived from the original on 16 October 2014. Retrieved 25 September 2014.
- ^ Perlroth, Nicole (25 September 2014). "Security Experts Expect 'Shellshock' Software Bug in Bash to Be Significant". The New York Times. Archived from the original on 5 April 2019. Retrieved 25 September 2014.
- ^ Seltzer, Larry (29 September 2014). "Shellshock makes Heartbleed look insignificant". zdnet.com. ZDNet. Archived from the original on 14 May 2016.
- ^ Sidhpurwala, Huzaifa (24 September 2014). "Bash specially-crafted environment variables code injection attack". redhat.com. Red Hat. Archived from the original on 25 September 2014. Retrieved 25 September 2014.
- ^ Chazelas, Stephane (4 October 2014). "oss-sec mailing list archives". seclists.org. Archived from the original on 6 October 2014. Retrieved 4 October 2014.
- ^ "Advanced Bash Scripting Guide: 2.3: Debugging Bash scripts". Linux Documentation Project. Archived from the original on 4 November 2018. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 5.2: Bash Variables". Free Software Foundation, Inc. GNU Project. BASHPID. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 5.2: Bash Variables". Free Software Foundation, Inc. GNU Project. BASH_ARGC. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 5.2: Bash Variables". Free Software Foundation, Inc. GNU Project. BASH_ARGV. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 5.2: Bash Variables". Free Software Foundation, Inc. GNU Project. BASH_LINENO. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 5.2: Bash Variables". Free Software Foundation, Inc. GNU Project. BASH_REMATCH. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 5.2: Bash Variables". Free Software Foundation, Inc. GNU Project. BASH_SOURCE. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 5.2: Bash Variables". Free Software Foundation, Inc. GNU Project. BASH_XTRACEFD. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 5.2: Bash Variables". Free Software Foundation, Inc. GNU Project. EPOCHREALTIME. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 5.2: Bash Variables". Free Software Foundation, Inc. GNU Project. FUNCNAME. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 5.2: Bash Variables". Free Software Foundation, Inc. GNU Project. LINENO. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 5.2: Bash Variables". Free Software Foundation, Inc. GNU Project. PIPESTATUS. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 5.2: Bash Variables". Free Software Foundation, Inc. GNU Project. PPID. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 5.2: Bash Variables". Free Software Foundation, Inc. GNU Project. PS4. Retrieved 8 August 2025.
- ^
- "GNU Bash Manual, 3.5.3: Shell Parameter Expansion". Free Software Foundation, Inc. GNU Project. Retrieved 8 August 2025.
- "Bash(1), Parameter Expansion". case.edu. Case Western Reserve University. Retrieved 8 August 2025.
- "POSIX 2024, 2.6.2 Parameter Expansion". opengroup.org. The Open Group. Retrieved 18 August 2025.
- ^ "Bash Reference Manual: 3.4.2: Special Parameters". Free Software Foundation, Inc. GNU Project. Retrieved 8 August 2025.
- ^ "Shell Command Language". opengroup.org.
- ^ a b
- "GNU Bash Manual, 4.3.1: The Set Builtin". Free Software Foundation, Inc. Retrieved 8 August 2025.
- "POSIX 2024, set". opengroup.org.
- ^ a b c d e "Bash(1), Shell builtin commands". case.edu. Case Western Reserve University. Retrieved 8 August 2025.
- ^ a b "Bourne Shell Builtins (Bash Reference Manual)". Free Software Foundation, Inc. GNU Project. Retrieved 8 August 2025.
- ^ "Bash Reference Manual: 5.2: Bash Variables". Free Software Foundation, Inc. GNU Project. Retrieved 8 August 2025.
- ^ "Bash(1), Special parameters". case.edu. Case Western Reserve University. Retrieved 8 August 2025.
- ^ "The Shopt Builtin (Bash Reference Manual)". Free Software Foundation, Inc. GNU Project. Retrieved 8 August 2025.
- ^
- "ShellCheck: Shell script analysis tool". shellcheck.net.
- Github: shellcheck
- ^ "Debian -- Details of package devscripts in sid". debian.org.
- ^ "Kcov - code coverage". github.io.
- ^ "[Bashdb-devel] Re: [PATCH] fix bashdb script handling of tmp directory". bug-bash (Mailing list). GNU Project. Retrieved 8 August 2025.
- ^ "bashbug(1)". die.net. 21 October 2017. Archived from the original on 2 October 2018.
- ^ "bashbug(1) Mac OS X Manual Page". apple.com. 4 June 2014. Archived from the original on 6 October 2014.
- ^ "In Unix, what do some obscurely named commands stand for?". iu.edu. Indiana University. 5 February 2009. Archived from the original on 10 June 2010. Retrieved 8 August 2025.
- ^ Louis Pouzin (25 November 2000). "The Origin of the Shell". multicians.org.
- ^ "The SHELL, A Global Tool for Calling and Chaining Procedures in the System" (PDF). mit.edu.
- ^ Cerf, Vint (16 October 1969). "ASCII Format for Network Interchange". ietf.org. UCLA, Network Working Group. Retrieved 8 August 2025.
- ^ a b
Stallman, Richard; Ramey, Chet (10 February 1988). "GNU + BSD = ?". google.com. Newsgroup: comp.unix.questions. Usenet: 2362@mandrill.CWRU.Edu. Archived from the original on 28 December 2021. Retrieved 28 December 2021.
For a year and a half, the GNU shell was "just about done". The author made repeated promises to deliver what he had done, and never kept them. Finally I could no longer believe he would ever deliver anything. So Foundation staff member Brian Fox is now implementing an imitation of the Bourne shell.
- ^
Stallman, Richard (3 October 2010). "About the GNU Project". Free Software Foundation, Inc. GNU Project. Archived from the original on 24 April 2011. Retrieved 8 August 2025.
Free Software Foundation employees have written and maintained a number of GNU software packages. Two notable ones are the C library and the shell. ... We funded development of these programs because the GNU Project was not just about tools or a development environment. Our goal was a complete operating system, and these programs were needed for that goal.
- ^ Fox, Brian; Tower Jr., Leonard H. (8 June 1989). "Bash is in beta release!". google.com. Newsgroup: gnu.announce. Archived from the original on 4 May 2013. Retrieved 28 October 2010.
- ^ "Evolution of shells in Linux". ibm.com. IBM. 9 December 2011.
- ^
Torvalds, Linus (26 August 1991). "What would you like to see most in Minix?". google.com. Newsgroup: comp.os.minix. Retrieved 8 August 2025.
To make things really clear - yes I can run gcc on it, and bash, and most of the gnu [bin/file]utilities
- ^ "January 1993 GNU's Bulletin". google.com. Newsgroup: gnu.announce. 20 April 1993. Usenet: gnusenet930421bulletin@prep.ai.mit.edu. Archived from the original on 2 March 2021. Retrieved 28 October 2010.
- ^ Ramey, Chet (1 August 1994). "Bash – the GNU shell (Reflections and Lessons Learned)". linuxjournal.com. Linux Journal. Archived from the original on 5 December 2008. Retrieved 13 November 2008.
- ^ Ramey, Chet (31 October 2010), "Dates in your Computerworld interview", scribd.com, archived from the original on 20 July 2012, retrieved 31 October 2010
- ^
- Ramey, Chet (12 June 1989). "Bash 0.99 fixes & improvements". google.com. Newsgroup: gnu.bash.bug. Archived from the original on 10 November 2012. Retrieved 1 November 2010.
- Ramey, Chet (24 July 1989). "Some bash-1.02 fixes". google.com. Newsgroup: gnu.bash.bug. Archived from the original on 10 November 2012. Retrieved 30 October 2010.
- Fox, Brian (2 March 1990). "Availability of bash 1.05". google.com. Newsgroup: gnu.bash.bug. Archived from the original on 10 November 2012. Retrieved 30 October 2010.
- ^
Bresnahan, Christine; Blum, Richard (April 2015). CompTIA Linux+ Powered by Linux Professional Institute Study Guide: Exam LX0-103 and Exam LX0-104 (3rd ed.). John Wiley & Sons, Inc. p. 5. ISBN 978-1-119-02122-3. Archived from the original on 2 March 2021. Retrieved 6 June 2016.
In Linux, most users run bash because it is the most popular shell.
- ^
Danesh, Arman; Jang, Michael (February 2006). Mastering Linux. John Wiley & Sons, Inc. p. 363. ISBN 978-0-7821-5277-7. Archived from the original on 2 March 2021. Retrieved 6 June 2016.
The Bourne Again Shell (bash) is the most common shell installed with Linux distributions.
- ^ a b "Debian Wiki: Shell". debian.org.
- ^ Ramey, Chet (14 September 2000). "Re: Line-Edit mode is lost if "set -o vi" is in any files sourced on login". bug-bash (Mailing list). GNU Project. Retrieved 8 August 2025.
- ^ Ramey, Chet (9 April 2001). "Bash-2.05 available for FTP". bug-bash (Mailing list). GNU Project. Retrieved 8 August 2025.
- ^ Essential Mac OS S Panther Server Administration, pg 189
- ^
Foster-Johnson, Eric; Welch, John C.; Anderson, Micah (April 2005). Beginning Shell Scripting. John Wiley & Sons, Inc. p. 6. ISBN 978-0-7645-9791-6. Archived from the original on 2 March 2021. Retrieved 8 August 2025.
Bash is by far the most popular shell and forms the default shell on Linux and Mac OSX systems.
- ^ "Bash-3.0 available for FTP". bug-bash (Mailing list). GNU Project. Retrieved 8 August 2025.
- ^ "Bash-3.1 released". bug-bash (Mailing list). GNU Project. Retrieved 8 August 2025.
- ^ "Bash-3.2 available for FTP". bug-bash (Mailing list). GNU Project. Retrieved 8 August 2025.
- ^ "Bash-4.0 available for FTP". bug-bash (Mailing list). GNU Project. Retrieved 8 August 2025.
- ^ "Bash-4.1 available for FTP". bug-bash (Mailing list). GNU Project. Retrieved 8 August 2025.
- ^ "Bash-4.2 available for FTP". bug-bash (Mailing list). GNU Project. Retrieved 8 August 2025.
- ^ "User Environment Feature Changes". oracle.com. Oracle Corporation. Archived from the original on 12 June 2018. Retrieved 8 August 2025.
- ^ "Bash-4.3 available for FTP". bug-bash (Mailing list). GNU Project. Retrieved 8 August 2025.
- ^
"CVE-2014-6271". cve.org. Retrieved 8 August 2025.
GNU Bash through 4.3 processes trailing strings after function definitions in the values of environment variables, which allows remote attackers to execute arbitrary code via a crafted environment, as demonstrated by vectors involving the ForceCommand feature in OpenSSH sshd, the mod_cgi and mod_cgid modules in the Apache HTTP Server, scripts executed by unspecified DHCP clients, and other situations in which setting the environment occurs across a privilege boundary from Bash execution, aka "ShellShock."
- ^ "CVE-2014-7169". cve.org. Retrieved 8 August 2025.
- ^ "Bash 3.0 Official Patch 1". bug-bash (Mailing list). GNU Project. Retrieved 8 August 2025.
- ^ Briegel, Armin (5 June 2019). "Moving to zsh". scriptingosx.com. Retrieved 8 August 2025.
- ^ "Apple Support – Use zsh as the default shell on your Mac". apple.com. Archived from the original on 2 December 2019. Retrieved 8 August 2025.
- ^
Warren, Tom (4 June 2019). "Apple replaces bash with zsh as the default shell in macOS Catalina". theverge.com. The Verge. Archived from the original on 10 June 2019. Retrieved 8 August 2025.
The bash binary bundled with macOS has been stuck on version 3.2 for a long time now. Bash v4 was released in 2009 and bash v5 in January 2019. The reason Apple has not switched to these newer versions is that they are licensed with GPL v3. Bash v3 is still GPL v2.
- ^ Hughes, Matthew (4 June 2019). "Why does macOS Catalina use Zsh instead of Bash? Licensing". The Next Web. Archived from the original on 31 December 2020. Retrieved 8 August 2025.
- ^ "Bash-5.0 release available". bug-bash (Mailing list). GNU Project. Retrieved 8 August 2025.
- ^ "Bash-5.1 release available". bug-bash (Mailing list). GNU Project. Retrieved 8 August 2025.
- ^ "shell.c", Free Software Foundation, Inc., GNU Project, 29 August 1996, archived from the original on 28 September 2018, retrieved 8 August 2025
- ^ "Command-line shell". archlinux.org. 5 February 2025. Retrieved 8 August 2025.
- ^ Hanny Helal (7 January 2015). "Gtkdialog – Create Graphical (GTK+) Interfaces and Dialog Boxes Using Shell Scripts in Linux". tecmint.com. Tecmint. Retrieved 18 August 2025.
- ^ a b Kristian Kißling (2009). "Adding graphic elements to your scripts with Zenity and KDialog". linux-magazine.com. Linux Magazine. Issue 99. Retrieved 18 August 2025.
- ^ "Bash Shell Scripting/Whiptail". wikibooks.org. WikiMedia. Retrieved 18 August 2025.
- ^ Dmitri Popov (12 March 2012). "Dress Up Bash Scripts with YAD". linux-magazine.com. Linux Magazine. Retrieved 18 August 2025.
- ^ Pete Metcalfe (2021). "Create GUI dialogs in one line of code". linux-magazine.com. Linux Magazine. Issue 247. Retrieved 18 August 2025.
Bash (Unix shell)
View on GrokipediaFundamentals
Definition and Purpose
Bash, or the Bourne-Again SHell, is a free software Unix shell and command language interpreter developed by the GNU Project as a replacement for the original Bourne shell (sh).[1] It serves as the default shell for the GNU operating system, providing an enhanced, POSIX-compatible environment for executing commands and scripts.[6] The primary purpose of Bash is to interpret commands entered interactively by users or read from script files, enabling a command-line interface (CLI) for system administration, task automation, and shell programming.[7] It allows users to combine GNU utilities through scripting features, facilitating efficient control over processes, file operations, and system resources.[1] In both interactive and non-interactive modes, Bash processes input from standard input or files, supporting synchronous or asynchronous command execution with redirection capabilities.[8] Key characteristics of Bash include its compliance with the IEEE POSIX Shell and Utilities standard (IEEE 1003.1), augmented by extensions for advanced functionality, and its integration with the GNU Readline library for command-line editing and history management.[1] Unlike the more limited Bourne shell, Bash offers greater extensibility through features borrowed from other shells like ksh and csh, making it suitable for complex scripting.[7] Bash's basic workflow involves reading user or script input, splitting it into tokens (words and operators), performing expansions such as parameter and filename substitution, parsing the command structure, and executing the resulting commands while managing input/output streams and process exit statuses.[9] Bash is the default shell on most Linux distributions, including Red Hat Enterprise Linux and Ubuntu, due to its widespread adoption and compatibility.[10][11] On macOS, it served as the default shell until 2019, when Apple transitioned to zsh starting with macOS Catalina.[12] This popularity underscores Bash's role as a versatile, extensible alternative to the POSIX-minimal sh, balancing standards adherence with practical enhancements for everyday use.[7]History and Development
Bash was developed in 1989 by Brian Fox as part of the GNU Project, aimed at providing a free and open-source implementation of the POSIX shell standard to replace the Bourne shell (sh), which was restricted by licensing constraints.[13] The initial release occurred on June 8, 1989, marking the beginning of Bash as the default shell for the GNU operating system. Fox, the first employee of the Free Software Foundation, designed Bash to be compatible with sh while extending its capabilities for interactive use and scripting.[14] The primary maintenance transitioned to Chet Ramey in 1992, who has served as the lead developer and maintainer since then, overseeing evolution through bug fixes, feature additions, and compliance updates as of 2025.[13] Bash draws its core syntax from the Bourne shell but incorporates interactive features from the C shell (csh), such as command history and editing, and advanced scripting elements from the Korn shell (ksh), including job control and arrays.[13] This blend made Bash highly versatile, leading to its widespread adoption as the default login shell in GNU/Linux distributions during the 1990s and as the standard shell in macOS from its early versions until macOS Catalina in 2019, when it was replaced by zsh due to licensing and feature considerations.[15][16] Bash's development is managed under the GNU Project through the Savannah hosting platform, where contributions are coordinated via mailing lists and version control, with releases focusing on POSIX compliance, security enhancements, and new functionalities like coprocesses introduced in version 4.0.[17] Updates are distributed via official tarballs, incorporating community-reported fixes and standards alignment, ensuring portability across Unix-like systems.[13]| Version | Release Date | Major Additions |
|---|---|---|
| 2.0 | December 23, 1996 | Improved readline integration for command-line editing, unlimited history.[18] |
| 3.0 | July 27, 2004 | Indexed arrays, programmable command completion.[18] |
| 4.0 | February 23, 2009 | Associative arrays, coprocesses, enhanced debugging.[18] |
| 5.0 | January 7, 2019 | New shell variables (EPOCHSECONDS, EPOCHREALTIME), nameref improvements.[18] [19] |
| 5.1 | December 7, 2020 | PROMPT_COMMAND as array, SRANDOM variable, wait -p option.[18] |
| 5.2 | September 26, 2022 | Security fixes for vulnerabilities, minor scripting improvements.[18] |
| 5.3 | July 5, 2025 | New command substitution syntax, glob sorting options, enhanced error reporting.[20] [21] |
Shell Environment
Startup and Configuration Files
When Bash starts, it reads specific configuration files to initialize the shell environment, with the sequence depending on whether the shell is a login shell, an interactive non-login shell, or non-interactive. These files allow system administrators and users to set environment variables, define aliases, and configure shell behavior. The process ensures that global settings are applied before user-specific customizations, promoting consistency across sessions.[22] For interactive login shells, which occur when a user logs in via a terminal or remote session, Bash first sources the system-wide/etc/profile file if it exists and is readable. This file typically sets global environment variables such as PATH and umask. Next, Bash attempts to source one of the user-specific profile files in this order: ~/.bash_profile, ~/.bash_login, or ~/.profile, stopping at the first readable file. The ~/.bash_profile is preferred for Bash-specific settings, while ~/.profile provides compatibility with other POSIX shells. These files often export variables like PATH (e.g., export PATH="$PATH:/usr/local/bin") and set permissions with umask 022 to control default file creation modes.[22]
Interactive non-login shells, common in graphical terminal emulators, source different files to focus on session-specific configurations. Bash first checks for and sources /etc/bash.bashrc if it exists, a system-wide file for interactive settings on some distributions. Then, it sources the user-specific ~/.bashrc, which is ideal for defining aliases, shell functions, and prompt customizations without affecting login environments. To ensure consistency, ~/.bash_profile in login shells often includes a conditional statement to source ~/.bashrc, such as:
if [[ -n "$PS1" ]]; then
if [ -f ~/.bashrc ]; then . ~/.bashrc; fi
fi
if [[ -n "$PS1" ]]; then
if [ -f ~/.bashrc ]; then . ~/.bashrc; fi
fi
$PS1 variable, which is set for interactive shells, to detect and load interactive configurations only when appropriate.[22]
Non-interactive shells, such as those running scripts, do not source startup files by default to avoid unnecessary overhead. However, if the BASH_ENV environment variable is set to a filename, Bash sources that file before executing the script, allowing scripted environments to be customized.[22]
The sourcing order and file availability can vary across systems. On Linux distributions like those using systemd, graphical terminals typically launch non-login interactive shells, directly sourcing ~/.bashrc. In contrast, macOS Terminal.app defaults to login shells, prioritizing ~/.bash_profile and requiring explicit sourcing of ~/.bashrc for interactive features; this stems from macOS's BSD heritage influencing shell invocation. The /etc/bash.bashrc file, for instance, is commonly present on Debian-based Linux systems but absent or unused on macOS.[22]
Bash invocation options allow overriding this behavior. The --login flag forces login shell processing, sourcing profile files regardless of context. Conversely, --noprofile skips /etc/profile and user profile files, while --norc prevents sourcing of /etc/bash.bashrc and ~/.bashrc, useful for clean script execution or debugging. If a file exists but cannot be read, Bash reports an error (unless invoked with --norc or similar). These mechanisms enable precise control over environment initialization.[23]
Environment Variables and Parameters
In Bash, shell parameters encompass both named variables, which store values assigned via simple statements likename=value, and special parameters that provide predefined information about the shell's state or execution context.[24] Variables are created and modified through assignment, and their attributes—such as locality, immutability, or exportability—can be specified using built-in commands like declare, local, readonly, or export.[24] These mechanisms allow precise control over data storage and accessibility within scripts and interactive sessions.
Variable declaration in Bash supports scoping and inheritance attributes. By default, variables are global, meaning they are visible throughout the shell's execution environment unless overridden.[24] The local keyword, used within functions, declares a variable that is confined to the function's scope and its child processes, preventing interference with outer variables of the same name through dynamic scoping.[25] The readonly attribute renders a variable immutable, prohibiting subsequent assignments or unset operations once set, which is useful for defining constants like configuration flags.[26] Exporting a variable with export or declare -x marks it for inheritance by child processes, ensuring it becomes part of the environment passed to executed commands or subshells.[27]
Special parameters in Bash provide read-only access to runtime information without explicit declaration. The parameter $0 holds the name of the shell or the invoking script.[28] Positional parameters $1 through $9 capture the first nine command-line arguments passed to the script or function, with higher numbers accessible via shift or indirect reference.[28] The parameter $# reports the number of positional parameters, while $? yields the exit status (0 for success, non-zero for failure) of the most recent command.[28] The parameters $@ and $* represent all positional parameters, with $@ treating them as separate words (especially when quoted) and $* as a single word; $PPID gives the process ID of the shell's parent, and $! the process ID of the most recent background job.[28] These parameters are essential for scripting logic, such as error handling or argument processing.
The shell's environment consists of exported variables inherited from the parent process upon invocation, forming an array of name=value pairs passed to child processes.[29] Only exported variables are included in this inheritance; non-exported ones remain local to the current shell.[30] The env utility allows viewing or modifying this environment before invoking a command, such as by setting temporary variables or clearing the environment entirely with env -i.[31] Unsetting a variable with the unset command removes it from the current shell and, if exported, from the environment passed to future children; however, unsetting critical variables like PATH or HOME can disrupt command resolution or user directory access, respectively.[27]
Bash supports internationalization through locale-related environment variables, which influence behavior like message formatting and collation. The LANG variable sets the default locale category, while LC_* variables (e.g., LC_MESSAGES for message language, LC_COLLATE for sorting order) override specific aspects; LC_ALL takes precedence to set all categories uniformly.[32] These variables, when exported, ensure child processes adhere to the desired locale settings for consistent output across diverse systems.[32]
Standard Input, Output, and Error Streams
Bash employs three primary streams for handling input and output during command execution, adhering to Unix conventions: standard input (stdin), linked to file descriptor 0; standard output (stdout), file descriptor 1; and standard error (stderr), file descriptor 2. Stdin provides the default source of data for commands requiring input, such as reading user prompts or file contents, while stdout captures normal program output, like results or logs, and stderr directs error messages and diagnostics to ensure they remain distinguishable from regular output. These streams enable seamless integration in pipelines, where the stdout of one command feeds into the stdin of the next.[33] In interactive Bash sessions, stdin defaults to the keyboard or terminal device, permitting direct user input, whereas both stdout and stderr are routed to the terminal for immediate display, facilitating real-time feedback during command execution. This setup supports typical interactive workflows, such as entering commands and viewing responses on the console.[9] For non-interactive executions, such as in scripts or background processes, stdin is commonly derived from the script file itself, an argument-supplied source, or /dev/null if no explicit input is provided, while stdout may connect to a pipe, file, or inheriting process, and stderr typically remains directed to the terminal unless redirected. This configuration allows scripts to process batch data without user intervention, with output potentially captured for further processing or logging.[9] Basic manipulation of these streams is achieved through redirection operators; for example,command > file diverts stdout to a specified file (overwriting if it exists), and command 2> error.log sends stderr to a separate file for error isolation (full redirection syntax is covered in the Command Execution section).[33]
Here documents provide a mechanism to supply multi-line content directly to a command's stdin using the << operator, enabling inline data or scripts without external files. The syntax reads lines from the current input until encountering a delimiter, as in:
cat << EOF
Line one of input.
Line two with variables expanded if quoted differently.
EOF
cat << EOF
Line one of input.
Line two with variables expanded if quoted differently.
EOF
exec builtin, such as exec 3>&1, which duplicates the current stdout (fd 1) to a new fd 3 for independent access later in the session without altering the original. This technique supports complex I/O routing while preserving default behaviors.[33]
Syntax Basics
Tokens, Words, and Basic Syntax
In Bash, the fundamental units of syntax are tokens, which consist of words and operators derived from the shell's input stream. A token is defined as a sequence of characters treated as a single unit by the shell, encompassing either a word or an operator.[35] The shell reads input—whether from a terminal, script file, or command string—and breaks it into these tokens by identifying metacharacters that separate them. Metacharacters include space, tab, newline, and unquoted symbols such as|, &, ;, (, ), <, >, and others like *, ?, [, which have special meanings unless quoted.[35][36]
Words form the core of commands and arguments, representing sequences of non-metacharacter characters or quoted metacharacters that the shell treats as cohesive units. For instance, in the command ls -l, the shell tokenizes it into two words: ls (the command) and -l (an argument), separated by whitespace.[37] Operators, on the other hand, are tokens containing one or more unquoted metacharacters that control command execution, such as the pipe | for connecting commands or > for output redirection. These operators enable constructs like pipelines (cmd1 | cmd2) and command lists separated by ;, &, or newlines, without requiring semicolons for individual commands.[35][36]
Basic syntax in Bash follows a structure where a simple command comprises an optional list of variable assignments, followed by words (the command name and its arguments), and optional redirections or other operators. Commands are executed sequentially unless modified by operators; for example, [echo](/page/Echo) hello > file.txt directs output to a file using the redirection operator. Reserved words, a subset of words with syntactic significance, include flow-control terms like if, then, [else](/page/If-Then-Else), [fi](/page/If-Then-Else), for, do, done, while, until, case, [esac](/page/If-Then-Else), and {, }, which must appear unquoted in specific grammatical contexts to trigger their special behavior, distinguishing them from ordinary commands or built-ins like alias.[36][38]
The parsing process begins with initial tokenization, where the shell divides input into words and operators while discarding comments (lines or parts starting with #). This precedes further phases, including expansions (handled separately) and command execution, ensuring that metacharacters like * or ? are recognized only if unquoted during token formation. For a command like [cat](/page/Cat) file | [grep](/page/Grep) error, tokenization yields words [cat](/page/Cat), file, [grep](/page/Grep), error and the operator |, forming a pipeline structure.[39][37][36]
Quoting and Escaping
In Bash, quoting mechanisms prevent the shell from interpreting metacharacters, expansions, and special constructs, thereby preserving the literal value of characters or words in commands. These include the backslash escape character, single quotes, double quotes, and ANSI-C quoting forms. Quoting is essential for handling spaces, variables, and special symbols without unintended splitting or substitution.[40] Single quotes ('...') treat all enclosed characters literally, suppressing all forms of expansion, including parameter expansion, command substitution, arithmetic expansion, and process substitution. No special characters, such as the dollar sign ($), backtick (`), or backslash (\), retain their meaning inside single quotes. For instance, the command echo '$HOME is $PATH' outputs the literal string $HOME is $PATH rather than expanding the variables. To embed a single quote within a single-quoted string, the quote must be closed, the literal single quote escaped with a backslash, and then reopened, as in echo 'It'\''s a test', which outputs It's a test.[40]
Double quotes ("...") preserve the literal value of most characters while permitting limited expansions, such as parameter and variable expansion ($var), command substitution ($(command) or `command`), arithmetic expansion ($(())), and history expansion (!). However, they prevent word splitting and pathname expansion (globbing) on the expanded results, treating the content as a single word. Special characters like the backslash (\) inside double quotes escape only the dollar sign, backtick, double quote, backslash, and newline. For example, echo "The date is $(date)" expands the command substitution to output the current date, while echo "It's a test" treats the apostrophe literally without affecting the expansion. This makes double quotes suitable for mixed literal and expanded content, such as echo "User: $USER, home: $HOME".[40]
The backslash (\) serves as an escape character when unquoted, preserving the literal value of the immediately following character, including metacharacters like $, `, \, or spaces. Inside double quotes, it escapes only specific characters: $, `, ", \, and newline (which continues the line). Unquoted backslashes at the end of a line are removed after quoting rules are applied. For instance, [echo](/page/Echo) \$HOME outputs $HOME literally, and echo "Path: \$PATH" outputs Path: $PATH. Backslashes do not escape characters within single quotes, where they are treated literally.[40]
ANSI-C quoting provides advanced literal preservation with escape interpretation. The form $'...' treats the content as a single-quoted string but interprets backslash-escaped sequences according to the ANSI C standard, such as \n for newline, \t for tab, \r for carriage return, \\ for backslash, octal escapes (\nnn), hexadecimal (\xHH), and Unicode (\uHHHH or \UHHHHHHHH). For example, echo $'Hello\nWorld' outputs Hello followed by a newline and World. This form is useful for embedding control characters without external tools. Separately, $"..." enables locale-specific translation, expanding the string to its translated equivalent if a matching message catalog entry exists, while otherwise behaving like double quotes. For instance, echo $"Hello, world" might output a localized greeting based on the current locale.[41]
Nested quoting combines these mechanisms to handle complex strings. Double quotes often enclose single-quoted literals or escaped elements, as in echo "It'\''s $(date): $USER", which outputs It's [current date]: [username] by using single quotes for the apostrophe and allowing expansions for the date and user. Single quotes cannot directly nest within themselves but can be simulated via the escape technique mentioned earlier.[40]
A common pitfall arises from omitting quotes around variable expansions, which triggers word splitting on the results using the internal field separator ($IFS, defaulting to space, tab, and newline). For example, if files="a b c", the unquoted loop for f in $files; do [echo](/page/Echo) "$f"; done splits into three iterations (a, b, c), potentially causing errors with filenames containing spaces. Quoting as for f in "$files"; do [echo](/page/Echo) "$f"; done treats the value as a single item, preserving spaces and avoiding unintended splitting. This issue is particularly problematic in scripts handling user input or dynamic data.[42]
Types of Expansions
Bash performs several types of expansions on the words in a command line after tokenization but before execution, transforming the input into the actual arguments passed to commands. The precise order of these expansions is crucial for predictable behavior, as they are applied sequentially from left to right within words. This sequence ensures that later expansions operate on the results of earlier ones, and it is defined in the GNU Bash Reference Manual as: brace expansion first, followed by tilde expansion, parameter and variable expansion, arithmetic expansion, and command substitution (all in a left-to-right fashion); then word splitting; pathname expansion; and finally quote removal.[43] Brace expansion generates multiple strings from a pattern enclosed in curly braces, such as{a,b} producing a and b, or {1..3} yielding 1 2 3; it is a Bash-specific extension not present in the POSIX standard shell. Tilde expansion replaces ~ at the start of a word with the home directory path, either the current user's (~) or a specified user's (~username), defaulting to the value of the $HOME environment variable if unset. Parameter and variable expansion substitutes the value of variables or parameters, using forms like $var, ${var}, or special parameters like $? for the exit status of the last command. Arithmetic expansion evaluates integer expressions within $(( )), performing calculations such as $((2 + 3)) resulting in 5. Command substitution executes a command and replaces $(command) or the older `command` with its standard output, trimming trailing newlines.
Following these initial expansions, word splitting divides the resulting words into fields using the Internal Field Separator (IFS), which defaults to space, tab, and newline, though unquoted expansions like $var trigger this while quoted ones do not. Pathname expansion, also known as globbing, matches patterns like *, ?, or [abc] against existing filenames in the current directory, replacing the pattern with a list of matching paths; if no matches are found, the original word is retained unless the nullglob option is set. Finally, quote removal strips unescaped double quotes, single quotes, and backslashes from the words after all other expansions, ensuring the final arguments are clean.[43]
Expansions occur after the shell has tokenized the input into words and operators but before any command execution, allowing the shell to interpret dynamic content like variables within the command line. Nested expansions are supported and processed from innermost to outermost; for example, echo $(echo ${[HOSTNAME](/page/Hostname)}) first expands ${HOSTNAME} to the machine's name during the inner command substitution, then uses that output in the outer one. To disable specific expansions, quoting prevents most types—such as double quotes around $var inhibiting word splitting and pathname expansion—while the set -f option (or set -o noglob) globally turns off pathname expansion without affecting other types. The set -u option can treat unset variables as errors during parameter expansion.[43]
The order of expansions in Bash closely aligns with the standardization in IEEE Std 1003.1-2017 (POSIX.1), which specifies tilde, parameter, arithmetic, command substitution, word splitting, pathname expansion, and quote removal, but Bash prepends brace expansion as an extension to enhance scripting flexibility. This POSIX foundation ensures portability across Unix-like systems, though Bash's additions like brace expansion and extended tilde support provide advanced features beyond the base standard.[44][43]
Command Execution
Command Lookup and PATH
Bash performs command lookup by searching for the specified command name in a specific order to determine how to execute it. If the command name contains no slashes, Bash first checks for an alias matching the name. If no alias is found, it then looks for a shell function by that name. Next, it searches for a built-in command. If the command is not a built-in, Bash consults its internal hash table for a cached full pathname of an executable file. If the command is not found in the hash table, Bash searches the directories listed in the PATH environment variable in sequence until it locates an executable file matching the name; the first match is executed. If no match is found after exhausting these steps, Bash typically reports an error, unless acommand_not_found_handle function is defined to handle the case.[45]
The PATH environment variable is a colon-separated list of directory paths that Bash uses to locate external executable files when the command name does not include slashes. For example, a PATH value like /usr/local/bin:/usr/bin:/bin directs Bash to search first in /usr/local/bin, then /usr/bin, and finally /bin, executing the first executable found with the matching name. This mechanism allows users to run programs without specifying their full paths, but the order of directories determines execution priority.[45]
To optimize repeated lookups, Bash maintains an internal hash table that caches the full pathnames of previously executed external commands. Upon successful execution of an external command, its full path is automatically added to the hash table. Before searching PATH, Bash checks this table; if an entry exists, it uses the cached path directly, avoiding a full directory traversal. The hash built-in command manages this table: hash without arguments lists its contents, hash -p /full/path command associates a specific path with a command name, and hash -r clears all entries, forcing future lookups to re-search PATH. This caching improves performance in interactive sessions or scripts with frequent command invocations.
Commands with slashes in their names bypass the standard lookup order and are treated as file paths. An absolute path, such as /bin/ls, specifies the exact location and is executed directly if the file exists and is executable. A relative path, like ./script or subdir/command, is resolved starting from the current working directory. These paths do not consult aliases, functions, builtins, the hash table, or PATH, providing a way to invoke scripts or binaries outside the standard search mechanism.[45]
Modifying PATH insecurely, such as prepending user-writable directories like the current directory (.), can introduce security risks. An attacker with write access to those directories could place a malicious executable with the same name as a system command, causing Bash to execute the trojan instead of the legitimate program during lookup. This path interception technique has been documented as a persistence and privilege escalation vector in Unix-like systems.[46]
To locate executables without executing them, users can employ the which and whereis utilities. The which command, often implemented as a Bash built-in or external program, searches PATH and returns the full path of the first matching executable; for instance, which ls might output /bin/ls. The whereis command, a separate utility, searches a predefined set of standard directories (including PATH, manual pages, and source paths) and reports locations of the binary, source files, and manual pages for the command, such as whereis gcc showing /usr/bin/gcc /usr/share/man/man1/gcc.1.gz. These tools aid in debugging PATH issues or verifying command installations.
Built-in Commands
Built-in commands in Bash are commands implemented directly within the shell's binary, rather than as standalone executable programs in the file system. This internal implementation allows them to execute more rapidly, as they avoid the overhead of forking a new process and performing an exec system call that external commands require. Additionally, built-ins have direct access to the shell's internal state, enabling operations that manipulate the environment or control flow in ways that would be inefficient or impossible with external utilities.[1] Common built-in commands handle essential tasks such as directory navigation, input/output operations, conditional testing, and signal management. For instance,cd changes the current working directory and updates the shell's internal notion of the current path, while pwd prints the current working directory by accessing the shell's state directly. Output commands include echo, which writes its arguments to standard output with options for newline suppression (-n) and escape sequence interpretation (-e), and printf, which provides formatted output based on a format string, supporting Bash-specific specifiers like %q for quoted strings and %T for timestamps. Input is managed by read, which reads a line from standard input into shell variables, and mapfile (also known as readarray), which loads lines into an array. Conditional evaluation uses test (or its synonym [) for basic tests like file existence or string comparisons, and the Bash-specific [[ for extended tests including pattern matching and arithmetic without forking. Signal handling is provided by trap, which specifies commands to execute upon receipt of signals. Resource usage is reported by times, which displays the user and system times accumulated by the shell and its child processes. File inclusion is achieved with source (or its synonym .), which executes commands from a specified file in the current execution environment, a feature particularly useful for loading configuration or functions.[1]
In contrast to external commands, Bash built-ins like true (which always returns an exit status of 0) and false (which returns a non-zero status) execute instantaneously without process creation overhead, making them preferable in scripts for control flow where speed and shell integration matter. External versions of these, such as /bin/true, incur unnecessary costs and do not interact as seamlessly with the shell's state. Built-ins can be temporarily disabled using the enable -n command, which removes them from the shell's command lookup, allowing external commands with the same name to take precedence; they can be re-enabled with enable without arguments. To list all built-ins, the enable -p or help commands display them, providing a way to inspect available internals.[1]
The following table groups Bash's built-in commands by primary function, with brief descriptions; this is not an exhaustive list of all options but representative of core capabilities:
| Category | Built-in Commands | Description |
|---|---|---|
| Directory Management | cd, pwd | cd changes the working directory; pwd prints it. |
| Input/Output | echo, printf, read, mapfile | echo and printf handle output; read and mapfile manage input to variables or arrays. |
| Conditionals | test, [, [[ | Evaluate expressions for file, string, or arithmetic conditions; [[ is Bash-enhanced. |
| Shell Control | source (.), trap, times, true, false | source includes files; trap manages signals; times reports usage; true/false set exit statuses. |
| Variable Management | declare, local, export | declare sets attributes; local scopes variables in functions; export makes them environment variables. |
| Job Control | bg, fg, jobs | Manage background jobs and foreground processes. |
| Other | alias, bind, history, type | alias defines shortcuts; bind configures key bindings; history views command history; type identifies command types. |
mapfile for array input and enhanced printf formats, extend POSIX standards to provide more powerful scripting features directly within the shell.[1]
Redirections and File Descriptors
In Bash, redirections allow commands to read input from and write output to files or other sources, manipulating the three standard file descriptors: standard input (file descriptor 0), standard output (file descriptor 1), and standard error (file descriptor 2).[47] These operations enable flexible input/output (I/O) handling, such as saving command output to a file or providing input from a string.[47] File descriptors are non-negative integers representing open files or streams, with Bash supporting descriptors beyond the standard three for advanced use cases.[47] Basic redirection operators include>, which redirects standard output to a file, truncating the file if it exists; >>, which appends standard output to a file without truncation; <, which redirects standard input from a file; and 2>, which redirects standard error to a file, truncating it.[47] For example, the command ls > output.txt writes the directory listing to output.txt, overwriting any existing content, while ls >> output.txt adds to the file.[47] Similarly, command < input.txt reads from input.txt as input, and command 2> errors.txt captures error messages separately.[47]
To target specific file descriptors, Bash uses numbered forms like > filename for output or < filename for input, where n is the descriptor number (defaulting to 1 for > and 0 for < if omitted).[47] Descriptors greater than 9 may conflict with shell internals, so lower numbers are preferred for custom use.[47] The &> operator redirects both standard output and standard error to a file, truncating it, while &>> appends both.[47]
File descriptor duplication merges or copies streams using forms like >&m to duplicate output descriptor n to m, or <&m for input.[47] A common example is 2>&1, which merges standard error into standard output, as in ls nonext 2>&1 > combined.txt, sending both to the file.[47] Using - instead of a number, such as 2>&-, closes the descriptor.[47] Bash also provides special files like /dev/stdout and /dev/stderr for explicit duplication.[47]
Here documents supply multi-line input using <<word, where the shell reads until it encounters a line matching word exactly (with variable and command expansion if word is unquoted).[47] The <<-word variant strips leading tabs from the input lines and delimiter, aiding indented scripts, as in:
cat <<EOF
Line 1
Line 2 (tabs stripped with <<-)
EOF
cat <<EOF
Line 1
Line 2 (tabs stripped with <<-)
EOF
<(command) for input (e.g., diff <(sort file1) <(sort file2)) and >(command) for output (e.g., command >(grep filter)).[47] This enables treating processes like files in redirections.[47]
For persistent redirections in the current shell, exec modifies descriptors without starting a new process, such as exec 3> logfile to open descriptor 3 for appending logs, or exec >outfile to redirect all subsequent output.[47]
Redirections are applied from left to right before the command executes, affecting the order of operations; for instance, command >file 2>&1 redirects both streams to file, but command 2>&1 >file redirects only output to file (with error already merged).[47] This sequencing ensures predictable I/O behavior in complex pipelines.[47]
Control Flow
Conditional Constructs
Bash provides conditional constructs to enable decision-making in scripts based on command exit statuses, file properties, string comparisons, or arithmetic evaluations. These constructs include theif statement for linear conditional branching, the case statement for multi-way branching using pattern matching, and specialized test commands for evaluating conditions.[48]
The if statement evaluates a condition and executes commands accordingly. Its syntax is if test-commands; then consequent-commands; [elif more-test-commands; then more-consequents;] [else alternate-consequents;] fi, where test-commands are executed first; if they return a zero exit status, the consequent-commands follow. If not, any elif clauses are checked sequentially, and if none succeed, the else block executes if present. The overall return status is that of the last executed command or zero if no condition is true.[48]
Conditions in if statements typically use the test command, invoked as [ expression ] or its Bash extension [[ expression ]]. The [ ] form performs basic tests with word splitting and filename expansion enabled, supporting unary file operators like -f file (true if file exists and is a regular file) and binary operators like = for string equality or -eq for numeric equality.[49] For example, if [[ -f config.txt ]]; then echo "File exists"; fi checks for the existence of a regular file named config.txt.[49] String comparisons use == or = for equality and != for inequality, as in if [[ "$var" == "value" ]]; then ...; fi.[49]
The [[ ]] construct extends [ ] by disabling word splitting and glob expansion, allowing safer handling of variables, and adds features like pattern matching with == (glob-style) and regex matching with =~. For instance, [[ $string =~ ^[0-9]+$ ]] tests if $string consists entirely of digits using a POSIX regular expression; it returns 0 if true, 1 if false, and 2 if the regex is invalid.[48] The BASH_REMATCH array captures matches from =~, with index 0 holding the full match.[48] Numeric comparisons in [[ ]] use -eq, -ne, -lt, -le, -gt, and -ge.[49]
For arithmetic conditions, Bash uses the (( expression )) compound command as an alternative to test. It evaluates the arithmetic expression and returns 0 if the result is non-zero, or 1 if zero, enabling uses like if (( count > 0 )); then ...; fi.[48]
Conditions often rely on command exit codes, where 0 indicates success and non-zero indicates failure. The $? variable holds the exit status of the most recent command, as in command; if [[ $? -eq 0 ]]; then ...; fi. For pipelines, the PIPESTATUS array stores exit statuses of all commands in the most recently executed foreground pipeline, allowing checks like cmd1 | cmd2; if [[ ${PIPESTATUS{{grok:render&&&type=render_inline_citation&&&citation_id=0&&&citation_type=wikipedia}}} -eq 0 && ${PIPESTATUS{{grok:render&&&type=render_inline_citation&&&citation_id=1&&&citation_type=wikipedia}}} -eq 0 ]]; then ...; fi. Bash sets PIPESTATUS after pipelines, subshells, or certain compound commands.[50]
The case statement handles multiple conditions via pattern matching. Its syntax is case word in [patterns [| patterns]...) commands ;; ]... esac, where word (typically a variable) is matched against glob patterns; the first matching case's commands execute, using | for alternatives. Patterns support globs like * for default matching, and execution stops at ;; (or continues with ;& or ;;& in Bash 4.0+). For example:
case "$animal" in
horse|dog|cat) echo "four legs" ;;
*) echo "unknown" ;;
esac
case "$animal" in
horse|dog|cat) echo "four legs" ;;
*) echo "unknown" ;;
esac
$animal matches "horse", "dog", or "cat". The return status is 0 if no patterns match, otherwise that of the last command in the executed case.[48]
Looping Constructs
Bash provides several looping constructs to enable repetitive execution of commands in scripts, allowing automation of tasks such as processing lists of files or iterating over sequences of values. These include the traditional for loop for iterating over word lists, the while and until loops for condition-based repetition, an arithmetic for loop inspired by C syntax, and the select construct for creating interactive menus. Control over loop execution is managed through the break and continue builtins, which allow early termination or skipping of iterations.[51] The standard for loop iterates over a list of words, assigning each to a variable for use within the loop body. Its syntax isfor name [in words ...]; do commands; done, where the loop executes the commands for each word in the supplied list, binding the current word to the variable name. If the in words clause is omitted, the loop uses the script's positional parameters. For example, to iterate over all files in the current directory using glob expansion, one might write:
for i in *; do
[echo](/page/Echo) "$i"
done
for i in *; do
[echo](/page/Echo) "$i"
done
for ((expr1; expr2; expr3)); do commands; done. Here, expr1 initializes the loop (e.g., setting a counter), expr2 serves as the continuation condition (evaluated as non-zero to continue), and expr3 increments or updates after each iteration. Expressions use shell arithmetic evaluation. For instance, for ((i=1; i<=5; i++)); do echo $i; done outputs numbers 1 through 5. The return status is the last command's status or non-zero if any expression is invalid. This form is useful for precise control in computational scripts.[51]
The while loop repeats commands based on a condition, with syntax while test-commands; do consequent-commands; done. It executes the consequent-commands as long as the test-commands return a zero exit status (true). The until loop inverts this logic: until test-commands; do consequent-commands; done runs while the test returns non-zero (false), stopping when true. Both return the status of the last consequent-command or zero if none executed. These are ideal for loops dependent on external states, such as reading input until end-of-file.[51]
The select construct facilitates interactive menu selection, using syntax select name [in words ...]; do commands; done. It displays a numbered list of words derived from the shell's PS3 prompt (defaulting to "? "), reads user input via the Reply variable, and executes commands with the selected word bound to name. The loop continues until a break is issued or input is invalid. This is commonly used for simple user-driven choices in scripts. The return status matches the last command's or zero if none ran.[51]
To manage flow within loops, the break builtin exits the enclosing loop (or the nth if specified: break ), while continue skips to the next iteration ( continue ). Both apply to for, while, until, and select loops, requiring n ≥ 1, and return zero unless n is invalid. For example, break 2 exits two enclosing loops. These provide essential control for handling exceptions or optimizations during iteration.[52]
Functions and Aliases
Bash provides mechanisms for users to define reusable code blocks through functions and aliases, enhancing script modularity and interactive efficiency. Functions allow for complex logic with argument handling and variable scoping, while aliases offer simple text substitutions primarily for interactive use. These features enable customization without altering core shell behavior.[25][53] Functions in Bash are defined using one of two syntaxes:name() compound-command or function name [()] compound-command, where the compound-command is typically a braced list of commands, such as { echo "Hello"; }. For example, the following defines a function named greet that outputs a message:
greet() {
echo "Hello, world!"
}
greet() {
echo "Hello, world!"
}
greet executes the body as if the commands were directly entered in the shell.[25]
Within a function, arguments passed during invocation become available as positional parameters: $1 for the first argument, $2 for the second, and so on, with $# indicating the total number. For instance, if greet "Alice" is called, $1 inside the function holds "Alice". The function's exit status is determined by the last command executed or explicitly set using return n, where n is an integer from 0 to 255; a value of 0 indicates success.[25][24]
Variable scoping in functions supports locality through the local declaration, which creates variables visible only within the function and its subshells, preventing unintended modifications to the global environment. For example:
myfunc() {
local temp=42
echo $temp # Outputs 42
}
myfunc() {
local temp=42
echo $temp # Outputs 42
}
local, variables are dynamically scoped and shared with the calling environment. Functions can also be recursive, calling themselves, though depth is limited by the FUNCNEST shell option (default: unlimited, constrained by system stack size).[25][54]
To make a function available in child processes or subshells, use export -f name, which marks it for inheritance similar to environment variables. This is essential for scripts that spawn subprocesses needing the function. For example:
export -f greet
bash -c 'greet' # Executes the function in a new shell
```[](https://www.gnu.org/software/bash/manual/bash.html#Shell-Functions)
Aliases, in contrast, provide straightforward command shortcuts via the `alias` builtin, using the syntax `alias name=value`, where `value` is the expanded text. A common example is `alias ll='ls -l'`, which substitutes `ll` with `ls -l` upon invocation. Aliases are expanded during the shell's tokenization phase, before other expansions, but only in interactive shells or when `shopt -s expand_aliases` is enabled in scripts. They support recursive expansion if the value ends with a space.[](https://www.gnu.org/software/bash/manual/bash.html#Aliases)[](https://www.gnu.org/software/bash/manual/bash.html#Shell-Operation)
Aliases can be removed with `unalias name` or all at once with `unalias -a`. Listing active aliases is done via `alias` (without arguments) or `alias -p` for a printable format. However, aliases are limited to simple textual replacements and do not handle arguments or multi-line logic, making them unsuitable for anything beyond interactive conveniences like abbreviating common commands.[](https://www.gnu.org/software/bash/manual/bash.html#Aliases)
The primary distinction between functions and aliases lies in their capabilities: aliases suit quick, non-parameterized shortcuts in interactive sessions, whereas functions enable sophisticated scripting with arguments, control structures, and scoping for reusable code in both interactive and non-interactive contexts. For instance, an alias cannot process input like `$1`, but a function can implement conditional behavior based on arguments. This separation ensures aliases remain lightweight while functions provide full shell programming power.[](https://www.gnu.org/software/bash/manual/bash.html#Shell-Functions)[](https://www.gnu.org/software/bash/manual/bash.html#Aliases)
| Feature | Aliases | Functions |
|----------------------|----------------------------------|------------------------------------|
| Definition Syntax | `alias name=value` | `name() { commands; }` or `function name { commands; }` |
| Argument Handling | None | Positional parameters (`$1`, etc.) |
| Complexity | Simple text substitution | Supports logic, loops, conditionals |
| Scoping | Global replacement | Local variables via `local` |
| Exit Status Control | Inherits from substituted command | Explicit via `return n` (0-255) |
| Export to Subshells | Not applicable | Via `export -f` |
| Primary Use Case | Interactive shortcuts | Reusable script modules |
## Advanced Constructs
### Subshells and Process Management
In Bash, a subshell is a child process created as a copy of the current shell, allowing commands to execute in an isolated environment. Subshells are invoked by enclosing a list of commands within parentheses, such as `(command1; command2)`, which forces the shell to spawn a new process for their execution. This mechanism ensures that changes made within the subshell, such as variable assignments, do not persist in the parent shell after completion. For example:
```bash
( export FOO=bar; echo $FOO ) # Outputs 'bar' inside subshell
echo $FOO # Outputs nothing or previous value in parent shell
export -f greet
bash -c 'greet' # Executes the function in a new shell
```[](https://www.gnu.org/software/bash/manual/bash.html#Shell-Functions)
Aliases, in contrast, provide straightforward command shortcuts via the `alias` builtin, using the syntax `alias name=value`, where `value` is the expanded text. A common example is `alias ll='ls -l'`, which substitutes `ll` with `ls -l` upon invocation. Aliases are expanded during the shell's tokenization phase, before other expansions, but only in interactive shells or when `shopt -s expand_aliases` is enabled in scripts. They support recursive expansion if the value ends with a space.[](https://www.gnu.org/software/bash/manual/bash.html#Aliases)[](https://www.gnu.org/software/bash/manual/bash.html#Shell-Operation)
Aliases can be removed with `unalias name` or all at once with `unalias -a`. Listing active aliases is done via `alias` (without arguments) or `alias -p` for a printable format. However, aliases are limited to simple textual replacements and do not handle arguments or multi-line logic, making them unsuitable for anything beyond interactive conveniences like abbreviating common commands.[](https://www.gnu.org/software/bash/manual/bash.html#Aliases)
The primary distinction between functions and aliases lies in their capabilities: aliases suit quick, non-parameterized shortcuts in interactive sessions, whereas functions enable sophisticated scripting with arguments, control structures, and scoping for reusable code in both interactive and non-interactive contexts. For instance, an alias cannot process input like `$1`, but a function can implement conditional behavior based on arguments. This separation ensures aliases remain lightweight while functions provide full shell programming power.[](https://www.gnu.org/software/bash/manual/bash.html#Shell-Functions)[](https://www.gnu.org/software/bash/manual/bash.html#Aliases)
| Feature | Aliases | Functions |
|----------------------|----------------------------------|------------------------------------|
| Definition Syntax | `alias name=value` | `name() { commands; }` or `function name { commands; }` |
| Argument Handling | None | Positional parameters (`$1`, etc.) |
| Complexity | Simple text substitution | Supports logic, loops, conditionals |
| Scoping | Global replacement | Local variables via `local` |
| Exit Status Control | Inherits from substituted command | Explicit via `return n` (0-255) |
| Export to Subshells | Not applicable | Via `export -f` |
| Primary Use Case | Interactive shortcuts | Reusable script modules |
## Advanced Constructs
### Subshells and Process Management
In Bash, a subshell is a child process created as a copy of the current shell, allowing commands to execute in an isolated environment. Subshells are invoked by enclosing a list of commands within parentheses, such as `(command1; command2)`, which forces the shell to spawn a new process for their execution. This mechanism ensures that changes made within the subshell, such as variable assignments, do not persist in the parent shell after completion. For example:
```bash
( export FOO=bar; echo $FOO ) # Outputs 'bar' inside subshell
echo $FOO # Outputs nothing or previous value in parent shell
$(command)) or asynchronous command execution, providing isolation for potentially disruptive operations. However, starting with Bash 5.3, a new form of command substitution allows execution in the current shell environment without forking a subshell. The syntax is ${c command; }, where c is a space, tab, newline, or |, and the closing brace follows a command terminator such as a semicolon. This applies side effects, like variable assignments, directly to the parent environment and captures the command's output (with trailing newlines removed). A variant, ${| command; }, sets the output to the local REPLY variable without expanding it in the substitution, preserving trailing newlines and leaving standard output unchanged.[55]
Bash provides special parameters to access process identifiers, enabling scripts to track and manage processes. The parameter $$ expands to the process ID (PID) of the current shell, while $PPID yields the PID of the shell's parent process. The parameter $! returns the PID of the most recently started background process, facilitating reference to asynchronous tasks. These parameters are read-only and updated dynamically as processes are created or managed.
Background execution allows commands to run asynchronously without blocking the shell, initiated by appending an ampersand (&) to the command, such as command &. This launches the command in a subshell, returning control to the shell immediately while the background process continues. The shell reports the PID and a job specification upon starting the background command. To bring a background process to the foreground, the fg builtin can be used with the job specification or PID; conversely, bg resumes a suspended job in the background.
The wait builtin synchronizes script execution by pausing until specified background processes complete. Invoked as wait [PID], it returns the exit status of the waited process or 0 if no argument is provided (waiting for all background jobs). This is essential for ensuring dependent operations proceed only after asynchronous tasks finish, as in:
sleep 5 &
wait $!
echo "Background task completed"
sleep 5 &
wait $!
echo "Background task completed"
exec builtin replaces the current shell process with a specified command, without creating a new subshell, effectively terminating the shell upon invocation. When used as exec command, it overlays the command onto the shell's process image, inheriting all open file descriptors unless redirected. If no command is given (e.g., exec), it simply closes the shell after applying any redirections, useful for reassigning standard input/output in login shells.
In contrast to subshells, command grouping with curly braces { list; } executes commands within the current shell environment, avoiding the overhead and isolation of a new process. This form requires a semicolon before the closing brace and spaces around the braces for proper parsing, as in:
{ export FOO=bar; echo $FOO; } # 'bar' persists in current shell
echo $FOO # Outputs 'bar'
{ export FOO=bar; echo $FOO; } # 'bar' persists in current shell
echo $FOO # Outputs 'bar'
Pipelines and Logical Operators
Bash pipelines allow multiple commands to be connected in a linear fashion, where the standard output of one command serves as the standard input for the next. A pipeline consists of one or more commands separated by the pipe operator|, as in the syntax [time [-p]] [!] command1 [ | or |& command2 ] ....[56] The output of each command in the pipeline is directed to the input of the subsequent command, enabling data to flow sequentially through the chain without intermediate storage to disk.[56] For instance, the command grep pattern file | wc -l searches for a specified pattern in a file and pipes the matching lines to wc -l, which counts the number of lines, effectively providing the total occurrences of the pattern.[56]
The pipe operator |& extends this by connecting both the standard output and standard error of the preceding command to the next, equivalent to appending 2>&1 | for redirection.[56] By default, the exit status of a pipeline is that of the last (rightmost) command, unless the pipeline is preceded by !, in which case the status is negated, or if run asynchronously, in which it is always 0.[56] However, when the pipefail shell option is enabled via set -o pipefail, the exit status reflects the rightmost command that failed with a non-zero status, or 0 if all commands succeeded; this promotes robust error detection in scripts.[56]
Logical operators facilitate conditional and sequential execution of pipelines or commands within lists, which are sequences separated by ;, &&, or ||.[57] The semicolon ; enforces sequential execution, running each command or pipeline one after the other, with the shell waiting for completion before proceeding, and the overall exit status being that of the final command.[57] The && operator (logical AND) executes the subsequent command only if the preceding one exits successfully (status 0), while || (logical OR) executes it only on failure (non-zero status); both associate left-to-right with equal precedence, and the list's exit status is that of the last executed command.[57] For example, command1 && command2 || command3 runs command2 if command1 succeeds, otherwise skips to command3.[57]
Commands or pipelines can be grouped for compound execution using parentheses () or braces {}. Parentheses execute the enclosed list in a separate environment, treating it as a single unit whose redirections apply to the group, with the exit status matching that of the list.[58] Braces execute the list in the current environment, requiring a trailing semicolon or newline before the closing brace, and also yield the list's exit status as the group's.[58] These groupings integrate with pipelines and lists, such as (cmd1 | cmd2); cmd3.[58]
Process substitution enhances pipelines by allowing a command's input or output to be treated as a file. The syntax <(list) provides the output of the list as a readable filename, while >(list) supplies input to the list via a writable filename; these expand during command processing and support asynchronous execution via named pipes or /dev/fd.[59] In pipelines, this enables flexible data routing, as in sort <(cmd1) | cmd2, where cmd1's output is sorted before piping to cmd2.[59] Redirections, such as those for files or devices, can be applied within pipelines but are handled per command unless grouped.[56]
Arrays and Data Structures
Bash supports arrays as a fundamental data structure for storing and manipulating collections of values under a single variable name, primarily through one-dimensional indexed arrays and associative arrays. Indexed arrays use integer indices starting from 0, while associative arrays, available since Bash 4.0, use string keys for mapping values.[60][61] These arrays enable efficient handling of lists, mappings, and dynamic data in scripts without relying on external tools. Indexed arrays can be explicitly declared using thedeclare -a builtin, though declaration is often implicit through assignment. For example, elements are assigned with arrayname[index]=value, where the index is a non-negative integer (negative indices, such as -1 for the last element, are supported and count from the end). An array can also be initialized with multiple values using compound assignment: arrayname=(value1 value2 value3). To expand all elements, use ${arrayname[@]} or ${arrayname[*]}, where @ preserves word separation based on the IFS variable and * joins elements with IFS.[60]
Associative arrays require explicit declaration with declare -A arrayname (Bash 4.0 and later) and use string keys for assignments like arrayname[key]=value. Initialization follows a similar compound format: declare -A assoc=( [key1]=value1 [key2]=value2 ). Keys must be non-empty strings, and retrieval uses ${arrayname[key]}. Like indexed arrays, expansion of all values employs ${arrayname[@]}.[60][61]
Common operations on arrays include determining length, unsetting elements, and appending values. The number of elements is obtained with ${#arrayname[@]} (for both types), while the length of a specific element uses ${#arrayname[index]} or ${#arrayname[key]}. To remove an element, unset arrayname[index] or unset arrayname[key] is used; unsetting the entire array clears it with unset arrayname. Appending works via arrayname+=(value) for indexed arrays or arrayname+=( [key]=value ) for associative ones, automatically extending or adding as needed.[60]
Bash simulates multidimensional arrays using compound subscripts within a flat, one-dimensional structure, such as declare -a matrix; matrix[0,0]=value; echo "${matrix[0,0]}", where the index "0,0" is treated as a single string key in an indexed array (or directly in associative arrays). This approach lacks true nesting and requires careful index management.[60]
Iteration over arrays is typically done with for loops. For values in an indexed array: for element in "${arrayname[@]}"; do echo "$element"; done. For associative arrays, keys are iterated with for key in "${!arrayname[@]}"; do echo "$key -> ${arrayname[$key]}"; done, where ${!arrayname[@]} expands to all indices or keys. Sparse arrays are supported, permitting non-contiguous indices without wasting space for gaps, but Bash provides no native support for true multidimensional arrays beyond this simulation.[60]
Interactive Features
Command History and Recall
Bash maintains a list of recently executed commands, known as the command history, which is available only in interactive shells. This feature allows users to recall, edit, and reuse previous commands efficiently, enhancing productivity in terminal sessions. By default, Bash enables command history and history expansion for interactive use, storing commands in both memory and a persistent file.[62] The history list in memory is controlled by theHISTSIZE variable, which specifies the maximum number of commands to maintain, with a default value of 500. When the shell session ends, Bash appends the in-memory history to the history file, typically located at ~/.bash_history, unless the histappend shell option is enabled via shopt -s histappend, in which case it appends rather than overwrites the file. The size of the history file is limited by the HISTFILESIZE variable, also defaulting to 500 lines; excess lines are truncated when writing. Users can customize the history file location with the HISTFILE variable. Commands can be excluded from history by starting them with a space (if ignorespace is set in HISTCONTROL) or by matching patterns in the colon-separated HISTIGNORE variable, such as HISTIGNORE="ls:cd" to ignore ls and cd commands. Additionally, HISTCONTROL can be set to ignoredups to suppress consecutive duplicates or erasedups to remove all prior instances of a command.[62]
To view or manipulate the history, Bash provides the built-in history command. Invoking history without arguments displays the entire history list, numbered sequentially starting from 1, while history n shows the most recent n commands. Options include history -c to clear the in-memory list, history -d n to delete the nth entry (supporting negative offsets from the end in Bash 5.3 and later), and history -a to append the current session's history to the file immediately. As of Bash 5.3, history -d also supports deleting ranges of entries with syntax like -d start-end. The fc built-in command facilitates editing and re-execution of historical commands; for example, fc -l lists history entries similar to history, fc n invokes the default editor on the nth command for modification before re-execution, and fc -s old=new substitutes old with new in the most recent command matching old and then executes it.[63][64]
History expansion, triggered by the ! character in interactive input, enables quick recall and modification of past commands without listing the full history. This expansion occurs after the command line is read but before it is split into words or executed, and it applies to each line individually. The ! must be quoted (e.g., with \!) if literal use is intended. Event designators specify which command to recall: !! refers to the previous command; !n selects the nth command by its history number; !-n selects the nth previous command (e.g., !-1 is equivalent to !!); !string matches the most recent command starting with string; and !?string matches the most recent containing string, optionally ending with ? for exact search. Word designators follow the event to select parts: ^ for the first argument, $ for the last, n for the nth word (starting from 0 for the command itself), * for all arguments, or x-y for a range. Modifiers alter the selected text, such as :p to print without executing, :s/old/new/ for single substitution, :gs/old/new/ for global substitution, :t to retain only the trailing filename component, :h for the head (directory), :r to remove the extension, and :e for the extension alone. For instance, !! re-executes the last command, !ls:1 inserts the first argument of the last ls command, and !!:s/foo/bar/ substitutes "foo" with "bar" in the previous command before running it. If no match is found, expansion fails and the command aborts unless modified with :p. History expansion can be disabled with set +H or set -o history (enabled by default for interactive shells).[65]
To share history across multiple terminal sessions in real-time, users can set the PROMPT_COMMAND variable to history -a; history -r, which appends the current session's new commands to the file and reads updates from other sessions after each prompt. Combined with shopt -s histappend, this ensures non-destructive updates without overwriting.[62]
Programmable Completion
Programmable completion in Bash allows users to customize the tab-completion behavior for commands and their arguments in interactive shells, enabling context-aware suggestions that enhance efficiency. This feature, enabled by default via theprogcomp shell option, integrates with the Readline library to generate and display possible matches when the Tab key is pressed. By default, Bash attempts completion by first checking for any programmable specifications associated with the command; if none exist, it falls back to filename completion or other defaults like alias expansion. The complete -p command lists all current completion specifications, providing a way to inspect and reuse existing setups. As of Bash 5.3, command completion matches aliases and shell function names case-insensitively if the Readline variable completion-ignore-case is set.[66][67][64]
Customization occurs primarily through the complete builtin command, which defines completion specifications (compspecs) for specific commands or globally. For instance, complete -F func cmd associates a shell function func with the command cmd, where the function generates options dynamically. The compgen builtin aids in this process by generating lists of possible completions, such as compgen -f for filenames or compgen -A alias for aliases, which can be integrated into completion functions. These builtins support various options to refine behavior, including -o bashdefault to combine custom logic with Bash's defaults, -o nospace to prevent adding a space after completion, and -X filterpat to exclude patterns matching a glob.[68][67]
In programmable completion functions, Bash sets several environment variables to provide context, such as COMP_WORDS (an array of words in the current command line), COMP_CWORD (the index of the word containing the cursor), and COMP_LINE (the full command line). The function must populate the COMPREPLY array with matching strings—one per element—to supply the completions, which Readline then uses to display a menu, cycle through options, or insert the match. For example, a completion function for the cd builtin might use compgen -d -- "$cur" to suggest directories, incorporating tilde expansion and the $CDPATH variable for enhanced navigation, and bind it via complete -F _comp_cd cd. This approach allows for sophisticated logic, such as filtering based on previous arguments or integrating external data.[69][67]
The bash-completion project extends this capability system-wide by providing a collection of pre-written completion scripts for common commands, loaded automatically from /etc/bash_completion or /etc/bash_completion.d/. This framework, sourced in user profiles like ~/.bashrc, supports on-demand loading and per-user overrides in directories such as ~/.local/share/bash-completion/completions. A prominent example is the Git completion script, which offers detailed tab-completion for Git subcommands, branches, and options; it includes the __git_ps1 function for integrating repository status into the shell prompt, though the core completions are handled by a dedicated _git function bound via complete -F _git git. These scripts demonstrate how programmable completion scales to complex tools, reducing errors and speeding up workflows in development environments.[70][71]
Custom Prompts and Readline
Bash provides extensive customization options for interactive prompts through environment variables and integration with the GNU Readline library, allowing users to tailor the command-line interface for better usability and aesthetics. The primary prompt, displayed before each command in an interactive shell, is defined by the PS1 variable, which supports a variety of backslash-escaped special characters for dynamic content. For instance, \u inserts the current username, \w displays the current working directory (with tilde expansion for the home directory), and \h shows the hostname up to the first period. As of Bash 5.3, a new PS0 prompt string variable has been introduced, which is expanded and displayed after reading a command line but before executing it.[72][64] Secondary prompts extend this customization for specific interactive scenarios. The PS2 variable controls the continuation prompt for multi-line commands, defaulting to "> " and appearing when the shell awaits further input, such as after an open quote or unclosed brace.[72] PS3 defines the prompt for the select built-in command in scripts, prompting for menu choices, while PS4 prefixes debug output when tracing is enabled withset -x, typically set to "+ " to indicate traced commands.[72] Like PS1, these variables interpret the same escape sequences, allowing consistent formatting across prompt types. In Bash 5.3, prompt expansion now quotes the results of the \U escape sequence.[72][64]
Colors and non-printing sequences enhance prompt readability on supported terminals. ANSI escape codes can be embedded within $$ $$ delimiters to apply formatting without affecting cursor positioning or line wrapping; for example, $$\e[32m$$ sets green text, and $$\e[0m$$ resets to default.[72] A common colored prompt is export PS1='$$\e[32m$$\u@\h:\w\$ $$\e[0m$$', displaying the username and host in green.[72] These escapes rely on the terminal's capabilities, determined by the TERM environment variable (e.g., "xterm-256color" for full color support); users should verify TERM before applying colors to avoid garbled output on basic terminals like "dumb".[50]
The GNU Readline library underpins Bash's command-line editing, providing programmable interfaces for input handling and key customization. Readline supports two primary editing modes: Emacs mode (default, with Ctrl-based shortcuts like Ctrl-A for beginning of line) and Vi mode (enabled via set -o vi or set editing-mode vi in ~/.inputrc), allowing users to switch between familiar keymaps for navigation and editing. As of Readline 8.3 (included with Bash 5.3), new bindable commands next-screen-line and previous-screen-line allow cursor movement by screen lines, and non-incremental Vi-mode searches (N, n) can use shell pattern matching via fnmatch(3) if available. New Readline variables include completion-display-width to set the number of columns used for displaying matches and menu-complete-display-prefix to show a common prefix before cycling through completions. Additionally, the export-completions command writes possible completions to stdout.[73][64]
Key bindings can be defined dynamically using the bind built-in, such as bind '"\C-x": "some-command"' to map Ctrl-X to a specific Readline function, or to shell commands with -x.[74]
Persistent customizations are managed through the ~/.inputrc file, which Readline reads on startup (falling back to /etc/inputrc if absent). This file supports variable assignments (e.g., set history-search-delimiter / for custom history searches) and key bindings in the format keyseq: function-name, such as "\e[A": history-search-backward to enable upward arrow for prefix-based history navigation.[75] Bindings in ~/.inputrc apply globally to Readline-using applications, including Bash, and can be reloaded interactively with Ctrl-X Ctrl-R.[75] For Vi mode specifics, ~/.inputrc can include $if mode=vi conditionals to set insertion or command-mode bindings separately.[75]
Job Control and Signals
Job control in Bash provides mechanisms for managing multiple processes or jobs within an interactive shell session, allowing users to suspend, resume, and run processes in the background or foreground.[76] Each pipeline executed in the shell constitutes a single job, which Bash tracks with a unique job number starting from 1, displayed in brackets (e.g., [1]) alongside the process ID and status when listed.[76] This feature relies on support from the underlying operating system and terminal driver, enabling selective suspension and resumption of process execution. As of Bash 5.3, in interactive shells, job completion notifications are suppressed while sourcing scripts and printed during trap execution.[76][64] Jobs can be referenced using job specifications (jobspecs) such as%n for the job with number n (e.g., %1), %string for the job whose command begins with string (e.g., %ce for a job starting with "ce"), or %?string for any job containing string.[77] The jobs builtin command lists all active jobs, showing their numbers, PIDs, status (running, stopped, or done), and commands, with the current job marked by a + and the previous job by a -.[77] For example, running sleep 100 & followed by jobs might output {{grok:render&&&type=render_inline_citation&&&citation_id=1&&&citation_type=wikipedia}}+ Running sleep 100 &.
To manage jobs, users can suspend a foreground job by pressing Ctrl+Z, which sends the SIGTSTP signal to stop its execution and return control to the shell.[76] The fg builtin then resumes the specified job (or the current one if none provided) in the foreground, blocking the shell until completion (e.g., fg %1).[77] Conversely, bg %1 resumes a stopped job in the background, allowing the shell to accept new commands while the job runs asynchronously.[77] These operations facilitate multitasking in interactive sessions without terminating processes. As of Bash 5.3, the wait builtin can wait for the last process substitution created and includes a -f option to wait until a job or process terminates.[77][64]
Bash handles signals to manage job lifecycle and interruptions, with interactive shells ignoring SIGTERM by default and catching SIGINT (generated by Ctrl+C) to interrupt commands or loops. As of Bash 5.3, in POSIX mode, the SIGCHLD trap runs once per exiting child process even if job control is disabled.[78][64] The trap builtin allows customization of signal handling by specifying a command to execute upon receipt of a signal, such as trap 'echo "Interrupted"' SIGINT, which runs the handler when SIGINT is received.[52] Common signals include SIGINT for user interruptions and SIGTERM for graceful termination requests, though the latter is ignored in interactive mode unless explicitly trapped.[78]
The disown builtin removes jobs from the shell's active table or prevents them from receiving SIGHUP (hangup signal) on shell exit, with disown -h %1 marking a job to ignore SIGHUP while keeping it listed.[77] This is useful for long-running background jobs that should persist after logout.[77] Additionally, wait synchronizes script execution by pausing until specified jobs or processes complete, returning their exit status (e.g., wait %1 for job 1).[77]
By default, job control is disabled in non-interactive scripts for performance reasons, but it can be enabled with set -m, allowing background job management similar to interactive sessions.[79] In such cases, subshell processes created by scripts can be treated as jobs under this mode.[79] With job control active, Bash places processes in separate process groups, notifying the user upon background job completion.[79]
Debugging and Observability
Tracing and Verbose Output
Bash provides built-in options for enabling tracing and verbose output during script execution, allowing users to observe command processing and expansions for debugging purposes. Theset -x option, also known as xtrace, instructs the shell to print each command and its expanded arguments to standard error just before execution.[80] This output is prefixed by the value of the PS4 shell variable, which defaults to + but can be customized, for example, to include line numbers with PS4='Line ${LINENO}: '.[80]
In contrast, the set -v option, or verbose mode, causes the shell to echo each line of input as it is read, before any expansions or substitutions occur.[80] This is particularly useful for verifying the raw script content during processing. Both options can be combined for comprehensive visibility; for instance, running an external script with tracing enabled from the outset is achieved via bash -xv script.sh.[80]
Tracing can be toggled dynamically within a script using set +x to disable xtrace or set +v to turn off verbose mode.[80] For conditional activation, the trap builtin can be employed with the DEBUG signal to enable or disable tracing based on specific events, such as errors, by executing set -x or set +x in response.[27] Additionally, the BASH_XTRACEFD shell variable allows redirection of xtrace output to a specific file descriptor rather than the default standard error (file descriptor 2); setting it to an integer like 3 directs output there, while unsetting it reverts to standard error.[81] The file descriptor is automatically closed if BASH_XTRACEFD is unset or reassigned.[81]
These features are commonly used in debugging Bash scripts to log command expansions and trace execution flow, helping identify issues like variable substitution errors or unexpected argument passing without altering the script's logic.[80] For example, enabling xtrace in a complex script reveals the actual commands after globbing and parameter expansion, aiding in troubleshooting.[80]
#!/bin/bash
set -x # Enable tracing
echo "Value: $VAR" # Output: + echo "Value: hello" (assuming VAR=hello)
set +x # Disable tracing
#!/bin/bash
set -x # Enable tracing
echo "Value: $VAR" # Output: + echo "Value: hello" (assuming VAR=hello)
set +x # Disable tracing
Error Handling and Exit Status
In Bash, commands and scripts communicate success or failure through exit status codes, which are integers ranging from 0 to 255. A value of 0 indicates successful execution, while any non-zero value signals an error or failure, with specific conventions such as 126 for commands that cannot be executed due to permissions and 127 for commands not found.[82] The special parameter$? holds the exit status of the most recently executed foreground pipeline or command, allowing scripts to inspect and respond to outcomes programmatically.[28]
To handle errors dynamically, Bash provides the trap builtin, which can intercept non-zero exit statuses via the ERR pseudo-signal. For instance, the command trap 'echo "Error at line $LINENO"' ERR will execute the specified action whenever a command exits with a non-zero status, excluding certain contexts like conditionals or command lists; $LINENO expands to the current line number for precise error location.[83] This mechanism enables custom error logging or cleanup without halting execution unless desired. The set -e option, also known as errexit, causes the shell to terminate immediately upon any command's non-zero exit status, unless the failing command is part of a conditional construct (such as if or while) or executed within an && or || list, promoting stricter error propagation in scripts.[80]
For pipelines, where multiple commands are chained with |, the default exit status is that of the last command, but set -o pipefail alters this to return the exit status of the last command that failed (rightmost non-zero) or 0 if all succeed, ensuring intermediate failures in the pipeline are not masked.[84] This is particularly useful for detecting errors in data processing chains. In functions, the return builtin explicitly sets the function's exit status to a specified value between 0 and 255, overriding the status of the last command executed within it; for example, return 42 sets $? to 42 upon function completion.[27]
Practical examples illustrate these features. To check a command's outcome conditionally, one might use if ! ls /nonexistent; then echo "Failed with status $?"; fi, where ! negates the exit status for the test (detailed further in conditional constructs).[85] Enabling set -e in a script like set -e; false; echo "This won't print" results in immediate exit without printing, halting on the failure.[80] With pipefail, set -o pipefail; false | true; echo $? outputs 1, reflecting the failure in the pipeline.[84] In a function, defining error() { return 1; } and calling error; echo $? prints 1, demonstrating controlled status setting.[25] These tools collectively allow robust error management, balancing automation with explicit control in Bash scripting.
Comments and Debugging Tools
In Bash scripts, comments are introduced by a hash symbol (#) at the beginning of a word, causing the shell to ignore the # and all subsequent characters until the end of the line.[86] This feature is enabled by default in interactive shells through the interactive_comments shell option, but it applies universally in non-interactive contexts like scripts.[80] Bash does not support native multi-line comments; instead, developers typically achieve this by prefixing each line of a block with # or by employing a here-document to encapsulate explanatory text, though the latter is not strictly a comment mechanism.[34]
For debugging Bash scripts, there is no built-in debugger analogous to gdb for other languages; instead, developers rely on manual techniques and shell builtins.[26] The declare -p command prints the definitions of specified variables, including their attributes and values, which aids in inspecting the script's state during development.[87] Similarly, typeset serves as an alias for declare and can display variable attributes when used without additional options, facilitating the verification of data types and scopes.[87]
The LINENO special variable provides the current line number within a script or function, proving particularly useful in trap handlers to log or respond to errors at specific locations.[28] Best practices for debugging include liberally adding inline comments to clarify logic, grouping related comments into blocks with multiple # lines for readability, and employing the printf builtin for conditional debug output, such as printing variable states only when a debug flag is set (e.g., printf "Debug: var=%s\n" "$var").[86][88] For runtime execution tracing, the set -x option enables verbose output of each command as it runs, though detailed coverage of tracing appears in the dedicated section on tracing and verbose output.[80]
Data Manipulation
Parameter and Variable Expansion
Parameter expansion in Bash allows the substitution of the value of a shell parameter, which can be a variable, positional parameter, or special parameter, into the command line. The basic syntax uses the dollar sign followed by the parameter name, such as$var to expand the value of the variable var.[89] For clarity or when the parameter name might be ambiguous, such as with multi-digit positional parameters or when followed by characters that could be part of the name, the preferred form is ${var}.[89] This expansion occurs during the shell's parsing phase, replacing the parameter reference with its value before the command is executed.[89]
Bash provides several operators for handling unset or null parameters, enabling defaults, assignments, or error checks. The form ${parameter:-default} substitutes the default value if the parameter is unset or null, but does not assign it to the parameter; for example, echo ${var:-unknown} safely outputs "unknown" if var is unset, otherwise its value.[89] In contrast, ${parameter:=default} assigns the default to the parameter if it is unset or null and then substitutes the value, useful for initializing variables on first use, as in : ${var:=default}; echo $var.[89] For error handling, ${parameter:?error-message} substitutes the parameter's value if set and non-null, but if unset or null, it prints the error message to standard error and exits the shell (or returns a non-zero status in interactive mode).[89]
Substring extraction and pattern-based removal are supported through specific operators. The syntax ${parameter:offset:length} extracts a substring starting from the zero-based offset, taking up to length characters; negative offsets count from the end of the string.[89] For instance, with var=abcdefgh, ${var:2:3} yields "cde".[89] Prefix removal uses ${parameter#pattern} to delete the shortest matching prefix from the expanded value, or ${parameter##pattern} for the longest match; an example is var=/path/to/file; echo ${var#/path} outputting "/to/file".[89]
The length of a parameter's value can be obtained with ${#parameter}, which returns the number of characters in the expanded value for scalars.[89] For arrays, ${#array[@]} or ${#array[*]} gives the number of elements in the array, providing a way to count array size without delving into array-specific structures.[89] Indirect expansion, ${!parameter}, treats the value of parameter as the name of another parameter and substitutes that one's value; for example, if var=HOME and HOME=/home/user, then echo ${!var} outputs "/home/user".[89]
Introduced in Bash version 4.0 and later, case modification operators allow transforming the case of the expanded value. The form ${parameter^^} converts all characters in the expansion to uppercase (or matching a specified pattern), while ${parameter,,} converts to lowercase; for var=hello, echo ${var^^} produces "HELLO".[89] These features enhance string manipulation directly in expansions, reducing the need for external commands like tr.[89]
Brace, Tilde, and Pathname Expansion
Brace expansion in Bash generates arbitrary strings that share a common prefix and suffix, allowing users to create multiple variations efficiently. It is performed before any other expansions and treats the content strictly textually, preserving special characters for later processing. The syntax consists of an optional preamble, followed by an unquoted opening brace{, a comma-separated list of strings or a sequence expression, and an unquoted closing brace }, optionally followed by a postscript. For example, echo a{d,c,b}e expands to ade ace abe. Nested brace expansions are supported, and the results are generated in left-to-right order.[90]
Sequence expressions, introduced in Bash 3.0, enable numeric or alphabetic ranges within braces. The form {x..y} expands to strings from x to y inclusive, where x and y can be integers or single characters; an optional increment ..incr allows custom steps, defaulting to 1 for ascending or -1 for descending sequences. Integer sequences are zero-padded if the upper bound requires more digits, while character sequences follow lexicographic order in the C locale. For instance, echo file{1..3}.txt produces file1.txt file2.txt file3.txt, and echo {a..c} yields a b c. To prevent expansion, backslashes can escape the braces or commas, or the opening brace can follow a dollar sign as in ${.[90]
Tilde expansion substitutes the tilde ~ at the beginning of an unquoted word with directory paths, facilitating shorthand references to user directories and navigation history. If the word starts with an unquoted ~ followed by characters up to the first unquoted slash (or the end if none), it forms the tilde-prefix for substitution. The plain ~ expands to the value of the $HOME environment variable, representing the current user's home directory. ~user expands to the home directory of the specified user, as determined by the password database. Variants include ~+ for the current working directory ($PWD), ~- for the previous working directory ($OLDPWD if set), and ~N or ~+N for elements in the directory stack via dirs +N or dirs -N. The expansion result is quoted to prevent further splitting or expansion, and invalid prefixes remain unchanged. For example, cd ~user/documents navigates to /home/user/documents. Tilde expansion also applies after the colon or first equals sign in certain variable assignments like PATH or CDPATH.[91]
Pathname expansion, also known as globbing or filename expansion, replaces unquoted patterns containing *, ?, or [ in words with a sorted list of matching filenames from the current directory (or specified paths). The * matches any string of zero or more characters, ? matches exactly one character, and [...] matches any single character from the specified set or range, such as [a-z]. Dots at the start of filenames or after slashes must be matched explicitly unless the dotglob option is enabled. For example, ls *.txt lists all files ending in .txt. By default, patterns that match no files remain unexpanded.[92]
Extended globbing patterns are available when the extglob shell option is enabled via shopt -s extglob, adding operators like !(pattern) to match anything except the given pattern, ?(pattern) for zero or one occurrence, *(pattern) for zero or more, +(pattern) for one or more, and @(pattern) for exactly one of the patterns. These must be enabled before parsing, as parentheses otherwise have syntactic meaning. For instance, with extglob active, ls !(README) lists all files except README.[93]
Several options modify pathname expansion behavior. The nullglob option, set with shopt -s nullglob, causes unmatched patterns to expand to an empty string rather than remaining literal. GLOBIGNORE is a colon-separated list of patterns that, when set, ignores matching filenames during expansion, excluding . and .. by default. Additional options like nocaseglob enable case-insensitive matching, failglob treats no matches as errors, and globstar (Bash 4.0+) allows ** to recursively match directories. These expansions occur after brace and tilde but before word splitting, which divides the resulting words into fields. Examples include echo file{1,2}.txt expanding first via braces to file1.txt file2.txt, then via pathname if files exist.[92][94]
Word Splitting and Globbing
In Bash, word splitting is the process by which the shell divides the results of certain expansions—specifically parameter expansion, command substitution, and arithmetic expansion—into individual words, or fields, when they are not enclosed in double quotes.[42] This occurs after the initial expansions but before further processing like filename expansion. The splitting is controlled by the Internal Field Separator (IFS) variable, which by default consists of the space, tab, and newline characters; if IFS is unset, it defaults to this value, and if set to null, no splitting occurs.[42] The rules for word splitting are precise: sequences of IFS whitespace characters (space, tab, or newline) are first stripped from the beginning and end of the expansion result, treating any sequence of such characters as equivalent to a single delimiter without creating empty fields.[42] For non-whitespace IFS characters, the text is split at each occurrence, and consecutive non-whitespace delimiters produce empty fields; however, if IFS contains only whitespace, consecutive delimiters are treated as one, ignoring potential empty fields unless the expansion is quoted.[42] Expansions within double quotes are not subject to word splitting, preserving the entire result as a single word, though explicit null arguments (like"") are retained.[42]
Following word splitting, Bash performs filename expansion, also known as globbing, on each resulting word that contains unquoted pattern characters such as * (matching any string, including the null string), ? (matching any single character), or [...] (matching any single character in the specified set or range).[92] This expansion replaces the pattern with a sorted list of matching filenames in the current directory (or specified path), but only if the pattern does not begin with a / or ./ and is not quoted; if no matches are found, the original word is retained unless modified by shell options.[92] Globbing thus applies independently to each split word, potentially expanding a single split field into multiple filenames.
Several shell options, set via the shopt builtin, influence globbing behavior. The nocaseglob option enables case-insensitive pattern matching, so *.[Tt][Xx][Tt] would match files like file.TXT.[94] The failglob option causes the shell to report an error and exit if a pattern fails to match any files, rather than leaving the pattern unexpanded.[94] These options do not affect word splitting directly but control how the post-split words are further processed.
For example, consider the command var="file1.txt file2.txt"; for i in $var; do echo "$i"; done, which splits $var on spaces (default IFS) into two words, then applies globbing if patterns are present, outputting each filename on a separate line.[42] If IFS is set to a colon, as in IFS=:'; var="a::b"; echo $var, the result splits into three fields: "a", an empty field, and "b", demonstrating how non-whitespace delimiters create null fields.[42] In contrast, quoting prevents this: for i in "$var"; do echo "$i"; done treats the entire expansion as one word.[42]
Security Considerations
Common Vulnerabilities and Exploits
One of the most significant historical vulnerabilities in Bash is Shellshock, identified as CVE-2014-6271, which affected GNU Bash versions through 4.3. This flaw allowed remote attackers to execute arbitrary commands by injecting malicious code into environment variables, as Bash processed trailing strings after function definitions in these variables without proper sanitization. The vulnerability was particularly dangerous in network-facing applications like web servers using CGI scripts, where environment variables such as HTTP headers could be controlled by attackers, leading to widespread exploitation attempts shortly after disclosure. Patches were released in Bash 4.3 update 25 and subsequent versions to prevent execution of this trailing code.[95][96] Command injection represents a common usage-related vulnerability in Bash scripts, particularly when theeval builtin is used with untrusted input. By passing attacker-controlled data to eval, such as through eval "$untrusted_input", arbitrary shell commands can be executed, potentially compromising the system if the input originates from external sources like user forms or network requests. This issue exploits Bash's ability to interpret and execute strings as code, enabling attackers to append or inject commands that alter script behavior. For instance, if untrusted input contains a semicolon followed by a malicious command, it can chain executions beyond the intended operation.[97]
Path traversal vulnerabilities arise in Bash scripts that construct file paths using unsanitized user input, allowing attackers to access files outside the intended directory. By injecting sequences like ../ into path variables, an attacker can navigate the filesystem to read or write sensitive files, such as configuration data or logs, if the script performs operations like reading or creating files based on this input. This is especially risky in scripts handling file uploads or dynamic path resolution without validation, leading to unauthorized data exposure or modification.[98]
Time-of-check to time-of-use (TOCTOU) race conditions are prevalent in Bash scripts involving file operations, where a check for file existence or permissions occurs separately from its subsequent use. For example, a script might use [ -f "$file" ] to verify a file's presence before reading or writing to it, but an attacker could replace the file with a malicious one in the intervening time, exploiting the window between the check and use to execute unintended code or overwrite data. These races are exacerbated in multi-process environments and can lead to privilege escalation if the script runs with elevated permissions.[99]
An untrusted PATH environment variable poses risks when Bash searches for executables in directories under attacker control, potentially executing malicious binaries instead of legitimate ones. If the PATH includes writable or remote directories, an attacker can place a trojanized version of a common command (e.g., ls) that performs harmful actions while mimicking normal behavior, leading to arbitrary code execution during script invocation. This vulnerability is common in shared or multi-user systems where PATH is modified without verification.[100][101]
Symlink abuse occurs when Bash scripts create or access temporary files without proper protections, allowing attackers to replace predictable temp files with symbolic links pointing to sensitive targets. For instance, if a script writes to /tmp/predictable_file without atomic operations, an attacker can symlink it to a critical file like /etc/[passwd](/page/Passwd), causing the script's write to overwrite or corrupt the target when executed. This can result in data loss or unauthorized modifications, particularly in privileged scripts using fixed temp names.[102][103]
Secure Coding Practices
Secure coding practices in Bash scripting emphasize preventing common security risks through rigorous input handling, controlled execution environments, and adherence to least-privilege principles. Developers should prioritize validating all user inputs to mitigate command injection attacks, where untrusted data could alter script behavior. For instance, use the[[ ]] conditional construct instead of the single [ ] test command, as it performs no word splitting or pathname expansion on variables, reducing the risk of unintended command execution. Always quote variables (e.g., "$var") to preserve literal values and prevent globbing or splitting, and apply regular expression matching within [[ ]] for strict validation, such as [[ $input =~ ^[a-zA-Z0-9]+$ ]] to allow only alphanumeric characters. These techniques align with input validation strategies that assume all external data is potentially malicious, enforcing a "whitelist" approach to accept only known-good formats.[85][97][104]
Avoiding dangerous builtins like eval is crucial, as it can execute arbitrary strings constructed from untrusted input, leading to code injection. Instead, opt for safer alternatives such as Bash arrays to build and iterate over dynamic command lists; for example, declare an array with cmds=("/bin/[ls](/page/Ls)" "-l") and execute via "${cmds[@]}" to separate arguments securely. Similarly, secure the PATH environment variable by using absolute paths for commands (e.g., /usr/bin/[ls](/page/Ls) instead of [ls](/page/Ls)) or invoking scripts in a clean environment with [env](/page/Env) -i PATH=/secure/path script.sh to prevent substitution of malicious binaries in user-controlled directories. File permissions must be managed proactively: set an appropriate umask at the script's start, such as umask 077 to restrict new files to owner-only access, and avoid operating in world-writable directories to prevent tampering.[105][97][104]
Robust error handling enhances security by failing fast and transparently on issues. Enable set -u (nounset) to treat references to unset variables as errors, preventing silent failures that could expose systems to unintended behavior, and set -e (errexit) to exit immediately upon any command returning a non-zero status, ensuring partial executions do not leave systems in insecure states. Combine these with set -o pipefail for pipelines to propagate errors from any segment. For logging, redirect errors and output to secure files (e.g., command 2>> /var/log/script_errors.log) while avoiding inclusion of sensitive data like passwords in messages; use tools like logger for syslog integration if needed. Adhere to least-privilege principles by designing scripts to run as non-root users whenever possible—check effective UID with id -u and exit if zero unless root access is explicitly required—and elevate privileges only for specific operations using sudo targeted commands. These practices minimize the attack surface and align with established Unix security models.[80][106]
Compliance and Modes
POSIX Compliance Mode
Bash's POSIX compliance mode configures the shell to adhere more closely to the POSIX Shell and Utilities standard (IEEE Std 1003.1), limiting its behavior to ensure compatibility with other POSIX-compliant shells.[107] This mode can be enabled by invoking Bash with the--posix command-line option, executing set -o posix within a running session, or starting Bash as sh after processing its startup files.[107] Additionally, setting the POSIXLY_CORRECT environment variable forces Bash into this mode, which can be checked via echo $POSIXLY_CORRECT to verify its status.[107]
In POSIX mode, Bash disables several Bash-specific extensions (Bashisms) to enforce POSIX limits, such as treating the [[ compound command as an ordinary command rather than a test construct, thereby requiring the use of single brackets [ ] for conditionals.[107] Arrays and associative arrays, which are non-POSIX features, are unavailable, and brace expansion—including range forms like {1..10}—is suppressed, so commands like echo {1..5} output the literal string instead of expanded numbers.[107] Other changes include always enabling alias expansion even in non-interactive shells, performing no filename expansion on redirections (e.g., > *.txt fails unless the shell is interactive), and restricting tilde expansion to assignments preceding command names (e.g., PATH=~/bin expands, but echo ~ does not).[107] Non-interactive shells exit immediately on errors like invalid variable assignments or syntax issues in eval, and special builtins such as export take precedence over functions of the same name.[107]
The primary benefit of POSIX compliance mode is enhanced portability, allowing scripts written for Bash to run predictably on other Unix-like systems using POSIX shells like those based on the original Bourne shell, without relying on proprietary extensions.[36] This is particularly useful for system administration tasks or software packaging that must operate across diverse environments, such as Linux distributions and BSD variants.[107]
However, POSIX mode imposes limitations, including the absence of local variables within functions (no local or declare -l), restricted parameter expansions compared to full Bash capabilities, and incomplete implementation of some POSIX requirements, such as byte-oriented word splitting or the default behavior of echo and fc builtins.[107] Bash in this mode reads POSIX-specific startup files like $ENV instead of its usual profiles, which may alter initialization but ensures stricter adherence.[107]
Common use cases include testing shell scripts for compatibility with /bin/sh on POSIX systems, developing portable automation tools, and ensuring compliance during software builds or deployments where Bash acts as a drop-in replacement for traditional shells.[107] For instance, developers might invoke bash --posix script.sh to validate that the script avoids Bashisms before committing it to a portable repository.[107]
Restricted and Other Special Modes
Bash supports a restricted shell mode designed to create a more controlled environment, limiting certain user actions to enhance security in scenarios such as jailed user accounts or limited access systems.[108] This mode is invoked by starting Bash with the-r or --restricted option, or by naming the executable rbash, which causes Bash to enter restricted mode automatically.[108] In restricted mode, the shell behaves like standard Bash but imposes several key limitations: users cannot change directories with cd, modify critical environment variables like PATH, SHELL, ENV, BASH_ENV, or HISTFILE, or execute commands containing slashes in their names, which prevents absolute or relative path usage.[108] Additionally, output redirection operators (such as >, >>, or >|), the exec builtin, and certain options to enable or command are disabled, while importing functions from the environment or sourcing files via SHELLOPTS is prohibited.[108] These restrictions take effect after startup files are read, and they cannot be disabled once enabled, as commands like set +r or shopt -u restricted_shell are ignored.[108] For practical deployment in secure environments, administrators often set the user's shell to /bin/rbash using chsh -s /bin/rbash, combined with a controlled PATH limited to trusted directories and a non-writable home directory to further constrain access.[108]
Beyond restricted mode, Bash offers other special invocation and runtime modes that alter its behavior for specific use cases. The login shell mode, activated with bash -l or bash --login, simulates a shell started by a login process, sourcing profile files like ~/.bash_profile or /etc/profile to initialize the environment appropriately for session starts.[109] In non-interactive mode, invoked via bash -c "command_string" to execute a command string or bash -s to read from standard input, Bash omits interactive prompts, expands positional parameters from arguments or input, and exits on end-of-file (EOF) without further input.[109]
For debugging, the -x or --xtrace option enables trace mode, where Bash prints each command and its expanded arguments to standard error before execution, aiding in script troubleshooting; the --debugger variant additionally sets the extdebug shell option for enhanced debugging features like trap tracing.[109] Command-line editing modes can be switched at runtime using the set builtin: set -o emacs enables Emacs-style key bindings for line editing (the default), while set -o vi switches to Vi-style editing, allowing modal insertion and command modes for navigation and modification via the Readline library.[79] These modes apply to interactive shells and the read -e builtin, providing familiar interfaces for users preferring one editing paradigm over the other.[79]
The shopt builtin further allows toggling of optional shell behaviors that function as special modes for customization. For instance, shopt -s nocaseglob enables case-insensitive filename globbing during pathname expansion, useful in file systems where case sensitivity varies.[94] Other options like nocasematch for case-insensitive pattern matching in case statements or [[ tests, and extglob for extended glob patterns (e.g., +(pattern) for one-or-more matches), can be enabled or disabled similarly to fine-tune expansion and matching behaviors without altering core shell invocation.[94] These modes collectively allow Bash to adapt to diverse operational needs while maintaining its POSIX-compatible foundation.[94]