format(docs): remove escaped 's character pairs

This commit is contained in:
Vera De Kalb 2024-03-30 13:22:45 -05:00
parent 93e24b5962
commit a0a82533bc
80 changed files with 385 additions and 385 deletions

View File

@ -77,7 +77,7 @@ The `read` builtin command has some interesting new features.
The `-t` option to specify a timeout value has been slightly tuned. It
now accepts fractional values and the special value 0 (zero). When
`-t 0` is specified, `read` immediately returns with an exit status
indicating if there\'s data waiting or not. However, when a timeout is
indicating if there's data waiting or not. However, when a timeout is
given, and the `read` builtin times out, any partial data recieved up to
the timeout is stored in the given variable, rather than lost. When a
timeout is hit, `read` exits with a code greater than 128.

View File

@ -59,7 +59,7 @@ f3
## Notes
- `caller` produces no output unless used within a script that\'s run
- `caller` produces no output unless used within a script that's run
from a real file. It isn\'t particularly useful for interactive use,
but can be used to create a decent `die` function to track down
errors in moderately complex scripts.
@ -68,7 +68,7 @@ f3
are available and a number of special parameters that give more
detail than caller (e.g. BASH_ARG{C,V}). Tools such as
[Bashdb](http://bashdb.sourceforge.net/) can assist in using some of
Bash\'s more advanced debug features.
Bash's more advanced debug features.
- The Bash manpage and help text specifies that the argument to
`caller` is an \"expr\" (whatever that means). Only an integer is
actually allowed, with no special interpretation of an

View File

@ -13,7 +13,7 @@ The `cd` builtin command is used to change the current working directory
- to the given directory (`cd DIRECTORY`)
- to the previous working directory (`cd -`) as saved in the
[OLDPWD](../../syntax/shellvars.md#OLDPWD) shell variable
- to the user\'s home directory as specified in the
- to the user's home directory as specified in the
[HOME](../../syntax/shellvars.md#HOME) environment variable (when used
without a `DIRECTORY` argument)
@ -30,7 +30,7 @@ is given or the shell is configured to do so (see the `-P` option of
-------- ----------------------------------------------------
`-L` Follow symbolic links (default)
`-P` Do not follow symbolic links
`-@` Browse a file\'s extended attributed, if supported
`-@` Browse a file's extended attributed, if supported
### Exit status
@ -41,7 +41,7 @@ is given or the shell is configured to do so (see the `-P` option of
## Examples
### Change the working directory to the user\'s home directory
### Change the working directory to the user's home directory
cd

View File

@ -22,7 +22,7 @@ variable.
When used in a function, `declare` makes `NAMEs` local variables, unless
used with the `-g` option.
Don\'t use it\'s synonym `typeset` when coding for Bash, since it\'s
Don\'t use it's synonym `typeset` when coding for Bash, since it's
tagged as obsolete.
### Options
@ -52,7 +52,7 @@ Below, `[-+]X` indicates an attribute, use `-X` to set the attribute,
`[-+]n` make NAME a reference to the variable named by its value. Introduced in Bash 4.3-alpha.\
\'\' \${!NAME}\'\' reveals the reference variable name, VALUE.\
Use `unset -n NAME` to unset the variable. (`unset -v NAME` unsets the VALUE variable.)\
Use `[[ -R NAME ]]` to test if NAME has been set to a VALUE, another variable\'s name.
Use `[[ -R NAME ]]` to test if NAME has been set to a VALUE, another variable's name.
`-p` display the attributes and value of each NAME
@ -156,9 +156,9 @@ a=(1 2 3); b=(6 5 4); c=(2 4 6) sum total a b c printf \'Final value of
\"total\" is: %d\\n\' \"\$total\" \</div\>
`typeset -n` is currently implemented in ksh93, mksh, and Bash 4.3. Bash
and mksh\'s implementations are quite similar, but much different from
ksh93\'s. See [Portability considerations](#portability_considerations)
for details. ksh93 namerefs are much more powerful than Bash\'s.
and mksh's implementations are quite similar, but much different from
ksh93's. See [Portability considerations](#portability_considerations)
for details. ksh93 namerefs are much more powerful than Bash's.
## Portability considerations

View File

@ -6,7 +6,7 @@
## Description
`echo` outputs it\'s args to stdout, separated by spaces, followed by a
`echo` outputs it's args to stdout, separated by spaces, followed by a
newline. The return status is always `0`. If the
[shopt](../../commands/builtin/shopt.md) option `xpg_echo` is set, Bash
dynamically determines whether echo should expand escape characters

View File

@ -168,7 +168,7 @@ identical to those of [let](../../commands/builtin/let.md).
eval](http://mywiki.wooledge.org/BashFAQ/006#Assigning_indirect.2BAC8-reference_variables)
- [More indirection via
eval](http://fvue.nl/wiki/Bash:_Passing_variables_by_reference)
- [Martin Väth\'s \"push\"](https://github.com/vaeth/push) \--
- [Martin Väth's \"push\"](https://github.com/vaeth/push) \--
`printf %q` work-alike for POSIX.
- [The \"magic alias\"
hack](http://www.chiark.greenend.org.uk/~sgtatham/aliases.html)

View File

@ -44,8 +44,8 @@ command](../../syntax/ccmd/arithmetic_eval.md):
\<WRAP info\> Remember that inside arithmetic evaluation contexts, all
other expansions are processed as usual (from left-to-right), and the
resulting text is evaluated as an arithmetic expression. Arithmetic
already has a way to control precedence using parentheses, so it\'s very
rare to need to nest arithmetic expansions within one another. It\'s
already has a way to control precedence using parentheses, so it's very
rare to need to nest arithmetic expansions within one another. It's
used above only to illustrate how this precedence works. \</WRAP\>
Unlike `((`, being a simple command `let` has its own environment. In
@ -87,10 +87,10 @@ needed.
is more \"standard\" than `let`, the above should always be
preferred. Both [arithmetic expansion](../../syntax/arith_expr.md)s and the
`[` test operator are specified by POSIX(r) and satisfy almost all
of expr\'s use-cases. Unlike `let`, `expr` cannot assign directly to
of expr's use-cases. Unlike `let`, `expr` cannot assign directly to
bash variables but instead returns a result on stdout. `expr` takes
each operator it recognizes as a separate word and then concatenates
them into a single expression that\'s evaluated according to it\'s
them into a single expression that's evaluated according to it's
own rules (which differ from shell arithmetic). `let` parses each
word it recieves on its own and evaluates it as an expression
without generating any output other than a return code.

View File

@ -48,7 +48,7 @@ way, and takes all the same options, with 3 exceptions:
variables follow roughly
[lexical-scoping](http://community.schemewiki.org/?lexical-scope),
except that functions themselves don\'t have scope, just like Bash.
This means that even functions defined within a \"function\'s
This means that even functions defined within a \"function's
scope\" don\'t have access to non-local variables except through
`namerefs`.

View File

@ -29,7 +29,7 @@ given array `ARRAY` is set readonly.
`-t` Remove any trailing newline from a line read, before it is assigned to an array element.
`-u FD` Read from filedescriptor `FD` rather than standard input.
While `mapfile` isn\'t a common or portable shell feature, it\'s
While `mapfile` isn\'t a common or portable shell feature, it's
functionality will be familiar to many programmers. Almost all
programming languages (aside from shells) with support for compound
datatypes like arrays, and which handle open file objects in the
@ -43,13 +43,13 @@ use.
## Examples
Here\'s a real-world example of interactive use borrowed from Gentoo
Here's a real-world example of interactive use borrowed from Gentoo
workflow. Xorg updates require rebuilding drivers, and the
Gentoo-suggested command is less than ideal, so let\'s Bashify it. The
Gentoo-suggested command is less than ideal, so let's Bashify it. The
first command produces a list of packages, one per line. We can read
those into the array named \"args\" using `mapfile`, stripping trailing
newlines with the \'-t\' option. The resulting array is then expanded
into the arguments of the \"emerge\" command - an interface to Gentoo\'s
into the arguments of the \"emerge\" command - an interface to Gentoo's
package manager. This type of usage can make for a safe and effective
replacement for xargs(1) in certain situations. Unlike xargs, all
arguments are guaranteed to be passed to a single invocation of the
@ -59,13 +59,13 @@ business.
# eix --only-names -IC x11-drivers | { mapfile -t args; emerge -av1 "${args[@]}" <&1; }
Note the use of command grouping to keep the emerge command inside the
pipe\'s subshell and within the scope of \"args\". Also note the unusual
pipe's subshell and within the scope of \"args\". Also note the unusual
redirection. This is because the -a flag makes emerge interactive,
asking the user for confirmation before continuing, and checking with
isatty(3) to abort if stdin isn\'t pointed at a terminal. Since stdin of
the entire command group is still coming from the pipe even though
mapfile has read all available input, we just borrow FD 1 as it just so
happens to be pointing where we want it. More on this over at greycat\'s
happens to be pointing where we want it. More on this over at greycat's
wiki: <http://mywiki.wooledge.org/BashFAQ/024>
### The callback
@ -209,7 +209,7 @@ each subsequent 2 iterations. The RETURN trap is unimportant.
## To Do
- Create an implementation as a shell function that\'s portable
- Create an implementation as a shell function that's portable
between Ksh, Zsh, and Bash (and possibly other bourne-like shells
with array support).
@ -218,4 +218,4 @@ each subsequent 2 iterations. The RETURN trap is unimportant.
- [arrays](../../syntax/arrays.md)
- [read](../../commands/builtin/read.md) - If you don\'t know about this yet,
why are you reading this page?
- <http://mywiki.wooledge.org/BashFAQ/001> - It\'s FAQ 1 for a reason.
- <http://mywiki.wooledge.org/BashFAQ/001> - It's FAQ 1 for a reason.

View File

@ -23,7 +23,7 @@ POSIX(r) recommends that `printf` is preferred over `echo`.
## General
The `printf` command provides a method to print preformatted text
similar to the `printf()` system interface (C function). It\'s meant as
similar to the `printf()` system interface (C function). It's meant as
successor for `echo` and has far more features and possibilities.
Beside other reasons, POSIX(r) has a very good argument to recommend it:
@ -164,7 +164,7 @@ introductory `%` and the character that specifies the format:
Field output format
--------------------- --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
`<N>` **Any number**: Specifies a **minimum field width**, if the text to print is shorter, it\'s padded with spaces, if the text is longer, the field is expanded
`<N>` **Any number**: Specifies a **minimum field width**, if the text to print is shorter, it's padded with spaces, if the text is longer, the field is expanded
`.` **The dot**: Together with a field width, the field is **not** expanded when the text is longer, the text is truncated instead. \"`%.s`\" is an undocumented equivalent for \"`%.0s`\", which will force a field width of zero, effectively hiding the field from output
`*` **The asterisk**: the width is given as argument before the string or number. Usage (the \"`*`\" corresponds to the \"`20`\"): `printf "%*s\n" 20 "test string"`
`#` \"Alternative format\" for numbers: see table below
@ -178,8 +178,8 @@ introductory `%` and the character that specifies the format:
Alternative Format
-------------------------------------------------- --------------------------------------------------------------------------------------------------------------------------------------------------------------
`%#o` The octal number is printed with a leading zero, unless it\'s zero itself
`%#x`, `%#X` The hex number is printed with a leading \"`0x`\"/\"`0X`\", unless it\'s zero
`%#o` The octal number is printed with a leading zero, unless it's zero itself
`%#x`, `%#X` The hex number is printed with a leading \"`0x`\"/\"`0X`\", unless it's zero
`%#g`, `%#G` The float number is printed with **trailing zeros** until the number of digits for the current precision is reached (usually trailing zeros are not printed)
all number formats except `%d`, `%o`, `%x`, `%X` Always print a decimal point in the output, even if no digits follow it
@ -382,7 +382,7 @@ readability.
use `%c`, you\'re actually asking for the first byte of the
argument. Likewise, the maximum field width modifier (dot) in
combination with `%s` goes by bytes, not characters. This limits
some of printf\'s functionality to working with ascii only. ksh93\'s
some of printf's functionality to working with ascii only. ksh93's
`printf` supports the `L` modifier with `%s` and `%c` (but so far
not `%S` or `%C`) in order to treat precision as character width,
not byte count. zsh appears to adjust itself dynamically based upon
@ -409,7 +409,7 @@ fmt++;
- mksh has no built-in printf by default (usually). There is an
unsupported compile-time option to include a very poor, basically
unusable implementation. For the most part you must rely upon the
system\'s `/usr/bin/printf` or equivalent. The mksh maintainer
system's `/usr/bin/printf` or equivalent. The mksh maintainer
recommends using `print`. The development version (post- R40f) adds
a new parameter expansion in the form of `${name@Q}` which fills the
role of `printf %q` \-- expanding in a shell-escaped format.
@ -418,7 +418,7 @@ fmt++;
<!-- -->
```
- ksh93 optimizes builtins run from within a command substitution and
which have no redirections to run in the shell\'s process. Therefore
which have no redirections to run in the shell's process. Therefore
the `printf -v` functionality can be closely matched by
`var=$(printf ...)` without a big performance hit.
@ -447,8 +447,8 @@ fmt++;
- The optional Bash loadable `print` may be useful for ksh
compatibility and to overcome some of
[echo](../../commands/builtin/echo.md)\'s portability pitfalls. Bash, ksh93,
and zsh\'s `print` have an `-f` option which takes a `printf` format
[echo](../../commands/builtin/echo.md)'s portability pitfalls. Bash, ksh93,
and zsh's `print` have an `-f` option which takes a `printf` format
string and applies it to the remaining arguments. Bash lists the
synopsis as:
`print: print [-Rnprs] [-u unit] [-f format] [arguments]`. However,
@ -472,5 +472,5 @@ fmt++;
function](http://pubs.opengroup.org/onlinepubs/9699919799/functions/printf.html)
- [Code snip: Print a horizontal
line](../../snipplets/print_horizontal_line.md) uses some `printf` examples
- [Greg\'s BashFAQ 18: How can I use numbers with leading zeros in a
- [Greg's BashFAQ 18: How can I use numbers with leading zeros in a
loop, e.g., 01, 02?](BashFAQ>018)

View File

@ -43,7 +43,7 @@ line is read). That means the timeout can occur during input, too.
---------------- -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
`-a <ARRAY>` read the data word-wise into the specified array `<ARRAY>` instead of normal variables
`-d <DELIM>` recognize `<DELIM>` as data-end, rather than `<newline>`
`-e` on interactive shells: use Bash\'s readline interface to read the data. Since version 5.1-alpha, this can also be used on specified file descriptors using `-u`
`-e` on interactive shells: use Bash's readline interface to read the data. Since version 5.1-alpha, this can also be used on specified file descriptors using `-u`
`-i <STRING>` preloads the input buffer with text from `<STRING>`, only works when Readline (`-e`) is used
`-n <NCHARS>` reads `<NCHARS>` characters of input, then quits
`-N <NCHARS>` reads `<NCHARS>` characters of input, *ignoring any delimiter*, then quits
@ -56,7 +56,7 @@ line is read). That means the timeout can occur during input, too.
When both, `-a <ARRAY>` and a variable name `<NAME>` is given, then the
array is set, but not the variable.
Of course it\'s valid to set individual array elements without using
Of course it's valid to set individual array elements without using
`-a`:
read MYARRAY[5]
@ -99,7 +99,7 @@ array name and index:
Essentially all you need to know about `-r` is to **ALWAYS** use it. The
exact behavior you get without `-r` is completely useless even for weird
purposes. It basically allows the escaping of input which matches
something in IFS, and also escapes line continuations. It\'s explained
something in IFS, and also escapes line continuations. It's explained
pretty well in the [POSIX
read](http://pubs.opengroup.org/onlinepubs/9699919799/utilities/read.html#tag_20_109)
spec.
@ -141,7 +141,7 @@ some baskslash-escapes or switches (like `-n`).
### Press any key\...
Remember the MSDOS `pause` command? Here\'s something similar:
Remember the MSDOS `pause` command? Here's something similar:
pause() {
local dummy

View File

@ -77,7 +77,7 @@ maintainer refuses to change either the `shift` or `command` builtins.~~
[Fixed](https://github.com/MirBSD/mksh/commit/996e05548ab82f7ef2dea61f109cc7b6d13837fa).
(Thanks!)
- Perhaps almost as bad as the above, busybox sh\'s `shift` always
- Perhaps almost as bad as the above, busybox sh's `shift` always
returns success, even when attempting to shift beyond the final
argument. \<code\> \$ bb -c \'f() { if shift; then echo \"\$1\";
else echo \"no args\"; fi; }; f\'

View File

@ -34,7 +34,7 @@ Special events
`EXIT` 0 executed on shell exit
`DEBUG` executed before every simple command
`RETURN` executed when a shell function or a sourced code finishes executing
`ERR` executed each time a command\'s failure would cause the shell to exit when the [`-e` option (`errexit`)](../../commands/builtin/set.md) is enabled
`ERR` executed each time a command's failure would cause the shell to exit when the [`-e` option (`errexit`)](../../commands/builtin/set.md) is enabled
### Options

View File

@ -54,7 +54,7 @@ property allows looking upwards through the stack as variable names are
unset, so long as unset and the local it unsets aren\'t together in the
same scope level.
Here\'s a demonstration of this behavior.
Here's a demonstration of this behavior.
#!/usr/bin/env bash
@ -121,8 +121,8 @@ output:
Some things to observe:
- `unset2` is only really needed once. We remain 5 levels deep in
`f`\'s for the remaining `unset` calls, which peel away the outer
layers of `a`\'s.
`f`'s for the remaining `unset` calls, which peel away the outer
layers of `a`'s.
- Notice that the \"a\" is unset using an ordinary unset command at
recursion depth 1, and subsequently calling unset reveals a again in
the global scope, which has since been modified in a lower scope
@ -145,7 +145,7 @@ expands its arguments.
~ $ ( a=({a..d}); unset 'a[2]'; declare -p a )
declare -a a='([0]="a" [1]="b" [3]="d")'
As usual in such cases, it\'s important to quote the args to avoid
As usual in such cases, it's important to quote the args to avoid
accidental results such as globbing.
~ $ ( a=({a..d}) b=a c=d d=1; set -x; unset "${b}["{2..3}-c\]; declare -p a )

View File

@ -34,7 +34,7 @@ The return status is the return status of the job waited for, or
Status Reason
-------- -------------------------------------------------
0 waited for all jobs in shell\'s job list
0 waited for all jobs in shell's job list
1 the given `ID` is not a valid job or process ID
## Examples

View File

@ -244,7 +244,7 @@ Let's say, we want to check the following two things (AND):
1. if a string is null (empty)
2. if a command produced an output
Let\'s see:
Let's see:
```bash
if [ -z "false" -a -z "$(echo I am executed >&2)" ] ; then ...

View File

@ -2,7 +2,7 @@
In terms of UNIX(r), a directory is a special file which contains a list
of [hardlinks](../dict/terms/hardlink.md) to other files. These other files
also can be directories of course, so it\'s possible to create a
also can be directories of course, so it's possible to create a
\"hierarchy of directories\" - the UNIX(r)-typical filesystem structure.
The structure begins at the special directory `/` (root directory) and

View File

@ -38,7 +38,7 @@ purposes, like reporting a termination by a signal:
1-255 failure (in general)
126 the requested command (file) can\'t be executed (but was found)
127 command (file) not found
128 according to ABS it\'s used to report an invalid argument to the exit builtin, but I wasn\'t able to verify that in the source code of Bash (see code 255)
128 according to ABS it's used to report an invalid argument to the exit builtin, but I wasn\'t able to verify that in the source code of Bash (see code 255)
128 + N the shell was terminated by the signal N (also used like this by various other programs)
255 wrong argument to the exit builtin (see code 128)

View File

@ -1,11 +1,11 @@
# File
A file is a pool of data in the [filesystem](../dict/terms/filesystem.md). On
userlevel, it\'s referenced using a name, a
userlevel, it's referenced using a name, a
[hardlink](../dict/terms/hardlink.md) to the file.
If a file is not referenced anymore (number of hardlinks to it drops to
0) then the space allocated for that file is re-used, unless it\'s still
0) then the space allocated for that file is re-used, unless it's still
used by some process.
The file-data splits into actual payload (file contents) and some

View File

@ -19,5 +19,5 @@ written to (when `mtime` is updated.
## mtime
The mtime is set, whenever a file\'s contents are changed, for example
The mtime is set, whenever a file's contents are changed, for example
by editing a file.

View File

@ -18,7 +18,7 @@ really executes
The term \"glob\" originates back in the UNIX(r) days where an
executable `glob` (from \"global\") existed which was used to expand
pattern-matching characters. Later, this functionality was built into
the shell. There\'s still a library function called `glob()` (POSIX(r)),
the shell. There's still a library function called `glob()` (POSIX(r)),
which serves the same purpose.
## See also

View File

@ -15,7 +15,7 @@ hardlink.
The difference between a [symbolic link](../dict/terms/symlink.md) and a hard
link is that there is no easy way to differentiate between a \'real\'
file and a hard link, let\'s take a look at the example:
file and a hard link, let's take a look at the example:
\* create an empty file

View File

@ -16,7 +16,7 @@ A shebang will typically look like
#!/bin/bash
Since the line starting with `#` is a comment for the shell (and some
other scripting languages), it\'s ignored.
other scripting languages), it's ignored.
Regarding the shebang, there are various, differences between operating
systems, including:
@ -26,7 +26,7 @@ systems, including:
- may be able to take arguments for the interpreter
- \...
POSIX(r) doesn\'t specify the shebang, though in general it\'s commonly
POSIX(r) doesn\'t specify the shebang, though in general it's commonly
supported by operating systems.
## See also

View File

@ -20,7 +20,7 @@ A shell variable is a parameter denoted by a *variable name*:
- containing only alphanumeric characters and underscores
- beginning with an alphabetic character or an underscore
A value can be assigned to a variable, using the variable\'s name and an
A value can be assigned to a variable, using the variable's name and an
equal-sign:
NAME=VALUE
@ -37,7 +37,7 @@ The nullstring is a valid value:
A positional parameter is denoted by a number other than `0` (zero).
Positional parameters reflect the shell\'s arguments that are not given
Positional parameters reflect the shell's arguments that are not given
to the shell itself (in practise, the script arguments, also the
function arguments). You can\'t directly assign to the positional
parameters, however, [the set builtin command](../commands/builtin/set.md)

View File

@ -2,12 +2,12 @@
On UNIX(r), the shell is the main interaction tool between the
user-level and the system. That doesn\'t necessarily mean the user
always sits infront of a shell, but it\'s integral part of the system,
always sits infront of a shell, but it's integral part of the system,
not only an \"optional commandline interpreter\".
The main job of a shell is to execute commands as a user requests them.
This behaviour alone doesn\'t help much. A shell knits some intelligence
and flow control around the possibility to execute commands - it\'s a
and flow control around the possibility to execute commands - it's a
complete commandline-oriented user-interface (UI).
FIXME

View File

@ -1,7 +1,7 @@
# Special file
Unlike a regular file (a bunch of accessible data organized on a
filesystem), it\'s a special filename that points to a ressource or
filesystem), it's a special filename that points to a ressource or
similar:
- character special files

View File

@ -140,10 +140,10 @@ command `f`. The stack remains unchanged:
1
Note how the first element that will be popped from the stack is printed
first, if you are used to an HP calculator, it\'s the reverse.
first, if you are used to an HP calculator, it's the reverse.
Don\'t hesitate to put `f` in the examples of this tutorial, it doesn\'t
change the result, and it\'s a good way to see what\'s going on.
change the result, and it's a good way to see what's going on.
## Registers
@ -184,7 +184,7 @@ enclosed in `[]`. You can print it with `p`: `dc <<< '[Hello World!]p'`
and you can evalute it with x: `dc <<< '[1 2+]xp'`.
This is not that interesting until combined with registers. First,
let\'s say we want to calculate the square of a number (don\'t forget to
let's say we want to calculate the square of a number (don\'t forget to
include `f` if you get lost!):
dc << EOF
@ -256,7 +256,7 @@ remove all those extra spaces newlines and comments:
dc <<< '[lip1-si0li>L]sL10silLx'
dc <<< '[p1-d0<L]sL10lLx' # use the stack instead of a register
I\'ll let you figure out the second example, it\'s not hard, it uses the
I\'ll let you figure out the second example, it's not hard, it uses the
stack instead of a register for the index.
## Next

View File

@ -5,7 +5,7 @@
## What is a \"Collapsing Function\"?
A collapsing function is a function whose behavior changes depending
upon the circumstances under which it\'s run. Function collapsing is
upon the circumstances under which it's run. Function collapsing is
useful when you find yourself repeatedly checking a variable whose value
never changes.

View File

@ -16,7 +16,7 @@ key=\"value\" format, otherwise bash will try to interpret commands:
echo "Config for the target host: $cool_host" >&2
So, where do these variables come from? If everything works fine, they
are defined in /etc/cool.cfg which is a file that\'s sourced into the
are defined in /etc/cool.cfg which is a file that's sourced into the
current script or shell. Note: this is **not** the same as executing
this file as a script! The sourced file most likely contains something
like:
@ -40,8 +40,8 @@ usage of the dot is identical:
## Per-user configs
There\'s also a way to provide a system-wide config file in /etc and a
custom config in \~/(user\'s home) to override system-wide defaults. In
There's also a way to provide a system-wide config file in /etc and a
custom config in \~/(user's home) to override system-wide defaults. In
the following example, the if/then construct is used to check for the
existance of a user-specific config:

View File

@ -5,7 +5,7 @@ $ ls *.zip | while read i; do j=`echo $i | sed 's/.zip//g'`; mkdir $j; cd $j; un
```
This is an actual one-liner someone asked about in `#bash`. **There are
several things wrong with it. Let\'s break it down!**
several things wrong with it. Let's break it down!**
``` bash
$ ls *.zip | while read i; do ...; done
@ -21,7 +21,7 @@ But in sh and bash alike, we can loop safely over the glob itself:
$ for i in *.zip; do j=`echo $i | sed 's/.zip//g'`; mkdir $j; cd $j; unzip ../$i; cd ..; done
```
Let\'s break it down some more!
Let's break it down some more!
``` bash
j=`echo $i | sed 's/.zip//g'` # where $i is some name ending in '.zip'
@ -30,14 +30,14 @@ j=`echo $i | sed 's/.zip//g'` # where $i is some name ending in '.zip'
The goal here seems to be get the filename without its `.zip` extension.
In fact, there is a POSIX(r)-compliant command to do this: `basename`
The implementation here is suboptimal in several ways, but the only
thing that\'s genuinely error-prone with this is \"`echo $i`\". Echoing
thing that's genuinely error-prone with this is \"`echo $i`\". Echoing
an *unquoted* variable means
[wordsplitting](../syntax/expansion/wordsplit.md) will take place, so any
whitespace in `$i` will essentially be normalized. In `sh` it is
necessary to use an external command and a subshell to achieve the goal,
but we can eliminate the pipe (subshells, external commands, and pipes
carry extra overhead when they launch, so they can really hurt
performance in a loop). Just for good measure, let\'s use the more
performance in a loop). Just for good measure, let's use the more
readable, [modern](../syntax/expansion/cmdsubst.md) `$()` construct instead
of the old style backticks:
@ -53,7 +53,7 @@ expansion](../syntax/pe.md#substring_removal):
bash $ for i in *.zip; do j="${i%.zip}"; mkdir $j; cd $j; unzip ../$i; cd ..; done
```
Let\'s keep going:
Let's keep going:
``` bash
$ mkdir $j; cd $j; ...; cd ..
@ -64,7 +64,7 @@ program will run. Even if you do, the following best practice will never
hurt: When a following command depends on the success of a previous
command(s), check for success! You can do this with the \"`&&`\"
conjunction, that way, if the previous command fails, bash will not try
to execute the following command(s). It\'s fully POSIX(r). Oh, and
to execute the following command(s). It's fully POSIX(r). Oh, and
remember what I said about [wordsplitting](../syntax/expansion/wordsplit.md)
in the previous step? Well, if you don\'t quote `$j`, wordsplitting can
happen again.
@ -73,10 +73,10 @@ happen again.
$ mkdir "$j" && cd "$j" && ... && cd ..
```
That\'s almost right, but there\'s one problem \-- what happens if `$j`
That's almost right, but there's one problem \-- what happens if `$j`
contains a slash? Then `cd ..` will not return to the original
directory. That\'s wrong! `cd -` causes cd to return to the previous
working directory, so it\'s a much better choice:
directory. That's wrong! `cd -` causes cd to return to the previous
working directory, so it's a much better choice:
``` bash
$ mkdir "$j" && cd "$j" && ... && cd -
@ -84,7 +84,7 @@ $ mkdir "$j" && cd "$j" && ... && cd -
(If it occurred to you that I forgot to check for success after cd -,
good job! You could do this with `{ cd - || break; }`, but I\'m going to
leave that out because it\'s verbose and I think it\'s likely that we
leave that out because it's verbose and I think it's likely that we
will be able to get back to our original working directory without a
problem.)
@ -98,13 +98,13 @@ sh $ for i in *.zip; do j=$(basename "$i" ".zip"); mkdir "$j" && cd "$j" && unzi
bash $ for i in *.zip; do j="${i%.zip}"; mkdir "$j" && cd "$j" && unzip ../$i && cd -; done
```
Let\'s throw the `unzip` command back in the mix:
Let's throw the `unzip` command back in the mix:
``` bash
mkdir "$j" && cd "$j" && unzip ../$i && cd -
```
Well, besides word splitting, there\'s nothing terribly wrong with this.
Well, besides word splitting, there's nothing terribly wrong with this.
Still, did it occur to you that unzip might already be able to target a
directory? There isn\'t a standard for the `unzip` command, but all the
implementations I\'ve seen can do it with the -d flag. So we can drop
@ -122,4 +122,4 @@ sh $ for i in *.zip; do j=$(basename "$i" ".zip"); mkdir "$j" && unzip -d "$j" "
bash $ for i in *.zip; do j="${i%.zip}"; mkdir "$j" && unzip -d "$j" "$i"; done
```
There! That\'s as good as it gets.
There! That's as good as it gets.

View File

@ -20,7 +20,7 @@ Why `ed`?
- last but not least: standard `ed` has very good editing and
addressing possibilities, compared to standard `sed`
Don\'t get me wrong, this is **not** meant as anti-`sed` article! It\'s
Don\'t get me wrong, this is **not** meant as anti-`sed` article! It's
just meant to show you another way to do the job.
## Commanding ed
@ -127,7 +127,7 @@ which defines an address range for the first to the last line, `,p` thus
means print the whole file, after it has been modified. When your script
runs sucessfully, you only have to replace the `,p` by a `w`.
Of course, even if the file is not modified by the `p` command, **it\'s
Of course, even if the file is not modified by the `p` command, **it's
always a good idea to have a backup copy!**
## Editing your files
@ -139,7 +139,7 @@ that can\'t be done in `sed` or can only be done with very complex code.
Like `sed`, `ed` also knows the common `s/FROM/TO/` command, and it can
also take line-addresses. **If no substitution is made on the addressed
lines, it\'s considered an error.**
lines, it's considered an error.**
#### Substitutions through the whole file
@ -219,7 +219,7 @@ prints the result to stdout - `,p`):
ed -s FILE1 <<< $'$-1 r FILE2\n,p'
To compare, here\'s a possible `sed` solution which must use Bash
To compare, here's a possible `sed` solution which must use Bash
arithmetic and the external program `wc`:
sed "$(($(wc -l < FILE1)-1))r FILE2" FILE1
@ -260,10 +260,10 @@ about it with the g (global) command:
**\_\_ an error stops the script \_\_**
You might think that it\'s not a problem and that the same thing happens
You might think that it's not a problem and that the same thing happens
with sed and you\'re right, with the exception that if ed does not find
a pattern it\'s an error, while sed just continues with the next line.
For instance, let\'s say that you want to change foo to bar on the first
a pattern it's an error, while sed just continues with the next line.
For instance, let's say that you want to change foo to bar on the first
line of the file and add something after the next line, ed will stop if
it cannot find foo on the first line, sed will continue.
@ -289,7 +289,7 @@ attempt the substitution on all non blank lines
**\_\_ shell parameters are expanded \_\_**
If you don\'t quote the delimiter, \$ has a special meaning. This sounds
obvious but it\'s easy to forget this fact when you use addresses like
obvious but it's easy to forget this fact when you use addresses like
\$-1 or commands like \$a. Either quote the \$ or the delimiter:
#fails
@ -355,7 +355,7 @@ number of lines of the file:
### cat
Yea, it\'s a joke\...
Yea, it's a joke\...
ed -s file <<< $',p'

View File

@ -8,14 +8,14 @@
(`--myoption`) nor XF86-style long options (`-myoption`). So, when you
want to parse command line arguments in a professional ;-) way,
`getopts` may or may not work for you. Unlike its older brother `getopt`
(note the missing *s*!), it\'s a shell builtin command. The advantages
(note the missing *s*!), it's a shell builtin command. The advantages
are:
- No need to pass the positional parameters through to an external
program.
- Being a builtin, `getopts` can set shell variables to use for
parsing (impossible for an *external* process!)
- There\'s no need to argue with several `getopt` implementations
- There's no need to argue with several `getopt` implementations
which had buggy concepts in the past (whitespace, \...)
- `getopts` is defined in POSIX(r).
@ -27,7 +27,7 @@ parameters](../scripting/posparams.md).
### Terminology
It\'s useful to know what we\'re talking about here, so let\'s see\...
It's useful to know what we\'re talking about here, so let's see\...
Consider the following command line:
mybackup -x -f /etc/mybackup.conf -r ./foo.txt ./bar.txt
@ -43,7 +43,7 @@ several logical groups:
itself, but that isn\'t mandatory. Joining the option and option
argument into a single argument `-f/etc/mybackup.conf` is valid.
- `-r` depends on the configuration. In this example, `-r` doesn\'t
take arguments so it\'s a standalone option like `-x`.
take arguments so it's a standalone option like `-x`.
- `./foo.txt` and `./bar.txt` are remaining arguments without any
associated options. These are often used as **mass-arguments**. For
example, the filenames specified for `cp(1)`, or arguments that
@ -58,7 +58,7 @@ line is equivalent to:
which is complex to parse without the help of `getopts`.
The option flags can be **upper- and lowercase** characters, or
**digits**. It may recognize other characters, but that\'s not
**digits**. It may recognize other characters, but that's not
recommended (usability and maybe problems with special characters).
### How it works
@ -71,8 +71,8 @@ parameters. If you want to shift them, it must be done manually:
shift $((OPTIND-1))
# now do something with $@
Since `getopts` sets an exit status of *FALSE* when there\'s nothing
left to parse, it\'s easy to use in a while-loop:
Since `getopts` sets an exit status of *FALSE* when there's nothing
left to parse, it's easy to use in a while-loop:
while getopts ...; do
...
@ -124,7 +124,7 @@ an argument (i.e. to become `-A SOMETHING`) just do:
getopts fA:x VARNAME
If the **very first character** of the option-string is a `:` (colon),
which would normally be nonsense because there\'s no option letter
which would normally be nonsense because there's no option letter
preceding it, `getopts` switches to \"**silent error reporting mode**\".
In productive scripts, this is usually what you want because it allows
you to handle errors yourself without being disturbed by annoying
@ -161,7 +161,7 @@ Regarding error-reporting, there are two modes `getopts` can run in:
For productive scripts I recommend to use the silent mode, since
everything looks more professional, when you don\'t see annoying
standard messages. Also it\'s easier to handle, since the failure cases
standard messages. Also it's easier to handle, since the failure cases
are indicated in an easier way.
#### Verbose Mode
@ -182,7 +182,7 @@ are indicated in an easier way.
Enough said - action!
Let\'s play with a very simple case: only one option (`-a`) expected,
Let's play with a very simple case: only one option (`-a`) expected,
without any arguments. Also we disable the *verbose error handling* by
preceding the whole option string with a colon (`:`):
@ -204,7 +204,7 @@ done
I put that into a file named `go_test.sh`, which is the name you\'ll see
below in the examples.
Let\'s do some tests:
Let's do some tests:
#### Calling it without any arguments
@ -228,7 +228,7 @@ The arguments given to your script are of course accessible as `$1` -
#### Calling it with option-arguments
Now let\'s trigger `getopts`: Provide options.
Now let's trigger `getopts`: Provide options.
First, an **invalid** one:
@ -249,7 +249,7 @@ Now, a **valid** one (`-a`):
You see, the detection works perfectly. The `a` was put into the
variable `$opt` for our case statement.
Of course it\'s possible to **mix valid and invalid** options when
Of course it's possible to **mix valid and invalid** options when
calling:
$ ./go_test.sh -a -x -b -c
@ -259,7 +259,7 @@ calling:
Invalid option: -c
$
Finally, it\'s of course possible, to give our option **multiple
Finally, it's of course possible, to give our option **multiple
times**:
$ ./go_test.sh -a -a -a -a
@ -278,7 +278,7 @@ The last examples lead us to some points you may consider:
### An option with argument
Let\'s extend our example from above. Just a little bit:
Let's extend our example from above. Just a little bit:
- `-a` now takes an argument
- on an error, the parsing exits with `exit 1`
@ -303,7 +303,7 @@ while getopts ":a:" opt; do
done
```
Let\'s do the very same tests we did in the last example:
Let's do the very same tests we did in the last example:
#### Calling it without any arguments
@ -338,7 +338,7 @@ like programmed.
The option was okay, but there is an argument missing.
Let\'s provide **the argument**:
Let's provide **the argument**:
$ ./go_test.sh -a /etc/passwd
-a was triggered, Parameter: /etc/passwd

View File

@ -4,14 +4,14 @@
## Why lock?
Sometimes there\'s a need to ensure only one copy of a script runs, i.e
Sometimes there's a need to ensure only one copy of a script runs, i.e
prevent two or more copies running simultaneously. Imagine an important
cronjob doing something very important, which will fail or corrupt data
if two copies of the called program were to run at the same time. To
prevent this, a form of `MUTEX` (**mutual exclusion**) lock is needed.
The basic procedure is simple: The script checks if a specific condition
(locking) is present at startup, if yes, it\'s locked - the scipt
(locking) is present at startup, if yes, it's locked - the scipt
doesn\'t start.
This article describes locking with common UNIX(r) tools. There are
@ -44,8 +44,8 @@ in the filesystem that can be used as locking indicator:
To create a file or set a file timestamp, usually the command touch is
used. The following problem is implied: A locking mechanism checks for
the existance of the lockfile, if no lockfile exists, it creates one and
continues. Those are **two separate steps**! That means it\'s **not an
atomic operation**. There\'s a small amount of time between checking and
continues. Those are **two separate steps**! That means it's **not an
atomic operation**. There's a small amount of time between checking and
creating, where another instance of the same script could perform
locking (because when it checked, the lockfile wasn\'t there)! In that
case you would have 2 instances of the script running, both thinking
@ -87,13 +87,13 @@ trapped. I am sure there there is a better solution than this
suggestion* \-\-- *[sn18](sunny_delhi18@yahoo.com) 2009/12/19 08:24*
**Note:** While perusing the Internet, I found some people asking if the
`mkdir` method works \"on all filesystems\". Well, let\'s say it should.
`mkdir` method works \"on all filesystems\". Well, let's say it should.
The syscall under `mkdir` is guarenteed to work atomicly in all cases,
at least on Unices. Two examples of problems are NFS filesystems and
filesystems on cluster servers. With those two scenarios, dependencies
exist related to the mount options and implementation. However, I
successfully use this simple method on an Oracle OCFS2 filesystem in a
4-node cluster environment. So let\'s just say \"it should work under
4-node cluster environment. So let's just say \"it should work under
normal conditions\".
Another atomic method is setting the `noclobber` shell option

View File

@ -12,7 +12,7 @@ the best features of `tar` and `cpio`, able to handle all common archive
types.
However, this is **not a manpage**, it will **not** list all possible
options, it will **not** you detailed information about `pax`. It\'s
options, it will **not** you detailed information about `pax`. It's
only an introduction.
This article is based on the debianized Berkeley implementation of
@ -75,7 +75,7 @@ When you don\'t specify anything special, `pax` will attempt to read
archive data from standard input (read/list modes) and write archive
data to standard output (write mode). This ensures `pax` can be easily
used as part of a shell pipe construct, e.g. to read a compressed
archive that\'s decompressed in the pipe.
archive that's decompressed in the pipe.
The option to specify the pathname of a file to be archived is `-f` This
file will be used as input or output, depending on the operation
@ -84,7 +84,7 @@ file will be used as input or output, depending on the operation
When pax reads an archive, it tries to guess the archive type. However,
in *write* mode, you must specify which type of archive to append using
the `-x <TYPE>` switch. If you omit this switch, a default archive will
be created (POSIX says it\'s implementation defined, Berkeley `pax`
be created (POSIX says it's implementation defined, Berkeley `pax`
creates `ustar` if no options are specified).
The following archive formats are supported (Berkeley implementation):
@ -115,7 +115,7 @@ files to list or extract.
patterns
- if no patterns are given, `pax` will \"match\" (list or extract) all
files from the archive
- **To avoid conflicts with shell pathname expansion, it\'s wise to
- **To avoid conflicts with shell pathname expansion, it's wise to
quote patterns!**
#### Some assorted examples of patterns
@ -269,7 +269,7 @@ happens in the order they are specified.
### Excluding files from an archive
The -s command seen above can be used to exclude a file. The
substitution must result in a null string: For example, let\'s say that
substitution must result in a null string: For example, let's say that
you want to exclude all the CVS directories to create a source code
archive. We are going to replace the names containing /CVS/ with
nothing, note the .\* they are needed because we need to match the
@ -277,7 +277,7 @@ entire pathname.
pax -w -x ustar -f release.tar -s',.*/CVS/.*,,' myapplication
You can use several -s options, for instance, let\'s say you also want
You can use several -s options, for instance, let's say you also want
to remove files ending in \~:
pax -w -x ustar -f release.tar -'s,.*/CVS/.*,,' -'s/.*~//' myapplication

View File

@ -201,7 +201,7 @@ So you got a copy of this descriptor:
--- +-----------------------+
Internally each of these is represented by a file descriptor opened by
the operating system\'s `fopen` calls, and is likely just a pointer to
the operating system's `fopen` calls, and is likely just a pointer to
the file which has been opened for reading (`stdin` or file descriptor
`0`) or writing (`stdout` /`stderr`).
@ -214,8 +214,8 @@ Similarly for output file descriptors, writing a line to file descriptor
descriptor `t`.
\<note tip\>The syntax is somewhat confusing in that you would think
that the arrow would point in the direction of the copy, but it\'s
reversed. So it\'s `target>&source` effectively.\</note\>
that the arrow would point in the direction of the copy, but it's
reversed. So it's `target>&source` effectively.\</note\>
So, as a simple example (albeit slightly contrived), is the following:
@ -233,7 +233,7 @@ line, their order does matter. They are set up from left to right.
- `2>&1 >file`
A common error, is to do `command 2>&1 > file` to redirect both `stderr`
and `stdout` to `file`. Let\'s see what\'s going on. First we type the
and `stdout` to `file`. Let's see what's going on. First we type the
command in our terminal, the descriptors look like this:
--- +-----------------------+
@ -263,7 +263,7 @@ descriptor look like this:
standard error ( 2 ) ---->| /dev/pts/5 |
--- +-----------------------+
That\'s right, nothing has changed, 2 was already pointing to the same
That's right, nothing has changed, 2 was already pointing to the same
place as 1. Now Bash sees `> file` and thus changes `stdout`:
--- +-----------------------+
@ -278,11 +278,11 @@ place as 1. Now Bash sees `> file` and thus changes `stdout`:
standard error ( 2 ) ---->| /dev/pts/5 |
--- +-----------------------+
And that\'s not what we want.
And that's not what we want.
- `>file 2>&1`
Now let\'s look at the correct `command >file 2>&1`. We start as in the
Now let's look at the correct `command >file 2>&1`. We start as in the
previous example, and Bash sees `> file`:
--- +-----------------------+
@ -313,7 +313,7 @@ Then it sees our duplication `2>&1`:
And voila, both `1` and `2` are redirected to file.
## Why sed \'s/foo/bar/\' file \>file Doesn\'t Work
## Why sed 's/foo/bar/\' file \>file Doesn\'t Work
This is a common error, we want to modify a file using something that
reads from a file and writes the result to `stdout`. To do this, we
@ -356,7 +356,7 @@ script and ran `myscript 2>file`.
commands in your script produce, just add `exec 2>myscript.errors` at
the beginning of your script.
Let\'s see another use case. We want to read a file line by line, this
Let's see another use case. We want to read a file line by line, this
is easy, we just do:
while read -r line;do echo "$line";done < file
@ -386,7 +386,7 @@ and our read inherits these descriptors, and our command
and not from our terminal.
A quick look at `help read` tells us that we can specify a file
descriptor from which `read` should read. Cool. Now let\'s use `exec` to
descriptor from which `read` should read. Cool. Now let's use `exec` to
get another descriptor:
exec 3<file
@ -415,7 +415,7 @@ and it works.
## Closing The File Descriptors
Closing a file through a file descriptor is easy, just make it a
duplicate of -. For instance, let\'s close `stdin <&-` and
duplicate of -. For instance, let's close `stdin <&-` and
`stderr 2>&-`:
bash -c '{ lsof -a -p $$ -d0,1,2 ;} <&- 2>&-'
@ -427,7 +427,7 @@ we see that inside the `{}` that only `1` is still here.
Though the OS will probably clean up the mess, it is perhaps a good idea
to close the file descriptors you open. For instance, if you open a file
descriptor with `exec 3>file`, all the commands afterwards will inherit
it. It\'s probably better to do something like:
it. It's probably better to do something like:
exec 3>file
.....
@ -462,7 +462,7 @@ on the comp.unix.shell group:
The redirections are processed from left to right, but as the file
descriptors are inherited we will also have to work from the outer to
the inner contexts. We will assume that we run this command in a
terminal. Let\'s start with the outer `{ } 3>&2 4>&1`.
terminal. Let's start with the outer `{ } 3>&2 4>&1`.
--- +-------------+ --- +-------------+
( 0 ) ---->| /dev/pts/5 | ( 3 ) ---->| /dev/pts/5 |
@ -482,7 +482,7 @@ and thus `1` and `2` go to the terminal. As an exercise, you can start
with `1` pointing to `file.stdout` and 2 pointing to `file.stderr`, you
will see why these redirections are very nice.
Let\'s continue with the right part of the second pipe:
Let's continue with the right part of the second pipe:
`| cmd3 3>&- 4>&-`
--- +-------------+
@ -516,7 +516,7 @@ pipe for reading. Now for the left part of the second pipe
First, The file descriptor `1` is connected to the pipe (`|`), then `2`
is made a copy of `1` and thus is made an fd to the pipe (`2>&1`), then
`1` is made a copy of `4` (`>&4`), then `4` is closed. These are the
file descriptors of the inner `{}`. Lcet\'s go inside and have a look at
file descriptors of the inner `{}`. Lcet's go inside and have a look at
the right part of the first pipe: `| cmd2 2>&3 3>&-`
--- +-------------+
@ -599,7 +599,7 @@ help you, a redirection is always like the following:
- `lhs` is always a file description, i.e., a number:
- Either we want to open, duplicate, move or we want to close. If
the op is `<` then there is an implicit 0, if it\'s `>` or `>>`,
the op is `<` then there is an implicit 0, if it's `>` or `>>`,
there is an implicit 1.
```{=html}

View File

@ -252,7 +252,7 @@ the only possible completions. This option is enabled by default.
If set, range expressions used in pattern matching behave as if in the
traditional C locale when performing comparisons. That is, the current
locale\'s collating sequence is not taken into account, so b will not
locale's collating sequence is not taken into account, so b will not
collate between A and B, and upper-case and lower-case ASCII characters
will collate together.

View File

@ -15,35 +15,35 @@ And yes, these bashphorisms reflect the daily reality in `#bash`.
Number Bashphorism
-------- ----------------------------------------------------------------------------------------------------------------------------------------------------------------
0 The questioner will never tell you what they are really doing the first time they ask.
1 The questioner\'s first description of the problem/question will be misleading.
1 The questioner's first description of the problem/question will be misleading.
2 The questioner will keep changing the question until it drives the helpers in the channel insane.
3 Offtopicness will continue until someone asks a bash question that falls under bashphorisms 1 and/or 2, and `greycat` gets pissed off.
4 The questioner will not read and apply the answers he is given but will instead continue to practice bashphorism #1 and bashphorism #2.
5 The ignorant will continually mis-educate the other noobies.
6 When given a choice of solutions, the newbie will always choose the wrong one.
7 The newbie will always find a reason to say, \"It doesn\'t work.\"
8 If you don\'t know to whom the bashphorism\'s referring, it\'s you.
8 If you don\'t know to whom the bashphorism's referring, it's you.
9 All examples given by the questioner will be broken, misleading, wrong, and not representative of the actual question.
10 See B1
11 Please apply `(( % 10 ))` to the bashphorism value.
12 All logic is deniable; however, some logic will \*plonk\* you if you deny it.
13 Everyone ignores greycat when he is right. When he is wrong, it is !b1
14 The newbie doesn\'t actually know what he\'s asking. If he did, he wouldn\'t need to ask.
14 The newbie doesn\'t actually know what he's asking. If he did, he wouldn\'t need to ask.
15 The more advanced you are, the more likely you are to be overcomplicating it.
16 The more beginner you are, the more likely you are to be overcomplicating it.
17 A newbie comes to #bash to get his script confirmed. He leaves disappointed.
18 The newbie will not accept the answer you give, no matter how right it is.
19 The newbie is a bloody loon.
20 The newbie will always have some excuse for doing it wrong.
21 When the newbie\'s question is ambiguous, the proper interpretation will be whichever one makes the problem the hardest to solve.
22 The newcomer will abuse the bot\'s factoid triggers for their own entertainment until someone gets annoyed enough to ask them to message it privately instead.
21 When the newbie's question is ambiguous, the proper interpretation will be whichever one makes the problem the hardest to solve.
22 The newcomer will abuse the bot's factoid triggers for their own entertainment until someone gets annoyed enough to ask them to message it privately instead.
23 Everyone is a newcomer.
24 The newcomer will address greybot as if it were human.
25 The newbie won\'t accept any answer that uses practical or standard tools.
26 The newbie will not TELL you about this restriction until you have wasted half an hour.
27 The newbie will lie.
28 When the full horror of the newbie\'s true goal is revealed, the newbie will try to restate the goal to trick you into answering. Newbies are stupid.
29 It\'s always git. Or python virtualenv. Or docker. One of those pieces of shit. ALWAYS.
28 When the full horror of the newbie's true goal is revealed, the newbie will try to restate the goal to trick you into answering. Newbies are stupid.
29 It's always git. Or python virtualenv. Or docker. One of those pieces of shit. ALWAYS.
30 They won\'t show you the homework assignment. That would make it too easy.
31 Your teacher is a f\*\*king idiot.
32 The more horrifyingly wrong a proposed solution is, the more likely it will be used.

View File

@ -1,4 +1,4 @@
# Bash\'s behaviour
# Bash's behaviour
![](keywords>bash shell scripting startup files dotfiles modes POSIX)
@ -8,15 +8,15 @@ FIXME incomplete
### Login shell
As a \"login shell\", Bash reads and sets (executes) the user\'s profile
As a \"login shell\", Bash reads and sets (executes) the user's profile
from `/etc/profile` and one of `~/.bash_profile`, `~/.bash_login`, or
`~/.profile` (in that order, using the first one that\'s readable!).
`~/.profile` (in that order, using the first one that's readable!).
When a login shell exits, Bash reads and executes commands from the file
`~/.bash_logout`, if it exists.
Why an extra login shell mode? There are many actions and variable sets
that only make sense for the initial user login. That\'s why all UNIX(r)
that only make sense for the initial user login. That's why all UNIX(r)
shells have (should have) a \"login\" mode.
[**Methods to start Bash as a login shell:**]{.underline}
@ -44,7 +44,7 @@ they\'re not inherited from the parent shell.
The feature to have a system-wide `/etc/bash.bashrc` or a similar
system-wide rc-file is specific to vendors and distributors that ship
*their own, patched variant of Bash*. The classic way to have a
system-wide rc file is to `source /etc/bashrc` from every user\'s
system-wide rc file is to `source /etc/bashrc` from every user's
`~/.bashrc`.
[**Methods to test for interactive-shell mode:**]{.underline}
@ -64,9 +64,9 @@ system-wide rc file is to `source /etc/bashrc` from every user\'s
When Bash starts in SH compatiblity mode, it tries to mimic the startup
behaviour of historical versions of `sh` as closely as possible, while
conforming to the POSIX(r) standard as well. The profile files read are
`/etc/profile` and `~/.profile`, if it\'s a login shell.
`/etc/profile` and `~/.profile`, if it's a login shell.
If it\'s not a login shell, the environment variable
If it's not a login shell, the environment variable
[ENV](../syntax/shellvars.md#ENV) is evaluated and the resulting filename is
used as the name of the startup file.
@ -117,7 +117,7 @@ read any startup files in POSIX(r) mode.
### POSIX run mode
In POSIX(r) mode, Bash follows the POSIX(r) standard regarding behaviour
and parsing (excerpt from a Bash maintainer\'s document):
and parsing (excerpt from a Bash maintainer's document):
Starting Bash with the `--posix' command-line option or executing `set
-o posix' while Bash is running will cause Bash to conform more closely
@ -311,7 +311,7 @@ FIXME help me to find out what breaks in POSIX(r) mode!
### Restricted shell
In restricted mode, Bash sets up (and runs) a shell environment that\'s
In restricted mode, Bash sets up (and runs) a shell environment that's
far more controlled and limited than the standard shell mode. It acts
like normal Bash with the following restrictions:

View File

@ -163,7 +163,7 @@ For this topic, see also
`read` `read` checks first variable argument for validity before trying to read inout 4.3-beta
`help` attempts substring matching (as it did through bash-4.2) if exact string matching fails 4.3-beta2
`fc` interprets option `-0` (zero) as the current command line 4.3-beta2
`cd` new option `-@` to browse a file\'s extended attributes (on systems that support `O_XATTR`) 4.3-rc1
`cd` new option `-@` to browse a file's extended attributes (on systems that support `O_XATTR`) 4.3-rc1
`kill` new option `-L` (upper case ell) to list signals like the normal lowercase option `-l` (compatiblity with some standalone `kill` commands) 4.4-beta
`mapfile` new option `-d` 4.4-alpha
`wait` new option `-f` 5.0-alpha

View File

@ -49,7 +49,7 @@ following arguments.
/home/bash/bin/test testword hello
The same way, with `#!/bin/bash` the shell \"`/bin/bash`\" is called
with the script filename as an argument. It\'s the same as executing
with the script filename as an argument. It's the same as executing
\"`/bin/bash /home/bash/bin/test testword hello`\"
If the interpreter can be specified with arguments and how long it can
@ -68,7 +68,7 @@ nothing and it fails, check the shebang. Older Bash versions will
respond with a \"`no such file or directory`\" error for a nonexistant
interpreter specified by the shebang. \</WRAP\>
**Additional note:** When you specify `#!/bin/sh` as shebang and that\'s
**Additional note:** When you specify `#!/bin/sh` as shebang and that's
a link to a Bash, then Bash will run in POSIX(r) mode! See:
- [Bash behaviour](../scripting/bashbehaviour.md).
@ -84,7 +84,7 @@ A common method is to specify a shebang like
Which one you need, or whether you think which one is good, or bad, is
up to you. There is no bulletproof portable way to specify an
interpreter. **It\'s a common misconception that it solves all problems.
interpreter. **It's a common misconception that it solves all problems.
Period.**
## The standard filedescriptors
@ -100,7 +100,7 @@ Usually, they\'re all connected to your terminal, stdin as input file
(keyboard), stdout and stderr as output files (screen). When calling
such a program, the invoking shell can change these filedescriptor
connections away from the terminal to any other file (see redirection).
Why two different output filedescriptors? It\'s convention to send error
Why two different output filedescriptors? It's convention to send error
messages and warnings to stderr and only program output to stdout. This
enables the user to decide if they want to see nothing, only the data,
only the errors, or both - and where they want to see them.
@ -117,7 +117,7 @@ redirection and piping, see:
## Variable names
It\'s good practice to use lowercase names for your variables, as shell
It's good practice to use lowercase names for your variables, as shell
and system-variable names are usually all in UPPERCASE. However, you
should avoid naming your variables any of the following (incomplete
list!):
@ -148,7 +148,7 @@ termination by a signal:
- **126**: the requested command (file) was found, but can\'t be
executed
- **127**: command (file) not found
- **128**: according to ABS it\'s used to report an invalid argument
- **128**: according to ABS it's used to report an invalid argument
to the exit builtin, but I wasn\'t able to verify that in the source
code of Bash (see code 255)
- **128 + N**: the shell was terminated by the signal N
@ -200,7 +200,7 @@ so others can check the script execution.
## Comments
In a larger, or complex script, it\'s wise to comment the code. Comments
In a larger, or complex script, it's wise to comment the code. Comments
can help with debugging or tests. Comments start with the \# character
(hashmark) and continue to the end of the line:
@ -210,7 +210,7 @@ can help with debugging or tests. Comments start with the \# character
echo "Be liberal in what you accept, and conservative in what you send" # say something
```
The first thing was already explained, it\'s the so-called shebang, for
The first thing was already explained, it's the so-called shebang, for
the shell, **only a comment**. The second one is a comment from the
beginning of the line, the third comment starts after a valid command.
All three syntactically correct.
@ -219,8 +219,8 @@ All three syntactically correct.
To temporarily disable complete blocks of code you would normally have
to prefix every line of that block with a \# (hashmark) to make it a
comment. There\'s a little trick, using the pseudo command `:` (colon)
and input redirection. The `:` does nothing, it\'s a pseudo command, so
comment. There's a little trick, using the pseudo command `:` (colon)
and input redirection. The `:` does nothing, it's a pseudo command, so
it does not care about standard input. In the following code example,
you want to test mail and logging, but not dump the database, or execute
a shutdown:
@ -278,12 +278,12 @@ program\".
**[Attention:]{.underline}** When you set variables in a child process,
for example a *subshell*, they will be set there, but you will **never**
have access to them outside of that subshell. One way to create a
subshell is the pipe. It\'s all mentioned in a small article about [Bash
subshell is the pipe. It's all mentioned in a small article about [Bash
in the processtree](../scripting/processtree.md)!
### Local variables
Bash provides ways to make a variable\'s scope *local* to a function:
Bash provides ways to make a variable's scope *local* to a function:
- Using the `local` keyword, or
- Using `declare` (which will *detect* when it was called from within
@ -327,7 +327,7 @@ echo $foo
### Environment variables
The environment space is not directly related to the topic about scope,
but it\'s worth mentioning.
but it's worth mentioning.
Every UNIX(r) process has a so-called *environment*. Other items, in
addition to variables, are saved there, the so-called *environment

View File

@ -9,7 +9,7 @@ but as hints and comments about debugging a Bash script.
Do **not** name your script `test`, for example! *Why?* `test` is the
name of a UNIX(r)-command, and [most likely built into your
shell]{.underline} (it\'s a built-in in Bash) - so you won\'t be able to
shell]{.underline} (it's a built-in in Bash) - so you won\'t be able to
run a script with the name `test` in a normal way.
**Don\'t laugh!** This is a classic mistake :-)
@ -41,7 +41,7 @@ From my personal experience, I can suggest `vim` or `GNU emacs`.
## Write logfiles
For more complex scripts, it\'s useful to write to a log file, or to the
For more complex scripts, it's useful to write to a log file, or to the
system log. Nobody can debug your script without knowing what actually
happened and what went wrong.
@ -61,7 +61,7 @@ literal quotes, to see leading and trailing spaces!
pid=$(< fooservice.pid)
echo "DEBUG: read from file: pid=\"$pid\"" >&2
Bash\'s [printf](../commands/builtin/printf.md) command has the `%q` format,
Bash's [printf](../commands/builtin/printf.md) command has the `%q` format,
which is handy for verifying whether strings are what they appear to be.
foo=$(< inputfile)
@ -97,20 +97,20 @@ There are two useful debug outputs for that task (both are written to
### Simple example of how to interpret xtrace output
Here\'s a simple command (a string comparison using the [classic test
Here's a simple command (a string comparison using the [classic test
command](../commands/classictest.md)) executed while in `set -x` mode:
set -x
foo="bar baz"
[ $foo = test ]
That fails. Why? Let\'s see the `xtrace` output:
That fails. Why? Let's see the `xtrace` output:
+ '[' bar baz = test ']'
And now you see that it\'s (\"bar\" and \"baz\") recognized as two
And now you see that it's (\"bar\" and \"baz\") recognized as two
separate words (which you would have realized if you READ THE ERROR
MESSAGES ;) ). Let\'s check it\...
MESSAGES ;) ). Let's check it\...
# next try
[ "$foo" = test ]
@ -218,7 +218,7 @@ This can be wrapped in a shell function for more readable code.
script.sh: line 100: syntax error: unexpected end of file
Usually indicates exactly what it says: An unexpected end of file. It\'s
Usually indicates exactly what it says: An unexpected end of file. It's
unexpected because Bash waits for the closing of a [compound
command](../syntax/ccmd/intro.md):
@ -288,7 +288,7 @@ definition invalid.
### What is the CRLF issue?
There\'s a big difference in the way that UNIX(r) and Microsoft(r) (and
There's a big difference in the way that UNIX(r) and Microsoft(r) (and
possibly others) handle the **line endings** of plain text files. The
difference lies in the use of the CR (Carriage Return) and LF (Line
Feed) characters.
@ -298,7 +298,7 @@ Feed) characters.
Keep in mind your script is a **plain text file**, and the `CR`
character means nothing special to UNIX(r) - it is treated like any
other character. If it\'s printed to your terminal, a carriage return
other character. If it's printed to your terminal, a carriage return
will effectively place the cursor at the beginning of the *current*
line. This can cause much confusion and many headaches, since lines
containing CRs are not what they appear to be when printed. In summary,
@ -310,7 +310,7 @@ Some possible sources of CRs:
- a DOS/Windows text editor
- a UNIX(r) text editor that is \"too smart\" when determining the
file content type (and thinks \"*it\'s a DOS text file*\")
file content type (and thinks \"*it's a DOS text file*\")
- a direct copy and paste from certain webpages (some pastebins are
known for this)
@ -327,7 +327,7 @@ carriage return character!):
echo "Hello world"^M
...
Here\'s what happens because of the `#!/bin/bash^M` in our shebang:
Here's what happens because of the `#!/bin/bash^M` in our shebang:
- the file `/bin/bash^M` doesn\'t exist (hopefully)
- So Bash prints an error message which (depending on the terminal,
@ -347,7 +347,7 @@ Why? Because when printed literally, the `^M` makes the cursor go back
to the beginning of the line. The whole error message is *printed*, but
you *see* only part of it!
\<note warning\> It\'s easy to imagine the `^M` is bad in other places
\<note warning\> It's easy to imagine the `^M` is bad in other places
too. If you get weird and illogical messages from your script, rule out
the possibility that`^M` is involved. Find and eliminate it! \</note\>

View File

@ -32,8 +32,8 @@ See also:
Give it another name. The executable `test` already exists.
In Bash it\'s a builtin. With other shells, it might be an executable
file. Either way, it\'s bad name choice!
In Bash it's a builtin. With other shells, it might be an executable
file. Either way, it's bad name choice!
Workaround: You can call it using the pathname:
@ -135,14 +135,14 @@ assigned value**:
### Expanding (using) variables
A typical beginner\'s trap is quoting.
A typical beginner's trap is quoting.
As noted above, when you want to **expand** a variable i.e. \"get the
content\", the variable name needs to be prefixed with a dollar-sign.
But, since Bash knows various ways to quote and does word-splitting, the
result isn\'t always the same.
Let\'s define an example variable containing text with spaces:
Let's define an example variable containing text with spaces:
example="Hello world"
@ -237,8 +237,8 @@ Or, simpler yet:
grep ^root: /etc/passwd >/dev/null 2>&1 || echo "root was not found - check the pub at the corner"
```
If you need the specific value of `$?`, there\'s no other choice. But if
you need only a \"true/false\" exit indication, there\'s no need for
If you need the specific value of `$?`, there's no other choice. But if
you need only a \"true/false\" exit indication, there's no need for
`$?`.
See also:
@ -247,11 +247,11 @@ See also:
### Output vs. Return Value
It\'s important to remember the different ways to run a child command,
It's important to remember the different ways to run a child command,
and whether you want the output, the return value, or neither.
When you want to run a command (or a pipeline) and save (or print) the
**output**, whether as a string or an array, you use Bash\'s
**output**, whether as a string or an array, you use Bash's
`$(command)` syntax:
$(ls -l /tmp)

View File

@ -38,9 +38,9 @@ In these cases the portable syntax should be preferred.
## Portability rationale
Here is some assorted portability information. Take it as a small guide
to make your scripts a bit more portable. It\'s not complete (it never
will be!) and it\'s not very detailed (e.g. you won\'t find information
about how which shell technically forks off which subshell). It\'s just
to make your scripts a bit more portable. It's not complete (it never
will be!) and it's not very detailed (e.g. you won\'t find information
about how which shell technically forks off which subshell). It's just
an assorted small set of portability guidelines. *-Thebonsai*
FIXME UNIX shell gurus out there, please be patient with a newbie like
@ -54,7 +54,7 @@ there are two possibilities:
The *new value* is seen by subsequent programs
- without any special action (e.g. Bash)
- only after an explicit export with `export VARIABLE` (e.g. Sun\'s
- only after an explicit export with `export VARIABLE` (e.g. Sun's
`/bin/sh`)
Since an extra `export` doesn\'t hurt, the safest and most portable way
@ -64,7 +64,7 @@ subsequent processes.
### Arithmetics
Bash has a special compound command to do arithmetic without expansion.
However, POSIX has no such command. In the table at the top, there\'s
However, POSIX has no such command. In the table at the top, there's
the `: $((MATH))` construct mentioned as possible alternative. Regarding
the exit code, a 100% equivalent construct would be:
@ -153,7 +153,7 @@ Find another method.
### Check for a command in PATH
The [PATH](../syntax/shellvars.md#PATH) variable is a colon-delimited list of
directory names, so it\'s basically possible to run a loop and check
directory names, so it's basically possible to run a loop and check
every `PATH` component for the command you\'re looking for and for
executability.
@ -199,6 +199,6 @@ accessible by `PATH`:
echo "sed is available"
fi
[^1]: \"portable\" doesn\'t necessarily mean it\'s POSIX, it can also
mean it\'s \"widely used and accepted\", and thus maybe more
[^1]: \"portable\" doesn\'t necessarily mean it's POSIX, it can also
mean it's \"widely used and accepted\", and thus maybe more
portable than POSIX(r

View File

@ -20,7 +20,7 @@ of POSIX, and some may be incompatible with POSIX.
---------------------------------- --------------------------------------------------------- --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
`&>FILE` and `>&FILE` `>FILE 2>&1` This redirection syntax is short for `>FILE 2>&1` and originates in the C Shell. The latter form is especially uncommon and should never be used, and the explicit form using separate redirections is preferred over both. These shortcuts contribute to confusion about the copy descriptor because the syntax is unclear. They also introduce parsing ambiguity, and conflict with POSIX. Shells without this feature treat `cmd1 &>file cmd2` as: \"background `cmd1` and then execute `cmd2` with its stdout redirected to `file`\", which is the correct interpretation of this expression. See: [redirection](../syntax/redirection.md) \`\$ { bash; dash \</dev/fd/0; } \<\<\<\'echo foo\>/dev/null&\>/dev/fd/2 echo bar\' foo echo bar bar\`
`$[EXPRESSION]` `$((EXPRESSION))` This undocumented syntax is completely replaced by the POSIX-conforming arithmetic expansion `$((EXPRESSION))`. It is unimplemented almost everywhere except Bash and Zsh. See [arithmetic expansion](../syntax/expansion/arith.md). [Some discussion](http://lists.gnu.org/archive/html/bug-bash/2012-04/msg00034.html).
`COMMAND\ |&\ COMMAND` `COMMAND 2>&1 | COMMAND` This is an alternate pipeline operator derived from Zsh. Officially, it is not considered deprecated by Bash, but I highly discourage it. It conflicts with the list operator used for [coprocess](../syntax/keywords/coproc.md) creation in most Korn shells. It also has confusing behavior. The stdout is redirected first like an ordinary pipe, while the stderr is actually redirected last \-- after other redirects preceding the pipe operator. Overall, it\'s pointless syntax bloat. Use an explicit redirect instead.
`COMMAND\ |&\ COMMAND` `COMMAND 2>&1 | COMMAND` This is an alternate pipeline operator derived from Zsh. Officially, it is not considered deprecated by Bash, but I highly discourage it. It conflicts with the list operator used for [coprocess](../syntax/keywords/coproc.md) creation in most Korn shells. It also has confusing behavior. The stdout is redirected first like an ordinary pipe, while the stderr is actually redirected last \-- after other redirects preceding the pipe operator. Overall, it's pointless syntax bloat. Use an explicit redirect instead.
`function\ NAME()\ COMPOUND-CMD` `NAME()\ COMPOUND-CMD` or `function\ NAME\ {\ CMDS;\ }` This is an amalgamation between the Korn and POSIX style function definitions - using both the `function` keyword and parentheses. It has no useful purpose and no historical basis or reason to exist. It is not specified by POSIX. It is accepted by Bash, mksh, zsh, and perhaps some other Korn shells, where it is treated as identical to the POSIX-style function. It is not accepted by AT&T ksh. It should never be used. See the next table for the `function` keyword. Bash doesn\'t have this feature documented as expressly deprecated.
`for x; { ...;}` `do`, `done`, `in`, `esac`, etc. This undocumented syntax replaces the `do` and `done` reserved words with braces. Many Korn shells support various permutations on this syntax for certain compound commands like `for`, `case`, and `while`. Which ones and certain details like whether a newline or semicolon are required vary. Only `for` works in Bash. Needless to say, don\'t use it.
@ -40,15 +40,15 @@ historical reasons.
`[\ EXPRESSION\ ]`\\ and\\ `test\ EXPRESSION` `[[\ EXPRESSION\ ]]` `test` and `[` are the Bourne/POSIX commands for evaluating test expressions (they are almost identical, and `[` is somewhat more common). The expressions consist of regular arguments, unlike the Ksh/Bash `[[` command. While the issue is analogous to `let` vs `((`, the advantages of `[[` vs `[` are even more important because the arguments/expansions aren\'t just concatenated into one expression. With the classic `[` command, the number of arguments is significant. If at all possible, use the [conditional expression](../syntax/ccmd/conditional_expression.md) (\"new test command\") `[[ EXPRESSION ]]`. Unless there is a need for POSIX compatibility, there are only a few reasons to use `[`. `[[` is one of the most portable and consistent non-POSIX ksh extensions available. See: [conditional_expression](../syntax/ccmd/conditional_expression.md) and *[What is the difference between test, \[ and \[\[ ?](http://mywiki.wooledge.org/BashFAQ/031)*
`set -e`, `set -o errexit`\ proper control flow and error handling `set -e` causes untested non-zero exit statuses to be fatal. It is a debugging feature intended for use only during development and should not be used in production code, especially init scripts and other high-availability scripts. Do not be tempted to think of this as \"error handling\"; it\'s not, it\'s just a way to find the place you\'ve *forgotten* to put error handling.\
`set -e`, `set -o errexit`\ proper control flow and error handling `set -e` causes untested non-zero exit statuses to be fatal. It is a debugging feature intended for use only during development and should not be used in production code, especially init scripts and other high-availability scripts. Do not be tempted to think of this as \"error handling\"; it's not, it's just a way to find the place you\'ve *forgotten* to put error handling.\
and the `ERR` trap Think of it as akin to `use strict` in Perl or `throws` in C++: tough love that makes you write better code. Many guides recommend avoiding it entirely because of the apparently-complex rules for when non-zero statuses cause the script to abort. Conversely, large software projects with experienced coders may recommend or even mandate its use.\
Because it provides no notification of the location of the error, it\'s more useful combined with `set -x` or the `DEBUG` trap and other Bash debug features, and both flags are normally better set on the command line rather than within the script itself.\
Because it provides no notification of the location of the error, it's more useful combined with `set -x` or the `DEBUG` trap and other Bash debug features, and both flags are normally better set on the command line rather than within the script itself.\
Most of this also applies to the `ERR` trap, though I\'ve seen it used in a few places in shells that lack `pipefail` or `PIPESTATUS`. The `ERR` trap is not POSIX, but `set -e` is. `failglob` is another Bash feature that falls into this category (mainly useful for debugging).\
**The `set -e` feature generates more questions and false bug reports on the Bash mailing list than all other features combined!** Please do not rely on `set -e` for logic in scripts. If you still refuse to take this advice, make sure you understand **exactly** how it works. See: *[Why doesn\'t set -e (or set -o errexit, or trap ERR) do what I expected?](http://mywiki.wooledge.org/BashFAQ/105)* and <http://www.fvue.nl/wiki/Bash:_Error_handling>
`set -u` or `set -o nounset` Proper control flow and error handling `set -u` causes attempts to expand unset variables or parameters as fatal errors. Like `set -e`, it bypasses control flow and exits immediately from the current shell environment. Like non-zero statuses, unset variables are a normal part of most non-trivial shell scripts. Living with `set -u` requires hacks like `${1+"$1"}` for each expansion that might possibly be unset. Only very current shells guarantee that expanding `@` or `*` won\'t trigger an error when no parameters are set (<http://austingroupbugs.net/view.php?id=155>, <http://www.in-ulm.de/~mascheck/various/bourne_args/>). Apparently some find it useful for debugging. See *[How do I determine whether a variable is already defined? Or a function?](http://mywiki.wooledge.org/BashFAQ/083)* for how to properly test for defined variables. Don\'t use `set -u`.
`${var?msg}` or `${var:?msg}` Proper control flow and error handling Like `set -u`, this expansion causes a fatal error which immediately exits the current shell environment if the given parameter is unset or is null. It prints the error message given, to the right of the operator. If a value is expected and you\'d like to create an assertion or cause errors, it\'s better to test for undefined variables using one of [these techniques](http://mywiki.wooledge.org/BashFAQ/083) and handle the error manually, or call a `die` function. This expansion is defined by POSIX. It\'s better than `set -u`, because it\'s explicit, but not by much. It also allows you to accidentally construct hilariously deceptive error messages: `bash -c 'f() { definitely_not_printf "${printf:?"$1" - No such option}"; }; f -v'
`${var?msg}` or `${var:?msg}` Proper control flow and error handling Like `set -u`, this expansion causes a fatal error which immediately exits the current shell environment if the given parameter is unset or is null. It prints the error message given, to the right of the operator. If a value is expected and you\'d like to create an assertion or cause errors, it's better to test for undefined variables using one of [these techniques](http://mywiki.wooledge.org/BashFAQ/083) and handle the error manually, or call a `die` function. This expansion is defined by POSIX. It's better than `set -u`, because it's explicit, but not by much. It also allows you to accidentally construct hilariously deceptive error messages: `bash -c 'f() { definitely_not_printf "${printf:?"$1" - No such option}"; }; f -v'
bash: printf: -v - No such option`
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
@ -67,13 +67,13 @@ surprised if you run across someone telling you not to use these.
`function\ NAME\ {\ CMDS;\ }` `NAME()\ COMPOUND-CMD` This is the ksh form of function definition created to extend the Bourne and POSIX form with modified behaviors and additional features like local variables. The idea was for new-style functions to be analogous to regular builtins with their own environment and scope, while POSIX-style functions are more like special builtins. `function` is supported by almost every ksh-derived shell including Bash and Zsh, but isn\'t specified by POSIX. Bash treats all function styles the same, but this is unusual. `function` has some preferable characteristics in many ksh variants, making it more portable for scripts that use non-POSIX extensions by some measures. If you\'re going to use the `function` keyword, it implies that you\'re either targeting Ksh specifically, or that you have detailed knowledge of how to compensate for differences across shells. It should always be used consistently with `typeset`, but never used with `declare` or `local`. Also in ksh93, the braces are not a [command group](../syntax/ccmd/grouping_plain.md), but a required part of the syntax (unlike Bash and others). See [shell function definitions](../syntax/basicgrammar.md#shell_function_definitions)
`typeset` `declare`, `local`, `export`, `readonly` This is closely related to the above, and should often be used together. `typeset` exists primarily for `ksh` compatibility, but is marked as \"deprecated\" in Bash (though I don\'t entirely agree with this). This makes some sense, because future compatibility can\'t be guaranteed, and any compatibility at all, requires understanding the non-POSIX features of other shells and their differences. Using `declare` instead of `typeset` emphasizes your intention to be \"Bash-only\", and definitely breaks everywhere else (except possibly zsh if you\'re lucky). The issue is further complicated by Dash and the [Debian policy](http://www.debian.org/doc/debian-policy/ch-files.html#s-scripts) requirement for a `local` builtin, which is itself not entirely compatible with Bash and other shells.
\'\'let \'EXPR\' \'\' `((EXPR))` or `[\ $((EXPR))\ -ne\ 0 ]` `let` is the \"simple command\" variant of arithmetic evaluation command, which takes regular arguments. Both `let` and `((expr))` were present in ksh88, and everything that supports one should support the other. Neither are POSIX. The compound variant is preferable because it doesn\'t take regular arguments for [wordsplitting](../syntax/expansion/wordsplit.md) and [globbing](../syntax/expansion/globs.md), which makes it safer and clearer. It is also usually faster, especially in Bash, where compound commands are typically significantly faster. Some of the (few) reasons for using `let` are detailed on the [let](../commands/builtin/let.md) page. See [arithmetic evaluation compound command](../syntax/ccmd/arithmetic_eval.md)
`eval` Depends. Often code can be restructured to use better alternatives. `eval` is thrown in here for good measure, as sadly it is so often misused that any use of `eval` (even the rare clever one) is immediately dismissed as wrong by experts, and among the most immediate solutions abused by beginners. In reality, there are correct ways to use `eval`, and even cases in which it\'s necessary, even in sophisticated shells like Bash and Ksh. `eval` is unusual in that it is less frequently appropriate in more feature-rich shells than in more minimal shells like Dash, where it is used to compensate for more limitations. If you find yourself needing `eval` too frequently, it might be a sign that you\'re either better off using a different language entirely, or trying to borrow an idiom from some other paradigm that isn\'t well suited to the shell language. By the same token, there are some cases in which working too hard to avoid `eval` ends up adding a lot of complexity and sacrificing all portability. Don\'t substitute a clever `eval` for something that\'s a bit \"too clever\", just to avoid the `eval`, yet, take reasonable measures to avoid it where it is sensible to do so. See: [eval](../commands/builtin/eval.md) and [Eval command and security issues](http://mywiki.wooledge.org/BashFAQ/048).
`eval` Depends. Often code can be restructured to use better alternatives. `eval` is thrown in here for good measure, as sadly it is so often misused that any use of `eval` (even the rare clever one) is immediately dismissed as wrong by experts, and among the most immediate solutions abused by beginners. In reality, there are correct ways to use `eval`, and even cases in which it's necessary, even in sophisticated shells like Bash and Ksh. `eval` is unusual in that it is less frequently appropriate in more feature-rich shells than in more minimal shells like Dash, where it is used to compensate for more limitations. If you find yourself needing `eval` too frequently, it might be a sign that you\'re either better off using a different language entirely, or trying to borrow an idiom from some other paradigm that isn\'t well suited to the shell language. By the same token, there are some cases in which working too hard to avoid `eval` ends up adding a lot of complexity and sacrificing all portability. Don\'t substitute a clever `eval` for something that's a bit \"too clever\", just to avoid the `eval`, yet, take reasonable measures to avoid it where it is sensible to do so. See: [eval](../commands/builtin/eval.md) and [Eval command and security issues](http://mywiki.wooledge.org/BashFAQ/048).
## See also
- [Non-portable syntax and command uses](../scripting/nonportable.md)
- [bashchanges](../scripting/bashchanges.md)
- [Greg\'s BashFAQ 061: List of essential features added (with the
- [Greg's BashFAQ 061: List of essential features added (with the
Bash version tag)](BashFAQ>061)
- [Bash \<-\> POSIX Portability guide with a focus on
Dash](http://mywiki.wooledge.org/Bashism)

View File

@ -30,7 +30,7 @@ See also [the dictionary entry for
## The first argument
The very first argument you can access is referenced as `$0`. It is
usually set to the script\'s name exactly as called, and it\'s set on
usually set to the script's name exactly as called, and it's set on
shell initialization:
[Testscript]{.underline} - it just echos `$0`:
@ -52,9 +52,9 @@ However, this isn\'t true for login shells:
> echo "$0"
-bash
In other terms, `$0` is not a positional parameter, it\'s a special
In other terms, `$0` is not a positional parameter, it's a special
parameter independent from the positional parameter list. It can be set
to anything. In the **ideal** case it\'s the pathname of the script, but
to anything. In the **ideal** case it's the pathname of the script, but
since this gets set on invocation, the invoking program can easily
influence it (the `login` program does that for login shells, by
prefixing a dash, for example).
@ -145,7 +145,7 @@ expands to something, using the [test command](../commands/classictest.md):
done
Looks nice, but has the disadvantage of stopping when `$1` is empty
(null-string). Let\'s modify it to run as long as `$1` is defined (but
(null-string). Let's modify it to run as long as `$1` is defined (but
may be null), using [parameter expansion for an alternate
value](../syntax/pe.md#use_an_alternate_value):
@ -163,8 +163,8 @@ There is a [small tutorial dedicated to
### All Positional Parameters
Sometimes it\'s necessary to just \"relay\" or \"pass\" given arguments
to another program. It\'s very inefficient to do that in one of these
Sometimes it's necessary to just \"relay\" or \"pass\" given arguments
to another program. It's very inefficient to do that in one of these
loops, as you will destroy integrity, most likely (spaces!).
The shell developers created `$*` and `$@` for this purpose.
@ -197,7 +197,7 @@ your positional parameters to **call another program** (for example in a
wrapper-script), then this is the choice for you, use double quoted
`"$@"`.
Well, let\'s just say: **You almost always want a quoted `"$@"`!**
Well, let's just say: **You almost always want a quoted `"$@"`!**
### Range Of Positional Parameters
@ -251,7 +251,7 @@ inside the script or function:
# $5: positional
# $6: parameters
It\'s wise to signal \"end of options\" when setting positional
It's wise to signal \"end of options\" when setting positional
parameters this way. If not, the dashes might be interpreted as an
option switch by `set` itself:
@ -274,7 +274,7 @@ To make your program accept options as standard command syntax:
`COMMAND [options] <params>` \# Like \'cat -A file.txt\'
See simple option parsing code below. It\'s not that flexible. It
See simple option parsing code below. It's not that flexible. It
doesn\'t auto-interpret combined options (-fu USER) but it works and is
a good rudimentary way to parse your arguments.

View File

@ -13,7 +13,7 @@ needed to run the process) i.e. [**The environment**]{.underline}.
Every process has its **own** environment space.
The environment stores, among other things, data that\'s useful to us,
The environment stores, among other things, data that's useful to us,
the **environment variables**. These are strings in common `NAME=VALUE`
form, but they are not related to shell variables. A variable named
`LANG`, for example, is used by every program that looks it up in its
@ -33,10 +33,10 @@ set by login scripts or programs).
## Executing programs
All the diagrams of the process tree use names like \"`xterm`\" or
\"`bash`\", but that\'s just to make it easier to understand what\'s
\"`bash`\", but that's just to make it easier to understand what's
going on, it doesn\'t mean those processes are actually executed.
Let\'s take a short look at what happens when you \"execute a program\"
Let's take a short look at what happens when you \"execute a program\"
from the Bash prompt, a program like \"ls\":
$ ls
@ -100,7 +100,7 @@ into a variable. We run it in a loop here to count input lines:
cat /etc/passwd | while read; do ((counter++)); done
echo "Lines: $counter"
What? It\'s 0? Yes! The number of lines might not be 0, but the variable
What? It's 0? Yes! The number of lines might not be 0, but the variable
`$counter` still is 0. Why? Remember the diagram from above? Rewriting
it a bit, we have:
@ -123,7 +123,7 @@ that sets the counter must be the \"main shell\". For example:
while read; do ((counter++)); done </etc/passwd
echo "Lines: $counter"
It\'s nearly self-explanatory. The `while` loop runs in the **current
It's nearly self-explanatory. The `while` loop runs in the **current
shell**, the counter is incremented in the **current shell**, everything
vital happens in the **current shell**, also the `read` command sets the
variable `REPLY` (the default if nothing is given), though we don\'t use
@ -137,14 +137,14 @@ performs:
### Executing commands
As shown above, Bash will create subprocesses everytime it executes
commands. That\'s nothing new.
commands. That's nothing new.
But if your command is a subprocess that sets variables you want to use
in your main script, that won\'t work.
For exactly this purpose, there\'s the `source` command (also: the *dot*
For exactly this purpose, there's the `source` command (also: the *dot*
`.` command). Source doesn\'t execute the script, it imports the other
script\'s code into the current shell:
script's code into the current shell:
source ./myvariables.sh
# equivalent to:

View File

@ -19,16 +19,16 @@ of course it helps others to read the code.
## Indentation guidelines
Indentation is nothing that technically influences a script, it\'s only
Indentation is nothing that technically influences a script, it's only
for us humans.
I\'m used to seeing/using indentation of *two space characters* (though
many may prefer 4 spaces, see below in the discussion section):
- it\'s easy and fast to type
- it\'s not a hard-tab that\'s displayed differently in different
- it's easy and fast to type
- it's not a hard-tab that's displayed differently in different
environments
- it\'s wide enough to give a visual break and small enough to not
- it's wide enough to give a visual break and small enough to not
waste too much space on the line
Speaking of hard-tabs: Avoid them if possible. They only make trouble. I
@ -145,7 +145,7 @@ Cryptic constructs, we all know them, we all love them. If they are not
100% needed, avoid them, since nobody except you may be able to decipher
them.
It\'s - just like in C - the middle ground between smart, efficient and
It's - just like in C - the middle ground between smart, efficient and
readable.
If you need to use a cryptic construct, include a comment that explains
@ -179,12 +179,12 @@ loop counting variables, etc., \... (in the example: `file`)
### Variable initialization
As in C, it\'s always a good idea to initialize your variables, though,
As in C, it's always a good idea to initialize your variables, though,
the shell will initialize fresh variables itself (better: Unset
variables will generally behave like variables containing a null
string).
It\'s no problem to pass an **environment variable** to the script. If
It's no problem to pass an **environment variable** to the script. If
you blindly assume that all variables you use for the first time are
**empty**, anybody can **inject** content into a variable by passing it
via the environment.
@ -278,7 +278,7 @@ The basic structure of a script simply reads:
### The shebang
If possible (I know it\'s not always possible!), use [a
If possible (I know it's not always possible!), use [a
shebang](../dict/terms/shebang.md).
Be careful with `/bin/sh`: The argument that \"on Linux `/bin/sh` is

View File

@ -14,12 +14,12 @@ won\'t display the character-sequence, but will perform some action. You
can print the codes with a simple `echo` command.
[**Note:**]{.underline} I see codes referenced as \"Bash colors\"
sometimes (several \"Bash tutorials\" etc\...): That\'s a completely
sometimes (several \"Bash tutorials\" etc\...): That's a completely
incorrect definition.
## The tput command
Because there\'s a large number of different terminal control languages,
Because there's a large number of different terminal control languages,
usually a system has an intermediate communication layer. The real codes
are looked up in a database **for the currently detected terminal type**
and you give standardized requests to an API or (from the shell) to a
@ -40,7 +40,7 @@ is unclear! Also the `tput` acronyms are usually the ones dedicated for
ANSI escapes!
I listed only the most relevant codes, of course, any ANSI terminal
understands many more! But let\'s keep the discussion centered on common
understands many more! But let's keep the discussion centered on common
shell scripting ;-)
If I couldn\'t find a matching ANSI escape, you\'ll see a :?: as the
@ -51,7 +51,7 @@ The ANSI codes always start with the ESC character. (ASCII 0x1B or octal
codes directly - use the `tput` command!**
All codes that can be used with `tput` can be found in terminfo(5). (on
OpenBSD at least) See [OpenBSD\'s
OpenBSD at least) See [OpenBSD's
terminfo(5)](http://www.openbsd.org/cgi-bin/man.cgi?query=terminfo&apropos=0&sektion=5&manpath=OpenBSD+Current&arch=i386&format=html)
under the [Capabilities]{.underline} section. The *cap-name* is the code
to use with tput. A description of each code is also provided.
@ -293,7 +293,7 @@ switch to, to get the other 8 colors.
### Mandelbrot set
This is a slightly modified version of Charles Cooke\'s colorful
This is a slightly modified version of Charles Cooke's colorful
Mandelbrot plot scripts ([original w/
screenshot](http://earth.gkhs.net/ccooke/shell.html)) \-- ungolfed,
optimized a bit, and without hard-coded terminal escapes. The `colorBox`

View File

@ -34,7 +34,7 @@ hardwiring codes.
This snipplet sets up associative arrays for basic color codes using
`tput` for Bash, ksh93 or zsh. You can pass it variable names to
correspond with a collection of codes. There\'s a `main` function with
correspond with a collection of codes. There's a `main` function with
example usage.
``` bash

View File

@ -33,7 +33,7 @@ you don\'t have a space after the commas).
Test:
The `awk` command used for the CSV above just prints the fileds
separated by `###` to see what\'s going on:
separated by `###` to see what's going on:
$ awk -v FS='", "|^"|"$' '{print $2"###"$3"###"$4}' data.csv
first###second###last

View File

@ -32,11 +32,11 @@ You will get something like this:
Yes, we killed it
This is more or less a normal message. And it can\'t be easily
redirected since it\'s the shell itself that yells this message, not the
redirected since it's the shell itself that yells this message, not the
command `kill` or something else. You would have to redirect the whole
script\'s output.
script's output.
It\'s also useless to temporarily redirect `stderr` when you call the
It's also useless to temporarily redirect `stderr` when you call the
`kill` command, since the successful termination of the job, the
termination of the `kill` command and the message from the shell may not
happen at the same time. And a blind `sleep` after the `kill` would be

View File

@ -7,7 +7,7 @@ ccsalvesen, others type: snipplet
------------------------------------------------------------------------
The purpose of this small code collection is to show some code that
draws a horizontal line using as less external tools as possible (it\'s
draws a horizontal line using as less external tools as possible (it's
not a big deal to do it with AWK or Perl, but with pure or nearly-pure
Bash it gets more interesting).
@ -66,7 +66,7 @@ printf '%*s\n' "${COLUMNS:-$(tput cols)}" '' | tr ' ' -
This one is a bit tricky. The format for the `printf` command is `%.0s`,
which specified a field with the **maximum** length of **zero**. After
this field, `printf` is told to print a dash. You might remember that
it\'s the nature of `printf` to repeat, if the number of conversion
it's the nature of `printf` to repeat, if the number of conversion
specifications is less than the number of given arguments. With brace
expansion `{1..20}`, 20 arguments are given (you could easily write
`1 2 3 4 ... 20`, of course!). Following happens: The **zero-length

View File

@ -103,7 +103,7 @@ a=(one two three four five six seven eight nine ten)
printf '%.*s ' $(printf '%s ' "${#a[x=RANDOM%${#a[@]}]} ${a[x]}"{1..10})
```
This generates each parameter and it\'s length in pairs. The \'\*\'
This generates each parameter and it's length in pairs. The \'\*\'
modifier instructs printf to use the value preceding each parameter as
the field width. Note the space between the parameters. This example
unfortunately relies upon the unquoted command substitution to perform

View File

@ -20,7 +20,7 @@ from the C programming language.
This article describes the theory of the used syntax and the behaviour.
To get practical examples without big explanations, see [this page on
Greg\'s
Greg's
wiki](http://mywiki.wooledge.org/BashGuide/CompoundCommands#Arithmetic_Evaluation).
## Constants
@ -72,7 +72,7 @@ specified base greater than 10, characters other than 0 to 9 are needed
- `@`
- `_`
Let\'s quickly invent a new number system with base 43 to show what I
Let's quickly invent a new number system with base 43 to show what I
mean:
$ echo $((43#1))
@ -260,7 +260,7 @@ first.
## Arithmetic expressions and return codes
Bash\'s overall language construct is based on exit codes or return
Bash's overall language construct is based on exit codes or return
codes of commands or functions to be executed. `if` statements, `while`
loops, etc., they all take the return codes of commands as conditions.
@ -269,7 +269,7 @@ not 0 means \"FALSE\" or \"FAILURE\") don\'t correspond to the meaning
of the result of an arithmetic expression (0 means \"FALSE\", not 0
means \"TRUE\").
That\'s why all commands and keywords that do arithmetic operations
That's why all commands and keywords that do arithmetic operations
attempt to **translate** the arithmetical meaning into an equivalent
return code. This simply means:

View File

@ -90,7 +90,7 @@ synonymous with referring to the array name without a subscript.
hi hi hi
The only exceptions to this rule are in a few cases where the array
variable\'s name refers to the array as a whole. This is the case for
variable's name refers to the array as a whole. This is the case for
the `unset` builtin (see [destruction](#Destruction)) and when declaring
an array without assigning any values (see [declaration](#Declaration)).
@ -106,7 +106,7 @@ arrays:
`declare -a ARRAY` Declares an **indexed** array `ARRAY`. An existing array is not initialized.
`declare -A ARRAY` Declares an **associative** array `ARRAY`. This is the one and only way to create associative arrays.
As an example, and for use below, let\'s declare our `NAMES` array as
As an example, and for use below, let's declare our `NAMES` array as
described [above](#purpose):
declare -a NAMES=('Peter' 'Anna' 'Greg' 'Jan')
@ -137,7 +137,7 @@ check the notes about arrays. \</note\>
Syntax Description
----------------------------------------------------------------------- -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
`${ARRAY[N]}` Expands to the value of the index `N` in the **indexed** array `ARRAY`. If `N` is a negative number, it\'s treated as the offset from the maximum assigned index (can\'t be used for assignment) - 1
`${ARRAY[N]}` Expands to the value of the index `N` in the **indexed** array `ARRAY`. If `N` is a negative number, it's treated as the offset from the maximum assigned index (can\'t be used for assignment) - 1
`${ARRAY[S]}` Expands to the value of the index `S` in the **associative** array `ARRAY`.
`"${ARRAY[@]}" ${ARRAY[@]} "${ARRAY[*]}" ${ARRAY[*]}` Similar to [mass-expanding positional parameters](../scripting/posparams.md#mass_usage), this expands to all elements. If unquoted, both subscripts `*` and `@` expand to the same result, if quoted, `@` expands to all elements individually quoted, `*` expands to all elements quoted as a whole.
`"${ARRAY[@]:N:M}" ${ARRAY[@]:N:M} "${ARRAY[*]:N:M}" ${ARRAY[*]:N:M}` Similar to what this syntax does for the characters of a single string when doing [substring expansion](../syntax/pe.md#substring_expansion), this expands to `M` elements starting with element `N`. This way you can mass-expand individual indexes. The rules for quoting and the subscripts `*` and `@` are the same as above for the other mass-expansions.
@ -146,7 +146,7 @@ For clarification: When you use the subscripts `@` or `*` for
mass-expanding, then the behaviour is exactly what it is for `$@` and
`$*` when [mass-expanding the positional
parameters](../scripting/posparams.md#mass_usage). You should read this
article to understand what\'s going on.
article to understand what's going on.
### Metadata
@ -217,13 +217,13 @@ less explain all the needed background theory.
Now, some examples and comments for you.
Let\'s say we have an array `sentence` which is initialized as follows:
Let's say we have an array `sentence` which is initialized as follows:
sentence=(Be liberal in what you accept, and conservative in what you send)
Since no special code is there to prevent word splitting (no quotes),
every word there will be assigned to an individual array element. When
you count the words you see, you should get 12. Now let\'s see if Bash
you count the words you see, you should get 12. Now let's see if Bash
has the same opinion:
$ echo ${#sentence[@]}
@ -260,7 +260,7 @@ think you could just do
$ for ((i = 0; i < ${#sentence[@]}; i++)); do echo "Element $i: '${sentence[i]}'" ; done
Element 0: 'NAMES'
Obviously that\'s wrong. What about
Obviously that's wrong. What about
$ unset sentence ; declare -a sentence=${NAMES}
@ -271,7 +271,7 @@ Obviously that\'s wrong. What about
$ for ((i = 0; i < ${#sentence[@]}; i++)); do echo "Element $i: '${sentence[i]}'" ; done
Element 0: 'Peter'
So what\'s the **right** way? The (slightly ugly) answer is, reuse the
So what's the **right** way? The (slightly ugly) answer is, reuse the
enumeration syntax:
$ unset sentence ; declare -a sentence=("${NAMES[@]}")
@ -563,9 +563,9 @@ dynamically calls a function whose name is resolved from the array.
- Assigning or referencing negative indexes in mksh causes
wrap-around. The max index appears to be `UINT_MAX`, which would be
addressed by `arr[-1]`.
- So far, Bash\'s `-v var` test doesn\'t support individual array
- So far, Bash's `-v var` test doesn\'t support individual array
subscripts. You may supply an array name to test whether an array is
defined, but can\'t check an element. ksh93\'s `-v` supports both.
defined, but can\'t check an element. ksh93's `-v` supports both.
Other shells lack a `-v` test.
### Bugs
@ -686,7 +686,7 @@ to generate these results.
variables?](http://mywiki.wooledge.org/BashFAQ/005) - A very
detailed discussion on arrays with many examples.
- [BashSheet - Arrays](http://mywiki.wooledge.org/BashSheet#Arrays) -
Bashsheet quick-reference on Greycat\'s wiki.
Bashsheet quick-reference on Greycat's wiki.
\<div hide\> vim: set fenc=utf-8 ff=unix ts=4 sts=4 sw=4 ft=dokuwiki et
wrap lbr: \</div\>

View File

@ -33,7 +33,7 @@ to the environment of the `ls` program. It doesn\'t affect your current
shell. This also works while calling functions, unless Bash runs in
POSIX(r) mode (in which case it affects your current shell).
Every command has an exit code. It\'s a type of return status. The shell
Every command has an exit code. It's a type of return status. The shell
can catch it and act on it. Exit code range is from 0 to 255, where 0
means success, and the rest mean either something failed, or there is an
issue to report back to the calling program.
@ -41,7 +41,7 @@ issue to report back to the calling program.
\<wrap center round info 90%\> The simple command construct is the
**base** for all higher constructs. Everything you execute, from
pipelines to functions, finally ends up in (many) simple commands.
That\'s why Bash only has one method to [expand and execute a simple
That's why Bash only has one method to [expand and execute a simple
command](../syntax/grammar/parser_exec.md). \</wrap\>
## Pipelines
@ -50,7 +50,7 @@ FIXME Missing an additional article about pipelines and pipelining
`[time [-p]] [ ! ] command [ | command2 ... ]`
**Don\'t get confused** about the name \"pipeline.\" It\'s a grammatic
**Don\'t get confused** about the name \"pipeline.\" It's a grammatic
name for a construct. Such a pipeline isn\'t necessarily a pair of
commands where stdout/stdin is connected via a real pipe.
@ -76,24 +76,24 @@ executed if the pattern \"\^root:\" is **not** found in `/etc/passwd`:
Yes, this is also a pipeline (although there is no pipe!), because the
**exclamation mark to invert the exit code** can only be used in a
pipeline. If `grep`\'s exit code is 1 (FALSE) (the text was not found),
pipeline. If `grep`'s exit code is 1 (FALSE) (the text was not found),
the leading `!` will \"invert\" the exit code, and the shell sees (and
acts on) exit code 0 (TRUE) and the `then` part of the `if` stanza is
executed. One could say we checked for
\"`not grep "^root" /etc/passwd`\".
The [set option pipefail](../commands/builtin/set.md#attributes) determines
the behavior of how bash reports the exit code of a pipeline. If it\'s
the behavior of how bash reports the exit code of a pipeline. If it's
set, then the exit code (`$?`) is the last command that exits with non
zero status, if none fail, it\'s zero. If it\'s not set, then `$?`
zero status, if none fail, it's zero. If it's not set, then `$?`
always holds the exit code of the last command (as explained above).
The shell option `lastpipe` will execute the last element in a pipeline
construct in the current shell environment, i.e. not a subshell.
There\'s also an array `PIPESTATUS[]` that is set after a foreground
There's also an array `PIPESTATUS[]` that is set after a foreground
pipeline is executed. Each element of `PIPESTATUS[]` reports the exit
code of the respective command in the pipeline. Note: (1) it\'s only for
code of the respective command in the pipeline. Note: (1) it's only for
foreground pipe and (2) for higher level structure that is built up from
a pipeline. Like list, `PIPESTATUS[]` holds the exit status of the last
pipeline command executed.
@ -115,7 +115,7 @@ A list is a sequence of one or more [pipelines](../basicgrammar.md#pipelines)
separated by one of the operators `;`, `&`, `&&`, or `││`, and
optionally terminated by one of `;`, `&`, or `<newline>`.
=\> It\'s a group of **pipelines** separated or terminated by **tokens**
=\> It's a group of **pipelines** separated or terminated by **tokens**
that all have **different meanings** for Bash.
Your whole Bash script technically is one big single list!
@ -139,7 +139,7 @@ There are two forms of compound commands:
- form a new syntax element using a list as a \"body\"
- completly independant syntax elements
Essentially, everything else that\'s not described in this article.
Essentially, everything else that's not described in this article.
Compound commands have the following characteristics:
- they **begin** and **end** with a specific keyword or operator (e.g.
@ -208,7 +208,7 @@ Bash allows three equivalent forms of the function definition:
The space between `NAME` and `()` is optional, usually you see it
without the space.
I suggest using the first form. It\'s specified in POSIX and all
I suggest using the first form. It's specified in POSIX and all
Bourne-like shells seem to support it.
[**Note:**]{.underline} Before version `2.05-alpha1`, Bash only
@ -312,7 +312,7 @@ FIXME more\...
- the [list](../basicgrammar.md#lists) that `if` **executes** contains a
simple command (`cp mymusic.mp3 /data/mp3`)
Let\'s invert test command exit code, only one thing changes:
Let's invert test command exit code, only one thing changes:
if ! [ -d /data/mp3 ]; then
cp mymusic.mp3 /data/mp3

View File

@ -28,7 +28,7 @@ mechanisms available in the language.
The `((;;))` syntax at the top of the loop is not an ordinary
[arithmetic compound command](../../syntax/ccmd/arithmetic_eval.md), but is part
of the C-style for-loop\'s own syntax. The three sections separated by
of the C-style for-loop's own syntax. The three sections separated by
semicolons are [arithmetic expression](../../syntax/arith_expr.md) contexts.
Each time one of the sections is to be evaluated, the section is first
processed for: brace, parameter, command, arithmetic, and process
@ -85,7 +85,7 @@ Bash, Ksh93, Mksh, and Zsh also provide an alternate syntax for the
This syntax is **not documented** and shouldn\'t be used. I found the
parser definitions for it in 1.x code, and in modern 4.x code. My guess
is that it\'s there for compatibility reasons. Unlike the other
is that it's there for compatibility reasons. Unlike the other
aforementioned shells, Bash does not support the analogous syntax for
[case..esac](../../syntax/ccmd/case.md#portability_considerations).
@ -176,7 +176,7 @@ value (8 bits).
\</div\>
Why that one begins at 128 (highest value, on the left) and not 1
(lowest value, on the right)? It\'s easier to print from left to
(lowest value, on the right)? It's easier to print from left to
right\...
We arrive at 128 for `n` through the recursive arithmetic expression
@ -227,7 +227,7 @@ variables.
## Bugs
- *Fixed in 4.3*. ~~There appears to be a bug as of Bash 4.2p10 in
which command lists can\'t be distinguished from the for loop\'s
which command lists can\'t be distinguished from the for loop's
arithmetic argument delimiter (both semicolons), so command
substitutions within the C-style for loop expression can\'t contain
more than one command.~~

View File

@ -18,7 +18,7 @@ against every pattern `<PATTERNn>` and on a match, the associated
[list](../../syntax/basicgrammar.md#lists) `<LISTn>` is executed. Every
commandlist is terminated by `;;`. This rule is optional for the very
last commandlist (i.e., you can omit the `;;` before the `esac`). Every
`<PATTERNn>` is separated from it\'s associated `<LISTn>` by a `)`, and
`<PATTERNn>` is separated from it's associated `<LISTn>` by a `)`, and
is optionally preceded by a `(`.
Bash 4 introduces two new action terminators. The classic behavior using
@ -81,10 +81,10 @@ Another one of my stupid examples\...
echo "Unknown fruit - sure it isn't toxic?"
esac
Here\'s a practical example showing a common pattern involving a `case`
Here's a practical example showing a common pattern involving a `case`
statement. If the first argument is one of a valid set of alternatives,
then perform some sysfs operations under Linux to control a video
card\'s power profile. Otherwise, show a usage synopsis, and print the
card's power profile. Otherwise, show a usage synopsis, and print the
current power profile and GPU temperature.
``` bash
@ -146,7 +146,7 @@ f a b c
## Portability considerations
- Only the `;;` delimiter is specified by POSIX.
- zsh and mksh use the `;|` control operator instead of Bash\'s `;;&`.
- zsh and mksh use the `;|` control operator instead of Bash's `;;&`.
Mksh has `;;&` for Bash compatability (undocumented).
- ksh93 has the `;&` operator, but no `;;&` or equivalent.
- ksh93, mksh, zsh, and posh support a historical syntax where open

View File

@ -53,7 +53,7 @@ for x in 1 2 3
This syntax is **not documented** and should not be used. I found the
parser definitions for it in 1.x code, and in modern 4.x code. My guess
is that it\'s there for compatiblity reasons. This syntax is not
is that it's there for compatiblity reasons. This syntax is not
specified by POSIX(r).
### Return status
@ -137,13 +137,13 @@ done
This is just an example. In *general*
- it\'s not a good idea to parse `ls(1)` output
- it's not a good idea to parse `ls(1)` output
- the [while loop](../../syntax/ccmd/while_loop.md) (using the `read` command)
is a better joice to iterate over lines
### Nested for-loops
It\'s of course possible to use another for-loop as `<LIST>`. Here,
It's of course possible to use another for-loop as `<LIST>`. Here,
counting from 0 to 99 in a weird way:
``` bash

View File

@ -79,7 +79,7 @@ quoting:
fi
Compare that to the [classic test command](../../commands/classictest.md), where
word splitting is done (because it\'s a normal command, not something
word splitting is done (because it's a normal command, not something
special):
sentence="Be liberal in what you accept, and conservative in what you send"
@ -178,7 +178,7 @@ both contains whitespace and is not the result of an expansion.
## Portability considerations
- `[[ ... ]]` functionality isn\'t specified by POSIX(R), though it\'s
- `[[ ... ]]` functionality isn\'t specified by POSIX(R), though it's
a reserved word
- Amongst the major \"POSIX-shell superset languages\" (for lack of a
better term) which do have `[[`, the test expression compound
@ -187,7 +187,7 @@ both contains whitespace and is not the result of an expansion.
between Ksh88, Ksh93, mksh, Zsh, and Bash. Ksh93 also adds a large
number of unique pattern matching features not supported by other
shells including support for several different regex dialects, which
are invoked using a different syntax from Bash\'s `=~`, though `=~`
are invoked using a different syntax from Bash's `=~`, though `=~`
is still supported by ksh and defaults to ERE.
- As an extension to POSIX ERE, most GNU software supports
backreferences in ERE, including Bash. According to POSIX, only BRE
@ -207,5 +207,5 @@ both contains whitespace and is not the result of an expansion.
- Internal: [the classic test command](../../commands/classictest.md)
- Internal: [the if-clause](../../syntax/ccmd/if_clause.md)
- [What is the difference between test, \[ and \[\[
?](http://mywiki.wooledge.org/BashFAQ/031) - BashFAQ 31 - Greg\'s
?](http://mywiki.wooledge.org/BashFAQ/031) - BashFAQ 31 - Greg's
wiki.

View File

@ -31,7 +31,7 @@ The input and output **filedescriptors** are cumulative:
This compound command also usually is the body of a [function
definition](../../syntax/basicgrammar.md#shell_function_definitions), though not
the only compound command that\'s valid there:
the only compound command that's valid there:
print_help() {
echo "Options:"

View File

@ -32,4 +32,4 @@ echo "$PWD" # Still in the original directory.
## See also
- [grouping commands](../../syntax/ccmd/grouping_plain.md)
- [Subshells on Greycat\'s wiki](http://mywiki.wooledge.org/SubShell)
- [Subshells on Greycat's wiki](http://mywiki.wooledge.org/SubShell)

View File

@ -22,7 +22,7 @@
## Description
The `if`-clause can control the script\'s flow (what\'s executed) by
The `if`-clause can control the script's flow (what's executed) by
looking at the exit codes of other commands.
All commandsets `<LIST>` are interpreted as [command
@ -65,19 +65,19 @@ commands of the condition that succeeded.
**Multiple commands as condition**
It\'s perfectly valid to do:
It's perfectly valid to do:
if echo "I'm testing!"; [ -e /some/file ]; then
...
fi
The exit code that dictates the condition\'s value is the exit code of
The exit code that dictates the condition's value is the exit code of
the very last command executed in the condition-list (here: The
`[ -e /some/file ]`)
**A complete pipe as condition**
A complete pipe can also be used as condition. It\'s very similar to the
A complete pipe can also be used as condition. It's very similar to the
example above (multiple commands):
if echo "Hello world!" | grep -i hello >/dev/null 2>&1; then

View File

@ -1,6 +1,6 @@
# Bash compound commands
The main part of Bash\'s syntax are the so-called **compound commands**.
The main part of Bash's syntax are the so-called **compound commands**.
They\'re called like that because they use \"real\" commands ([simple
commands](../../syntax/basicgrammar.md#simple_commands) or
[lists](../../syntax/basicgrammar.md#lists)) and knit some intelligence around

View File

@ -46,7 +46,7 @@ loop body in `{...}` instead of `do ... done`:
This syntax is **not documented** and should not be used. I found the
parser definitions for it in 1.x code, and in modern 4.x code. My guess
is that it\'s there for compatiblity reasons. This syntax is not
is that it's there for compatiblity reasons. This syntax is not
specified by POSIX(R).
## Examples

View File

@ -8,7 +8,7 @@ The [arithmetic expression](../../syntax/arith_expr.md) `<EXPRESSION>` is
evaluated and expands to the result. The output of the arithmetic
expansion is guaranteed to be one word and a digit in Bash.
Please **do not use the second form `$[ ... ]`**! It\'s deprecated. The
Please **do not use the second form `$[ ... ]`**! It's deprecated. The
preferred and standardized form is `$(( ... ))`!
Example
@ -62,7 +62,7 @@ echo $(($x[0])) # Error. This expands to $((1[0])), an invalid expression.
have to use something like `expr(1)` within backticks instead. Since
`expr` is horrible (as are backticks), and arithmetic expansion is
required by POSIX, you should not worry about this, and preferably
fix any code you find that\'s still using `expr`.
fix any code you find that's still using `expr`.
## See also

View File

@ -17,10 +17,10 @@ Brace expansion is used to generate arbitrary strings. The specified
strings are used to generate **all possible combinations** with the
optional surrounding prefixes and suffixes.
Usually it\'s used to generate mass-arguments for a command, that follow
Usually it's used to generate mass-arguments for a command, that follow
a specific naming-scheme.
:!: It is the very first step in expansion-handling, it\'s important to
:!: It is the very first step in expansion-handling, it's important to
understand that. When you use
echo {a,b}$PATH
@ -32,7 +32,7 @@ in a **later step**. Brace expansion just makes it being:
Another common pitfall is to assume that a range like `{1..200}` can be
expressed with variables using `{$a..$b}`. Due to what I described
above, it **simply is not possible**, because it\'s the very first step
above, it **simply is not possible**, because it's the very first step
in doing expansions. A possible way to achieve this, if you really
can\'t handle this in another way, is using the `eval` command, which
basically evaluates a commandline twice: `eval echo {$a..$b}` For
@ -79,7 +79,7 @@ With prefix or suffix strings, the result is a space-separated list of
The brace expansion is only performed, if the given string list is
really a **list of strings**, i.e., if there is a minimum of one \"`,`\"
(comma)! Something like `{money}` doesn\'t expand to something special,
it\'s really only the text \"`{money}`\".
it's really only the text \"`{money}`\".
## Ranges
@ -112,7 +112,7 @@ generated range is zero padded, too:
$ echo {01..10}
01 02 03 04 05 06 07 08 09 10
There\'s a chapter of Bash 4 brace expansion changes at [the end of this
There's a chapter of Bash 4 brace expansion changes at [the end of this
article](#new_in_bash_4.0).
Similar to the expansion using stringlists, you can add prefix and
@ -127,7 +127,7 @@ suffix strings:
## Combining and nesting
When you combine more brace expansions, you effectively use a brace
expansion as prefix or suffix for another one. Let\'s generate all
expansion as prefix or suffix for another one. Let's generate all
possible combinations of uppercase letters and digits:
$ echo {A..Z}{0..9}
@ -147,7 +147,7 @@ Hey.. that **saves you writing** 260 strings!
Brace expansions can be nested, but too much of it usually makes you
losing overview a bit ;-)
Here\'s a sample to generate the alphabet, first the uppercase letters,
Here's a sample to generate the alphabet, first the uppercase letters,
then the lowercase ones:
$ echo {{A..Z},{a..z}}
@ -165,13 +165,13 @@ download.
wget http://docs.example.com/documentation/slides_part{1,2,3,4,5,6}.html
Of course it\'s possible, and even easier, to do that with a sequence:
Of course it's possible, and even easier, to do that with a sequence:
wget http://docs.example.com/documentation/slides_part{1..6}.html
### Generate a subdirectory structure
Your life is hard? Let\'s ease it a bit - that\'s what shells are here
Your life is hard? Let's ease it a bit - that's what shells are here
for.
mkdir /home/bash/test/{foo,bar,baz,cat,dog}
@ -214,7 +214,7 @@ Can be written as
#### More fun
The most optimal possible brace expansion to expand n arguments of
course consists of n\'s prime factors. We can use the \"factor\" program
course consists of n's prime factors. We can use the \"factor\" program
bundled with GNU coreutils to emit a brace expansion that will expand
any number of arguments.

View File

@ -23,7 +23,7 @@ results, quote the command substitution!**
The second form `` `COMMAND` `` is more or less obsolete for Bash, since
it has some trouble with nesting (\"inner\" backticks need to be
escaped) and escaping characters. Use `$(COMMAND)`, it\'s also POSIX!
escaped) and escaping characters. Use `$(COMMAND)`, it's also POSIX!
When you [call an explicit subshell](../../syntax/ccmd/grouping_subshell.md)
`(COMMAND)` inside the command substitution `$()`, then take care, this
@ -51,8 +51,8 @@ command was `cat FILE`.
## A closer look at the two forms
In general you really should only use the form `$()`, it\'s
escaping-neutral, it\'s nestable, it\'s also POSIX. But take a look at
In general you really should only use the form `$()`, it's
escaping-neutral, it's nestable, it's also POSIX. But take a look at
the following code snips to decide yourself which form you need under
specific circumstances:
@ -77,7 +77,7 @@ normal on a commandline. No special escaping of **nothing** is needed:
**[Constructs you should avoid]{.underline}**
It\'s not all shiny with `$()`, at least for my current Bash
It's not all shiny with `$()`, at least for my current Bash
(`3.1.17(1)-release`. :!: [**Update:** Fixed since `3.2-beta` together
with a misinterpretion of \'))\' being recognized as arithmetic
expansion \[by redduck666\]]{.underline}). This command seems to
@ -101,10 +101,10 @@ uncommon ;-)) construct like:
In general, the `$()` should be the preferred method:
- it\'s clean syntax
- it\'s intuitive syntax
- it\'s more readable
- it\'s nestable
- it's clean syntax
- it's intuitive syntax
- it's more readable
- it's nestable
- its inner parsing is separate
## Examples

View File

@ -47,7 +47,7 @@ other `IFS`-characters they contain.
recursively get all `*.c` filenames.
- when the shell option `globasciiranges` is set, the bracket-range
globs (e.g. `[A-Z]`) use C locale order rather than the configured
locale\'s order (i.e. `ABC...abc...` instead of e.g. `AaBbCc...`) -
locale's order (i.e. `ABC...abc...` instead of e.g. `AaBbCc...`) -
since 4.3-alpha
- the variable [GLOBIGNORE](../../syntax/shellvars.md#GLOBIGNORE) can be set
to a colon-separated list of patterns to be removed from the list

View File

@ -15,7 +15,7 @@ The most simple example of this behaviour is a referenced variable:
echo "$mystring"
The `echo` program definitely doesn\'t care about what a shell variable
is. It is Bash\'s job to deal with the variable. Bash **expands** the
is. It is Bash's job to deal with the variable. Bash **expands** the
string \"`$mystring`\" to \"`Hello world`\", so that `echo` will only
see `Hello world`, not the variable or anything else!
@ -27,7 +27,7 @@ is called **quote-removal**.
## Overview
Saw a possible expansion syntax but don\'t know what it is? Here\'s a
Saw a possible expansion syntax but don\'t know what it is? Here's a
small list.
- [Parameter expansion](../../syntax/pe.md) (it has its own [overview

View File

@ -52,7 +52,7 @@ the caller when the callee returns.
In essence, process substitutions expanded to variables within functions
remain open until the function in which the process substitution occured
returns - even when assigned to locals that were set by a function\'s
returns - even when assigned to locals that were set by a function's
caller. Dynamic scope doesn\'t protect them from closing.
## Examples
@ -87,7 +87,7 @@ where those files are written to and destroyed automatically.
\<WRAP center round info 60%\> See Also:
[BashFAQ/024](http://mywiki.wooledge.org/BashFAQ/024) \-- *I set
variables in a loop that\'s in a pipeline. Why do they disappear after
variables in a loop that's in a pipeline. Why do they disappear after
the loop terminates? Or, why can\'t I pipe data to read?* \</WRAP\>
One of the most common uses for process substitutions is to avoid the
@ -124,7 +124,7 @@ echo "$counter files"
```
This is the normal input file redirection `< FILE`, just that the `FILE`
in this case is the result of process substitution. It\'s important to
in this case is the result of process substitution. It's important to
note that the space is required in order to disambiguate the syntax from
[here documents](../../syntax/redirection.md#here_documents).
@ -138,7 +138,7 @@ note that the space is required in order to disambiguate the syntax from
This example demonstrates how process substitutions can be made to
resemble \"passable\" objects. This results in converting the output of
`f`\'s argument to uppercase.
`f`'s argument to uppercase.
``` bash
f() {

View File

@ -23,7 +23,7 @@ The tilde expansion is used to expand to several specific pathnames:
Tilde expansion is only performed, when the tilde-construct is at the
beginning of a word, or a separate word.
If there\'s nothing to expand, i.e., in case of a wrong username or any
If there's nothing to expand, i.e., in case of a wrong username or any
other error condition, the tilde construct is not replaced, it stays
what it is.
@ -34,7 +34,7 @@ Tilde expansion is also performed everytime a variable is assigned:
`TARGET=file:~moonman/share`
\<note info\> As of now (Bash 4.3-alpha) the following constructs
**also** works, though it\'s not a variable assignment:
**also** works, though it's not a variable assignment:
echo foo=~
echo foo=:~
@ -68,7 +68,7 @@ operating system for the associated home directory for `<NAME>`.
To find the home directory of the current user (`~`), Bash has a
precedence:
- expand to the value of [HOME](../../syntax/shellvars.md#HOME) if it\'s
- expand to the value of [HOME](../../syntax/shellvars.md#HOME) if it's
defined
- expand to the home directory of the user executing the shell
(operating system)

View File

@ -49,4 +49,4 @@ is solely responsible.
- [WordSplitting](http://mywiki.wooledge.org/WordSplitting),
[IFS](http://mywiki.wooledge.org/IFS), and
[DontReadLinesWithFor](http://mywiki.wooledge.org/DontReadLinesWithFor) -
Greg\'s wiki
Greg's wiki

View File

@ -56,7 +56,7 @@ redirected to the pipe.
#### Avoid the final pipeline subshell
The traditional Ksh workaround to avoid the subshell when doing
`command | while read` is to use a coprocess. Unfortunately, Bash\'s
`command | while read` is to use a coprocess. Unfortunately, Bash's
behavior differs.
In Ksh you would do:
@ -81,14 +81,14 @@ bash: read: line: invalid file descriptor specification
By the time we start reading from the output of the coprocess, the file
descriptor has been closed.
See [this FAQ entry on Greg\'s
See [this FAQ entry on Greg's
wiki](http://mywiki.wooledge.org/BashFAQ/024) for other pipeline
subshell workarounds.
#### Buffering
In the first example, we GNU awk\'s `fflush()` command. As always, when
you use pipes the I/O operations are buffered. Let\'s see what happens
In the first example, we GNU awk's `fflush()` command. As always, when
you use pipes the I/O operations are buffered. Let's see what happens
with `sed`:
``` bash
@ -102,7 +102,7 @@ nothing read
Even though this example is the same as the first `awk` example, the
`read` doesn\'t return because the output is waiting in a buffer.
See [this faq entry on Greg\'s
See [this faq entry on Greg's
wiki](http://mywiki.wooledge.org/BashFAQ/009) for some workarounds and
more information on buffering issues.
@ -149,7 +149,7 @@ Here, fd 3 is inherited.
Unlike ksh, Bash doesn\'t have true anonymous coprocesses. Instead, Bash
assigns FDs to a default array named `COPROC` if no `NAME` is supplied.
Here\'s an example:
Here's an example:
``` bash
$ coproc awk '{print "foo" $0;fflush()}'
@ -209,10 +209,10 @@ exec >&${tee[1]} 2>&1
- The `coproc` keyword is not specified by POSIX(R)
- The `coproc` keyword appeared in Bash version 4.0-alpha
- The `-p` option to Bash\'s `print` loadable is a NOOP and not
- The `-p` option to Bash's `print` loadable is a NOOP and not
connected to Bash coprocesses in any way. It is only recognized as
an option for ksh compatibility, and has no effect.
- The `-p` option to Bash\'s `read` builtin conflicts with that of all
- The `-p` option to Bash's `read` builtin conflicts with that of all
kshes and zsh. The equivalent in those shells is to add a `\?prompt`
suffix to the first variable name argument to `read`. i.e., if the
first variable name given contains a `?` character, the remainder of
@ -236,14 +236,14 @@ and another coprocess may then be started, again using `|&`.
zsh coprocesses are very similar to ksh except in the way they are
started. zsh adds the shell reserved word `coproc` to the pipeline
syntax (similar to the way Bash\'s `time` keyword works), so that the
pipeline that follows is started as a coproc. The coproc\'s input and
syntax (similar to the way Bash's `time` keyword works), so that the
pipeline that follows is started as a coproc. The coproc's input and
output FDs can then be accessed and moved using the same `read`/`print`
`-p` and redirects used by the ksh shells.
It is unfortunate that Bash chose to go against existing practice in
their coproc implementation, especially considering it was the last of
the major shells to incorporate this feature. However, Bash\'s method
the major shells to incorporate this feature. However, Bash's method
accomplishes the same without requiring nearly as much additional
syntax. The `coproc` keyword is easy enough to wrap in a function such
that it takes Bash code as an ordinary argument and/or stdin like
@ -265,6 +265,6 @@ than one. This may be overridden at compile-time with the
## See also
- [Anthony Thyssen\'s Coprocess
- [Anthony Thyssen's Coprocess
Hints](http://www.ict.griffith.edu.au/anthony/info/shell/co-processes.hints) -
excellent summary of everything around the topic

View File

@ -14,7 +14,7 @@ A pattern is a **string description**. Bash uses them in various ways:
- Pattern-based branching using the [case command](../syntax/ccmd/case.md)
The pattern description language is relatively easy. Any character
that\'s not mentioned below matches itself. The `NUL` character may not
that's not mentioned below matches itself. The `NUL` character may not
occur in a pattern. If special characters are quoted, they\'re matched
literally, i.e., without their special meaning.
@ -28,7 +28,7 @@ share some symbols and do similar matching work.
`*` Matches **any string**, including the null string (empty string)
`?` Matches any **single character**
`X` Matches the character `X` which can be any character that has no special meaning
`\X` Matches the character `X`, where the character\'s special meaning is stripped by the backslash
`\X` Matches the character `X`, where the character's special meaning is stripped by the backslash
`\\` Matches a backslash
`[...]` Defines a pattern **bracket expression** (see below). Matches any of the enclosed characters at this position.
@ -127,7 +127,7 @@ least) ksh93 and zsh translate patterns into regexes and then use a
regex compiler to emit and cache optimized pattern matching code. This
means Bash may be an order of magnitude or more slower in cases that
involve complex back-tracking (usually that means extglob quantifier
nesting). You may wish to use Bash\'s regex support (the `=~` operator)
nesting). You may wish to use Bash's regex support (the `=~` operator)
if performance is a problem, because Bash will use your C library regex
implementation rather than its own pattern matcher.
@ -141,8 +141,8 @@ to those described above.
\* ksh93 supports arbitrary quantifiers just like ERE using the
`{from,to}(pattern-list)` syntax. `{2,4}(foo)bar` matches between 2-4
\"foo\"\'s followed by \"bar\". `{2,}(foo)bar` matches 2 or more
\"foo\"\'s followed by \"bar\". You can probably figure out the rest. So
\"foo\"'s followed by \"bar\". `{2,}(foo)bar` matches 2 or more
\"foo\"'s followed by \"bar\". You can probably figure out the rest. So
far, none of the other shells support this syntax.
\* In ksh93, a `pattern-list` may be delimited by either `&` or `|`. `&`
@ -155,7 +155,7 @@ double extglob negation. The aforementioned ksh93 pattern is equivalent
in Bash to: `[[ fo0bar == !(!(fo[0-9])|!(+([[:alnum:]])))bar ]]`, which
is technically more portable, but ugly.
\* ksh93\'s [printf](../commands/builtin/printf.md) builtin can translate from
\* ksh93's [printf](../commands/builtin/printf.md) builtin can translate from
shell patterns to ERE and back again using the `%R` and `%P` format
specifiers respectively.

View File

@ -84,7 +84,7 @@ Looking for a specific syntax you saw, without knowing the name?
`${PARAMETER}`
The easiest form is to just use a parameter\'s name within braces. This
The easiest form is to just use a parameter's name within braces. This
is identical to using `$FOO` like you see it everywhere, but has the
advantage that it can be immediately followed by characters that would
be interpreted as part of the parameter name otherwise. Compare these
@ -287,7 +287,7 @@ This list will also include [array names](../syntax/arrays.md).
`${PARAMETER%%PATTERN}`
This one can **expand only a part** of a parameter\'s value, **given a
This one can **expand only a part** of a parameter's value, **given a
pattern to describe what to remove** from the string. The pattern is
interpreted just like a pattern to describe a filename to match
(globbing). See [Pattern matching](../syntax/pattern.md) for more.
@ -392,7 +392,7 @@ The first one (*one slash*) is to only substitute **the first
occurrence** of the given pattern, the second one (*two slashes*) is to
substitute **all occurrences** of the pattern.
First, let\'s try to say \"happy\" instead of \"conservative\" in our
First, let's try to say \"happy\" instead of \"conservative\" in our
example string:
${MYSTRING//conservative/happy}
@ -403,8 +403,8 @@ example string:
Since there is only one \"conservative\" in that example, it really
doesn\'t matter which of the two forms we use.
Let\'s play with the word \"in\", I don\'t know if it makes any sense,
but let\'s substitute it with \"by\".
Let's play with the word \"in\", I don\'t know if it makes any sense,
but let's substitute it with \"by\".
[**First form: Substitute first occurrence**]{.underline}
@ -455,7 +455,7 @@ Assume: `array=(This is a text)`
`${#PARAMETER}`
When you use this form, the length of the parameter\'s value is
When you use this form, the length of the parameter's value is
expanded. Again, a quote from a big man, to have a test text:
MYSTRING="Be liberal in what you accept, and conservative in what you send"
@ -468,7 +468,7 @@ The length is reported in characters, not in bytes. Depending on your
environment this may not always be the same (multibyte-characters, like
in UTF8 encoding).
There\'s not much to say about it, mh?
There's not much to say about it, mh?
### (String) length: Arrays
@ -500,10 +500,10 @@ number of elements!**
`${PARAMETER:OFFSET:LENGTH}`
This one can expand only a **part** of a parameter\'s value, given a
This one can expand only a **part** of a parameter's value, given a
**position to start** and maybe a **length**. If `LENGTH` is omitted,
the parameter will be expanded up to the end of the string. If `LENGTH`
is negative, it\'s taken as a second offset into the string, counting
is negative, it's taken as a second offset into the string, counting
from the end of the string.
`OFFSET` and `LENGTH` can be **any** [arithmetic
@ -534,7 +534,7 @@ In the second form we also give a length value:
### Negative Offset Value
If the given offset is negative, it\'s counted from the end of the
If the given offset is negative, it's counted from the end of the
string, i.e. an offset of -1 is the last character. In that case, the
length still counts forward, of course. One special thing is to do when
using a negative offset: You need to separate the (negative) number from
@ -543,12 +543,12 @@ the colon:
${MYSTRING: -10:5}
${MYSTRING:(-10):5}
Why? Because it\'s interpreted as the parameter expansion syntax to [use
Why? Because it's interpreted as the parameter expansion syntax to [use
a default value](../syntax/pe.md#use_a_default_value).
### Negative Length Value
If the `LENGTH` value is negative, it\'s used as offset from the end of
If the `LENGTH` value is negative, it's used as offset from the end of
the string. The expansion happens from the first to the second offset
then:
@ -614,7 +614,7 @@ have to make a difference between expanding an individual element by a
given index and mass-expanding the array using the `@` and `*`
subscripts.
- For individual elements, it\'s the very same: If the expanded
- For individual elements, it's the very same: If the expanded
element is `NULL` or unset (watch the `:-` and `-` variants), the
default text is expanded
- For mass-expansion syntax, the default text is expanded if the array
@ -683,7 +683,7 @@ assigned when the parameter was **unset**.
After the first expansion here (`${HOME:=/home/$USER}`), `HOME` is set
and usable.
Let\'s change our code example from above:
Let's change our code example from above:
#!/bin/bash
@ -696,7 +696,7 @@ Let\'s change our code example from above:
For [arrays](../syntax/arrays.md) this expansion type is limited. For an
individual index, it behaves like for a \"normal\" parameter, the
default value is assigned to this one element. The mass-expansion
subscripts `@` and `*` **can not be used here** because it\'s not
subscripts `@` and `*` **can not be used here** because it's not
possible to assign to them!
## Use an alternate value
@ -706,7 +706,7 @@ possible to assign to them!
`${PARAMETER+WORD}`
This form expands to nothing if the parameter is unset or empty. If it
is set, it does not expand to the parameter\'s value, **but to some text
is set, it does not expand to the parameter's value, **but to some text
you can specify**:
echo "The Java application was installed and can be started.${JAVAPATH:+ NOTE: JAVAPATH seems to be set}"
@ -900,7 +900,7 @@ Removing the first 6 characters from a text string:
```{=html}
<!-- -->
```
- Bash (and most other shells) don\'t allow .\'s in identifiers. In
- Bash (and most other shells) don\'t allow .'s in identifiers. In
ksh93, dots in variable names are used to reference methods (i.e.
\"Discipline Functions\"), attributes, special shell variables, and
to define the \"real value\" of an instance of a class.
@ -1030,7 +1030,7 @@ it. Bash will actually expand the command as one of these:
To the best of my knowledge, ksh93 is the only shell that acts
differently. Rather than forcing nested expansions into quoting, a quote
at the beginning and end of the nested region will cause the quote state
to reverse itself within the nested part. I have no idea whether it\'s
to reverse itself within the nested part. I have no idea whether it's
an intentional or documented effect, but it does solve the problem and
consequently adds a lot of potential power to these expansions.

View File

@ -43,7 +43,7 @@ backslash:
echo \$HOME is set to \"$HOME\"
- `\$HOME` won\'t expand because it\'s not in variable-expansion
- `\$HOME` won\'t expand because it's not in variable-expansion
syntax anymore
- The backslash changes the quotes into literals - otherwise Bash
would interpret them
@ -68,7 +68,7 @@ meaning to bash. [Exception:]{.underline} Inside a single-quoted string
## Weak quoting
Inside a weak-quoted string there\'s **no special interpretion of**:
Inside a weak-quoted string there's **no special interpretion of**:
- spaces as word-separators (on inital command line splitting and on
[word splitting](../syntax/expansion/wordsplit.md)!)
@ -88,7 +88,7 @@ unless you have a file named `*`, spit out an error.
echo "Your PATH is: $PATH"
Will work as expected. `$PATH` is expanded, because it\'s double (weak)
Will work as expected. `$PATH` is expanded, because it's double (weak)
quoted.
If a backslash in double quotes (\"weak quoting\") occurs, there are 2
@ -112,8 +112,8 @@ single-quote that closes the string.
echo 'Your PATH is: $PATH'
`$PATH` won\'t be expanded, it\'s interpreted as ordinary text because
it\'s surrounded by strong quotes.
`$PATH` won\'t be expanded, it's interpreted as ordinary text because
it's surrounded by strong quotes.
In practise that means, to produce a text like `Here's my test...` as a
single-quoted string, you have to leave and re-enter the single quoting
@ -222,7 +222,7 @@ is seen as **one word**. The for loop iterates exactly one time, with
The command `test` or `[ ... ]` ([the classic test
command](../commands/classictest.md)) is an ordinary command, so ordinary
syntax rules apply. Let\'s take string comparison as an example:
syntax rules apply. Let's take string comparison as an example:
[ WORD = WORD ]
@ -232,7 +232,7 @@ writing this as a test command it would be:
test WORD = WORD
When you compare variables, it\'s wise to quote them. Let\'s create a
When you compare variables, it's wise to quote them. Let's create a
test string with spaces:
mystring="my string"

View File

@ -2,7 +2,7 @@
\<wrap left todo\>Fix me: To be continued\</wrap\>\
Redirection makes it possible to control where the output of a command
goes to, and where the input of a command comes from. It\'s a mighty
goes to, and where the input of a command comes from. It's a mighty
tool that, together with pipelines, makes the shell powerful. The
redirection operators are checked whenever a [simple command is about to
be executed](../syntax/grammar/parser_exec.md).
@ -99,11 +99,11 @@ truncated** before writing starts.
>& TARGET
This special syntax redirects both, `stdout` and `stderr` to the
specified target. It\'s **equivalent** to
specified target. It's **equivalent** to
> TARGET 2>&1
Since Bash4, there\'s `&>>TARGET`, which is equivalent to
Since Bash4, there's `&>>TARGET`, which is equivalent to
`>> TARGET 2>&1`.
\<wrap center important\>This syntax is deprecated and should not be
@ -146,7 +146,7 @@ omitted, filedescriptor 0 (`stdin`) is assumed.
A here-document is an input redirection using source data specified
directly at the command line (or in the script), no \"external\" source.
The redirection-operator `<<` is used together with a tag `TAG` that\'s
The redirection-operator `<<` is used together with a tag `TAG` that's
used to mark the end of input later:
# display help
@ -214,7 +214,7 @@ to hide it), this is **the wrong way**:
Why? Relatively easy:
- initially, `stdout` points to your terminal (you read it)
- same applies to `stderr`, it\'s connected to your terminal
- same applies to `stderr`, it's connected to your terminal
- `2>&1` redirects `stderr` away from the terminal to the target for
`stdout`: **the terminal** (again\...)
- `1>/dev/null` redirects `stdout` away from your terminal to the file

View File

@ -19,17 +19,17 @@
`?` question mark Status of the most recently executed foreground-pipeline (exit/return code)
`-` dash Current option flags set by the shell itself, on invocation, or using the [set builtin command](../commands/builtin/set.md). It\'s just a set of characters, like `himB` for `h`, `i`, `m` and `B`.
`-` dash Current option flags set by the shell itself, on invocation, or using the [set builtin command](../commands/builtin/set.md). It's just a set of characters, like `himB` for `h`, `i`, `m` and `B`.
`$` dollar-sign The process ID (PID) of the shell. In an [explicit subshell](../syntax/ccmd/grouping_subshell.md) it expands to the PID of the current \"main shell\", not the subshell. This is different from `$BASHPID`!
`!` exclamation mark The process ID (PID) of the most recently executed background pipeline (like started with `command &`)
`0` zero The name of the shell or the shell script (filename). Set by the shell itself.\
If Bash is started with a filename to execute (script), it\'s set to this filename. If started with the `-c <CMDLINE>` option (commandline given as argument), then `$0` will be the first argument after the given `<CMDLINE>`. Otherwise, it is set to the string given on invocation for `argv[0]`.\
If Bash is started with a filename to execute (script), it's set to this filename. If started with the `-c <CMDLINE>` option (commandline given as argument), then `$0` will be the first argument after the given `<CMDLINE>`. Otherwise, it is set to the string given on invocation for `argv[0]`.\
Unlike popular belief, `$0` is *not a positional parameter*.
`_` underscore A kind of catch-all parameter. Directly after shell invocation, it\'s set to the filename used to invoke Bash, or the absolute or relative path to the script, just like `$0` would show it. Subsequently, expands to the last argument to the previous command. Placed into the environment when executing commands, and set to the full pathname of these commands. When checking mail, this parameter holds the name of the mail file currently being checked.
`_` underscore A kind of catch-all parameter. Directly after shell invocation, it's set to the filename used to invoke Bash, or the absolute or relative path to the script, just like `$0` would show it. Subsequently, expands to the last argument to the previous command. Placed into the environment when executing commands, and set to the full pathname of these commands. When checking mail, this parameter holds the name of the mail file currently being checked.
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
## Shell Variables
@ -170,7 +170,7 @@ is the command executing at the time of the trap.
Type: normal variable Read-only: no
Set by Bash: no Default: n/a
The value is used to set the shell\'s compatibility level. The value may
The value is used to set the shell's compatibility level. The value may
be a decimal number (e.g., `4.2`) or an integer (e.g., `42`)
corresponding to the desired compatibility level. If `BASH_COMPAT` is
unset or set to the empty string, the compatibility level is set to the
@ -269,7 +269,7 @@ follows:
Expands to a string describing the version of this instance of Bash.
Since Bash 2.0 it includes the shell\'s \"release status\" (alpha\[N\],
Since Bash 2.0 it includes the shell's \"release status\" (alpha\[N\],
beta\[N\], release).
### CHILD_MAX
@ -618,7 +618,7 @@ contain only a single command).
Type: integer variable Read-only: yes
Set by Bash: yes Default: n/a
The process ID of the shell\'s parent process.
The process ID of the shell's parent process.
### PWD
@ -1309,7 +1309,7 @@ indicate multiple levels of indirection.
The full pathname to the shell is kept in this environment variable. If
it is not set when the shell starts, Bash assigns the full pathname of
the current user\'s login shell.
the current user's login shell.
### SRANDOM
@ -1386,7 +1386,7 @@ arrive.
Set by Bash: no Default: n/a
If set, Bash uses its value as the name of a directory in which Bash
creates temporary files for the shell\'s use.
creates temporary files for the shell's use.
### auto_resume
@ -1408,7 +1408,7 @@ The substring value provides functionality analogous to the %? job
identifier.
If set to any other value, the supplied string must be a prefix of a
stopped job\'s name; this provides functionality analogous to the
stopped job's name; this provides functionality analogous to the
`%string` job identifier.
### histchars

View File

@ -4,7 +4,7 @@
FIXME This article needs a review, it covers two topics (command line
splitting and word splitting) and mixes both a bit too much. But in
general, it\'s still usable to help understand this behaviour, it\'s
general, it's still usable to help understand this behaviour, it's
\"wrong but not wrong\".
One fundamental principle of Bash is to recognize words entered at the
@ -31,7 +31,7 @@ In other words, something you do (and Bash does) everyday. The
characters where Bash splits the command line (SPACE, TAB i.e. blanks)
are recognized as delimiters. There is no null argument generated when
you have 2 or more blanks in the command line. **A sequence of more
blank characters is treated as a single blank.** Here\'s an example:
blank characters is treated as a single blank.** Here's an example:
$ echo Hello little world
Hello little world
@ -46,7 +46,7 @@ ways to tell Bash not to treat them special: **Escaping** and
**quoting**.
Escaping a character means, to **take away its special meaning**. Bash
will use an escaped character as text, even if it\'s a special one.
will use an escaped character as text, even if it's a special one.
Escaping is done by preceeding the character with a backslash:
$ echo Hello\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ little \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ world
@ -93,7 +93,7 @@ For a more technical description, please read the [article about word
splitting](../syntax/expansion/wordsplit.md)!
The first kind of splitting is done to parse the command line into
separate tokens. This is what was described above, it\'s a pure
separate tokens. This is what was described above, it's a pure
**command line parsing**.
After the command line has been split into words, Bash will perform
@ -122,7 +122,7 @@ word splitting:
## Example
Let\'s follow an unquoted command through these steps, assuming that the
Let's follow an unquoted command through these steps, assuming that the
variable is set:
MYFILE="THE FILE.TXT"
@ -148,7 +148,7 @@ splitting](../syntax/expansion/wordsplit.md) on the results:
Word 1 Word 2 Word 3 Word 4 Word 5 Word 6 Word 7
`echo` `The` `file` `is` `named` `THE` `FILE.TXT`
Now let\'s imagine we quoted `$MYFILE`, the command line now looks like:
Now let's imagine we quoted `$MYFILE`, the command line now looks like:
echo The file is named "$MYFILE"