Merge pull request #16 from coronary/format/escapes

formatting(docs): remove majority of escape character sequences for legibility
This commit is contained in:
Florian Köhler 2024-04-02 19:06:36 +02:00 committed by GitHub
commit 93c513af25
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
98 changed files with 874 additions and 874 deletions

View File

@ -77,7 +77,7 @@ The `read` builtin command has some interesting new features.
The `-t` option to specify a timeout value has been slightly tuned. It
now accepts fractional values and the special value 0 (zero). When
`-t 0` is specified, `read` immediately returns with an exit status
indicating if there\'s data waiting or not. However, when a timeout is
indicating if there's data waiting or not. However, when a timeout is
given, and the `read` builtin times out, any partial data recieved up to
the timeout is stored in the given variable, rather than lost. When a
timeout is hit, `read` exits with a code greater than 128.
@ -90,7 +90,7 @@ See [read](commands/builtin/read.md)
### Changes to the "help" builtin
The builtin itself didn\'t change much, but the data displayed is more
The builtin itself didn't change much, but the data displayed is more
structured now. The help texts are in a better format, much easier to
read.

View File

@ -59,8 +59,8 @@ f3
## Notes
- `caller` produces no output unless used within a script that\'s run
from a real file. It isn\'t particularly useful for interactive use,
- `caller` produces no output unless used within a script that's run
from a real file. It isn't particularly useful for interactive use,
but can be used to create a decent `die` function to track down
errors in moderately complex scripts.
`{ bash /dev/stdin; } <<<$'f(){ g; }\ng(){ h; }\nh(){ while caller $((n++)); do :; done; }\nf'`
@ -68,7 +68,7 @@ f3
are available and a number of special parameters that give more
detail than caller (e.g. BASH_ARG{C,V}). Tools such as
[Bashdb](http://bashdb.sourceforge.net/) can assist in using some of
Bash\'s more advanced debug features.
Bash's more advanced debug features.
- The Bash manpage and help text specifies that the argument to
`caller` is an \"expr\" (whatever that means). Only an integer is
actually allowed, with no special interpretation of an

View File

@ -13,7 +13,7 @@ The `cd` builtin command is used to change the current working directory
- to the given directory (`cd DIRECTORY`)
- to the previous working directory (`cd -`) as saved in the
[OLDPWD](../../syntax/shellvars.md#OLDPWD) shell variable
- to the user\'s home directory as specified in the
- to the user's home directory as specified in the
[HOME](../../syntax/shellvars.md#HOME) environment variable (when used
without a `DIRECTORY` argument)
@ -30,7 +30,7 @@ is given or the shell is configured to do so (see the `-P` option of
-------- ----------------------------------------------------
`-L` Follow symbolic links (default)
`-P` Do not follow symbolic links
`-@` Browse a file\'s extended attributed, if supported
`-@` Browse a file's extended attributed, if supported
### Exit status
@ -41,7 +41,7 @@ is given or the shell is configured to do so (see the `-P` option of
## Examples
### Change the working directory to the user\'s home directory
### Change the working directory to the user's home directory
cd

View File

@ -22,7 +22,7 @@ variable.
When used in a function, `declare` makes `NAMEs` local variables, unless
used with the `-g` option.
Don\'t use it\'s synonym `typeset` when coding for Bash, since it\'s
Don't use it's synonym `typeset` when coding for Bash, since it's
tagged as obsolete.
### Options
@ -52,7 +52,7 @@ Below, `[-+]X` indicates an attribute, use `-X` to set the attribute,
`[-+]n` make NAME a reference to the variable named by its value. Introduced in Bash 4.3-alpha.\
\'\' \${!NAME}\'\' reveals the reference variable name, VALUE.\
Use `unset -n NAME` to unset the variable. (`unset -v NAME` unsets the VALUE variable.)\
Use `[[ -R NAME ]]` to test if NAME has been set to a VALUE, another variable\'s name.
Use `[[ -R NAME ]]` to test if NAME has been set to a VALUE, another variable's name.
`-p` display the attributes and value of each NAME
@ -138,7 +138,7 @@ RHS.
sum total a b c
printf 'Final value of "total" is: %d\n' "$total"
\<div hide\> function sum {
<div hide> function sum {
typeset -n _result=$1
shift
@ -153,12 +153,12 @@ RHS.
}
a=(1 2 3); b=(6 5 4); c=(2 4 6) sum total a b c printf \'Final value of
\"total\" is: %d\\n\' \"\$total\" \</div\>
\"total\" is: %d\\n\' \"\$total\" </div>
`typeset -n` is currently implemented in ksh93, mksh, and Bash 4.3. Bash
and mksh\'s implementations are quite similar, but much different from
ksh93\'s. See [Portability considerations](#portability_considerations)
for details. ksh93 namerefs are much more powerful than Bash\'s.
and mksh's implementations are quite similar, but much different from
ksh93's. See [Portability considerations](#portability_considerations)
for details. ksh93 namerefs are much more powerful than Bash's.
## Portability considerations
@ -167,7 +167,7 @@ for details. ksh93 namerefs are much more powerful than Bash\'s.
possible exception of Zsh in Bash compatibility mode. Bash marks the
synonym `typeset` as obsolete, which in Bash behaves identically to
`declare`. All other Korn-like shells use `typeset`, so it probably
isn\'t going away any time soon. Unfortunately, being a non-standard
isn't going away any time soon. Unfortunately, being a non-standard
builtin, `typeset` differs significantly between shells. ksh93 also
considers `typeset` a special builtin, while Bash does not - even in
POSIX mode. If you use `typeset`, you should attempt to only use it

View File

@ -6,11 +6,11 @@
## Description
`echo` outputs it\'s args to stdout, separated by spaces, followed by a
`echo` outputs it's args to stdout, separated by spaces, followed by a
newline. The return status is always `0`. If the
[shopt](../../commands/builtin/shopt.md) option `xpg_echo` is set, Bash
dynamically determines whether echo should expand escape characters
(listed below) by default based on the current platform. `echo` doesn\'t
(listed below) by default based on the current platform. `echo` doesn't
interpret `--` as the end of options, and will simply print this string
if given.

View File

@ -33,7 +33,7 @@ the `eval` command below it.
Frequently, `eval` is used to cause side-effects by performing a pass of
expansion on the code before executing the resulting string. This allows
for things that otherwise wouldn\'t be possible with ordinary Bash
for things that otherwise wouldn't be possible with ordinary Bash
syntax. This also, of course, makes `eval` the most powerful command in
all of shell programming (and in most other languages for that matter).
@ -106,10 +106,10 @@ controlled carefully by the caller is a good way to use it.
of eval will leak out into the surrounding environment. It is
possible to work around this limitation by prefixing special
builtins with the `command` regular builtin, but current versions of
~~ksh93~~ and zsh don\'t do this properly
~~ksh93~~ and zsh don't do this properly
([fixed](http://article.gmane.org/gmane.comp.programming.tools.ast.devel/686)
in ksh 93v- 2012-10-24 alpha). Earlier versions of zsh work (with
`setopt POSIX_BUILTINS` \-- looks like a regression). This works
`setopt POSIX_BUILTINS` -- looks like a regression). This works
correctly in Bash POSIX mode, Dash, and mksh.
```{=html}
@ -129,12 +129,12 @@ controlled carefully by the caller is a good way to use it.
$ ( x=a; eval "$x"'=( a b\ c d )'; printf '<%s> ' "${a[@]}"; echo ) # Proper quoting then gives us the expected results.
<a> <b c> <d>
We don\'t know why Bash does this. Since parentheses are metacharacters,
We don't know why Bash does this. Since parentheses are metacharacters,
they must ordinary be quoted or escaped when used as arguments. The
first example above is the same error as the second in all non-Bash
shells, even those with compound assignment.
In the case of `eval` it isn\'t recommended to use this behavior,
In the case of `eval` it isn't recommended to use this behavior,
because unlike e.g. [declare](../../commands/builtin/declare.md), the initial
expansion is still subject to all expansions including
[word-splitting](../../syntax/expansion/wordsplit.md) and [pathname
@ -161,14 +161,14 @@ identical to those of [let](../../commands/builtin/let.md).
## See also
- [BashFAQ 48 - eval and security
issues](http://mywiki.wooledge.org/BashFAQ/048) \-- **IMPORTANT**
issues](http://mywiki.wooledge.org/BashFAQ/048) -- **IMPORTANT**
- [Another eval
article](http://fvue.nl/wiki/Bash:_Why_use_eval_with_variable_expansion%3F)
- [Indirection via
eval](http://mywiki.wooledge.org/BashFAQ/006#Assigning_indirect.2BAC8-reference_variables)
- [More indirection via
eval](http://fvue.nl/wiki/Bash:_Passing_variables_by_reference)
- [Martin Väth\'s \"push\"](https://github.com/vaeth/push) \--
- [Martin Väth's \"push\"](https://github.com/vaeth/push) --
`printf %q` work-alike for POSIX.
- [The \"magic alias\"
hack](http://www.chiark.greenend.org.uk/~sgtatham/aliases.html)

View File

@ -22,7 +22,7 @@ There are no options.
### Exit status
Naturally, you can\'t ask for the exit status from within the shell that
Naturally, you can't ask for the exit status from within the shell that
executed the `exit` command, because the shell exits.
Status Reason

View File

@ -24,7 +24,7 @@ difference being `let` is a builtin (simple command), and `((` is a
compound command. The arguments to `let` are therefore subject to all
the same expansions and substitutions as any other simple command -
requiring proper quoting and escaping - whereas the contents of `((`
aren\'t subject to [word-splitting](../../syntax/expansion/wordsplit.md) or
aren't subject to [word-splitting](../../syntax/expansion/wordsplit.md) or
[pathname expansion](../../syntax/expansion/globs.md) (almost never desirable
for arithmetic). For this reason, **the [arithmetic compound
command](../../syntax/ccmd/arithmetic_eval.md) should generally be preferred
@ -41,12 +41,12 @@ command](../../syntax/ccmd/arithmetic_eval.md):
$ echo "$a - $b - $?"
4 - 2 - 0
\<WRAP info\> Remember that inside arithmetic evaluation contexts, all
<WRAP info> Remember that inside arithmetic evaluation contexts, all
other expansions are processed as usual (from left-to-right), and the
resulting text is evaluated as an arithmetic expression. Arithmetic
already has a way to control precedence using parentheses, so it\'s very
rare to need to nest arithmetic expansions within one another. It\'s
used above only to illustrate how this precedence works. \</WRAP\>
already has a way to control precedence using parentheses, so it's very
rare to need to nest arithmetic expansions within one another. It's
used above only to illustrate how this precedence works. </WRAP>
Unlike `((`, being a simple command `let` has its own environment. In
Bash, built-ins that can set variables process any arithmetic under
@ -80,17 +80,17 @@ needed.
- It seems to be a common misunderstanding that `let` has some legacy
purpose. Both `let` and [[^1]](../../syntax/ccmd/arithmetic_eval.md) were
ksh88 features and almost identical in terms of portability as
everything that inherited one also tended to get the other. Don\'t
everything that inherited one also tended to get the other. Don't
choose `let` over `((` expecting it to work in more places.
- [expr(1)](http://pubs.opengroup.org/onlinepubs/9699919799/utilities/expr.html#tag_20_42)
is a command one is likely to come across sooner or later. While it
is more \"standard\" than `let`, the above should always be
preferred. Both [arithmetic expansion](../../syntax/arith_expr.md)s and the
`[` test operator are specified by POSIX(r) and satisfy almost all
of expr\'s use-cases. Unlike `let`, `expr` cannot assign directly to
of expr's use-cases. Unlike `let`, `expr` cannot assign directly to
bash variables but instead returns a result on stdout. `expr` takes
each operator it recognizes as a separate word and then concatenates
them into a single expression that\'s evaluated according to it\'s
them into a single expression that's evaluated according to it's
own rules (which differ from shell arithmetic). `let` parses each
word it recieves on its own and evaluates it as an expression
without generating any output other than a return code.

View File

@ -26,7 +26,7 @@ way, and takes all the same options, with 3 exceptions:
## Portability considerations
- `local` is not specified by POSIX. Most bourne-like shells don\'t
- `local` is not specified by POSIX. Most bourne-like shells don't
have a builtin called `local`, but some such as `dash` and the
busybox shell do.
@ -41,15 +41,15 @@ way, and takes all the same options, with 3 exceptions:
```{=html}
<!-- -->
```
- In ksh93, using POSIX-style function definitions, `typeset` doesn\'t
- In ksh93, using POSIX-style function definitions, `typeset` doesn't
set `local` variables, but rather acts upon variables of the
next-outermost scope (e.g. setting attributes). Using `typeset`
within functions defined using ksh `function name {` syntax,
variables follow roughly
[lexical-scoping](http://community.schemewiki.org/?lexical-scope),
except that functions themselves don\'t have scope, just like Bash.
This means that even functions defined within a \"function\'s
scope\" don\'t have access to non-local variables except through
except that functions themselves don't have scope, just like Bash.
This means that even functions defined within a \"function's
scope\" don't have access to non-local variables except through
`namerefs`.
## See also

View File

@ -29,13 +29,13 @@ given array `ARRAY` is set readonly.
`-t` Remove any trailing newline from a line read, before it is assigned to an array element.
`-u FD` Read from filedescriptor `FD` rather than standard input.
While `mapfile` isn\'t a common or portable shell feature, it\'s
While `mapfile` isn't a common or portable shell feature, it's
functionality will be familiar to many programmers. Almost all
programming languages (aside from shells) with support for compound
datatypes like arrays, and which handle open file objects in the
traditional way, have some analogous shortcut for easily reading all
lines of some input as a standard feature. In Bash, `mapfile` in itself
can\'t do anything that couldn\'t already be done using read and a loop,
can't do anything that couldn't already be done using read and a loop,
and if portability is even a slight concern, should never be used.
However, it does *significantly* outperform a read loop, and can make
for shorter and cleaner code - especially convenient for interactive
@ -43,13 +43,13 @@ use.
## Examples
Here\'s a real-world example of interactive use borrowed from Gentoo
Here's a real-world example of interactive use borrowed from Gentoo
workflow. Xorg updates require rebuilding drivers, and the
Gentoo-suggested command is less than ideal, so let\'s Bashify it. The
Gentoo-suggested command is less than ideal, so let's Bashify it. The
first command produces a list of packages, one per line. We can read
those into the array named \"args\" using `mapfile`, stripping trailing
newlines with the \'-t\' option. The resulting array is then expanded
into the arguments of the \"emerge\" command - an interface to Gentoo\'s
into the arguments of the \"emerge\" command - an interface to Gentoo's
package manager. This type of usage can make for a safe and effective
replacement for xargs(1) in certain situations. Unlike xargs, all
arguments are guaranteed to be passed to a single invocation of the
@ -59,13 +59,13 @@ business.
# eix --only-names -IC x11-drivers | { mapfile -t args; emerge -av1 "${args[@]}" <&1; }
Note the use of command grouping to keep the emerge command inside the
pipe\'s subshell and within the scope of \"args\". Also note the unusual
pipe's subshell and within the scope of \"args\". Also note the unusual
redirection. This is because the -a flag makes emerge interactive,
asking the user for confirmation before continuing, and checking with
isatty(3) to abort if stdin isn\'t pointed at a terminal. Since stdin of
isatty(3) to abort if stdin isn't pointed at a terminal. Since stdin of
the entire command group is still coming from the pipe even though
mapfile has read all available input, we just borrow FD 1 as it just so
happens to be pointing where we want it. More on this over at greycat\'s
happens to be pointing where we want it. More on this over at greycat's
wiki: <http://mywiki.wooledge.org/BashFAQ/024>
### The callback
@ -121,7 +121,7 @@ illustrates the callback behavior:
Since redirects are syntactically allowed anywhere in a command, we put
it before the printf to stay out of the way of additional arguments.
Rather than opening \"outfile\<n\>\" for appending on each call by
Rather than opening \"outfile<n>\" for appending on each call by
calculating the filename, open an FD for each first and calculate which
FD to send output to by measuring the size of x mod 2. The zero-width
format specification is used to absorb the index number argument.
@ -209,13 +209,13 @@ each subsequent 2 iterations. The RETURN trap is unimportant.
## To Do
- Create an implementation as a shell function that\'s portable
- Create an implementation as a shell function that's portable
between Ksh, Zsh, and Bash (and possibly other bourne-like shells
with array support).
## See also
- [arrays](../../syntax/arrays.md)
- [read](../../commands/builtin/read.md) - If you don\'t know about this yet,
- [read](../../commands/builtin/read.md) - If you don't know about this yet,
why are you reading this page?
- <http://mywiki.wooledge.org/BashFAQ/001> - It\'s FAQ 1 for a reason.
- <http://mywiki.wooledge.org/BashFAQ/001> - It's FAQ 1 for a reason.

View File

@ -1,18 +1,18 @@
# The printf command
\<div center round todo box 70%\> FIXME Stranger, this is a very big
<div center round todo box 70%> FIXME Stranger, this is a very big
topic that needs experience - please fill in missing information, extend
the descriptions, and correct the details if you can! \</div\> \<div
center round tip 70%\> [**Attention:**]{.underline} This is about the
the descriptions, and correct the details if you can! </div> <div
center round tip 70%> [**Attention:**]{.underline} This is about the
Bash-builtin command `printf` - however, the description should be
nearly identical for an external command that follows POSIX(r).
[GNU Awk](http://www.gnu.org/software/gawk/manual/gawk.html#Printf)
expects a comma after the format string and between each of the
arguments of a **printf** command. For examples, see: [code
snippet](../../printf?&.md#using_printf_inside_of_awk). \</div\>
snippet](../../printf?&.md#using_printf_inside_of_awk). </div>
Unlike other documentations, I don\'t want to redirect you to the manual
Unlike other documentations, I don't want to redirect you to the manual
page for the `printf()` C function family. However, if you\'re more
experienced, that should be the most detailed description for the format
strings and modifiers.
@ -23,7 +23,7 @@ POSIX(r) recommends that `printf` is preferred over `echo`.
## General
The `printf` command provides a method to print preformatted text
similar to the `printf()` system interface (C function). It\'s meant as
similar to the `printf()` system interface (C function). It's meant as
successor for `echo` and has far more features and possibilities.
Beside other reasons, POSIX(r) has a very good argument to recommend it:
@ -54,10 +54,10 @@ argument!).
`-v VAR` If given, the output is assigned to the variable `VAR` instead of printed to `stdout` (comparable to `sprintf()` in some way)
---------- -------------------------------------------------------------------------------------------------------------------------------
The `-v` Option can\'t assign directly to array indexes in Bash versions
The `-v` Option can't assign directly to array indexes in Bash versions
older than Bash 4.1.
\<note warning\> In versions newer than 4.1, one must be careful when
<note warning> In versions newer than 4.1, one must be careful when
performing expansions into the first non-option argument of printf as
this opens up the possibility of an easy code injection vulnerability.
@ -66,11 +66,11 @@ this opens up the possibility of an easy code injection vulnerability.
declare -a x='([0]="hi")'
\...where the echo can of course be replaced with any arbitrary command.
If you must, either specify a hard-coded format string or use \-- to
If you must, either specify a hard-coded format string or use -- to
signal the end of options. The exact same issue also applies to
[read](../../commands/builtin/read.md), and a similar one to
[mapfile](../../commands/builtin/mapfile.md), though performing expansions into
their arguments is less common. \</note\>
their arguments is less common. </note>
### Arguments
@ -84,8 +84,8 @@ recognized to give a number-argument to `printf`:
`0N` An octal number
`0xN` A hexadecimal number
`0XN` A hexadecimal number
`"X` (a literal double-quote infront of a character): interpreted as number (underlying codeset) **don\'t forget escaping**
`'X` (a literal single-quote infront of a character): interpreted as number (underlying codeset) **don\'t forget escaping**
`"X` (a literal double-quote infront of a character): interpreted as number (underlying codeset) **don't forget escaping**
`'X` (a literal single-quote infront of a character): interpreted as number (underlying codeset) **don't forget escaping**
[**If more arguments than format specifiers**]{.underline} are present,
then the format string is re-used until the last argument is
@ -97,7 +97,7 @@ Take care to avoid [word splitting](../../syntax/expansion/wordsplit.md), as
accidentally passing the wrong number of arguments can produce wildly
different and unexpected results. See [this article](../../syntax/words.md).
\<note warning\> [**Again, attention:**]{.underline} When a numerical
<note warning> [**Again, attention:**]{.underline} When a numerical
format expects a number, the internal `printf`-command will use the
common Bash arithmetic rules regarding the base. A command like the
following example **will** throw an error, since `08` is not a valid
@ -105,7 +105,7 @@ octal number (`00` to `07`!):
printf '%d\n' 08
\</note\>
</note>
### Format strings
@ -143,7 +143,7 @@ all mean the same: A placeholder for data with a specified format:
`%G` Same as `%g`, but print it like `%E`
`%c` Interprets the associated argument as **char**: only the first character of a given argument is printed
`%s` Interprets the associated argument literally as string
`%n` Assigns the number of characters printed so far to the variable named in the corresponding argument. Can\'t specify an array index. If the given name is already an array, the value is assigned to the zeroth element.
`%n` Assigns the number of characters printed so far to the variable named in the corresponding argument. Can't specify an array index. If the given name is already an array, the value is assigned to the zeroth element.
`%a` Interprets the associated argument as **double**, and prints it in the form of a C99 [hexadecimal floating-point literal](http://www.exploringbinary.com/hexadecimal-floating-point-constants/).
`%A` Same as `%a`, but print it like `%E`
`%(FORMAT)T` output the date-time string resulting from using `FORMAT` as a format string for `strftime(3)`. The associated argument is the number of seconds since Epoch, or `-1` (current time) or `-2` (shell startup time). If no corresponding argument is supplies, the current time is used as default
@ -164,7 +164,7 @@ introductory `%` and the character that specifies the format:
Field output format
--------------------- --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
`<N>` **Any number**: Specifies a **minimum field width**, if the text to print is shorter, it\'s padded with spaces, if the text is longer, the field is expanded
`<N>` **Any number**: Specifies a **minimum field width**, if the text to print is shorter, it's padded with spaces, if the text is longer, the field is expanded
`.` **The dot**: Together with a field width, the field is **not** expanded when the text is longer, the text is truncated instead. \"`%.s`\" is an undocumented equivalent for \"`%.0s`\", which will force a field width of zero, effectively hiding the field from output
`*` **The asterisk**: the width is given as argument before the string or number. Usage (the \"`*`\" corresponds to the \"`20`\"): `printf "%*s\n" 20 "test string"`
`#` \"Alternative format\" for numbers: see table below
@ -178,8 +178,8 @@ introductory `%` and the character that specifies the format:
Alternative Format
-------------------------------------------------- --------------------------------------------------------------------------------------------------------------------------------------------------------------
`%#o` The octal number is printed with a leading zero, unless it\'s zero itself
`%#x`, `%#X` The hex number is printed with a leading \"`0x`\"/\"`0X`\", unless it\'s zero
`%#o` The octal number is printed with a leading zero, unless it's zero itself
`%#x`, `%#X` The hex number is printed with a leading \"`0x`\"/\"`0X`\", unless it's zero
`%#g`, `%#G` The float number is printed with **trailing zeros** until the number of digits for the current precision is reached (usually trailing zeros are not printed)
all number formats except `%d`, `%o`, `%x`, `%X` Always print a decimal point in the output, even if no digits follow it
@ -192,7 +192,7 @@ that precedes the number to print, like (prints 4,3000000000):
printf "%.*f\n" 10 4,3
The format `.*N` to specify the N\'th argument for precision does not
The format `.*N` to specify the N'th argument for precision does not
work in Bash.
For strings, the precision specifies the maximum number of characters to
@ -216,8 +216,8 @@ argument corresponding to a `%b` format.
`\v` Prints a vertical tabulator
`\"` Prints a `'`
`\?` Prints a `?`
`\<NNN>` Interprets `<NNN>` as **octal** number and prints the corresponding character from the character set
`\0<NNN>` same as `\<NNN>`
`<NNN>` Interprets `<NNN>` as **octal** number and prints the corresponding character from the character set
`\0<NNN>` same as `<NNN>`
`\x<NNN>` Interprets `<NNN>` as **hexadecimal** number and prints the corresponding character from the character set (**3 digits**)
`\u<NNNN>` same as `\x<NNN>`, but **4 digits**
`\U<NNNNNNNN>` same as `\x<NNN>`, but **8 digits**
@ -364,7 +364,7 @@ correct awk syntax.
With appropriate metacharacter escaping the bash printf can be called
from inside awk (as from perl and other languages that support shell
callout) as long as you don\'t care about program efficiency or
callout) as long as you don't care about program efficiency or
readability.
echo "Foo" | awk '{ system( "printf \"%s\\n \" \"" $1 "\"" ) }'
@ -382,7 +382,7 @@ readability.
use `%c`, you\'re actually asking for the first byte of the
argument. Likewise, the maximum field width modifier (dot) in
combination with `%s` goes by bytes, not characters. This limits
some of printf\'s functionality to working with ascii only. ksh93\'s
some of printf's functionality to working with ascii only. ksh93's
`printf` supports the `L` modifier with `%s` and `%c` (but so far
not `%S` or `%C`) in order to treat precision as character width,
not byte count. zsh appears to adjust itself dynamically based upon
@ -409,16 +409,16 @@ fmt++;
- mksh has no built-in printf by default (usually). There is an
unsupported compile-time option to include a very poor, basically
unusable implementation. For the most part you must rely upon the
system\'s `/usr/bin/printf` or equivalent. The mksh maintainer
system's `/usr/bin/printf` or equivalent. The mksh maintainer
recommends using `print`. The development version (post- R40f) adds
a new parameter expansion in the form of `${name@Q}` which fills the
role of `printf %q` \-- expanding in a shell-escaped format.
role of `printf %q` -- expanding in a shell-escaped format.
```{=html}
<!-- -->
```
- ksh93 optimizes builtins run from within a command substitution and
which have no redirections to run in the shell\'s process. Therefore
which have no redirections to run in the shell's process. Therefore
the `printf -v` functionality can be closely matched by
`var=$(printf ...)` without a big performance hit.
@ -447,13 +447,13 @@ fmt++;
- The optional Bash loadable `print` may be useful for ksh
compatibility and to overcome some of
[echo](../../commands/builtin/echo.md)\'s portability pitfalls. Bash, ksh93,
and zsh\'s `print` have an `-f` option which takes a `printf` format
[echo](../../commands/builtin/echo.md)'s portability pitfalls. Bash, ksh93,
and zsh's `print` have an `-f` option which takes a `printf` format
string and applies it to the remaining arguments. Bash lists the
synopsis as:
`print: print [-Rnprs] [-u unit] [-f format] [arguments]`. However,
only `-Rrnfu` are actually functional. Internally, `-p` is a noop
(it doesn\'t tie in with Bash coprocs at all), and `-s` only sets a
(it doesn't tie in with Bash coprocs at all), and `-s` only sets a
flag but has no effect. `-Cev` are unimplemented.
```{=html}
@ -472,5 +472,5 @@ fmt++;
function](http://pubs.opengroup.org/onlinepubs/9699919799/functions/printf.html)
- [Code snip: Print a horizontal
line](../../snipplets/print_horizontal_line.md) uses some `printf` examples
- [Greg\'s BashFAQ 18: How can I use numbers with leading zeros in a
- [Greg's BashFAQ 18: How can I use numbers with leading zeros in a
loop, e.g., 01, 02?](BashFAQ>018)

View File

@ -20,7 +20,7 @@ If `<NAME...>` is given, the line is word-split using
`<NAME>`. The remaining words are all assigned to the last `<NAME>` if
more words than variable names are present.
\<WRAP center round info 90%\> If no `<NAME>` is given, the whole line
<WRAP center round info 90%> If no `<NAME>` is given, the whole line
read (without performing word-splitting!) is assigned to the shell
variable [REPLY](../../syntax/shellvars.md#REPLY). Then, `REPLY` really contains
the line as it was read, without stripping pre- and postfix spaces and
@ -30,7 +30,7 @@ other things!
printf '"%s"\n' "$REPLY"
done <<<" a line with prefix and postfix space "
\</WRAP\>
</WRAP>
If a timeout is given, or if the shell variable
[TMOUT](../../syntax/shellvars.md#TMOUT) is set, it is counted from initially
@ -43,25 +43,25 @@ line is read). That means the timeout can occur during input, too.
---------------- -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
`-a <ARRAY>` read the data word-wise into the specified array `<ARRAY>` instead of normal variables
`-d <DELIM>` recognize `<DELIM>` as data-end, rather than `<newline>`
`-e` on interactive shells: use Bash\'s readline interface to read the data. Since version 5.1-alpha, this can also be used on specified file descriptors using `-u`
`-e` on interactive shells: use Bash's readline interface to read the data. Since version 5.1-alpha, this can also be used on specified file descriptors using `-u`
`-i <STRING>` preloads the input buffer with text from `<STRING>`, only works when Readline (`-e`) is used
`-n <NCHARS>` reads `<NCHARS>` characters of input, then quits
`-N <NCHARS>` reads `<NCHARS>` characters of input, *ignoring any delimiter*, then quits
`-p <PROMPT>` the prompt string `<PROMPT>` is output (without a trailing automatic newline) before the read is performed
`-r` raw input - **disables** interpretion of **backslash escapes** and **line-continuation** in the read data
`-s` secure input - don\'t echo input if on a terminal (passwords!)
`-s` secure input - don't echo input if on a terminal (passwords!)
`-t <TIMEOUT>` wait for data `<TIMEOUT>` seconds, then quit (exit code 1). Fractional seconds (\"5.33\") are allowed since Bash 4. A value of 0 immediately returns and indicates if data is waiting in the exit code. Timeout is indicated by an exit code greater than 128. If timeout arrives before data is read completely (before end-of-line), the partial data is saved.
`-u <FD>` use the filedescriptor number `<FD>` rather than `stdin` (0)
When both, `-a <ARRAY>` and a variable name `<NAME>` is given, then the
array is set, but not the variable.
Of course it\'s valid to set individual array elements without using
Of course it's valid to set individual array elements without using
`-a`:
read MYARRAY[5]
\<WRAP center round important 90%\>
<WRAP center round important 90%>
Reading into array elements using the syntax above **may cause [pathname
expansion](../../syntax/expansion/globs.md) to occur**.
@ -81,7 +81,7 @@ array name and index:
read 'x[1]'
\</WRAP\>
</WRAP>
### Return status
@ -90,7 +90,7 @@ array name and index:
0 no error
0 error when assigning to a read-only variable [^1]
2 invalid option
\>128 timeout (see `-t`)
>128 timeout (see `-t`)
!=0 invalid filedescriptor supplied to `-u`
!=0 end-of-file reached
@ -99,7 +99,7 @@ array name and index:
Essentially all you need to know about `-r` is to **ALWAYS** use it. The
exact behavior you get without `-r` is completely useless even for weird
purposes. It basically allows the escaping of input which matches
something in IFS, and also escapes line continuations. It\'s explained
something in IFS, and also escapes line continuations. It's explained
pretty well in the [POSIX
read](http://pubs.opengroup.org/onlinepubs/9699919799/utilities/read.html#tag_20_109)
spec.
@ -141,7 +141,7 @@ some baskslash-escapes or switches (like `-n`).
### Press any key\...
Remember the MSDOS `pause` command? Here\'s something similar:
Remember the MSDOS `pause` command? Here's something similar:
pause() {
local dummy
@ -254,9 +254,9 @@ date/time string are recognized correctly.
- POSIX(r) only specified the `-r` option (raw read); `-r` is not only
POSIX, you can find it in earlier Bourne source code
- POSIX(r) doesn\'t support arrays
- POSIX(r) doesn't support arrays
- `REPLY` is not POSIX(r), you need to set `IFS` to the empty string
to get the whole line for shells that don\'t know `REPLY`.
to get the whole line for shells that don't know `REPLY`.
`while IFS= read -r line; do
...
done < text.txt

View File

@ -7,7 +7,7 @@
## Description
The `readonly` builtin command is used to mark variables or functions as
read-only, which means unchangeable. This implies that it can\'t be
read-only, which means unchangeable. This implies that it can't be
unset anymore. A `readonly` variable may not be redefined in child
scopes. A readonly global may not be redefined as a function local
variable. Simple command environment assignments may not reference

View File

@ -18,7 +18,7 @@ There are no options.
### Exit status
If everything is okay, the `return` command doesn\'t come back. If it
If everything is okay, the `return` command doesn't come back. If it
comes back, there was a problem in doing the return.
Status Reason

View File

@ -27,7 +27,7 @@ set flags (true for most commands on UNIX(r)).
Flag Optionname Description
------ ---------------------------------------------------------- ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
`-a` `allexport` Automatically mark new and altered variables to be exported to subsequent environments.
`-b` `notify` Don\'t wait for the next prompt to print when showing the reports for a terminated background job (only with job control)
`-b` `notify` Don't wait for the next prompt to print when showing the reports for a terminated background job (only with job control)
`-e` `errexit` When set, the shell exits when a simple command in a command list exits non-zero (`FALSE`). This is not done in situations, where the exit code is already checked (`if`, `while`, `until`, `||`, `&&`)
`-f` `noglob` Disable [pathname expansion](../../syntax/expansion/globs.md) (globbing)
`-h` `hashall` Remembers the location of commands when they\'re called (hashing). Enabled by default.
@ -41,10 +41,10 @@ set flags (true for most commands on UNIX(r)).
`-v` `verbose` Print shell input lines as they are read - useful for debugging.
`-x` `xtrace` Print commands just before execution - with all expansions and substitutions done, and words marked - useful for debugging.
`-B` `braceexpand` The shell performs [brace expansion](../../syntax/expansion/brace.md) This is on by default.
`-C` \<BOOKMARK:tag_noclobber\>`noclobber` Don\'t overwrite files on redirection operations. You can override that by specifying the `>|` redirection operator when needed. See [redirection](../../syntax/redirection.md)
`-C` <BOOKMARK:tag_noclobber>`noclobber` Don't overwrite files on redirection operations. You can override that by specifying the `>|` redirection operator when needed. See [redirection](../../syntax/redirection.md)
`-E` `errtrace` `ERR`-traps are inherited by by shell functions, command substitutions, and commands executed in a subshell environment.
`-H` `histexpand` Enable `!`-style history expansion. Defaults to `on` for interactive shells.
`-P` `physical` Don\'t follow symlinks when changing directories - use the physical filesystem structure.
`-P` `physical` Don't follow symlinks when changing directories - use the physical filesystem structure.
`-T` `functrace` `DEBUG`- and `RETURN`-traps are inherited by subsequent environments, like `-E` for `ERR` trap.
`-` \"End of options\" - all following arguments are assigned to the positional parameters, even when they begin with a dash. `-x` and `-v` options are turned off. Positional parameters are unchanged (unlike using `--`!) when no further arguments are given.
`--` If no arguments follow, the positional parameters are unset. With arguments, the positional parameters are set, even if the strings begin with a `-` (dash) like an option.

View File

@ -63,26 +63,26 @@ There are no options.
dash: 1: shift: can't shift that many
` In most shells, you can work around this problem using the
[command](../../commands/builtin/command.md) builtin to suppress fatal
errors caused by *special builtins*. \<code\> \$ dash -c \'f() { if
command shift 2\>/dev/null; then echo \"\$1\"; else echo \"no
errors caused by *special builtins*. <code> \$ dash -c \'f() { if
command shift 2>/dev/null; then echo \"\$1\"; else echo \"no
args\"; fi; }; f\'
no args \</code\> While, POSIX requires this behavior, it isn\'t very
obvious and some shells don\'t do it correctly. To work around this, you
no args </code> While, POSIX requires this behavior, it isn't very
obvious and some shells don't do it correctly. To work around this, you
can use something like:
\<code\> \$ mksh -c \'f() { if ! \${1+false} && shift; then echo
\"\$1\"; else echo \"no args\"; fi; }; f\' no args \</code\> ~~The mksh
<code> \$ mksh -c \'f() { if ! \${1+false} && shift; then echo
\"\$1\"; else echo \"no args\"; fi; }; f\' no args </code> ~~The mksh
maintainer refuses to change either the `shift` or `command` builtins.~~
[Fixed](https://github.com/MirBSD/mksh/commit/996e05548ab82f7ef2dea61f109cc7b6d13837fa).
(Thanks!)
- Perhaps almost as bad as the above, busybox sh\'s `shift` always
- Perhaps almost as bad as the above, busybox sh's `shift` always
returns success, even when attempting to shift beyond the final
argument. \<code\> \$ bb -c \'f() { if shift; then echo \"\$1\";
argument. <code> \$ bb -c \'f() { if shift; then echo \"\$1\";
else echo \"no args\"; fi; }; f\'
(no output) \</code\> The above mksh workaround will work in this case
(no output) </code> The above mksh workaround will work in this case
too.
## See also

View File

@ -33,7 +33,7 @@ When listing options, the exit code is `TRUE` (0), if all options are
enabled, `FALSE` otherwise.
When setting/unsetting an option, the exit code is `TRUE` unless the
named option doesn\'t exitst.
named option doesn't exitst.
## Examples

View File

@ -34,7 +34,7 @@ Special events
`EXIT` 0 executed on shell exit
`DEBUG` executed before every simple command
`RETURN` executed when a shell function or a sourced code finishes executing
`ERR` executed each time a command\'s failure would cause the shell to exit when the [`-e` option (`errexit`)](../../commands/builtin/set.md) is enabled
`ERR` executed each time a command's failure would cause the shell to exit when the [`-e` option (`errexit`)](../../commands/builtin/set.md) is enabled
### Options

View File

@ -51,10 +51,10 @@ variable has been set, then the variable of the same name in the
next-outermost scope becomes visible to its scope and all children - as
if the variable that was unset was never set to begin with. This
property allows looking upwards through the stack as variable names are
unset, so long as unset and the local it unsets aren\'t together in the
unset, so long as unset and the local it unsets aren't together in the
same scope level.
Here\'s a demonstration of this behavior.
Here's a demonstration of this behavior.
#!/usr/bin/env bash
@ -121,8 +121,8 @@ output:
Some things to observe:
- `unset2` is only really needed once. We remain 5 levels deep in
`f`\'s for the remaining `unset` calls, which peel away the outer
layers of `a`\'s.
`f`'s for the remaining `unset` calls, which peel away the outer
layers of `a`'s.
- Notice that the \"a\" is unset using an ordinary unset command at
recursion depth 1, and subsequently calling unset reveals a again in
the global scope, which has since been modified in a lower scope
@ -130,7 +130,7 @@ Some things to observe:
- Declaring a global with declare -g bypasses all locals and sets or
modifies the variable of the global scope (outside of all
functions). It has no affect on the visibility of the global.
- This doesn\'t apply to individual array elements. If two local
- This doesn't apply to individual array elements. If two local
arrays of the same name appear in different scopes, the entire array
of the inner scope needs to be unset before any elements of the
outer array become visible. This makes \"unset\" and \"unset2\"
@ -145,7 +145,7 @@ expands its arguments.
~ $ ( a=({a..d}); unset 'a[2]'; declare -p a )
declare -a a='([0]="a" [1]="b" [3]="d")'
As usual in such cases, it\'s important to quote the args to avoid
As usual in such cases, it's important to quote the args to avoid
accidental results such as globbing.
~ $ ( a=({a..d}) b=a c=d d=1; set -x; unset "${b}["{2..3}-c\]; declare -p a )

View File

@ -34,7 +34,7 @@ The return status is the return status of the job waited for, or
Status Reason
-------- -------------------------------------------------
0 waited for all jobs in shell\'s job list
0 waited for all jobs in shell's job list
1 the given `ID` is not a valid job or process ID
## Examples

View File

@ -87,60 +87,60 @@ Since Bash 4.1, all tests related to permissions respect ACLs, if the underlying
| Operator syntax | Description |
| ------------------------- | ----------------------------------------------------------------------------------------------- |
| **-a** <FILE\> | True if <FILE\> exists. :warning: (not recommended, may collide with `-a` for `AND`, see below) |
| **-e** <FILE\> | True if <FILE\> exists. |
| **-f** <FILE\> | True, if <FILE\> exists and is a **regular** file. |
| **-d** <FILE\> | True, if <FILE\> exists and is a **directory**. |
| **-c** <FILE\> | True, if <FILE\> exists and is a **character special** file. |
| **-b** <FILE\> | True, if <FILE\> exists and is a **block special** file. |
| **-p** <FILE\> | True, if <FILE\> exists and is a **named pipe** (FIFO). |
| **-S** <FILE\> | True, if <FILE\> exists and is a **socket** file. |
| **-L** <FILE\> | True, if <FILE\> exists and is a **symbolic link**. |
| **-h** <FILE\> | True, if <FILE\> exists and is a **symbolic link**. |
| **-g** <FILE\> | True, if <FILE\> exists and has **sgid bit** set. |
| **-u** <FILE\> | True, if <FILE\> exists and has **suid bit** set. |
| **-r** <FILE\> | True, if <FILE\> exists and is **readable**. |
| **-w** <FILE\> | True, if <FILE\> exists and is **writable**. |
| **-x** <FILE\> | True, if <FILE\> exists and is **executable**. |
| **-s** <FILE\> | True, if <FILE\> exists and has size bigger than 0 (**not empty**). |
| **-t** <fd\> | True, if file descriptor <fd\> is open and refers to a terminal. |
| <FILE1\> **-nt** <FILE2\> | True, if <FILE1\> is **newer than** <FILE2\> (mtime). :warning: |
| <FILE1\> **-ot** <FILE2\> | True, if <FILE1\> is **older than** <FILE2\> (mtime). :warning: |
| <FILE1\> **-ef** <FILE2\> | True, if <FILE1\> and <FILE2\> refer to the **same device and inode numbers**. :warning: |
| **-a** <FILE> | True if <FILE> exists. :warning: (not recommended, may collide with `-a` for `AND`, see below) |
| **-e** <FILE> | True if <FILE> exists. |
| **-f** <FILE> | True, if <FILE> exists and is a **regular** file. |
| **-d** <FILE> | True, if <FILE> exists and is a **directory**. |
| **-c** <FILE> | True, if <FILE> exists and is a **character special** file. |
| **-b** <FILE> | True, if <FILE> exists and is a **block special** file. |
| **-p** <FILE> | True, if <FILE> exists and is a **named pipe** (FIFO). |
| **-S** <FILE> | True, if <FILE> exists and is a **socket** file. |
| **-L** <FILE> | True, if <FILE> exists and is a **symbolic link**. |
| **-h** <FILE> | True, if <FILE> exists and is a **symbolic link**. |
| **-g** <FILE> | True, if <FILE> exists and has **sgid bit** set. |
| **-u** <FILE> | True, if <FILE> exists and has **suid bit** set. |
| **-r** <FILE> | True, if <FILE> exists and is **readable**. |
| **-w** <FILE> | True, if <FILE> exists and is **writable**. |
| **-x** <FILE> | True, if <FILE> exists and is **executable**. |
| **-s** <FILE> | True, if <FILE> exists and has size bigger than 0 (**not empty**). |
| **-t** <fd> | True, if file descriptor <fd> is open and refers to a terminal. |
| <FILE1> **-nt** <FILE2> | True, if <FILE1> is **newer than** <FILE2> (mtime). :warning: |
| <FILE1> **-ot** <FILE2> | True, if <FILE1> is **older than** <FILE2> (mtime). :warning: |
| <FILE1> **-ef** <FILE2> | True, if <FILE1> and <FILE2> refer to the **same device and inode numbers**. :warning: |
## String tests
| Operator syntax | Description |
| ---------------------------- | -------------------------------------------------------------------------------------------------------------------------------- |
| **-z** <STRING\> | True, if <STRING\> is **empty**. |
| **-n** <STRING\> | True, if <STRING\> is **not empty** (this is the default operation). |
| <STRING1\> **=** <STRING2\> | True, if the strings are **equal**. |
| <STRING1\> **!=** <STRING2\> | True, if the strings are **not equal**. |
| <STRING1\> **<** <STRING2\> | True if <STRING1\> sorts **before** <STRING2\> lexicographically (pure ASCII, not current locale!). Remember to escape! Use `\<` |
| <STRING1\> **\>** <STRING2\> | True if <STRING1\> sorts **after** <STRING2\> lexicographically (pure ASCII, not current locale!). Remember to escape! Use `\>` |
| **-z** <STRING> | True, if <STRING> is **empty**. |
| **-n** <STRING> | True, if <STRING> is **not empty** (this is the default operation). |
| <STRING1> **=** <STRING2> | True, if the strings are **equal**. |
| <STRING1> **!=** <STRING2> | True, if the strings are **not equal**. |
| <STRING1> **<** <STRING2> | True if <STRING1> sorts **before** <STRING2> lexicographically (pure ASCII, not current locale!). Remember to escape! Use `<` |
| <STRING1> **>** <STRING2> | True if <STRING1> sorts **after** <STRING2> lexicographically (pure ASCII, not current locale!). Remember to escape! Use `>` |
## Arithmetic tests
| Operator syntax | Description |
| ------------------------------- | ------------------------------------------------------------------- |
| <INTEGER1\> **-eq** <INTEGER2\> | True, if the integers are **equal**. |
| <INTEGER1\> **-ne** <INTEGER2\> | True, if the integers are **NOT equal**. |
| <INTEGER1\> **-le** <INTEGER2\> | True, if the first integer is **less than or equal** second one. |
| <INTEGER1\> **-ge** <INTEGER2\> | True, if the first integer is **greater than or equal** second one. |
| <INTEGER1\> **-lt** <INTEGER2\> | True, if the first integer is **less than** second one. |
| <INTEGER1\> **-gt** <INTEGER2\> | True, if the first integer is **greater than** second one. |
| <INTEGER1> **-eq** <INTEGER2> | True, if the integers are **equal**. |
| <INTEGER1> **-ne** <INTEGER2> | True, if the integers are **NOT equal**. |
| <INTEGER1> **-le** <INTEGER2> | True, if the first integer is **less than or equal** second one. |
| <INTEGER1> **-ge** <INTEGER2> | True, if the first integer is **greater than or equal** second one. |
| <INTEGER1> **-lt** <INTEGER2> | True, if the first integer is **less than** second one. |
| <INTEGER1> **-gt** <INTEGER2> | True, if the first integer is **greater than** second one. |
## Misc syntax
| Operator syntax | Description |
| ------------------------ | ------------------------------------------------------------------------------------------------------------------------ |
| <TEST1\> **-a** <TEST2\> | True, if <TEST1\> **and** <TEST2\> are true (AND). Note that `-a` also may be used as a file test (see above) |
| <TEST1\> **-o** <TEST2\> | True, if either \<TEST1\> **or** <TEST2\> is true (OR). |
| **!** <TEST\> | True, if <TEST\> is **false** (NOT). |
| **(** <TEST\> **)** | Group a test (for precedence). **Attention:** In normal shell-usage, the `(` and `)` must be escaped; use `\(` and `\)`! |
| **-o** <OPTION_NAME\> | True, if the [shell option](../internals/shell_options.md) <OPTION_NAME\> is set. |
| **-v** <VARIABLENAME\> | True if the variable <VARIABLENAME\> has been set. Use `var[n]` for array elements. |
| **-R** <VARIABLENAME\> | True if the variable <VARIABLENAME\> has been set and is a nameref variable (since 4.3-alpha) |
| <TEST1> **-a** <TEST2> | True, if <TEST1> **and** <TEST2> are true (AND). Note that `-a` also may be used as a file test (see above) |
| <TEST1> **-o** <TEST2> | True, if either <TEST1> **or** <TEST2> is true (OR). |
| **!** <TEST> | True, if <TEST> is **false** (NOT). |
| **(** <TEST> **)** | Group a test (for precedence). **Attention:** In normal shell-usage, the `(` and `)` must be escaped; use `(` and `)`! |
| **-o** <OPTION_NAME> | True, if the [shell option](../internals/shell_options.md) <OPTION_NAME> is set. |
| **-v** <VARIABLENAME> | True if the variable <VARIABLENAME> has been set. Use `var[n]` for array elements. |
| **-R** <VARIABLENAME> | True if the variable <VARIABLENAME> has been set and is a nameref variable (since 4.3-alpha) |
## Number of Arguments Rules
@ -244,7 +244,7 @@ Let's say, we want to check the following two things (AND):
1. if a string is null (empty)
2. if a command produced an output
Let\'s see:
Let's see:
```bash
if [ -z "false" -a -z "$(echo I am executed >&2)" ] ; then ...
@ -290,7 +290,7 @@ true
For the test command, the precedence parenthesis are, as well, `( )`, but you need to escape or quote them, so that the shell doesn't try to interpret them:
```bash
$ if [ \( "true" -o -e /does/not/exist \) -a -e /does/not/exist ]; then echo true; else echo false; fi
$ if [ ( "true" -o -e /does/not/exist ) -a -e /does/not/exist ]; then echo true; else echo false; fi
false
# equivalent, but less readable IMHO:
@ -527,7 +527,7 @@ done
- Internal: [conditional expression](../syntax/ccmd/conditional_expression.md) (aka "the new test command")
- Internal: [the if-clause](../syntax/ccmd/if_clause.md)
[^1]: <rant\>Of course, one can wonder what is the use of including the
[^1]: <rant>Of course, one can wonder what is the use of including the
parenthesis in the specification without defining the behaviour with
more than 4 arguments or how usefull are the examples with 7 or 9
arguments attached to the specification.</rant\>
arguments attached to the specification.</rant>

View File

@ -2,7 +2,7 @@
In terms of UNIX(r), a directory is a special file which contains a list
of [hardlinks](../dict/terms/hardlink.md) to other files. These other files
also can be directories of course, so it\'s possible to create a
also can be directories of course, so it's possible to create a
\"hierarchy of directories\" - the UNIX(r)-typical filesystem structure.
The structure begins at the special directory `/` (root directory) and

View File

@ -36,9 +36,9 @@ purposes, like reporting a termination by a signal:
--------- ------------------------------------------------------------------------------------------------------------------------------------------------------------
0 success
1-255 failure (in general)
126 the requested command (file) can\'t be executed (but was found)
126 the requested command (file) can't be executed (but was found)
127 command (file) not found
128 according to ABS it\'s used to report an invalid argument to the exit builtin, but I wasn\'t able to verify that in the source code of Bash (see code 255)
128 according to ABS it's used to report an invalid argument to the exit builtin, but I wasn't able to verify that in the source code of Bash (see code 255)
128 + N the shell was terminated by the signal N (also used like this by various other programs)
255 wrong argument to the exit builtin (see code 128)
@ -51,7 +51,7 @@ control statements like `if` or `while`.
## Portability
Tables of shell behavior involving non-portable side-effects or common
bugs with exit statuses. Note heirloom doesn\'t support pipeline
bugs with exit statuses. Note heirloom doesn't support pipeline
negation (`! pipeline`).
### Misc
@ -68,7 +68,7 @@ negation (`! pipeline`).
`` eval echo \$? <&0`false` `` 1 1 1 1 0 0 0 0 1
`while :; do ! break; done; echo $?` 1 1 1 1 0 0 1 1 \-
`while :; do ! break; done; echo $?` 1 1 1 1 0 0 1 1 -
[discussion](https://lists.gnu.org/archive/html/bug-bash/2010-09/msg00009.html)`false; : | echo $?` 1 1 1 0 1 1 1 1 0
@ -90,9 +90,9 @@ transparency of the return builtin.
`f() { return $?; }; false; f; echo $?` 1 1 1 1 1 1 1 1 1
`f() { ! return; }; f; echo $?` 0 0 1 0 0 0 1 1 \-
`f() { ! return; }; f; echo $?` 0 0 1 0 0 0 1 1 -
`f() { ! return; }; false; f; echo $?` 1 1 0 0 1 1 0 0 \-
`f() { ! return; }; false; f; echo $?` 1 1 0 0 1 1 0 0 -
`` f() { return; }; x=`false` f; echo $? `` 1 1 1 1 0 0 0 0 0

View File

@ -1,11 +1,11 @@
# File
A file is a pool of data in the [filesystem](../dict/terms/filesystem.md). On
userlevel, it\'s referenced using a name, a
userlevel, it's referenced using a name, a
[hardlink](../dict/terms/hardlink.md) to the file.
If a file is not referenced anymore (number of hardlinks to it drops to
0) then the space allocated for that file is re-used, unless it\'s still
0) then the space allocated for that file is re-used, unless it's still
used by some process.
The file-data splits into actual payload (file contents) and some

View File

@ -19,5 +19,5 @@ written to (when `mtime` is updated.
## mtime
The mtime is set, whenever a file\'s contents are changed, for example
The mtime is set, whenever a file's contents are changed, for example
by editing a file.

View File

@ -18,7 +18,7 @@ really executes
The term \"glob\" originates back in the UNIX(r) days where an
executable `glob` (from \"global\") existed which was used to expand
pattern-matching characters. Later, this functionality was built into
the shell. There\'s still a library function called `glob()` (POSIX(r)),
the shell. There's still a library function called `glob()` (POSIX(r)),
which serves the same purpose.
## See also

View File

@ -15,7 +15,7 @@ hardlink.
The difference between a [symbolic link](../dict/terms/symlink.md) and a hard
link is that there is no easy way to differentiate between a \'real\'
file and a hard link, let\'s take a look at the example:
file and a hard link, let's take a look at the example:
\* create an empty file
@ -26,7 +26,7 @@ file and a hard link, let\'s take a look at the example:
$ ln a b
$ ln -s a c
as you can see file(1) can\'t differentiate between a real file \'a\'
as you can see file(1) can't differentiate between a real file \'a\'
and a hard link \'b\', but it can tell \'c\' is a sym link
$ file *
@ -41,7 +41,7 @@ same inode number AND are on the same file system it means they are
$ ls -i *
5262 a 5262 b 5263 c
hard links don\'t consume additional space on the filesystem, the space
hard links don't consume additional space on the filesystem, the space
is freed when the last hard link pointing to it is deleted.
## See also

View File

@ -16,7 +16,7 @@ A shebang will typically look like
#!/bin/bash
Since the line starting with `#` is a comment for the shell (and some
other scripting languages), it\'s ignored.
other scripting languages), it's ignored.
Regarding the shebang, there are various, differences between operating
systems, including:
@ -26,7 +26,7 @@ systems, including:
- may be able to take arguments for the interpreter
- \...
POSIX(r) doesn\'t specify the shebang, though in general it\'s commonly
POSIX(r) doesn't specify the shebang, though in general it's commonly
supported by operating systems.
## See also

View File

@ -20,7 +20,7 @@ A shell variable is a parameter denoted by a *variable name*:
- containing only alphanumeric characters and underscores
- beginning with an alphabetic character or an underscore
A value can be assigned to a variable, using the variable\'s name and an
A value can be assigned to a variable, using the variable's name and an
equal-sign:
NAME=VALUE
@ -37,9 +37,9 @@ The nullstring is a valid value:
A positional parameter is denoted by a number other than `0` (zero).
Positional parameters reflect the shell\'s arguments that are not given
Positional parameters reflect the shell's arguments that are not given
to the shell itself (in practise, the script arguments, also the
function arguments). You can\'t directly assign to the positional
function arguments). You can't directly assign to the positional
parameters, however, [the set builtin command](../commands/builtin/set.md)
can be used to indirectly set them.

View File

@ -1,13 +1,13 @@
# Shell
On UNIX(r), the shell is the main interaction tool between the
user-level and the system. That doesn\'t necessarily mean the user
always sits infront of a shell, but it\'s integral part of the system,
user-level and the system. That doesn't necessarily mean the user
always sits infront of a shell, but it's integral part of the system,
not only an \"optional commandline interpreter\".
The main job of a shell is to execute commands as a user requests them.
This behaviour alone doesn\'t help much. A shell knits some intelligence
and flow control around the possibility to execute commands - it\'s a
This behaviour alone doesn't help much. A shell knits some intelligence
and flow control around the possibility to execute commands - it's a
complete commandline-oriented user-interface (UI).
FIXME

View File

@ -1,7 +1,7 @@
# Special file
Unlike a regular file (a bunch of accessible data organized on a
filesystem), it\'s a special filename that points to a ressource or
filesystem), it's a special filename that points to a ressource or
similar:
- character special files

View File

@ -5,7 +5,7 @@ to another filename. Since it really only points to another **filename**
it can
- reference filenames on other filesystems
- reference filenames that don\'t actually exist
- reference filenames that don't actually exist
- save a reference to the name of a directory
## See also

View File

@ -18,7 +18,7 @@ In brief, the *reverse polish notation* means the numbers are put on the
stack first, then an operation is applied to them. Instead of writing
`1+1`, you write `1 1+`.
By default `dc`, unlike `bc`, doesn\'t print anything, the result is
By default `dc`, unlike `bc`, doesn't print anything, the result is
pushed on the stack. You have to use the \"p\" command to print the
element at the top of the stack. Thus a simple operation looks like:
@ -26,7 +26,7 @@ element at the top of the stack. Thus a simple operation looks like:
2
I used a \"here string\" present in bash 3.x, ksh93 and zsh. if your
shell doesn\'t support this, you can use `echo '1 1+p' | dc` or if you
shell doesn't support this, you can use `echo '1 1+p' | dc` or if you
have GNU `dc`, you can use `dc -e '1 1 +p`\'.
Of course, you can also just run `dc` and enter the commands.
@ -140,10 +140,10 @@ command `f`. The stack remains unchanged:
1
Note how the first element that will be popped from the stack is printed
first, if you are used to an HP calculator, it\'s the reverse.
first, if you are used to an HP calculator, it's the reverse.
Don\'t hesitate to put `f` in the examples of this tutorial, it doesn\'t
change the result, and it\'s a good way to see what\'s going on.
Don't hesitate to put `f` in the examples of this tutorial, it doesn't
change the result, and it's a good way to see what's going on.
## Registers
@ -159,7 +159,7 @@ supposed to use the NUL byte. Using a register is easy:
+p # add the 2 values and print
EOF
The above snippet uses newlines to embed comments, but it doesn\'t
The above snippet uses newlines to embed comments, but it doesn't
really matter, you can use `echo '12sa10la+p'| dc`, with the same
results.
@ -184,7 +184,7 @@ enclosed in `[]`. You can print it with `p`: `dc <<< '[Hello World!]p'`
and you can evalute it with x: `dc <<< '[1 2+]xp'`.
This is not that interesting until combined with registers. First,
let\'s say we want to calculate the square of a number (don\'t forget to
let's say we want to calculate the square of a number (don't forget to
include `f` if you get lost!):
dc << EOF
@ -223,7 +223,7 @@ we are used to reading:
Some `dc` have `>R <R =R`, GNU `dc` had some more, check your manual.
Note that the test \"consumes\" its operands: the 2 first elements are
popped off the stack (you can verify that
`dc <<< "[f]sR 2 1 >R 1 2 >R f"` doesn\'t print anything)
`dc <<< "[f]sR 2 1 >R 1 2 >R f"` doesn't print anything)
Have you noticed how we can *include* a macro (string) in a macro? and
as `dc` relies on a stack we can, in fact, use the macro recursively
@ -256,12 +256,12 @@ remove all those extra spaces newlines and comments:
dc <<< '[lip1-si0li>L]sL10silLx'
dc <<< '[p1-d0<L]sL10lLx' # use the stack instead of a register
I\'ll let you figure out the second example, it\'s not hard, it uses the
I\'ll let you figure out the second example, it's not hard, it uses the
stack instead of a register for the index.
## Next
Check your dc manual, i haven\'t decribed everything, like arrays (only
Check your dc manual, i haven't decribed everything, like arrays (only
documented with \"; : are used by bc(1) for array operations\" on
solaris, probably because *echo \'1 0:a 0Sa 2 0:a La 0;ap\' \| dc*
results in //Segmentation Fault (core dump) //, the latest solaris uses

View File

@ -5,7 +5,7 @@
## What is a \"Collapsing Function\"?
A collapsing function is a function whose behavior changes depending
upon the circumstances under which it\'s run. Function collapsing is
upon the circumstances under which it's run. Function collapsing is
useful when you find yourself repeatedly checking a variable whose value
never changes.
@ -41,7 +41,7 @@ common example is a script that gives the user the option of having
## How does it work?
The first time you run chatter(), the function redefines itself based on
the value of verbose. Thereafter, chatter doesn\'t check \$verbose, it
the value of verbose. Thereafter, chatter doesn't check \$verbose, it
simply is. Further calls to the function reflect its collapsed nature.
If verbose is unset, chatter will echo nothing, with no extra effort
from the developer.

View File

@ -4,7 +4,7 @@
## General
For this task, you don\'t have to write large parser routines (unless
For this task, you don't have to write large parser routines (unless
you want it 100% secure or you want a special file syntax) - you can use
the Bash source command. The file to be sourced should be formated in
key=\"value\" format, otherwise bash will try to interpret commands:
@ -16,7 +16,7 @@ key=\"value\" format, otherwise bash will try to interpret commands:
echo "Config for the target host: $cool_host" >&2
So, where do these variables come from? If everything works fine, they
are defined in /etc/cool.cfg which is a file that\'s sourced into the
are defined in /etc/cool.cfg which is a file that's sourced into the
current script or shell. Note: this is **not** the same as executing
this file as a script! The sourced file most likely contains something
like:
@ -40,8 +40,8 @@ usage of the dot is identical:
## Per-user configs
There\'s also a way to provide a system-wide config file in /etc and a
custom config in \~/(user\'s home) to override system-wide defaults. In
There's also a way to provide a system-wide config file in /etc and a
custom config in \~/(user's home) to override system-wide defaults. In
the following example, the if/then construct is used to check for the
existance of a user-specific config:
@ -74,10 +74,10 @@ malicious code:
echo rm -fr ~/*
mailto=netadmin@example.com
You don\'t want these `echo`-commands (which could be any other
You don't want these `echo`-commands (which could be any other
commands!) to be executed. One way to be a bit safer is to filter only
the constructs you want, write the filtered results to a new file and
source the new file. We also need to be sure something nefarious hasn\'t
source the new file. We also need to be sure something nefarious hasn't
been added to the end of one of our name=value parameters, perhaps using
; or && command separators. In those cases, perhaps it is simplest to
just ignore the line entirely. Egrep (`grep -E`) will help us here, it
@ -99,11 +99,11 @@ filters by description:
source "$configfile"
**[To make clear what it does:]{.underline}** egrep checks if the file
contains something we don\'t want, if yes, egrep filters the file and
contains something we don't want, if yes, egrep filters the file and
writes the filtered contents to a new file. If done, the original file
name is changed to the name stored in the variable `configfile`. The
file named by that variable is sourced, as if it were the original file.
This filter allows only `NAME=VALUE` and comments in the file, but it
doesn\'t prevent all methods of code execution. I will address that
doesn't prevent all methods of code execution. I will address that
later.

View File

@ -5,7 +5,7 @@ $ ls *.zip | while read i; do j=`echo $i | sed 's/.zip//g'`; mkdir $j; cd $j; un
```
This is an actual one-liner someone asked about in `#bash`. **There are
several things wrong with it. Let\'s break it down!**
several things wrong with it. Let's break it down!**
``` bash
$ ls *.zip | while read i; do ...; done
@ -21,7 +21,7 @@ But in sh and bash alike, we can loop safely over the glob itself:
$ for i in *.zip; do j=`echo $i | sed 's/.zip//g'`; mkdir $j; cd $j; unzip ../$i; cd ..; done
```
Let\'s break it down some more!
Let's break it down some more!
``` bash
j=`echo $i | sed 's/.zip//g'` # where $i is some name ending in '.zip'
@ -30,14 +30,14 @@ j=`echo $i | sed 's/.zip//g'` # where $i is some name ending in '.zip'
The goal here seems to be get the filename without its `.zip` extension.
In fact, there is a POSIX(r)-compliant command to do this: `basename`
The implementation here is suboptimal in several ways, but the only
thing that\'s genuinely error-prone with this is \"`echo $i`\". Echoing
thing that's genuinely error-prone with this is \"`echo $i`\". Echoing
an *unquoted* variable means
[wordsplitting](../syntax/expansion/wordsplit.md) will take place, so any
whitespace in `$i` will essentially be normalized. In `sh` it is
necessary to use an external command and a subshell to achieve the goal,
but we can eliminate the pipe (subshells, external commands, and pipes
carry extra overhead when they launch, so they can really hurt
performance in a loop). Just for good measure, let\'s use the more
performance in a loop). Just for good measure, let's use the more
readable, [modern](../syntax/expansion/cmdsubst.md) `$()` construct instead
of the old style backticks:
@ -45,7 +45,7 @@ of the old style backticks:
sh $ for i in *.zip; do j=$(basename "$i" ".zip"); mkdir $j; cd $j; unzip ../$i; cd ..; done
```
In Bash we don\'t need the subshell or the external basename command.
In Bash we don't need the subshell or the external basename command.
See [Substring removal with parameter
expansion](../syntax/pe.md#substring_removal):
@ -53,7 +53,7 @@ expansion](../syntax/pe.md#substring_removal):
bash $ for i in *.zip; do j="${i%.zip}"; mkdir $j; cd $j; unzip ../$i; cd ..; done
```
Let\'s keep going:
Let's keep going:
``` bash
$ mkdir $j; cd $j; ...; cd ..
@ -64,19 +64,19 @@ program will run. Even if you do, the following best practice will never
hurt: When a following command depends on the success of a previous
command(s), check for success! You can do this with the \"`&&`\"
conjunction, that way, if the previous command fails, bash will not try
to execute the following command(s). It\'s fully POSIX(r). Oh, and
to execute the following command(s). It's fully POSIX(r). Oh, and
remember what I said about [wordsplitting](../syntax/expansion/wordsplit.md)
in the previous step? Well, if you don\'t quote `$j`, wordsplitting can
in the previous step? Well, if you don't quote `$j`, wordsplitting can
happen again.
``` bash
$ mkdir "$j" && cd "$j" && ... && cd ..
```
That\'s almost right, but there\'s one problem \-- what happens if `$j`
That's almost right, but there's one problem -- what happens if `$j`
contains a slash? Then `cd ..` will not return to the original
directory. That\'s wrong! `cd -` causes cd to return to the previous
working directory, so it\'s a much better choice:
directory. That's wrong! `cd -` causes cd to return to the previous
working directory, so it's a much better choice:
``` bash
$ mkdir "$j" && cd "$j" && ... && cd -
@ -84,7 +84,7 @@ $ mkdir "$j" && cd "$j" && ... && cd -
(If it occurred to you that I forgot to check for success after cd -,
good job! You could do this with `{ cd - || break; }`, but I\'m going to
leave that out because it\'s verbose and I think it\'s likely that we
leave that out because it's verbose and I think it's likely that we
will be able to get back to our original working directory without a
problem.)
@ -98,15 +98,15 @@ sh $ for i in *.zip; do j=$(basename "$i" ".zip"); mkdir "$j" && cd "$j" && unzi
bash $ for i in *.zip; do j="${i%.zip}"; mkdir "$j" && cd "$j" && unzip ../$i && cd -; done
```
Let\'s throw the `unzip` command back in the mix:
Let's throw the `unzip` command back in the mix:
``` bash
mkdir "$j" && cd "$j" && unzip ../$i && cd -
```
Well, besides word splitting, there\'s nothing terribly wrong with this.
Well, besides word splitting, there's nothing terribly wrong with this.
Still, did it occur to you that unzip might already be able to target a
directory? There isn\'t a standard for the `unzip` command, but all the
directory? There isn't a standard for the `unzip` command, but all the
implementations I\'ve seen can do it with the -d flag. So we can drop
the cd commands entirely:
@ -122,4 +122,4 @@ sh $ for i in *.zip; do j=$(basename "$i" ".zip"); mkdir "$j" && unzip -d "$j" "
bash $ for i in *.zip; do j="${i%.zip}"; mkdir "$j" && unzip -d "$j" "$i"; done
```
There! That\'s as good as it gets.
There! That's as good as it gets.

View File

@ -13,14 +13,14 @@ that, e.g. editing active and open files, the lack of GNU, or other
Why `ed`?
- maybe your `sed` doesn\'t support in-place edit
- maybe your `sed` doesn't support in-place edit
- maybe you need to be as portable as possible
- maybe you need to really edit in-file (and not create a new file
like GNU `sed`)
- last but not least: standard `ed` has very good editing and
addressing possibilities, compared to standard `sed`
Don\'t get me wrong, this is **not** meant as anti-`sed` article! It\'s
Don't get me wrong, this is **not** meant as anti-`sed` article! It's
just meant to show you another way to do the job.
## Commanding ed
@ -127,19 +127,19 @@ which defines an address range for the first to the last line, `,p` thus
means print the whole file, after it has been modified. When your script
runs sucessfully, you only have to replace the `,p` by a `w`.
Of course, even if the file is not modified by the `p` command, **it\'s
Of course, even if the file is not modified by the `p` command, **it's
always a good idea to have a backup copy!**
## Editing your files
Most of these things can be done with `sed`. But there are also things
that can\'t be done in `sed` or can only be done with very complex code.
that can't be done in `sed` or can only be done with very complex code.
### Simple word substitutions
Like `sed`, `ed` also knows the common `s/FROM/TO/` command, and it can
also take line-addresses. **If no substitution is made on the addressed
lines, it\'s considered an error.**
lines, it's considered an error.**
#### Substitutions through the whole file
@ -179,7 +179,7 @@ regexp
\...using the `m` command: `<ADDRESS> m <TARGET-ADDRESS>`
This is definitely something that can\'t be done easily with sed.
This is definitely something that can't be done easily with sed.
# moving lines 5-9 to the end of the file
ed -s test.txt <<< $'5,9m$\nw'
@ -206,7 +206,7 @@ command: `j` (join).
ed -s file <<< $'1,$j\nw'
Compared with two other methods (using `tr` or `sed`), you don\'t have
Compared with two other methods (using `tr` or `sed`), you don't have
to delete all newlines and manually add one at the end.
### File operations
@ -219,7 +219,7 @@ prints the result to stdout - `,p`):
ed -s FILE1 <<< $'$-1 r FILE2\n,p'
To compare, here\'s a possible `sed` solution which must use Bash
To compare, here's a possible `sed` solution which must use Bash
arithmetic and the external program `wc`:
sed "$(($(wc -l < FILE1)-1))r FILE2" FILE1
@ -260,10 +260,10 @@ about it with the g (global) command:
**\_\_ an error stops the script \_\_**
You might think that it\'s not a problem and that the same thing happens
You might think that it's not a problem and that the same thing happens
with sed and you\'re right, with the exception that if ed does not find
a pattern it\'s an error, while sed just continues with the next line.
For instance, let\'s say that you want to change foo to bar on the first
a pattern it's an error, while sed just continues with the next line.
For instance, let's say that you want to change foo to bar on the first
line of the file and add something after the next line, ed will stop if
it cannot find foo on the first line, sed will continue.
@ -288,8 +288,8 @@ attempt the substitution on all non blank lines
**\_\_ shell parameters are expanded \_\_**
If you don\'t quote the delimiter, \$ has a special meaning. This sounds
obvious but it\'s easy to forget this fact when you use addresses like
If you don't quote the delimiter, \$ has a special meaning. This sounds
obvious but it's easy to forget this fact when you use addresses like
\$-1 or commands like \$a. Either quote the \$ or the delimiter:
#fails
@ -341,8 +341,8 @@ read into memory.
# equivalent
ed -s file <<< 'g/foo/'
The name `grep` is derived from the notaion `g/RE/p` (global =\> regular
expression =\> print). ref
The name `grep` is derived from the notaion `g/RE/p` (global => regular
expression => print). ref
<http://www.catb.org/~esr/jargon/html/G/grep.html>
### wc -l
@ -355,7 +355,7 @@ number of lines of the file:
### cat
Yea, it\'s a joke\...
Yea, it's a joke\...
ed -s file <<< $',p'

View File

@ -8,14 +8,14 @@
(`--myoption`) nor XF86-style long options (`-myoption`). So, when you
want to parse command line arguments in a professional ;-) way,
`getopts` may or may not work for you. Unlike its older brother `getopt`
(note the missing *s*!), it\'s a shell builtin command. The advantages
(note the missing *s*!), it's a shell builtin command. The advantages
are:
- No need to pass the positional parameters through to an external
program.
- Being a builtin, `getopts` can set shell variables to use for
parsing (impossible for an *external* process!)
- There\'s no need to argue with several `getopt` implementations
- There's no need to argue with several `getopt` implementations
which had buggy concepts in the past (whitespace, \...)
- `getopts` is defined in POSIX(r).
@ -27,7 +27,7 @@ parameters](../scripting/posparams.md).
### Terminology
It\'s useful to know what we\'re talking about here, so let\'s see\...
It's useful to know what we\'re talking about here, so let's see\...
Consider the following command line:
mybackup -x -f /etc/mybackup.conf -r ./foo.txt ./bar.txt
@ -40,14 +40,14 @@ several logical groups:
- `-f` is also an option, but this option has an associated **option
argument** (an argument to the option `-f`): `/etc/mybackup.conf`.
The option argument is usually the argument following the option
itself, but that isn\'t mandatory. Joining the option and option
itself, but that isn't mandatory. Joining the option and option
argument into a single argument `-f/etc/mybackup.conf` is valid.
- `-r` depends on the configuration. In this example, `-r` doesn\'t
take arguments so it\'s a standalone option like `-x`.
- `-r` depends on the configuration. In this example, `-r` doesn't
take arguments so it's a standalone option like `-x`.
- `./foo.txt` and `./bar.txt` are remaining arguments without any
associated options. These are often used as **mass-arguments**. For
example, the filenames specified for `cp(1)`, or arguments that
don\'t need an option to be recognized because of the intended
don't need an option to be recognized because of the intended
behavior of the program. POSIX(r) calls them **operands**.
To give you an idea about why `getopts` is useful, The above command
@ -58,7 +58,7 @@ line is equivalent to:
which is complex to parse without the help of `getopts`.
The option flags can be **upper- and lowercase** characters, or
**digits**. It may recognize other characters, but that\'s not
**digits**. It may recognize other characters, but that's not
recommended (usability and maybe problems with special characters).
### How it works
@ -71,16 +71,16 @@ parameters. If you want to shift them, it must be done manually:
shift $((OPTIND-1))
# now do something with $@
Since `getopts` sets an exit status of *FALSE* when there\'s nothing
left to parse, it\'s easy to use in a while-loop:
Since `getopts` sets an exit status of *FALSE* when there's nothing
left to parse, it's easy to use in a while-loop:
while getopts ...; do
...
done
`getopts` will parse options and their possible arguments. It will stop
parsing on the first non-option argument (a string that doesn\'t begin
with a hyphen (`-`) that isn\'t an argument for any option in front of
parsing on the first non-option argument (a string that doesn't begin
with a hyphen (`-`) that isn't an argument for any option in front of
it). It will also stop parsing when it sees the `--` (double-hyphen),
which means [end of options](../dict/terms/end_of_options.md).
@ -90,7 +90,7 @@ which means [end of options](../dict/terms/end_of_options.md).
------------------------------------ ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
[OPTIND](../syntax/shellvars.md#OPTIND) Holds the index to the next argument to be processed. This is how `getopts` \"remembers\" its own status between invocations. Also useful to shift the positional parameters after processing with `getopts`. `OPTIND` is initially set to 1, and **needs to be re-set to 1 if you want to parse anything again with getopts**
[OPTARG](../syntax/shellvars.md#OPTARG) This variable is set to any argument for an option found by `getopts`. It also contains the option flag of an unknown option.
[OPTERR](../syntax/shellvars.md#OPTERR) (Values 0 or 1) Indicates if Bash should display error messages generated by the `getopts` builtin. The value is initialized to **1** on every shell startup - so be sure to always set it to **0** if you don\'t want to see annoying messages! **`OPTERR` is not specified by POSIX for the `getopts` builtin utility \-\-- only for the C `getopt()` function in `unistd.h` (`opterr`).** `OPTERR` is bash-specific and not supported by shells such as ksh93, mksh, zsh, or dash.
[OPTERR](../syntax/shellvars.md#OPTERR) (Values 0 or 1) Indicates if Bash should display error messages generated by the `getopts` builtin. The value is initialized to **1** on every shell startup - so be sure to always set it to **0** if you don't want to see annoying messages! **`OPTERR` is not specified by POSIX for the `getopts` builtin utility --- only for the C `getopt()` function in `unistd.h` (`opterr`).** `OPTERR` is bash-specific and not supported by shells such as ksh93, mksh, zsh, or dash.
`getopts` also uses these variables for error reporting (they\'re set to
value-combinations which arent possible in normal operation).
@ -111,7 +111,7 @@ where:
#### The option-string
The option-string tells `getopts` which options to expect and which of
them must have an argument. The syntax is very simple \-\-- every option
them must have an argument. The syntax is very simple --- every option
character is simply named as is, this example-string would tell
`getopts` to look for `-f`, `-A` and `-x`:
@ -124,7 +124,7 @@ an argument (i.e. to become `-A SOMETHING`) just do:
getopts fA:x VARNAME
If the **very first character** of the option-string is a `:` (colon),
which would normally be nonsense because there\'s no option letter
which would normally be nonsense because there's no option letter
preceding it, `getopts` switches to \"**silent error reporting mode**\".
In productive scripts, this is usually what you want because it allows
you to handle errors yourself without being disturbed by annoying
@ -138,7 +138,7 @@ default (which means it parses `"$@"`).
You can give your own set of arguments to the utility to parse. Whenever
additional arguments are given after the `VARNAME` parameter, `getopts`
doesn\'t try to parse the positional parameters, but these given words.
doesn't try to parse the positional parameters, but these given words.
This way, you are able to parse any option set you like, here for
example from an array:
@ -160,8 +160,8 @@ Regarding error-reporting, there are two modes `getopts` can run in:
- silent mode
For productive scripts I recommend to use the silent mode, since
everything looks more professional, when you don\'t see annoying
standard messages. Also it\'s easier to handle, since the failure cases
everything looks more professional, when you don't see annoying
standard messages. Also it's easier to handle, since the failure cases
are indicated in an easier way.
#### Verbose Mode
@ -182,7 +182,7 @@ are indicated in an easier way.
Enough said - action!
Let\'s play with a very simple case: only one option (`-a`) expected,
Let's play with a very simple case: only one option (`-a`) expected,
without any arguments. Also we disable the *verbose error handling* by
preceding the whole option string with a colon (`:`):
@ -204,31 +204,31 @@ done
I put that into a file named `go_test.sh`, which is the name you\'ll see
below in the examples.
Let\'s do some tests:
Let's do some tests:
#### Calling it without any arguments
$ ./go_test.sh
$
Nothing happened? Right. `getopts` didn\'t see any valid or invalid
options (letters preceded by a dash), so it wasn\'t triggered.
Nothing happened? Right. `getopts` didn't see any valid or invalid
options (letters preceded by a dash), so it wasn't triggered.
#### Calling it with non-option arguments
$ ./go_test.sh /etc/passwd
$
Again \-\-- nothing happened. The **very same** case: `getopts` didn\'t
Again --- nothing happened. The **very same** case: `getopts` didn't
see any valid or invalid options (letters preceded by a dash), so it
wasn\'t triggered.
wasn't triggered.
The arguments given to your script are of course accessible as `$1` -
`${N}`.
#### Calling it with option-arguments
Now let\'s trigger `getopts`: Provide options.
Now let's trigger `getopts`: Provide options.
First, an **invalid** one:
@ -236,7 +236,7 @@ First, an **invalid** one:
Invalid option: -b
$
As expected, `getopts` didn\'t accept this option and acted like told
As expected, `getopts` didn't accept this option and acted like told
above: It placed `?` into `$opt` and the invalid option character (`b`)
into `$OPTARG`. With our `case` statement, we were able to detect this.
@ -249,7 +249,7 @@ Now, a **valid** one (`-a`):
You see, the detection works perfectly. The `a` was put into the
variable `$opt` for our case statement.
Of course it\'s possible to **mix valid and invalid** options when
Of course it's possible to **mix valid and invalid** options when
calling:
$ ./go_test.sh -a -x -b -c
@ -259,7 +259,7 @@ calling:
Invalid option: -c
$
Finally, it\'s of course possible, to give our option **multiple
Finally, it's of course possible, to give our option **multiple
times**:
$ ./go_test.sh -a -a -a -a
@ -271,14 +271,14 @@ times**:
The last examples lead us to some points you may consider:
- **invalid options don\'t stop the processing**: If you want to stop
- **invalid options don't stop the processing**: If you want to stop
the script, you have to do it yourself (`exit` in the right place)
- **multiple identical options are possible**: If you want to disallow
these, you have to check manually (e.g. by setting a variable or so)
### An option with argument
Let\'s extend our example from above. Just a little bit:
Let's extend our example from above. Just a little bit:
- `-a` now takes an argument
- on an error, the parsing exits with `exit 1`
@ -303,21 +303,21 @@ while getopts ":a:" opt; do
done
```
Let\'s do the very same tests we did in the last example:
Let's do the very same tests we did in the last example:
#### Calling it without any arguments
$ ./go_test.sh
$
As above, nothing happened. It wasn\'t triggered.
As above, nothing happened. It wasn't triggered.
#### Calling it with non-option arguments
$ ./go_test.sh /etc/passwd
$
The **very same** case: It wasn\'t triggered.
The **very same** case: It wasn't triggered.
#### Calling it with option-arguments
@ -327,7 +327,7 @@ The **very same** case: It wasn\'t triggered.
Invalid option: -b
$
As expected, as above, `getopts` didn\'t accept this option and acted
As expected, as above, `getopts` didn't accept this option and acted
like programmed.
**Valid** option, but without the mandatory **argument**:
@ -338,7 +338,7 @@ like programmed.
The option was okay, but there is an argument missing.
Let\'s provide **the argument**:
Let's provide **the argument**:
$ ./go_test.sh -a /etc/passwd
-a was triggered, Parameter: /etc/passwd

View File

@ -4,19 +4,19 @@
## Why lock?
Sometimes there\'s a need to ensure only one copy of a script runs, i.e
Sometimes there's a need to ensure only one copy of a script runs, i.e
prevent two or more copies running simultaneously. Imagine an important
cronjob doing something very important, which will fail or corrupt data
if two copies of the called program were to run at the same time. To
prevent this, a form of `MUTEX` (**mutual exclusion**) lock is needed.
The basic procedure is simple: The script checks if a specific condition
(locking) is present at startup, if yes, it\'s locked - the scipt
doesn\'t start.
(locking) is present at startup, if yes, it's locked - the scipt
doesn't start.
This article describes locking with common UNIX(r) tools. There are
other special locking tools available, But they\'re not standardized, or
worse yet, you can\'t be sure they\'re present when you want to run your
worse yet, you can't be sure they\'re present when you want to run your
scripts. **A tool designed for specifically for this purpose does the
job much better than general purpose code.**
@ -32,7 +32,7 @@ limits.
## Choose the locking method
The best way to set a global lock condition is the UNIX(r) filesystem.
Variables aren\'t enough, as each process has its own private variable
Variables aren't enough, as each process has its own private variable
space, but the filesystem is global to all processes (yes, I know about
chroots, namespaces, \... special case). You can \"set\" several things
in the filesystem that can be used as locking indicator:
@ -44,17 +44,17 @@ in the filesystem that can be used as locking indicator:
To create a file or set a file timestamp, usually the command touch is
used. The following problem is implied: A locking mechanism checks for
the existance of the lockfile, if no lockfile exists, it creates one and
continues. Those are **two separate steps**! That means it\'s **not an
atomic operation**. There\'s a small amount of time between checking and
continues. Those are **two separate steps**! That means it's **not an
atomic operation**. There's a small amount of time between checking and
creating, where another instance of the same script could perform
locking (because when it checked, the lockfile wasn\'t there)! In that
locking (because when it checked, the lockfile wasn't there)! In that
case you would have 2 instances of the script running, both thinking
they are succesfully locked, and can operate without colliding. Setting
the timestamp is similar: One step to check the timespamp, a second step
to set the timestamp.
\<WRAP center round tip 60%\> [**Conclusion:**]{.underline} We need an
operation that does the check and the locking in one step. \</WRAP\>
<WRAP center round tip 60%> [**Conclusion:**]{.underline} We need an
operation that does the check and the locking in one step. </WRAP>
A simple way to get that is to create a **lock directory** - with the
mkdir command. It will:
@ -84,16 +84,16 @@ atomic. Maybe a while loop checking continously for the existence of the
lock in the background and sending a signal such as USR1, if the
directory is not found, can be done. The signal would need to be
trapped. I am sure there there is a better solution than this
suggestion* \-\-- *[sn18](sunny_delhi18@yahoo.com) 2009/12/19 08:24*
suggestion* --- *[sn18](sunny_delhi18@yahoo.com) 2009/12/19 08:24*
**Note:** While perusing the Internet, I found some people asking if the
`mkdir` method works \"on all filesystems\". Well, let\'s say it should.
`mkdir` method works \"on all filesystems\". Well, let's say it should.
The syscall under `mkdir` is guarenteed to work atomicly in all cases,
at least on Unices. Two examples of problems are NFS filesystems and
filesystems on cluster servers. With those two scenarios, dependencies
exist related to the mount options and implementation. However, I
successfully use this simple method on an Oracle OCFS2 filesystem in a
4-node cluster environment. So let\'s just say \"it should work under
4-node cluster environment. So let's just say \"it should work under
normal conditions\".
Another atomic method is setting the `noclobber` shell option
@ -129,7 +129,7 @@ differences compared to the very simple example above:
- traps are created to automatically remove the lock when the script
terminates, or is killed
Details on how the script is killed aren\'t given, only code relevant to
Details on how the script is killed aren't given, only code relevant to
the locking process is shown:
``` bash

View File

@ -12,12 +12,12 @@ the best features of `tar` and `cpio`, able to handle all common archive
types.
However, this is **not a manpage**, it will **not** list all possible
options, it will **not** you detailed information about `pax`. It\'s
options, it will **not** you detailed information about `pax`. It's
only an introduction.
This article is based on the debianized Berkeley implementation of
`pax`, but implementation-specific things should be tagged as such.
Unfortunately, the Debian package doesn\'t seem to be maintained
Unfortunately, the Debian package doesn't seem to be maintained
anymore.
## Overview
@ -71,11 +71,11 @@ options, e.g. rewriting groups) to another location.
### Archive data
When you don\'t specify anything special, `pax` will attempt to read
When you don't specify anything special, `pax` will attempt to read
archive data from standard input (read/list modes) and write archive
data to standard output (write mode). This ensures `pax` can be easily
used as part of a shell pipe construct, e.g. to read a compressed
archive that\'s decompressed in the pipe.
archive that's decompressed in the pipe.
The option to specify the pathname of a file to be archived is `-f` This
file will be used as input or output, depending on the operation
@ -84,7 +84,7 @@ file will be used as input or output, depending on the operation
When pax reads an archive, it tries to guess the archive type. However,
in *write* mode, you must specify which type of archive to append using
the `-x <TYPE>` switch. If you omit this switch, a default archive will
be created (POSIX says it\'s implementation defined, Berkeley `pax`
be created (POSIX says it's implementation defined, Berkeley `pax`
creates `ustar` if no options are specified).
The following archive formats are supported (Berkeley implementation):
@ -115,7 +115,7 @@ files to list or extract.
patterns
- if no patterns are given, `pax` will \"match\" (list or extract) all
files from the archive
- **To avoid conflicts with shell pathname expansion, it\'s wise to
- **To avoid conflicts with shell pathname expansion, it's wise to
quote patterns!**
#### Some assorted examples of patterns
@ -170,7 +170,7 @@ customized to include permissions, timestamps, etc. with the
specification is strongly derived from the `printf(3)` format
specification.
**Unfortunately** the `pax` utility delivered with Debian doesn\'t seem
**Unfortunately** the `pax` utility delivered with Debian doesn't seem
to support these extended listing formats.
However, `pax` lists archive members in a `ls -l`-like format, when you
@ -238,7 +238,7 @@ The same, but with an archive, can be accomplished by:
pax -w -T 0000 -f /n/mybackups/$(date +%A)
In this case, the day-name is an archive-file (you don\'t need a
In this case, the day-name is an archive-file (you don't need a
filename extension like `.tar` but you can add one, if desired).
### Changing filenames while archiving
@ -269,7 +269,7 @@ happens in the order they are specified.
### Excluding files from an archive
The -s command seen above can be used to exclude a file. The
substitution must result in a null string: For example, let\'s say that
substitution must result in a null string: For example, let's say that
you want to exclude all the CVS directories to create a source code
archive. We are going to replace the names containing /CVS/ with
nothing, note the .\* they are needed because we need to match the
@ -277,7 +277,7 @@ entire pathname.
pax -w -x ustar -f release.tar -s',.*/CVS/.*,,' myapplication
You can use several -s options, for instance, let\'s say you also want
You can use several -s options, for instance, let's say you also want
to remove files ending in \~:
pax -w -x ustar -f release.tar -'s,.*/CVS/.*,,' -'s/.*~//' myapplication
@ -292,7 +292,7 @@ that you want to extract only the \"usr\" directory:
Like `cpio`, pax can read filenames from standard input (`stdin`). This
provides great flexibility - for example, a `find(1)` command may select
files/directories in ways pax can\'t do itself. In **write** mode
files/directories in ways pax can't do itself. In **write** mode
(creating an archive) or **copy** mode, when no filenames are given, pax
expects to read filenames from standard input. For example:

View File

@ -44,7 +44,7 @@ connected to `/dev/pts/5`.
# Simple Redirections
## Output Redirection \"n\> file\"
## Output Redirection \"n> file\"
`>` is probably the simplest redirection.
@ -97,7 +97,7 @@ pointing to `file`. The command will then start with:
What will the command do with this descriptor? It depends. Often
nothing. We will see later why we might want other file descriptors.
## Input Redirection \"n\< file\"
## Input Redirection \"n< file\"
When you run a commandusing `command < file`, it changes the file
descriptor `0` so that it looks like:
@ -148,7 +148,7 @@ descriptors.
# More On File Descriptors
## Duplicating File Descriptor 2\>&1
## Duplicating File Descriptor 2>&1
We have seen how to open (or redirect) file descriptors. Let us see how
to duplicate them, starting with the classic `2>&1`. What does this
@ -201,7 +201,7 @@ So you got a copy of this descriptor:
--- +-----------------------+
Internally each of these is represented by a file descriptor opened by
the operating system\'s `fopen` calls, and is likely just a pointer to
the operating system's `fopen` calls, and is likely just a pointer to
the file which has been opened for reading (`stdin` or file descriptor
`0`) or writing (`stdout` /`stderr`).
@ -213,9 +213,9 @@ Similarly for output file descriptors, writing a line to file descriptor
`s` will append a line to a file as will writing a line to file
descriptor `t`.
\<note tip\>The syntax is somewhat confusing in that you would think
that the arrow would point in the direction of the copy, but it\'s
reversed. So it\'s `target>&source` effectively.\</note\>
<note tip>The syntax is somewhat confusing in that you would think
that the arrow would point in the direction of the copy, but it's
reversed. So it's `target>&source` effectively.</note>
So, as a simple example (albeit slightly contrived), is the following:
@ -225,15 +225,15 @@ So, as a simple example (albeit slightly contrived), is the following:
exec 1>&3 # Copy 3 back into 1
echo Done # Output to original stdout
## Order Of Redirection, i.e., \"\> file 2\>&1\" vs. \"2\>&1 \>file\"
## Order Of Redirection, i.e., \"> file 2>&1\" vs. \"2>&1 >file\"
While it doesn\'t matter where the redirections appears on the command
While it doesn't matter where the redirections appears on the command
line, their order does matter. They are set up from left to right.
- `2>&1 >file`
A common error, is to do `command 2>&1 > file` to redirect both `stderr`
and `stdout` to `file`. Let\'s see what\'s going on. First we type the
and `stdout` to `file`. Let's see what's going on. First we type the
command in our terminal, the descriptors look like this:
--- +-----------------------+
@ -263,7 +263,7 @@ descriptor look like this:
standard error ( 2 ) ---->| /dev/pts/5 |
--- +-----------------------+
That\'s right, nothing has changed, 2 was already pointing to the same
That's right, nothing has changed, 2 was already pointing to the same
place as 1. Now Bash sees `> file` and thus changes `stdout`:
--- +-----------------------+
@ -278,11 +278,11 @@ place as 1. Now Bash sees `> file` and thus changes `stdout`:
standard error ( 2 ) ---->| /dev/pts/5 |
--- +-----------------------+
And that\'s not what we want.
And that's not what we want.
- `>file 2>&1`
Now let\'s look at the correct `command >file 2>&1`. We start as in the
Now let's look at the correct `command >file 2>&1`. We start as in the
previous example, and Bash sees `> file`:
--- +-----------------------+
@ -313,7 +313,7 @@ Then it sees our duplication `2>&1`:
And voila, both `1` and `2` are redirected to file.
## Why sed \'s/foo/bar/\' file \>file Doesn\'t Work
## Why sed 's/foo/bar/\' file >file Doesn't Work
This is a common error, we want to modify a file using something that
reads from a file and writes the result to `stdout`. To do this, we
@ -322,14 +322,14 @@ as we have seen, the redirections are setup before the command is
actually executed.
So **BEFORE** sed starts, standard output has already been redirected,
with the additional side effect that, because we used \>, \"file\" gets
with the additional side effect that, because we used >, \"file\" gets
truncated. When `sed` starts to read the file, it contains nothing.
## exec
In Bash the `exec` built-in replaces the shell with the specified
program. So what does this have to do with redirection? `exec` also
allow us to manipulate the file descriptors. If you don\'t specify a
allow us to manipulate the file descriptors. If you don't specify a
program, the redirection after `exec` modifies the file descriptors of
the current shell.
@ -356,7 +356,7 @@ script and ran `myscript 2>file`.
commands in your script produce, just add `exec 2>myscript.errors` at
the beginning of your script.
Let\'s see another use case. We want to read a file line by line, this
Let's see another use case. We want to read a file line by line, this
is easy, we just do:
while read -r line;do echo "$line";done < file
@ -366,7 +366,7 @@ user to press a key:
while read -r line;do echo "$line"; read -p "Press any key" -n 1;done < file
And, surprise this doesn\'t work. Why? because the shell descriptor of
And, surprise this doesn't work. Why? because the shell descriptor of
the while loop looks like:
--- +-----------------------+
@ -386,7 +386,7 @@ and our read inherits these descriptors, and our command
and not from our terminal.
A quick look at `help read` tells us that we can specify a file
descriptor from which `read` should read. Cool. Now let\'s use `exec` to
descriptor from which `read` should read. Cool. Now let's use `exec` to
get another descriptor:
exec 3<file
@ -415,7 +415,7 @@ and it works.
## Closing The File Descriptors
Closing a file through a file descriptor is easy, just make it a
duplicate of -. For instance, let\'s close `stdin <&-` and
duplicate of -. For instance, let's close `stdin <&-` and
`stderr 2>&-`:
bash -c '{ lsof -a -p $$ -d0,1,2 ;} <&- 2>&-'
@ -427,7 +427,7 @@ we see that inside the `{}` that only `1` is still here.
Though the OS will probably clean up the mess, it is perhaps a good idea
to close the file descriptors you open. For instance, if you open a file
descriptor with `exec 3>file`, all the commands afterwards will inherit
it. It\'s probably better to do something like:
it. It's probably better to do something like:
exec 3>file
.....
@ -438,11 +438,11 @@ it. It\'s probably better to do something like:
#we don't need 3 any more
I\'ve seen some people using this as a way to discard, say stderr, using
something like: command 2\>&-. Though it might work, I\'m not sure if
something like: command 2>&-. Though it might work, I\'m not sure if
you can expect all applications to behave correctly with a closed
stderr.
When in doubt, I use 2\>/dev/null.
When in doubt, I use 2>/dev/null.
# An Example
@ -462,7 +462,7 @@ on the comp.unix.shell group:
The redirections are processed from left to right, but as the file
descriptors are inherited we will also have to work from the outer to
the inner contexts. We will assume that we run this command in a
terminal. Let\'s start with the outer `{ } 3>&2 4>&1`.
terminal. Let's start with the outer `{ } 3>&2 4>&1`.
--- +-------------+ --- +-------------+
( 0 ) ---->| /dev/pts/5 | ( 3 ) ---->| /dev/pts/5 |
@ -482,7 +482,7 @@ and thus `1` and `2` go to the terminal. As an exercise, you can start
with `1` pointing to `file.stdout` and 2 pointing to `file.stderr`, you
will see why these redirections are very nice.
Let\'s continue with the right part of the second pipe:
Let's continue with the right part of the second pipe:
`| cmd3 3>&- 4>&-`
--- +-------------+
@ -516,7 +516,7 @@ pipe for reading. Now for the left part of the second pipe
First, The file descriptor `1` is connected to the pipe (`|`), then `2`
is made a copy of `1` and thus is made an fd to the pipe (`2>&1`), then
`1` is made a copy of `4` (`>&4`), then `4` is closed. These are the
file descriptors of the inner `{}`. Lcet\'s go inside and have a look at
file descriptors of the inner `{}`. Lcet's go inside and have a look at
the right part of the first pipe: `| cmd2 2>&3 3>&-`
--- +-------------+
@ -599,7 +599,7 @@ help you, a redirection is always like the following:
- `lhs` is always a file description, i.e., a number:
- Either we want to open, duplicate, move or we want to close. If
the op is `<` then there is an implicit 0, if it\'s `>` or `>>`,
the op is `<` then there is an implicit 0, if it's `>` or `>>`,
there is an implicit 1.
```{=html}
@ -628,7 +628,7 @@ The shell is pretty loose about what it considers a valid redirect.
While opinions probably differ, this author has some (strong)
recommendations:
- **Always** keep redirections \"tightly grouped\" \-- that is, **do
- **Always** keep redirections \"tightly grouped\" -- that is, **do
not** include whitespace anywhere within the redirection syntax
except within quotes if required on the RHS (e.g. a filename that
contains a space). Since shells fundamentally use whitespace to

View File

@ -23,7 +23,7 @@ We have a simple **stat.sh** script:
echo "PYTHON LINES: $LINES"
This script evaluate the number of python files and the number of python
code lines in the files. We can use it like **./stat.sh \<dir\>**
code lines in the files. We can use it like **./stat.sh <dir>**
### Create testsuit

View File

@ -3,7 +3,7 @@
![](keywords>bash shell scripting options runtime variable behaviour)
This information was taken from a Bash version \"`4.1`\", every now and
then new options are added, so likely, this list isn\'t complete.
then new options are added, so likely, this list isn't complete.
The shell-options can be set with the [shopt builtin
command](../commands/builtin/shopt.md).
@ -252,7 +252,7 @@ the only possible completions. This option is enabled by default.
If set, range expressions used in pattern matching behave as if in the
traditional C locale when performing comparisons. That is, the current
locale\'s collating sequence is not taken into account, so b will not
locale's collating sequence is not taken into account, so b will not
collate between A and B, and upper-case and lower-case ASCII characters
will collate together.
@ -490,7 +490,7 @@ There are two options that influence the parsing this way:
- `extglob`
- `extquote`
Consequence: You **can\'t** use the new syntax (e.g. the extended
Consequence: You **can't** use the new syntax (e.g. the extended
globbing syntax) and the command to enable it **in the same line**.
$ shopt -s extglob; echo !(*.txt) # this is the WRONG way!

View File

@ -15,38 +15,38 @@ And yes, these bashphorisms reflect the daily reality in `#bash`.
Number Bashphorism
-------- ----------------------------------------------------------------------------------------------------------------------------------------------------------------
0 The questioner will never tell you what they are really doing the first time they ask.
1 The questioner\'s first description of the problem/question will be misleading.
1 The questioner's first description of the problem/question will be misleading.
2 The questioner will keep changing the question until it drives the helpers in the channel insane.
3 Offtopicness will continue until someone asks a bash question that falls under bashphorisms 1 and/or 2, and `greycat` gets pissed off.
4 The questioner will not read and apply the answers he is given but will instead continue to practice bashphorism #1 and bashphorism #2.
5 The ignorant will continually mis-educate the other noobies.
6 When given a choice of solutions, the newbie will always choose the wrong one.
7 The newbie will always find a reason to say, \"It doesn\'t work.\"
8 If you don\'t know to whom the bashphorism\'s referring, it\'s you.
7 The newbie will always find a reason to say, \"It doesn't work.\"
8 If you don't know to whom the bashphorism's referring, it's you.
9 All examples given by the questioner will be broken, misleading, wrong, and not representative of the actual question.
10 See B1
11 Please apply `(( % 10 ))` to the bashphorism value.
12 All logic is deniable; however, some logic will \*plonk\* you if you deny it.
13 Everyone ignores greycat when he is right. When he is wrong, it is !b1
14 The newbie doesn\'t actually know what he\'s asking. If he did, he wouldn\'t need to ask.
14 The newbie doesn't actually know what he's asking. If he did, he wouldn't need to ask.
15 The more advanced you are, the more likely you are to be overcomplicating it.
16 The more beginner you are, the more likely you are to be overcomplicating it.
17 A newbie comes to #bash to get his script confirmed. He leaves disappointed.
18 The newbie will not accept the answer you give, no matter how right it is.
19 The newbie is a bloody loon.
20 The newbie will always have some excuse for doing it wrong.
21 When the newbie\'s question is ambiguous, the proper interpretation will be whichever one makes the problem the hardest to solve.
22 The newcomer will abuse the bot\'s factoid triggers for their own entertainment until someone gets annoyed enough to ask them to message it privately instead.
21 When the newbie's question is ambiguous, the proper interpretation will be whichever one makes the problem the hardest to solve.
22 The newcomer will abuse the bot's factoid triggers for their own entertainment until someone gets annoyed enough to ask them to message it privately instead.
23 Everyone is a newcomer.
24 The newcomer will address greybot as if it were human.
25 The newbie won\'t accept any answer that uses practical or standard tools.
25 The newbie won't accept any answer that uses practical or standard tools.
26 The newbie will not TELL you about this restriction until you have wasted half an hour.
27 The newbie will lie.
28 When the full horror of the newbie\'s true goal is revealed, the newbie will try to restate the goal to trick you into answering. Newbies are stupid.
29 It\'s always git. Or python virtualenv. Or docker. One of those pieces of shit. ALWAYS.
30 They won\'t show you the homework assignment. That would make it too easy.
28 When the full horror of the newbie's true goal is revealed, the newbie will try to restate the goal to trick you into answering. Newbies are stupid.
29 It's always git. Or python virtualenv. Or docker. One of those pieces of shit. ALWAYS.
30 They won't show you the homework assignment. That would make it too easy.
31 Your teacher is a f\*\*king idiot.
32 The more horrifyingly wrong a proposed solution is, the more likely it will be used.
33 The newbie cannot explain what he is doing, or why. He will show you incomprehensible, nonworking code instead. What? You can\'t read his mind?!
33 The newbie cannot explain what he is doing, or why. He will show you incomprehensible, nonworking code instead. What? You can't read his mind?!
Please feel free to correct or extend this page whenever needed.

View File

@ -28,7 +28,7 @@ Usually Bash and/or Linux (GNU Toolset) specific.
$ sed statement <<< cat
cement
$ \(-
$ (-
bash: (-: command not found
$ echo '[q]sa[ln0=aln256%Pln256/snlbx]sb3135071790101768542287578439snlbxq'|dc

View File

@ -1,4 +1,4 @@
# Bash\'s behaviour
# Bash's behaviour
![](keywords>bash shell scripting startup files dotfiles modes POSIX)
@ -8,15 +8,15 @@ FIXME incomplete
### Login shell
As a \"login shell\", Bash reads and sets (executes) the user\'s profile
As a \"login shell\", Bash reads and sets (executes) the user's profile
from `/etc/profile` and one of `~/.bash_profile`, `~/.bash_login`, or
`~/.profile` (in that order, using the first one that\'s readable!).
`~/.profile` (in that order, using the first one that's readable!).
When a login shell exits, Bash reads and executes commands from the file
`~/.bash_logout`, if it exists.
Why an extra login shell mode? There are many actions and variable sets
that only make sense for the initial user login. That\'s why all UNIX(r)
that only make sense for the initial user login. That's why all UNIX(r)
shells have (should have) a \"login\" mode.
[**Methods to start Bash as a login shell:**]{.underline}
@ -44,7 +44,7 @@ they\'re not inherited from the parent shell.
The feature to have a system-wide `/etc/bash.bashrc` or a similar
system-wide rc-file is specific to vendors and distributors that ship
*their own, patched variant of Bash*. The classic way to have a
system-wide rc file is to `source /etc/bashrc` from every user\'s
system-wide rc file is to `source /etc/bashrc` from every user's
`~/.bashrc`.
[**Methods to test for interactive-shell mode:**]{.underline}
@ -64,9 +64,9 @@ system-wide rc file is to `source /etc/bashrc` from every user\'s
When Bash starts in SH compatiblity mode, it tries to mimic the startup
behaviour of historical versions of `sh` as closely as possible, while
conforming to the POSIX(r) standard as well. The profile files read are
`/etc/profile` and `~/.profile`, if it\'s a login shell.
`/etc/profile` and `~/.profile`, if it's a login shell.
If it\'s not a login shell, the environment variable
If it's not a login shell, the environment variable
[ENV](../syntax/shellvars.md#ENV) is evaluated and the resulting filename is
used as the name of the startup file.
@ -76,7 +76,7 @@ mode (for running, not for starting!)](#posix_run_mode).
[**Bash starts in `sh` compatiblity mode when:**]{.underline}
- the base filename in `argv[0]` is `sh` (:!: NB: `/bin/sh` may be
linked to `/bin/bash`, but that doesn\'t mean it acts like
linked to `/bin/bash`, but that doesn't mean it acts like
`/bin/bash` :!:)
### POSIX mode
@ -85,7 +85,7 @@ When Bash is started in POSIX(r) mode, it follows the POSIX(r) standard
for startup files. In this mode, **interactive shells** expand the
[ENV](../syntax/shellvars.md#ENV) variable and commands are read and executed
from the file whose name is the expanded value.\
No other startup files are read. Hence, a non-interactive shell doesn\'t
No other startup files are read. Hence, a non-interactive shell doesn't
read any startup files in POSIX(r) mode.
[**Bash starts in POSIX(r) mode when:**]{.underline}
@ -104,11 +104,11 @@ read any startup files in POSIX(r) mode.
Mode `/etc/profile` `~/.bash_profile` `~/.bash_login` `~/.profile` `~/.bashrc` `${ENV}`
----------------------- ---------------- ------------------- ----------------- -------------- ------------- ----------
Login shell X X X X \- \-
Interactive shell \- \- \- \- X \-
SH compatible login X \- \- X \- \-
SH compatible \- \- \- \- \- X
POSIX(r) compatiblity \- \- \- \- \- X
Login shell X X X X - -
Interactive shell - - - - X -
SH compatible login X - - X - -
SH compatible - - - - - X
POSIX(r) compatiblity - - - - - X
## Bash run modes
@ -117,7 +117,7 @@ read any startup files in POSIX(r) mode.
### POSIX run mode
In POSIX(r) mode, Bash follows the POSIX(r) standard regarding behaviour
and parsing (excerpt from a Bash maintainer\'s document):
and parsing (excerpt from a Bash maintainer's document):
Starting Bash with the `--posix' command-line option or executing `set
-o posix' while Bash is running will cause Bash to conform more closely
@ -311,33 +311,33 @@ FIXME help me to find out what breaks in POSIX(r) mode!
### Restricted shell
In restricted mode, Bash sets up (and runs) a shell environment that\'s
In restricted mode, Bash sets up (and runs) a shell environment that's
far more controlled and limited than the standard shell mode. It acts
like normal Bash with the following restrictions:
- the `cd` command can\'t be used to change directories
- the `cd` command can't be used to change directories
- the variables [SHELL](../syntax/shellvars.md#SHELL),
[PATH](../syntax/shellvars.md#PATH), [ENV](../syntax/shellvars.md#ENV) and
[BASH_ENV](../syntax/shellvars.md#BASH_ENV) can\'t be set or unset
- command names that contain a `/` (slash) can\'t be called (hence
[BASH_ENV](../syntax/shellvars.md#BASH_ENV) can't be set or unset
- command names that contain a `/` (slash) can't be called (hence
you\'re limited to `PATH`)
- filenames containing a `/` (slash) can\'t be specified as argument
- filenames containing a `/` (slash) can't be specified as argument
to the `source` or `.` builtin command
- filenames containing a `/` (slash) can\'t be specified as argument
- filenames containing a `/` (slash) can't be specified as argument
to the `-p` option of the `hash` builtin command
- function definitions are not inherited from the environment at shell
startup
- the environment variable [SHELLOPTS](../syntax/shellvars.md#SHELLOPTS) is
ignored at startup
- redirecting output using the `>`, `>|`, `<>`, `>&`, `&>`, and `>>`
redirection operators isn\'t allowed
- the `exec` builtin command can\'t replace the shell with another
redirection operators isn't allowed
- the `exec` builtin command can't replace the shell with another
process
- adding or deleting builtin commands with the `-f` and `-d` options
to the enable builtin command is forbidden
- using the `enable` builtin command to enable disabled shell builtins
doesn\'t work
- the `-p` option to the `command` builtin command doesn\'t work
doesn't work
- the `-p` option to the `command` builtin command doesn't work
- turning off restricted mode with `set +r` or `set +o restricted` is
(of course) forbidden

View File

@ -39,7 +39,7 @@ For this topic, see also
`gnu_errfmt` 3.0-alpha
`force_fignore` 3.0-alpha
`failglob` 3.0-alpha
`extquote` 3.0-alpha unsure \-- verify!
`extquote` 3.0-alpha unsure -- verify!
`extdebug` 3.0-alpha
`pipefail` (for `set -o`) 3.0
`functrace` (for `set -o`) 3.0
@ -108,7 +108,7 @@ For this topic, see also
`[[...]]`: new 2.02-alpha1 KSH93
`[[...]]`: regex support (`=~`) 3.0-alpha
`[[...]]`: quotable right-hand-side of `=~` forces string matching 3.2-alpha for consistency with pattern matching
`[[...]]`: `<` and `>` operators respect locale 4.1-alpha for consistency, since 4.1-beta: ensure you have set compatiblity to \>4.0 (default)
`[[...]]`: `<` and `>` operators respect locale 4.1-alpha for consistency, since 4.1-beta: ensure you have set compatiblity to >4.0 (default)
`test`/`[`/`[[`: `-v` 4.2-alpha check if a variable is set
`test`/`[`/`[[`: `-v` 4.2-alpha support array syntax to check for elements
`test`/`[`/`[[`: `-N` accepts nanoseconds 5.1-alpha
@ -154,7 +154,7 @@ For this topic, see also
`echo` `\uNNNN` and `\UNNNNNNNN` escape sequences 4.2-alpha for: `printf`, `echo -e`, `$'...'`
`exec` option `-a` to give a `argv[0]` string 4.2-alpha
`time` allowed as a command by itself to display timing values of the shell and its children 4.2-alpha POSIX change
`help` `help` now searches exact topic-strings (i.e. `help read` won\'t find `readonly` anymore) 4.3-alpha
`help` `help` now searches exact topic-strings (i.e. `help read` won't find `readonly` anymore) 4.3-alpha
`return` accept negative values as return value (e.g. `return -1` will show as (8 bit) 255 in the caller) 4.3-alpha
`exit` accept negative values as return value (e.g. `return -1` will show as (8 bit) 255 in the caller) 4.3-alpha
`read` `read` skips `NUL` (ASCII Code 0) in input 4.3-alpha
@ -163,7 +163,7 @@ For this topic, see also
`read` `read` checks first variable argument for validity before trying to read inout 4.3-beta
`help` attempts substring matching (as it did through bash-4.2) if exact string matching fails 4.3-beta2
`fc` interprets option `-0` (zero) as the current command line 4.3-beta2
`cd` new option `-@` to browse a file\'s extended attributes (on systems that support `O_XATTR`) 4.3-rc1
`cd` new option `-@` to browse a file's extended attributes (on systems that support `O_XATTR`) 4.3-rc1
`kill` new option `-L` (upper case ell) to list signals like the normal lowercase option `-l` (compatiblity with some standalone `kill` commands) 4.4-beta
`mapfile` new option `-d` 4.4-alpha
`wait` new option `-f` 5.0-alpha

View File

@ -40,7 +40,7 @@ You can follow the process by using `echo` as a fake interpreter:
#!/bin/echo
We don\'t need a script body here, as the file will never be interpreted
We don't need a script body here, as the file will never be interpreted
and executed by \"`echo`\". You can see what the Operating System does,
it calls \"`/bin/echo`\" with the name of the executable file and
following arguments.
@ -49,7 +49,7 @@ following arguments.
/home/bash/bin/test testword hello
The same way, with `#!/bin/bash` the shell \"`/bin/bash`\" is called
with the script filename as an argument. It\'s the same as executing
with the script filename as an argument. It's the same as executing
\"`/bin/bash /home/bash/bin/test testword hello`\"
If the interpreter can be specified with arguments and how long it can
@ -58,17 +58,17 @@ be is system-specific (see
executes a file with a #!/bin/bash shebang, the shebang itself is
ignored, since the first character is a hashmark \"#\", which indicates
a comment. The shebang is for the operating system, not for the shell.
Programs that don\'t ignore such lines, may not work as shebang driven
Programs that don't ignore such lines, may not work as shebang driven
interpreters.
\<WRAP center round important 60%\> [**Attention:**]{.underline}When the
<WRAP center round important 60%> [**Attention:**]{.underline}When the
specified interpreter is unavailable or not executable (permissions),
you usually get a \"`bad interpreter`\" error message., If you get
nothing and it fails, check the shebang. Older Bash versions will
respond with a \"`no such file or directory`\" error for a nonexistant
interpreter specified by the shebang. \</WRAP\>
interpreter specified by the shebang. </WRAP>
**Additional note:** When you specify `#!/bin/sh` as shebang and that\'s
**Additional note:** When you specify `#!/bin/sh` as shebang and that's
a link to a Bash, then Bash will run in POSIX(r) mode! See:
- [Bash behaviour](../scripting/bashbehaviour.md).
@ -84,7 +84,7 @@ A common method is to specify a shebang like
Which one you need, or whether you think which one is good, or bad, is
up to you. There is no bulletproof portable way to specify an
interpreter. **It\'s a common misconception that it solves all problems.
interpreter. **It's a common misconception that it solves all problems.
Period.**
## The standard filedescriptors
@ -100,7 +100,7 @@ Usually, they\'re all connected to your terminal, stdin as input file
(keyboard), stdout and stderr as output files (screen). When calling
such a program, the invoking shell can change these filedescriptor
connections away from the terminal to any other file (see redirection).
Why two different output filedescriptors? It\'s convention to send error
Why two different output filedescriptors? It's convention to send error
messages and warnings to stderr and only program output to stdout. This
enables the user to decide if they want to see nothing, only the data,
only the errors, or both - and where they want to see them.
@ -117,7 +117,7 @@ redirection and piping, see:
## Variable names
It\'s good practice to use lowercase names for your variables, as shell
It's good practice to use lowercase names for your variables, as shell
and system-variable names are usually all in UPPERCASE. However, you
should avoid naming your variables any of the following (incomplete
list!):
@ -145,11 +145,11 @@ code is a number between 0 and 255. Values from 126 to 255 are reserved
for use by the shell directly, or for special purposes, like reporting a
termination by a signal:
- **126**: the requested command (file) was found, but can\'t be
- **126**: the requested command (file) was found, but can't be
executed
- **127**: command (file) not found
- **128**: according to ABS it\'s used to report an invalid argument
to the exit builtin, but I wasn\'t able to verify that in the source
- **128**: according to ABS it's used to report an invalid argument
to the exit builtin, but I wasn't able to verify that in the source
code of Bash (see code 255)
- **128 + N**: the shell was terminated by the signal N
- **255**: wrong argument to the exit builtin (see code 128)
@ -200,7 +200,7 @@ so others can check the script execution.
## Comments
In a larger, or complex script, it\'s wise to comment the code. Comments
In a larger, or complex script, it's wise to comment the code. Comments
can help with debugging or tests. Comments start with the \# character
(hashmark) and continue to the end of the line:
@ -210,7 +210,7 @@ can help with debugging or tests. Comments start with the \# character
echo "Be liberal in what you accept, and conservative in what you send" # say something
```
The first thing was already explained, it\'s the so-called shebang, for
The first thing was already explained, it's the so-called shebang, for
the shell, **only a comment**. The second one is a comment from the
beginning of the line, the third comment starts after a valid command.
All three syntactically correct.
@ -219,8 +219,8 @@ All three syntactically correct.
To temporarily disable complete blocks of code you would normally have
to prefix every line of that block with a \# (hashmark) to make it a
comment. There\'s a little trick, using the pseudo command `:` (colon)
and input redirection. The `:` does nothing, it\'s a pseudo command, so
comment. There's a little trick, using the pseudo command `:` (colon)
and input redirection. The `:` does nothing, it's a pseudo command, so
it does not care about standard input. In the following code example,
you want to test mail and logging, but not dump the database, or execute
a shutdown:
@ -242,7 +242,7 @@ SOMEWORD
```
What happened? The `:` pseudo command was given some input by
redirection (a here-document) - the pseudo command didn\'t care about
redirection (a here-document) - the pseudo command didn't care about
it, effectively, the entire block was ignored.
The here-document-tag was quoted here **to avoid substitutions** in the
@ -278,12 +278,12 @@ program\".
**[Attention:]{.underline}** When you set variables in a child process,
for example a *subshell*, they will be set there, but you will **never**
have access to them outside of that subshell. One way to create a
subshell is the pipe. It\'s all mentioned in a small article about [Bash
subshell is the pipe. It's all mentioned in a small article about [Bash
in the processtree](../scripting/processtree.md)!
### Local variables
Bash provides ways to make a variable\'s scope *local* to a function:
Bash provides ways to make a variable's scope *local* to a function:
- Using the `local` keyword, or
- Using `declare` (which will *detect* when it was called from within
@ -327,7 +327,7 @@ echo $foo
### Environment variables
The environment space is not directly related to the topic about scope,
but it\'s worth mentioning.
but it's worth mentioning.
Every UNIX(r) process has a so-called *environment*. Other items, in
addition to variables, are saved there, the so-called *environment

View File

@ -9,16 +9,16 @@ but as hints and comments about debugging a Bash script.
Do **not** name your script `test`, for example! *Why?* `test` is the
name of a UNIX(r)-command, and [most likely built into your
shell]{.underline} (it\'s a built-in in Bash) - so you won\'t be able to
shell]{.underline} (it's a built-in in Bash) - so you won't be able to
run a script with the name `test` in a normal way.
**Don\'t laugh!** This is a classic mistake :-)
**Don't laugh!** This is a classic mistake :-)
## Read the error messages
Many people come into IRC and ask something like *\"Why does my script
fail? I get an error!\"*. And when you ask them what the error message
is, they don\'t even know. Beautiful.
is, they don't even know. Beautiful.
Reading and interpreting error messages is 50% of your job as debugger!
Error messages actually **mean** something. At the very least, they can
@ -41,7 +41,7 @@ From my personal experience, I can suggest `vim` or `GNU emacs`.
## Write logfiles
For more complex scripts, it\'s useful to write to a log file, or to the
For more complex scripts, it's useful to write to a log file, or to the
system log. Nobody can debug your script without knowing what actually
happened and what went wrong.
@ -61,7 +61,7 @@ literal quotes, to see leading and trailing spaces!
pid=$(< fooservice.pid)
echo "DEBUG: read from file: pid=\"$pid\"" >&2
Bash\'s [printf](../commands/builtin/printf.md) command has the `%q` format,
Bash's [printf](../commands/builtin/printf.md) command has the `%q` format,
which is handy for verifying whether strings are what they appear to be.
foo=$(< inputfile)
@ -97,20 +97,20 @@ There are two useful debug outputs for that task (both are written to
### Simple example of how to interpret xtrace output
Here\'s a simple command (a string comparison using the [classic test
Here's a simple command (a string comparison using the [classic test
command](../commands/classictest.md)) executed while in `set -x` mode:
set -x
foo="bar baz"
[ $foo = test ]
That fails. Why? Let\'s see the `xtrace` output:
That fails. Why? Let's see the `xtrace` output:
+ '[' bar baz = test ']'
And now you see that it\'s (\"bar\" and \"baz\") recognized as two
And now you see that it's (\"bar\" and \"baz\") recognized as two
separate words (which you would have realized if you READ THE ERROR
MESSAGES ;) ). Let\'s check it\...
MESSAGES ;) ). Let's check it\...
# next try
[ "$foo" = test ]
@ -218,7 +218,7 @@ This can be wrapped in a shell function for more readable code.
script.sh: line 100: syntax error: unexpected end of file
Usually indicates exactly what it says: An unexpected end of file. It\'s
Usually indicates exactly what it says: An unexpected end of file. It's
unexpected because Bash waits for the closing of a [compound
command](../syntax/ccmd/intro.md):
@ -288,7 +288,7 @@ definition invalid.
### What is the CRLF issue?
There\'s a big difference in the way that UNIX(r) and Microsoft(r) (and
There's a big difference in the way that UNIX(r) and Microsoft(r) (and
possibly others) handle the **line endings** of plain text files. The
difference lies in the use of the CR (Carriage Return) and LF (Line
Feed) characters.
@ -298,7 +298,7 @@ Feed) characters.
Keep in mind your script is a **plain text file**, and the `CR`
character means nothing special to UNIX(r) - it is treated like any
other character. If it\'s printed to your terminal, a carriage return
other character. If it's printed to your terminal, a carriage return
will effectively place the cursor at the beginning of the *current*
line. This can cause much confusion and many headaches, since lines
containing CRs are not what they appear to be when printed. In summary,
@ -310,7 +310,7 @@ Some possible sources of CRs:
- a DOS/Windows text editor
- a UNIX(r) text editor that is \"too smart\" when determining the
file content type (and thinks \"*it\'s a DOS text file*\")
file content type (and thinks \"*it's a DOS text file*\")
- a direct copy and paste from certain webpages (some pastebins are
known for this)
@ -327,13 +327,13 @@ carriage return character!):
echo "Hello world"^M
...
Here\'s what happens because of the `#!/bin/bash^M` in our shebang:
Here's what happens because of the `#!/bin/bash^M` in our shebang:
- the file `/bin/bash^M` doesn\'t exist (hopefully)
- the file `/bin/bash^M` doesn't exist (hopefully)
- So Bash prints an error message which (depending on the terminal,
the Bash version, or custom patches!) may or may not expose the
problem.
- the script can\'t be executed
- the script can't be executed
The error message can vary. If you\'re lucky, you\'ll get:
@ -347,9 +347,9 @@ Why? Because when printed literally, the `^M` makes the cursor go back
to the beginning of the line. The whole error message is *printed*, but
you *see* only part of it!
\<note warning\> It\'s easy to imagine the `^M` is bad in other places
<note warning> It's easy to imagine the `^M` is bad in other places
too. If you get weird and illogical messages from your script, rule out
the possibility that`^M` is involved. Find and eliminate it! \</note\>
the possibility that`^M` is involved. Find and eliminate it! </note>
### How can I find and eliminate them?

View File

@ -28,12 +28,12 @@ See also:
- [Bash startup mode: SH mode](../scripting/bashbehaviour.md#sh_mode)
- [Bash run mode: POSIX mode](../scripting/bashbehaviour.md#posix_run_mode)
### Your script named \"test\" doesn\'t execute
### Your script named \"test\" doesn't execute
Give it another name. The executable `test` already exists.
In Bash it\'s a builtin. With other shells, it might be an executable
file. Either way, it\'s bad name choice!
In Bash it's a builtin. With other shells, it might be an executable
file. Either way, it's bad name choice!
Workaround: You can call it using the pathname:
@ -59,7 +59,7 @@ text formed by the prefix, the postfix and the braces themselves are
generated. In the example, these are only two: `-i*.vob` and `-i`. The
filename expansion happens **after** that, so there is a chance that
`-i*.vob` is expanded to a filename - if you have files like
`-ihello.vob`. But it definitely doesn\'t do what you expected.
`-ihello.vob`. But it definitely doesn't do what you expected.
Please see:
@ -90,7 +90,7 @@ variable! Bash is not PHP!
A variable name preceeded with a dollar-sign always means that the
variable gets **expanded**. In the example above, it might expand to
nothing (because it wasn\'t set), effectively resulting in\...
nothing (because it wasn't set), effectively resulting in\...
="Hello world!"
@ -135,14 +135,14 @@ assigned value**:
### Expanding (using) variables
A typical beginner\'s trap is quoting.
A typical beginner's trap is quoting.
As noted above, when you want to **expand** a variable i.e. \"get the
content\", the variable name needs to be prefixed with a dollar-sign.
But, since Bash knows various ways to quote and does word-splitting, the
result isn\'t always the same.
result isn't always the same.
Let\'s define an example variable containing text with spaces:
Let's define an example variable containing text with spaces:
example="Hello world"
@ -195,7 +195,7 @@ process you execute to start that script `./script.sh`):
Exporting is one-way. The direction is from parent process to child
process, not the reverse. The above example **will** work, when you
don\'t execute the script, but include (\"source\") it:
don't execute the script, but include (\"source\") it:
$ source ./script.sh
$ echo $hello
@ -213,7 +213,7 @@ Please see:
### Reacting to exit codes
If you just want to react to an exit code, regardless of its specific
value, you **don\'t need** to use `$?` in a test command like this:
value, you **don't need** to use `$?` in a test command like this:
``` bash
grep ^root: /etc/passwd >/dev/null 2>&1
@ -237,8 +237,8 @@ Or, simpler yet:
grep ^root: /etc/passwd >/dev/null 2>&1 || echo "root was not found - check the pub at the corner"
```
If you need the specific value of `$?`, there\'s no other choice. But if
you need only a \"true/false\" exit indication, there\'s no need for
If you need the specific value of `$?`, there's no other choice. But if
you need only a \"true/false\" exit indication, there's no need for
`$?`.
See also:
@ -247,11 +247,11 @@ See also:
### Output vs. Return Value
It\'s important to remember the different ways to run a child command,
It's important to remember the different ways to run a child command,
and whether you want the output, the return value, or neither.
When you want to run a command (or a pipeline) and save (or print) the
**output**, whether as a string or an array, you use Bash\'s
**output**, whether as a string or an array, you use Bash's
`$(command)` syntax:
$(ls -l /tmp)

View File

@ -22,9 +22,9 @@ In these cases the portable syntax should be preferred.
`command\ <<<\ WORD` `command <<MARKER WORD MARKER` a here-string, a special form of the here-document, avoid it in portable scripts! POSIX(r)
`export VAR=VALUE` `VAR=VALUE export VAR` Though POSIX(r) allows it, some shells don\'t want the assignment and the exporting in one command POSIX(r), zsh, ksh, \...
`export VAR=VALUE` `VAR=VALUE export VAR` Though POSIX(r) allows it, some shells don't want the assignment and the exporting in one command POSIX(r), zsh, ksh, \...
`(( MATH ))` `: $(( MATH ))` POSIX(r) does\'t define an arithmetic compund command, many shells don\'t know it. Using the pseudo-command `:` and the arithmetic expansion `$(( ))` is a kind of workaround here. **Attention:** Not all shell support assignment like `$(( a = 1 + 1 ))`! Also see below for a probably more portable solution. all POSIX(r) compatible shells
`(( MATH ))` `: $(( MATH ))` POSIX(r) does't define an arithmetic compund command, many shells don't know it. Using the pseudo-command `:` and the arithmetic expansion `$(( ))` is a kind of workaround here. **Attention:** Not all shell support assignment like `$(( a = 1 + 1 ))`! Also see below for a probably more portable solution. all POSIX(r) compatible shells
`[[\ EXPRESSION\ ]]` `[ EXPRESSION ]`\ The Bashish test keyword is reserved by POSIX(r), but not defined. Use the old fashioned way with the `test` command. See [the classic test command](../commands/classictest.md) POSIX(r) and others
or\
@ -38,9 +38,9 @@ In these cases the portable syntax should be preferred.
## Portability rationale
Here is some assorted portability information. Take it as a small guide
to make your scripts a bit more portable. It\'s not complete (it never
will be!) and it\'s not very detailed (e.g. you won\'t find information
about how which shell technically forks off which subshell). It\'s just
to make your scripts a bit more portable. It's not complete (it never
will be!) and it's not very detailed (e.g. you won't find information
about how which shell technically forks off which subshell). It's just
an assorted small set of portability guidelines. *-Thebonsai*
FIXME UNIX shell gurus out there, please be patient with a newbie like
@ -54,17 +54,17 @@ there are two possibilities:
The *new value* is seen by subsequent programs
- without any special action (e.g. Bash)
- only after an explicit export with `export VARIABLE` (e.g. Sun\'s
- only after an explicit export with `export VARIABLE` (e.g. Sun's
`/bin/sh`)
Since an extra `export` doesn\'t hurt, the safest and most portable way
Since an extra `export` doesn't hurt, the safest and most portable way
is to always (re-)export a changed variable if you want it to be seen by
subsequent processes.
### Arithmetics
Bash has a special compound command to do arithmetic without expansion.
However, POSIX has no such command. In the table at the top, there\'s
However, POSIX has no such command. In the table at the top, there's
the `: $((MATH))` construct mentioned as possible alternative. Regarding
the exit code, a 100% equivalent construct would be:
@ -84,8 +84,8 @@ aritrhmetic expansions, so the most portable is *with quotes*.
The overall problem with `echo` is, that there are 2 (maybe more)
mainstream flavours around. The only case where you may safely use an
`echo` on all systems is: Echoing non-variable arguments that don\'t
start with a `-` (dash) and don\'t contain a `\` (backslash).
`echo` on all systems is: Echoing non-variable arguments that don't
start with a `-` (dash) and don't contain a `\` (backslash).
Why? (list of known behaviours)
@ -113,7 +113,7 @@ existance of [the `printf` command](../commands/builtin/printf.md).
#### PWD
[PWD](../syntax/shellvars.md#PWD) is POSIX but not Bourne. Most shells are
*not POSIX* in that they don\'t ignore the value of the `PWD`
*not POSIX* in that they don't ignore the value of the `PWD`
environment variable. Workaround to fix the value of `PWD` at the start
of your script:
@ -153,11 +153,11 @@ Find another method.
### Check for a command in PATH
The [PATH](../syntax/shellvars.md#PATH) variable is a colon-delimited list of
directory names, so it\'s basically possible to run a loop and check
directory names, so it's basically possible to run a loop and check
every `PATH` component for the command you\'re looking for and for
executability.
However, this method doesn\'t look nice. There are other ways of doing
However, this method doesn't look nice. There are other ways of doing
this, using commands that are *not directly* related to this task.
#### hash
@ -183,7 +183,7 @@ Somewhat of a mass-check:
fi
done
Here (bash 3), `hash` also respects builtin commands. I don\'t know if
Here (bash 3), `hash` also respects builtin commands. I don't know if
this works everywhere, but it seems logical.
#### command
@ -199,6 +199,6 @@ accessible by `PATH`:
echo "sed is available"
fi
[^1]: \"portable\" doesn\'t necessarily mean it\'s POSIX, it can also
mean it\'s \"widely used and accepted\", and thus maybe more
[^1]: \"portable\" doesn't necessarily mean it's POSIX, it can also
mean it's \"widely used and accepted\", and thus maybe more
portable than POSIX(r

View File

@ -18,11 +18,11 @@ of POSIX, and some may be incompatible with POSIX.
Syntax Replacement Description
---------------------------------- --------------------------------------------------------- --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
`&>FILE` and `>&FILE` `>FILE 2>&1` This redirection syntax is short for `>FILE 2>&1` and originates in the C Shell. The latter form is especially uncommon and should never be used, and the explicit form using separate redirections is preferred over both. These shortcuts contribute to confusion about the copy descriptor because the syntax is unclear. They also introduce parsing ambiguity, and conflict with POSIX. Shells without this feature treat `cmd1 &>file cmd2` as: \"background `cmd1` and then execute `cmd2` with its stdout redirected to `file`\", which is the correct interpretation of this expression. See: [redirection](../syntax/redirection.md) \`\$ { bash; dash \</dev/fd/0; } \<\<\<\'echo foo\>/dev/null&\>/dev/fd/2 echo bar\' foo echo bar bar\`
`&>FILE` and `>&FILE` `>FILE 2>&1` This redirection syntax is short for `>FILE 2>&1` and originates in the C Shell. The latter form is especially uncommon and should never be used, and the explicit form using separate redirections is preferred over both. These shortcuts contribute to confusion about the copy descriptor because the syntax is unclear. They also introduce parsing ambiguity, and conflict with POSIX. Shells without this feature treat `cmd1 &>file cmd2` as: \"background `cmd1` and then execute `cmd2` with its stdout redirected to `file`\", which is the correct interpretation of this expression. See: [redirection](../syntax/redirection.md) \`\$ { bash; dash </dev/fd/0; } <<<\'echo foo>/dev/null&>/dev/fd/2 echo bar\' foo echo bar bar\`
`$[EXPRESSION]` `$((EXPRESSION))` This undocumented syntax is completely replaced by the POSIX-conforming arithmetic expansion `$((EXPRESSION))`. It is unimplemented almost everywhere except Bash and Zsh. See [arithmetic expansion](../syntax/expansion/arith.md). [Some discussion](http://lists.gnu.org/archive/html/bug-bash/2012-04/msg00034.html).
`COMMAND\ |&\ COMMAND` `COMMAND 2>&1 | COMMAND` This is an alternate pipeline operator derived from Zsh. Officially, it is not considered deprecated by Bash, but I highly discourage it. It conflicts with the list operator used for [coprocess](../syntax/keywords/coproc.md) creation in most Korn shells. It also has confusing behavior. The stdout is redirected first like an ordinary pipe, while the stderr is actually redirected last \-- after other redirects preceding the pipe operator. Overall, it\'s pointless syntax bloat. Use an explicit redirect instead.
`function\ NAME()\ COMPOUND-CMD` `NAME()\ COMPOUND-CMD` or `function\ NAME\ {\ CMDS;\ }` This is an amalgamation between the Korn and POSIX style function definitions - using both the `function` keyword and parentheses. It has no useful purpose and no historical basis or reason to exist. It is not specified by POSIX. It is accepted by Bash, mksh, zsh, and perhaps some other Korn shells, where it is treated as identical to the POSIX-style function. It is not accepted by AT&T ksh. It should never be used. See the next table for the `function` keyword. Bash doesn\'t have this feature documented as expressly deprecated.
`for x; { ...;}` `do`, `done`, `in`, `esac`, etc. This undocumented syntax replaces the `do` and `done` reserved words with braces. Many Korn shells support various permutations on this syntax for certain compound commands like `for`, `case`, and `while`. Which ones and certain details like whether a newline or semicolon are required vary. Only `for` works in Bash. Needless to say, don\'t use it.
`COMMAND\ |&\ COMMAND` `COMMAND 2>&1 | COMMAND` This is an alternate pipeline operator derived from Zsh. Officially, it is not considered deprecated by Bash, but I highly discourage it. It conflicts with the list operator used for [coprocess](../syntax/keywords/coproc.md) creation in most Korn shells. It also has confusing behavior. The stdout is redirected first like an ordinary pipe, while the stderr is actually redirected last -- after other redirects preceding the pipe operator. Overall, it's pointless syntax bloat. Use an explicit redirect instead.
`function\ NAME()\ COMPOUND-CMD` `NAME()\ COMPOUND-CMD` or `function\ NAME\ {\ CMDS;\ }` This is an amalgamation between the Korn and POSIX style function definitions - using both the `function` keyword and parentheses. It has no useful purpose and no historical basis or reason to exist. It is not specified by POSIX. It is accepted by Bash, mksh, zsh, and perhaps some other Korn shells, where it is treated as identical to the POSIX-style function. It is not accepted by AT&T ksh. It should never be used. See the next table for the `function` keyword. Bash doesn't have this feature documented as expressly deprecated.
`for x; { ...;}` `do`, `done`, `in`, `esac`, etc. This undocumented syntax replaces the `do` and `done` reserved words with braces. Many Korn shells support various permutations on this syntax for certain compound commands like `for`, `case`, and `while`. Which ones and certain details like whether a newline or semicolon are required vary. Only `for` works in Bash. Needless to say, don't use it.
This table lists syntax that is specified by POSIX (unless otherwise
specified below), but has been superseded by superior alternatives
@ -34,21 +34,21 @@ historical reasons.
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Syntax Replacement Description
----------------------------------------------------------------------------------------------------- ----------------------------------------------------------------------------------------------------------------------------------------------------- -----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Unquoted expansions, [wordsplit](../syntax/expansion/wordsplit.md), and [globs](../syntax/expansion/globs.md) [Proper quoting](http://mywiki.wooledge.org/Quotes), Ksh/Bash-style [arrays](../syntax/arrays.md), The \"\$@\" expansion, [read](../commands/builtin/read.md) *Quoting errors* are a broad category of common mistakes brought about by a few unintuitive features carried over from the Bourne shell due to complaints of broken scripts and changes in previously documented behavior. Most of the important expansions are performed at the same time from left to right. However, a few expansions, most notably word-splitting and globbing, and in shells other than Bash, [brace expansion](../syntax/expansion/brace.md), are performed **on the results of previous expansions, by default, unless they are quoted.** This means that the act of expanding an unquoted variable in an ordinary argument context, depending on the value of the variable, can yield different results depending on possibly uncontrolled side-effects like the value of `IFS`, and the names of files in the current working directory. You can\'t get globbing without word-splitting, or vice versa (without `set -f`). [You can\'t store a command or character-delimited list in a variable and safely evaluate it with unquoted expansion](http://mywiki.wooledge.org/BashFAQ/050). If possible, always choose a shell that supports Korn shell arrays such as Bash. They are a vital but non-standard feature for writing clean, safe scripts. Well-written scripts don\'t use word-splitting. A few exceptions are listed on the [word splitting page](../syntax/expansion/wordsplit.md). A significant proportion of the issues on the famous [Pitfalls list](http://mywiki.wooledge.org/BashPitfalls) fall under this category. See also: *[Don\'t read lines with for!](http://mywiki.wooledge.org/DontReadLinesWithFor)*
Unquoted expansions, [wordsplit](../syntax/expansion/wordsplit.md), and [globs](../syntax/expansion/globs.md) [Proper quoting](http://mywiki.wooledge.org/Quotes), Ksh/Bash-style [arrays](../syntax/arrays.md), The \"\$@\" expansion, [read](../commands/builtin/read.md) *Quoting errors* are a broad category of common mistakes brought about by a few unintuitive features carried over from the Bourne shell due to complaints of broken scripts and changes in previously documented behavior. Most of the important expansions are performed at the same time from left to right. However, a few expansions, most notably word-splitting and globbing, and in shells other than Bash, [brace expansion](../syntax/expansion/brace.md), are performed **on the results of previous expansions, by default, unless they are quoted.** This means that the act of expanding an unquoted variable in an ordinary argument context, depending on the value of the variable, can yield different results depending on possibly uncontrolled side-effects like the value of `IFS`, and the names of files in the current working directory. You can't get globbing without word-splitting, or vice versa (without `set -f`). [You can't store a command or character-delimited list in a variable and safely evaluate it with unquoted expansion](http://mywiki.wooledge.org/BashFAQ/050). If possible, always choose a shell that supports Korn shell arrays such as Bash. They are a vital but non-standard feature for writing clean, safe scripts. Well-written scripts don't use word-splitting. A few exceptions are listed on the [word splitting page](../syntax/expansion/wordsplit.md). A significant proportion of the issues on the famous [Pitfalls list](http://mywiki.wooledge.org/BashPitfalls) fall under this category. See also: *[Don't read lines with for!](http://mywiki.wooledge.org/DontReadLinesWithFor)*
`` `COMMANDS` `` `$(COMMANDS)` This is the older Bourne-compatible form of the [command substitution](../syntax/expansion/cmdsubst.md). Both the `` `COMMANDS` `` and `$(COMMANDS)` syntaxes are specified by POSIX, but the latter is [greatly]{.underline} preferred, though the former is unfortunately still very prevalent in scripts. New-style command substitutions are widely implemented by every modern shell (and then some). The only reason for using backticks is for compatibility with a real Bourne shell (like Heirloom). Backtick command substitutions require special escaping when nested, and examples found in the wild are improperly quoted more often than not. See: *[Why is \$(\...) preferred over \`\...\` (backticks)?](http://mywiki.wooledge.org/BashFAQ/082)*.
`[\ EXPRESSION\ ]`\\ and\\ `test\ EXPRESSION` `[[\ EXPRESSION\ ]]` `test` and `[` are the Bourne/POSIX commands for evaluating test expressions (they are almost identical, and `[` is somewhat more common). The expressions consist of regular arguments, unlike the Ksh/Bash `[[` command. While the issue is analogous to `let` vs `((`, the advantages of `[[` vs `[` are even more important because the arguments/expansions aren\'t just concatenated into one expression. With the classic `[` command, the number of arguments is significant. If at all possible, use the [conditional expression](../syntax/ccmd/conditional_expression.md) (\"new test command\") `[[ EXPRESSION ]]`. Unless there is a need for POSIX compatibility, there are only a few reasons to use `[`. `[[` is one of the most portable and consistent non-POSIX ksh extensions available. See: [conditional_expression](../syntax/ccmd/conditional_expression.md) and *[What is the difference between test, \[ and \[\[ ?](http://mywiki.wooledge.org/BashFAQ/031)*
`[\ EXPRESSION\ ]`\\ and\\ `test\ EXPRESSION` `[[\ EXPRESSION\ ]]` `test` and `[` are the Bourne/POSIX commands for evaluating test expressions (they are almost identical, and `[` is somewhat more common). The expressions consist of regular arguments, unlike the Ksh/Bash `[[` command. While the issue is analogous to `let` vs `((`, the advantages of `[[` vs `[` are even more important because the arguments/expansions aren't just concatenated into one expression. With the classic `[` command, the number of arguments is significant. If at all possible, use the [conditional expression](../syntax/ccmd/conditional_expression.md) (\"new test command\") `[[ EXPRESSION ]]`. Unless there is a need for POSIX compatibility, there are only a few reasons to use `[`. `[[` is one of the most portable and consistent non-POSIX ksh extensions available. See: [conditional_expression](../syntax/ccmd/conditional_expression.md) and *[What is the difference between test, \[ and \[\[ ?](http://mywiki.wooledge.org/BashFAQ/031)*
`set -e`, `set -o errexit`\ proper control flow and error handling `set -e` causes untested non-zero exit statuses to be fatal. It is a debugging feature intended for use only during development and should not be used in production code, especially init scripts and other high-availability scripts. Do not be tempted to think of this as \"error handling\"; it\'s not, it\'s just a way to find the place you\'ve *forgotten* to put error handling.\
`set -e`, `set -o errexit`\ proper control flow and error handling `set -e` causes untested non-zero exit statuses to be fatal. It is a debugging feature intended for use only during development and should not be used in production code, especially init scripts and other high-availability scripts. Do not be tempted to think of this as \"error handling\"; it's not, it's just a way to find the place you\'ve *forgotten* to put error handling.\
and the `ERR` trap Think of it as akin to `use strict` in Perl or `throws` in C++: tough love that makes you write better code. Many guides recommend avoiding it entirely because of the apparently-complex rules for when non-zero statuses cause the script to abort. Conversely, large software projects with experienced coders may recommend or even mandate its use.\
Because it provides no notification of the location of the error, it\'s more useful combined with `set -x` or the `DEBUG` trap and other Bash debug features, and both flags are normally better set on the command line rather than within the script itself.\
Because it provides no notification of the location of the error, it's more useful combined with `set -x` or the `DEBUG` trap and other Bash debug features, and both flags are normally better set on the command line rather than within the script itself.\
Most of this also applies to the `ERR` trap, though I\'ve seen it used in a few places in shells that lack `pipefail` or `PIPESTATUS`. The `ERR` trap is not POSIX, but `set -e` is. `failglob` is another Bash feature that falls into this category (mainly useful for debugging).\
**The `set -e` feature generates more questions and false bug reports on the Bash mailing list than all other features combined!** Please do not rely on `set -e` for logic in scripts. If you still refuse to take this advice, make sure you understand **exactly** how it works. See: *[Why doesn\'t set -e (or set -o errexit, or trap ERR) do what I expected?](http://mywiki.wooledge.org/BashFAQ/105)* and <http://www.fvue.nl/wiki/Bash:_Error_handling>
**The `set -e` feature generates more questions and false bug reports on the Bash mailing list than all other features combined!** Please do not rely on `set -e` for logic in scripts. If you still refuse to take this advice, make sure you understand **exactly** how it works. See: *[Why doesn't set -e (or set -o errexit, or trap ERR) do what I expected?](http://mywiki.wooledge.org/BashFAQ/105)* and <http://www.fvue.nl/wiki/Bash:_Error_handling>
`set -u` or `set -o nounset` Proper control flow and error handling `set -u` causes attempts to expand unset variables or parameters as fatal errors. Like `set -e`, it bypasses control flow and exits immediately from the current shell environment. Like non-zero statuses, unset variables are a normal part of most non-trivial shell scripts. Living with `set -u` requires hacks like `${1+"$1"}` for each expansion that might possibly be unset. Only very current shells guarantee that expanding `@` or `*` won\'t trigger an error when no parameters are set (<http://austingroupbugs.net/view.php?id=155>, <http://www.in-ulm.de/~mascheck/various/bourne_args/>). Apparently some find it useful for debugging. See *[How do I determine whether a variable is already defined? Or a function?](http://mywiki.wooledge.org/BashFAQ/083)* for how to properly test for defined variables. Don\'t use `set -u`.
`set -u` or `set -o nounset` Proper control flow and error handling `set -u` causes attempts to expand unset variables or parameters as fatal errors. Like `set -e`, it bypasses control flow and exits immediately from the current shell environment. Like non-zero statuses, unset variables are a normal part of most non-trivial shell scripts. Living with `set -u` requires hacks like `${1+"$1"}` for each expansion that might possibly be unset. Only very current shells guarantee that expanding `@` or `*` won't trigger an error when no parameters are set (<http://austingroupbugs.net/view.php?id=155>, <http://www.in-ulm.de/~mascheck/various/bourne_args/>). Apparently some find it useful for debugging. See *[How do I determine whether a variable is already defined? Or a function?](http://mywiki.wooledge.org/BashFAQ/083)* for how to properly test for defined variables. Don't use `set -u`.
`${var?msg}` or `${var:?msg}` Proper control flow and error handling Like `set -u`, this expansion causes a fatal error which immediately exits the current shell environment if the given parameter is unset or is null. It prints the error message given, to the right of the operator. If a value is expected and you\'d like to create an assertion or cause errors, it\'s better to test for undefined variables using one of [these techniques](http://mywiki.wooledge.org/BashFAQ/083) and handle the error manually, or call a `die` function. This expansion is defined by POSIX. It\'s better than `set -u`, because it\'s explicit, but not by much. It also allows you to accidentally construct hilariously deceptive error messages: `bash -c 'f() { definitely_not_printf "${printf:?"$1" - No such option}"; }; f -v'
`${var?msg}` or `${var:?msg}` Proper control flow and error handling Like `set -u`, this expansion causes a fatal error which immediately exits the current shell environment if the given parameter is unset or is null. It prints the error message given, to the right of the operator. If a value is expected and you\'d like to create an assertion or cause errors, it's better to test for undefined variables using one of [these techniques](http://mywiki.wooledge.org/BashFAQ/083) and handle the error manually, or call a `die` function. This expansion is defined by POSIX. It's better than `set -u`, because it's explicit, but not by much. It also allows you to accidentally construct hilariously deceptive error messages: `bash -c 'f() { definitely_not_printf "${printf:?"$1" - No such option}"; }; f -v'
bash: printf: -v - No such option`
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
@ -59,22 +59,22 @@ portability requirements, or in order to make use of some subtle
behavioral differences. These are frequently (mis)used for no reason.
Writing portable scripts that go outside of POSIX features requires
knowing how to account for many (often undocumented) differences across
many shells. If you do happen to know what you\'re doing, don\'t be too
many shells. If you do happen to know what you\'re doing, don't be too
surprised if you run across someone telling you not to use these.
Syntax Replacement Description
------------------------------- --------------------------------------------------------------------- ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
`function\ NAME\ {\ CMDS;\ }` `NAME()\ COMPOUND-CMD` This is the ksh form of function definition created to extend the Bourne and POSIX form with modified behaviors and additional features like local variables. The idea was for new-style functions to be analogous to regular builtins with their own environment and scope, while POSIX-style functions are more like special builtins. `function` is supported by almost every ksh-derived shell including Bash and Zsh, but isn\'t specified by POSIX. Bash treats all function styles the same, but this is unusual. `function` has some preferable characteristics in many ksh variants, making it more portable for scripts that use non-POSIX extensions by some measures. If you\'re going to use the `function` keyword, it implies that you\'re either targeting Ksh specifically, or that you have detailed knowledge of how to compensate for differences across shells. It should always be used consistently with `typeset`, but never used with `declare` or `local`. Also in ksh93, the braces are not a [command group](../syntax/ccmd/grouping_plain.md), but a required part of the syntax (unlike Bash and others). See [shell function definitions](../syntax/basicgrammar.md#shell_function_definitions)
`typeset` `declare`, `local`, `export`, `readonly` This is closely related to the above, and should often be used together. `typeset` exists primarily for `ksh` compatibility, but is marked as \"deprecated\" in Bash (though I don\'t entirely agree with this). This makes some sense, because future compatibility can\'t be guaranteed, and any compatibility at all, requires understanding the non-POSIX features of other shells and their differences. Using `declare` instead of `typeset` emphasizes your intention to be \"Bash-only\", and definitely breaks everywhere else (except possibly zsh if you\'re lucky). The issue is further complicated by Dash and the [Debian policy](http://www.debian.org/doc/debian-policy/ch-files.html#s-scripts) requirement for a `local` builtin, which is itself not entirely compatible with Bash and other shells.
\'\'let \'EXPR\' \'\' `((EXPR))` or `[\ $((EXPR))\ -ne\ 0 ]` `let` is the \"simple command\" variant of arithmetic evaluation command, which takes regular arguments. Both `let` and `((expr))` were present in ksh88, and everything that supports one should support the other. Neither are POSIX. The compound variant is preferable because it doesn\'t take regular arguments for [wordsplitting](../syntax/expansion/wordsplit.md) and [globbing](../syntax/expansion/globs.md), which makes it safer and clearer. It is also usually faster, especially in Bash, where compound commands are typically significantly faster. Some of the (few) reasons for using `let` are detailed on the [let](../commands/builtin/let.md) page. See [arithmetic evaluation compound command](../syntax/ccmd/arithmetic_eval.md)
`eval` Depends. Often code can be restructured to use better alternatives. `eval` is thrown in here for good measure, as sadly it is so often misused that any use of `eval` (even the rare clever one) is immediately dismissed as wrong by experts, and among the most immediate solutions abused by beginners. In reality, there are correct ways to use `eval`, and even cases in which it\'s necessary, even in sophisticated shells like Bash and Ksh. `eval` is unusual in that it is less frequently appropriate in more feature-rich shells than in more minimal shells like Dash, where it is used to compensate for more limitations. If you find yourself needing `eval` too frequently, it might be a sign that you\'re either better off using a different language entirely, or trying to borrow an idiom from some other paradigm that isn\'t well suited to the shell language. By the same token, there are some cases in which working too hard to avoid `eval` ends up adding a lot of complexity and sacrificing all portability. Don\'t substitute a clever `eval` for something that\'s a bit \"too clever\", just to avoid the `eval`, yet, take reasonable measures to avoid it where it is sensible to do so. See: [eval](../commands/builtin/eval.md) and [Eval command and security issues](http://mywiki.wooledge.org/BashFAQ/048).
`function\ NAME\ {\ CMDS;\ }` `NAME()\ COMPOUND-CMD` This is the ksh form of function definition created to extend the Bourne and POSIX form with modified behaviors and additional features like local variables. The idea was for new-style functions to be analogous to regular builtins with their own environment and scope, while POSIX-style functions are more like special builtins. `function` is supported by almost every ksh-derived shell including Bash and Zsh, but isn't specified by POSIX. Bash treats all function styles the same, but this is unusual. `function` has some preferable characteristics in many ksh variants, making it more portable for scripts that use non-POSIX extensions by some measures. If you\'re going to use the `function` keyword, it implies that you\'re either targeting Ksh specifically, or that you have detailed knowledge of how to compensate for differences across shells. It should always be used consistently with `typeset`, but never used with `declare` or `local`. Also in ksh93, the braces are not a [command group](../syntax/ccmd/grouping_plain.md), but a required part of the syntax (unlike Bash and others). See [shell function definitions](../syntax/basicgrammar.md#shell_function_definitions)
`typeset` `declare`, `local`, `export`, `readonly` This is closely related to the above, and should often be used together. `typeset` exists primarily for `ksh` compatibility, but is marked as \"deprecated\" in Bash (though I don't entirely agree with this). This makes some sense, because future compatibility can't be guaranteed, and any compatibility at all, requires understanding the non-POSIX features of other shells and their differences. Using `declare` instead of `typeset` emphasizes your intention to be \"Bash-only\", and definitely breaks everywhere else (except possibly zsh if you\'re lucky). The issue is further complicated by Dash and the [Debian policy](http://www.debian.org/doc/debian-policy/ch-files.html#s-scripts) requirement for a `local` builtin, which is itself not entirely compatible with Bash and other shells.
\'\'let \'EXPR\' \'\' `((EXPR))` or `[\ $((EXPR))\ -ne\ 0 ]` `let` is the \"simple command\" variant of arithmetic evaluation command, which takes regular arguments. Both `let` and `((expr))` were present in ksh88, and everything that supports one should support the other. Neither are POSIX. The compound variant is preferable because it doesn't take regular arguments for [wordsplitting](../syntax/expansion/wordsplit.md) and [globbing](../syntax/expansion/globs.md), which makes it safer and clearer. It is also usually faster, especially in Bash, where compound commands are typically significantly faster. Some of the (few) reasons for using `let` are detailed on the [let](../commands/builtin/let.md) page. See [arithmetic evaluation compound command](../syntax/ccmd/arithmetic_eval.md)
`eval` Depends. Often code can be restructured to use better alternatives. `eval` is thrown in here for good measure, as sadly it is so often misused that any use of `eval` (even the rare clever one) is immediately dismissed as wrong by experts, and among the most immediate solutions abused by beginners. In reality, there are correct ways to use `eval`, and even cases in which it's necessary, even in sophisticated shells like Bash and Ksh. `eval` is unusual in that it is less frequently appropriate in more feature-rich shells than in more minimal shells like Dash, where it is used to compensate for more limitations. If you find yourself needing `eval` too frequently, it might be a sign that you\'re either better off using a different language entirely, or trying to borrow an idiom from some other paradigm that isn't well suited to the shell language. By the same token, there are some cases in which working too hard to avoid `eval` ends up adding a lot of complexity and sacrificing all portability. Don't substitute a clever `eval` for something that's a bit \"too clever\", just to avoid the `eval`, yet, take reasonable measures to avoid it where it is sensible to do so. See: [eval](../commands/builtin/eval.md) and [Eval command and security issues](http://mywiki.wooledge.org/BashFAQ/048).
## See also
- [Non-portable syntax and command uses](../scripting/nonportable.md)
- [bashchanges](../scripting/bashchanges.md)
- [Greg\'s BashFAQ 061: List of essential features added (with the
- [Greg's BashFAQ 061: List of essential features added (with the
Bash version tag)](BashFAQ>061)
- [Bash \<-\> POSIX Portability guide with a focus on
- [Bash <-> POSIX Portability guide with a focus on
Dash](http://mywiki.wooledge.org/Bashism)
- <http://mywiki.wooledge.org/BashPitfalls>

View File

@ -30,7 +30,7 @@ See also [the dictionary entry for
## The first argument
The very first argument you can access is referenced as `$0`. It is
usually set to the script\'s name exactly as called, and it\'s set on
usually set to the script's name exactly as called, and it's set on
shell initialization:
[Testscript]{.underline} - it just echos `$0`:
@ -47,14 +47,14 @@ is the prompt\...):
> /usr/bin/testscript
/usr/bin/testscript
However, this isn\'t true for login shells:
However, this isn't true for login shells:
> echo "$0"
-bash
In other terms, `$0` is not a positional parameter, it\'s a special
In other terms, `$0` is not a positional parameter, it's a special
parameter independent from the positional parameter list. It can be set
to anything. In the **ideal** case it\'s the pathname of the script, but
to anything. In the **ideal** case it's the pathname of the script, but
since this gets set on invocation, the invoking program can easily
influence it (the `login` program does that for login shells, by
prefixing a dash, for example).
@ -97,7 +97,7 @@ While useful in another situation, this way is lacks flexibility. The
maximum number of arguments is a fixedvalue - which is a bad idea if you
write a script that takes many filenames as arguments.
=\> forget that one
=> forget that one
### Loops
@ -135,7 +135,7 @@ a given wordlist. The loop uses the positional parameters as a wordlist:
------------------------------------------------------------------------
The next method is similar to the first example (the `for` loop), but it
doesn\'t test for reaching `$#`. It shifts and checks if `$1` still
doesn't test for reaching `$#`. It shifts and checks if `$1` still
expands to something, using the [test command](../commands/classictest.md):
while [ "$1" ]
@ -145,7 +145,7 @@ expands to something, using the [test command](../commands/classictest.md):
done
Looks nice, but has the disadvantage of stopping when `$1` is empty
(null-string). Let\'s modify it to run as long as `$1` is defined (but
(null-string). Let's modify it to run as long as `$1` is defined (but
may be null), using [parameter expansion for an alternate
value](../syntax/pe.md#use_an_alternate_value):
@ -163,8 +163,8 @@ There is a [small tutorial dedicated to
### All Positional Parameters
Sometimes it\'s necessary to just \"relay\" or \"pass\" given arguments
to another program. It\'s very inefficient to do that in one of these
Sometimes it's necessary to just \"relay\" or \"pass\" given arguments
to another program. It's very inefficient to do that in one of these
loops, as you will destroy integrity, most likely (spaces!).
The shell developers created `$*` and `$@` for this purpose.
@ -197,7 +197,7 @@ your positional parameters to **call another program** (for example in a
wrapper-script), then this is the choice for you, use double quoted
`"$@"`.
Well, let\'s just say: **You almost always want a quoted `"$@"`!**
Well, let's just say: **You almost always want a quoted `"$@"`!**
### Range Of Positional Parameters
@ -251,7 +251,7 @@ inside the script or function:
# $5: positional
# $6: parameters
It\'s wise to signal \"end of options\" when setting positional
It's wise to signal \"end of options\" when setting positional
parameters this way. If not, the dashes might be interpreted as an
option switch by `set` itself:
@ -274,8 +274,8 @@ To make your program accept options as standard command syntax:
`COMMAND [options] <params>` \# Like \'cat -A file.txt\'
See simple option parsing code below. It\'s not that flexible. It
doesn\'t auto-interpret combined options (-fu USER) but it works and is
See simple option parsing code below. It's not that flexible. It
doesn't auto-interpret combined options (-fu USER) but it works and is
a good rudimentary way to parse your arguments.
#!/bin/sh
@ -335,7 +335,7 @@ This simple wrapper enables filtering unwanted options (here: `-a` and
`--all` for `ls`) out of the command line. It reads the positional
parameters and builds a filtered array consisting of them, then calls
`ls` with the new option set. It also respects the `--` as \"end of
options\" for `ls` and doesn\'t change anything after it:
options\" for `ls` and doesn't change anything after it:
#!/bin/bash

View File

@ -8,12 +8,12 @@ The processes in UNIX(r) are - unlike other systems - **organized as a
tree**. Every process has a parent process that started, or is
responsible, for it. Every process has its own **context memory** (Not
the memory where the process stores its data, rather, the memory where
data is stored that doesn\'t directly belong to the process, but is
data is stored that doesn't directly belong to the process, but is
needed to run the process) i.e. [**The environment**]{.underline}.
Every process has its **own** environment space.
The environment stores, among other things, data that\'s useful to us,
The environment stores, among other things, data that's useful to us,
the **environment variables**. These are strings in common `NAME=VALUE`
form, but they are not related to shell variables. A variable named
`LANG`, for example, is used by every program that looks it up in its
@ -33,10 +33,10 @@ set by login scripts or programs).
## Executing programs
All the diagrams of the process tree use names like \"`xterm`\" or
\"`bash`\", but that\'s just to make it easier to understand what\'s
going on, it doesn\'t mean those processes are actually executed.
\"`bash`\", but that's just to make it easier to understand what's
going on, it doesn't mean those processes are actually executed.
Let\'s take a short look at what happens when you \"execute a program\"
Let's take a short look at what happens when you \"execute a program\"
from the Bash prompt, a program like \"ls\":
$ ls
@ -65,7 +65,7 @@ The copy of the environment from the first step (forking) becomes the
environment for the final running program (in this case, `ls`).
[**What is so important about it?**]{.underline} In our example, what
the program `ls` does inside its own environment, it can\'t affect the
the program `ls` does inside its own environment, it can't affect the
environment of its parent process (in this case, `bash`). The
environment was copied when ls was executed. Nothing is \"copied back\"
to the parent environment when `ls` terminates.
@ -73,7 +73,7 @@ to the parent environment when `ls` terminates.
## Bash playing with pipes
Pipes are a very powerful tool. You can connect the output of one
process to the input of another process. We won\'t delve into piping at
process to the input of another process. We won't delve into piping at
this point, we just want to see how it looks in the process tree. Again,
we execute some commands, this time, we\'ll run `ls` and `grep`:
@ -85,8 +85,8 @@ It results in a tree like this:
xterm ----- bash --|
+-- grep
Note once again, `ls` can\'t influence the `grep` environment, `grep`
can\'t influence the `ls` environment, and neither `grep` nor `ls` can
Note once again, `ls` can't influence the `grep` environment, `grep`
can't influence the `ls` environment, and neither `grep` nor `ls` can
influence the `bash` environment.
[**How is that related to shell programming?!?**]{.underline}
@ -100,7 +100,7 @@ into a variable. We run it in a loop here to count input lines:
cat /etc/passwd | while read; do ((counter++)); done
echo "Lines: $counter"
What? It\'s 0? Yes! The number of lines might not be 0, but the variable
What? It's 0? Yes! The number of lines might not be 0, but the variable
`$counter` still is 0. Why? Remember the diagram from above? Rewriting
it a bit, we have:
@ -112,10 +112,10 @@ See the relationship? The forked Bash process will count the lines like
a charm. It will also set the variable `counter` as directed. But if
everything ends, this extra process will be terminated - **your
\"counter\" variable is gone.** You see a 0 because in the main shell it
was 0, and wasn\'t changed by the child process!
was 0, and wasn't changed by the child process!
[**So, how do we count the lines?**]{.underline} Easy: **Avoid the
subshell.** The details don\'t matter, the important thing is the shell
subshell.** The details don't matter, the important thing is the shell
that sets the counter must be the \"main shell\". For example:
counter=0
@ -123,10 +123,10 @@ that sets the counter must be the \"main shell\". For example:
while read; do ((counter++)); done </etc/passwd
echo "Lines: $counter"
It\'s nearly self-explanatory. The `while` loop runs in the **current
It's nearly self-explanatory. The `while` loop runs in the **current
shell**, the counter is incremented in the **current shell**, everything
vital happens in the **current shell**, also the `read` command sets the
variable `REPLY` (the default if nothing is given), though we don\'t use
variable `REPLY` (the default if nothing is given), though we don't use
it here.
## Actions that create a subshell
@ -137,14 +137,14 @@ performs:
### Executing commands
As shown above, Bash will create subprocesses everytime it executes
commands. That\'s nothing new.
commands. That's nothing new.
But if your command is a subprocess that sets variables you want to use
in your main script, that won\'t work.
in your main script, that won't work.
For exactly this purpose, there\'s the `source` command (also: the *dot*
`.` command). Source doesn\'t execute the script, it imports the other
script\'s code into the current shell:
For exactly this purpose, there's the `source` command (also: the *dot*
`.` command). Source doesn't execute the script, it imports the other
script's code into the current shell:
source ./myvariables.sh
# equivalent to:

View File

@ -10,8 +10,8 @@ it\".
This is not a bible, of course. But I have seen so much ugly and
terrible code (not only in shell) during all the years, that I\'m 100%
convinced there needs to be *some* code layout and style. No matter
which one you use, use it throughout your code (at least don\'t change
it within the same shellscript file); don\'t change your code layout
which one you use, use it throughout your code (at least don't change
it within the same shellscript file); don't change your code layout
with your mood.
Some good code layout helps you to read your own code after a while. And
@ -19,16 +19,16 @@ of course it helps others to read the code.
## Indentation guidelines
Indentation is nothing that technically influences a script, it\'s only
Indentation is nothing that technically influences a script, it's only
for us humans.
I\'m used to seeing/using indentation of *two space characters* (though
many may prefer 4 spaces, see below in the discussion section):
- it\'s easy and fast to type
- it\'s not a hard-tab that\'s displayed differently in different
- it's easy and fast to type
- it's not a hard-tab that's displayed differently in different
environments
- it\'s wide enough to give a visual break and small enough to not
- it's wide enough to give a visual break and small enough to not
waste too much space on the line
Speaking of hard-tabs: Avoid them if possible. They only make trouble. I
@ -145,7 +145,7 @@ Cryptic constructs, we all know them, we all love them. If they are not
100% needed, avoid them, since nobody except you may be able to decipher
them.
It\'s - just like in C - the middle ground between smart, efficient and
It's - just like in C - the middle ground between smart, efficient and
readable.
If you need to use a cryptic construct, include a comment that explains
@ -179,12 +179,12 @@ loop counting variables, etc., \... (in the example: `file`)
### Variable initialization
As in C, it\'s always a good idea to initialize your variables, though,
As in C, it's always a good idea to initialize your variables, though,
the shell will initialize fresh variables itself (better: Unset
variables will generally behave like variables containing a null
string).
It\'s no problem to pass an **environment variable** to the script. If
It's no problem to pass an **environment variable** to the script. If
you blindly assume that all variables you use for the first time are
**empty**, anybody can **inject** content into a variable by passing it
via the environment.
@ -203,7 +203,7 @@ in-code documentation for them.
Unless you are really sure what you\'re doing, **quote every parameter
expansion**.
There are some cases where this isn\'t needed from a technical point of
There are some cases where this isn't needed from a technical point of
view, e.g.
- inside `[[ ... ]]` (other than the RHS of the `==`, `!=`, and `=~`
@ -214,7 +214,7 @@ view, e.g.
But quoting these is never a mistake. If you quote every parameter
expansion, you\'ll be safe.
If you need to parse a parameter as a list of words, you can\'t quote,
If you need to parse a parameter as a list of words, you can't quote,
of course, e.g.
list="one two three"
@ -260,7 +260,7 @@ Avoid it, unless absolutely neccesary:
- `eval` can be your neckshot
- there are most likely other ways to achieve what you want
- if possible, re-think the way your script works, if it seems you
can\'t avoid `eval` with your current method
can't avoid `eval` with your current method
- if you really, really, have to use it, then take care, and be sure
about what you\'re doing
@ -278,7 +278,7 @@ The basic structure of a script simply reads:
### The shebang
If possible (I know it\'s not always possible!), use [a
If possible (I know it's not always possible!), use [a
shebang](../dict/terms/shebang.md).
Be careful with `/bin/sh`: The argument that \"on Linux `/bin/sh` is
@ -309,7 +309,7 @@ declared before the main script code runs. This gives a far better
overview and ensures that all function names are known before they are
used.
Since a function isn\'t parsed before it is executed, you usually don\'t
Since a function isn't parsed before it is executed, you usually don't
have to ensure they\'re in a specific order.
The portable form of the function definition should be used, without the
@ -321,7 +321,7 @@ command](../syntax/ccmd/grouping_plain.md)):
}
Speaking about the command grouping in function definitions using
`{ ...; }`: If you don\'t have a good reason to use another compound
`{ ...; }`: If you don't have a good reason to use another compound
command directly, you should always use this one.
## Behaviour and robustness
@ -372,7 +372,7 @@ operation status of your script.
You know: **\"One of the main causes of the fall of the Roman Empire was
that, lacking zero, they had no way to indicate successful termination
of their C programs.\"** *\-- Robert Firth*
of their C programs.\"** *-- Robert Firth*
## Misc

View File

@ -4,22 +4,22 @@
Terminal (control) codes are used to issue specific commands to your
terminal. This can be related to switching colors or positioning the
cursor, i.e. anything that can\'t be done by the application itself.
cursor, i.e. anything that can't be done by the application itself.
## How it technically works
A terminal control code is a special sequence of characters that is
printed (like any other text). If the terminal understands the code, it
won\'t display the character-sequence, but will perform some action. You
won't display the character-sequence, but will perform some action. You
can print the codes with a simple `echo` command.
[**Note:**]{.underline} I see codes referenced as \"Bash colors\"
sometimes (several \"Bash tutorials\" etc\...): That\'s a completely
sometimes (several \"Bash tutorials\" etc\...): That's a completely
incorrect definition.
## The tput command
Because there\'s a large number of different terminal control languages,
Because there's a large number of different terminal control languages,
usually a system has an intermediate communication layer. The real codes
are looked up in a database **for the currently detected terminal type**
and you give standardized requests to an API or (from the shell) to a
@ -40,18 +40,18 @@ is unclear! Also the `tput` acronyms are usually the ones dedicated for
ANSI escapes!
I listed only the most relevant codes, of course, any ANSI terminal
understands many more! But let\'s keep the discussion centered on common
understands many more! But let's keep the discussion centered on common
shell scripting ;-)
If I couldn\'t find a matching ANSI escape, you\'ll see a :?: as the
If I couldn't find a matching ANSI escape, you\'ll see a :?: as the
code. Feel free to mail me or fix it.
The ANSI codes always start with the ESC character. (ASCII 0x1B or octal
033) This isn\'t part of the list, but **you should avoid using the ANSI
033) This isn't part of the list, but **you should avoid using the ANSI
codes directly - use the `tput` command!**
All codes that can be used with `tput` can be found in terminfo(5). (on
OpenBSD at least) See [OpenBSD\'s
OpenBSD at least) See [OpenBSD's
terminfo(5)](http://www.openbsd.org/cgi-bin/man.cgi?query=terminfo&apropos=0&sektion=5&manpath=OpenBSD+Current&arch=i386&format=html)
under the [Capabilities]{.underline} section. The *cap-name* is the code
to use with tput. A description of each code is also provided.
@ -178,7 +178,7 @@ termcap/terminfo. While `xterm` and most of its clones (`rxvt`, `urxvt`,
etc) will support the instructions, your operating system may not
include references to them in its default xterm profile. (FreeBSD, in
particular, falls into this category.) If \`tput smcup\` appears to do
nothing for you, and you don\'t want to modify your system
nothing for you, and you don't want to modify your system
termcap/terminfo data, and you KNOW that you are using a compatible
xterm application, the following may work for you:
@ -210,7 +210,7 @@ terminals](https://gist.github.com/XVilka/8346728#now-supporting-truecolour)
also support full 24-bit colors, and any X11 color code can be written
directly into a special escape sequence. ([More
infos](https://gist.github.com/XVilka/8346728)) Only a few programs make
use of anything beyond 256 colors, and tput doesn\'t know about them.
use of anything beyond 256 colors, and tput doesn't know about them.
Colors beyond 16 usually only apply to modern terminal emulators running
in graphical environments.
@ -293,9 +293,9 @@ switch to, to get the other 8 colors.
### Mandelbrot set
This is a slightly modified version of Charles Cooke\'s colorful
This is a slightly modified version of Charles Cooke's colorful
Mandelbrot plot scripts ([original w/
screenshot](http://earth.gkhs.net/ccooke/shell.html)) \-- ungolfed,
screenshot](http://earth.gkhs.net/ccooke/shell.html)) -- ungolfed,
optimized a bit, and without hard-coded terminal escapes. The `colorBox`
function is [memoized](http://en.wikipedia.org/wiki/Memoization) to
collect `tput` output only when required and output a new escape only
@ -304,7 +304,7 @@ at most 16, and reduces raw output by more than half. The `doBash`
function uses integer arithmetic, but is still ksh93-compatible (run as
e.g. `bash ./mandelbrot` to use it). The ksh93-only floating-point
`doKsh` is almost 10x faster than `doBash` (thus the ksh shebang by
default), but uses only features that don\'t make the Bash parser crash.
default), but uses only features that don't make the Bash parser crash.
#!/usr/bin/env ksh

View File

@ -1,6 +1,6 @@
# Add Color to your scripts
\-\-\-- dataentry snipplet \-\-\-- snipplet_tags : terminal, color
---- dataentry snipplet ---- snipplet_tags : terminal, color
LastUpdate_dt : 2013-03-23 Contributors : Frank Lazzarini, Dan Douglas
type : snipplet
@ -8,7 +8,7 @@ type : snipplet
Make your scripts output more readable using bash colors. Simply add
these variables to your script, and you will be able to echo in color.
(I haven\'t added all the colors available, just some basics)
(I haven't added all the colors available, just some basics)
# Colors
ESC_SEQ="\x1b["
@ -34,7 +34,7 @@ hardwiring codes.
This snipplet sets up associative arrays for basic color codes using
`tput` for Bash, ksh93 or zsh. You can pass it variable names to
correspond with a collection of codes. There\'s a `main` function with
correspond with a collection of codes. There's a `main` function with
example usage.
``` bash

View File

@ -1,6 +1,6 @@
# Using `awk` to deal with CSV that uses quoted/unquoted delimiters
\-\-\-- dataentry snipplet \-\-\-- snipplet_tags : awk, csv
---- dataentry snipplet ---- snipplet_tags : awk, csv
LastUpdate_dt : 2010-07-31 Contributors : SiegX (IRC) type : snipplet
------------------------------------------------------------------------
@ -14,7 +14,7 @@ data fields that can contain the delimiter.
"fir,st", "second", "last"
"firtst one", "sec,ond field", "final,ly"
Simply using the comma as separator for `awk` won\'t work here, of
Simply using the comma as separator for `awk` won't work here, of
course.
Solution: Use the field separator `", "|^"|"$` for `awk`.
@ -28,12 +28,12 @@ This is an OR-ed list of 3 possible separators:
-------- -----------------------------------------------
You can tune these delimiters if you have other needs (for example if
you don\'t have a space after the commas).
you don't have a space after the commas).
Test:
The `awk` command used for the CSV above just prints the fileds
separated by `###` to see what\'s going on:
separated by `###` to see what's going on:
$ awk -v FS='", "|^"|"$' '{print $2"###"$3"###"$4}' data.csv
first###second###last

View File

@ -1,6 +1,6 @@
# Show size of a file
\-\-\-- dataentry snipplet \-\-\-- snipplet_tags: files, file size
---- dataentry snipplet ---- snipplet_tags: files, file size
LastUpdate_dt: 2010-07-31 Contributors: Frank Lazzarini type: snipplet
------------------------------------------------------------------------

View File

@ -1,6 +1,6 @@
# Kill a background job without a message
\-\-\-- dataentry snipplet \-\-\-- snipplet_tags: kill, process
---- dataentry snipplet ---- snipplet_tags: kill, process
management, jobs LastUpdate_dt: 2010-07-31 Contributors: Jan Schampera
type: snipplet
@ -31,12 +31,12 @@ You will get something like this:
./bg_kill1.sh: line 11: 3413 Killed sleep 300
Yes, we killed it
This is more or less a normal message. And it can\'t be easily
redirected since it\'s the shell itself that yells this message, not the
This is more or less a normal message. And it can't be easily
redirected since it's the shell itself that yells this message, not the
command `kill` or something else. You would have to redirect the whole
script\'s output.
script's output.
It\'s also useless to temporarily redirect `stderr` when you call the
It's also useless to temporarily redirect `stderr` when you call the
`kill` command, since the successful termination of the job, the
termination of the `kill` command and the message from the shell may not
happen at the same time. And a blind `sleep` after the `kill` would be

View File

@ -1,6 +1,6 @@
# Get largest file
\-\-\-- dataentry snipplet \-\-\-- snipplet_tags: directory, recursive,
---- dataentry snipplet ---- snipplet_tags: directory, recursive,
find, crawl LastUpdate_dt: 2013-03-23 Contributors: Dan Douglas type:
snipplet

View File

@ -1,6 +1,6 @@
# Pausing a script (like MSDOS pause command)
\-\-\-- dataentry snipplet \-\-\-- snipplet_tags: terminal, pause, input
---- dataentry snipplet ---- snipplet_tags: terminal, pause, input
LastUpdate_dt: 2010-07-31 Contributors: Jan Schampera type: snipplet
------------------------------------------------------------------------

View File

@ -1,6 +1,6 @@
# Print argument list for testing
\-\-\-- dataentry snipplet \-\-\-- snipplet_tags: debug, arguments
---- dataentry snipplet ---- snipplet_tags: debug, arguments
LastUpdate_dt: 2013-03-23 Contributors: Snappy (IRC), Dan Douglas type:
snipplet

View File

@ -1,13 +1,13 @@
# Print a horizontal line
\-\-\-- dataentry snipplet \-\-\-- snipplet_tags: terminal, line
---- dataentry snipplet ---- snipplet_tags: terminal, line
LastUpdate_dt: 2010-07-31 Contributors: Jan Schampera, prince_jammys,
ccsalvesen, others type: snipplet
------------------------------------------------------------------------
The purpose of this small code collection is to show some code that
draws a horizontal line using as less external tools as possible (it\'s
draws a horizontal line using as less external tools as possible (it's
not a big deal to do it with AWK or Perl, but with pure or nearly-pure
Bash it gets more interesting).
@ -66,7 +66,7 @@ printf '%*s\n' "${COLUMNS:-$(tput cols)}" '' | tr ' ' -
This one is a bit tricky. The format for the `printf` command is `%.0s`,
which specified a field with the **maximum** length of **zero**. After
this field, `printf` is told to print a dash. You might remember that
it\'s the nature of `printf` to repeat, if the number of conversion
it's the nature of `printf` to repeat, if the number of conversion
specifications is less than the number of given arguments. With brace
expansion `{1..20}`, 20 arguments are given (you could easily write
`1 2 3 4 ... 20`, of course!). Following happens: The **zero-length
@ -98,7 +98,7 @@ principle as this [string reverse
example](../commands/builtin/eval.md#expansion_side-effects). It completely
depends on Bash due to its brace expansion evaluation order and array
parameter parsing details. As above, the eval only inserts the COLUMNS
expansion into the expression and isn\'t involved in the rest, other
expansion into the expression and isn't involved in the rest, other
than to put the `_` value into the environment of the `_[0]` expansion.
This works well since we\'re not creating one set of arguments and then
editing or deleting them to create another as in the previous examples.

View File

@ -1,6 +1,6 @@
# Print a random string or select random elements
\-\-\-- dataentry snipplet \-\-\-- snipplet_tags: terminal, line
---- dataentry snipplet ---- snipplet_tags: terminal, line
LastUpdate_dt: 2013-04-30 Contributors: Dan Douglas (ormaaj) type:
snipplet
@ -31,7 +31,7 @@ basically the same principle as the `rndstr` function above.
~ $ ( set -- foo bar baz bork; printf '%s ' "${!_[_=RANDOM%$#+1,0]"{0..10}"}"; echo )
bork bar baz baz foo baz baz baz baz baz bork
\<div hide\> This has some interesting option parsing concepts, but is
<div hide> This has some interesting option parsing concepts, but is
overly complex. This is a good example of working too hard to avoid an
eval for no benefit and some performance penalty. :/
@ -63,9 +63,9 @@ rndstr()
fi
```
\</div\>
</div>
The remaining examples don\'t use quite the same tricks, which will
The remaining examples don't use quite the same tricks, which will
hopefully be explained elsewhere eventually. See
[unset](../commands/builtin/unset.md#scope) for why doing assignments in this
way works well.
@ -83,7 +83,7 @@ printf '%.1s' "${a[RANDOM%${#a[@]}]}"{0..9} $'\n'
The extra detail that makes this work is to notice that in Bash, [brace
expansion](../syntax/expansion/brace.md) is usually the very first type of
expansion to be processed, always before parameter expansion. Bash is
unique in this respect \-- all other shells with a brace expansion
unique in this respect -- all other shells with a brace expansion
feature perform it almost last, just before pathname expansion. First
the sequence expansion generates ten parameters, then the parameters are
expanded left-to-right causing the [arithmetic](../syntax/arith_expr.md) for
@ -103,15 +103,15 @@ a=(one two three four five six seven eight nine ten)
printf '%.*s ' $(printf '%s ' "${#a[x=RANDOM%${#a[@]}]} ${a[x]}"{1..10})
```
This generates each parameter and it\'s length in pairs. The \'\*\'
This generates each parameter and it's length in pairs. The \'\*\'
modifier instructs printf to use the value preceding each parameter as
the field width. Note the space between the parameters. This example
unfortunately relies upon the unquoted command substitution to perform
unsafe wordsplitting so that the outer printf gets each argument. Values
in the array can\'t contain characters in IFS, or anything that might be
in the array can't contain characters in IFS, or anything that might be
interpreted as a pattern without using `set -f`.
Lastly, empty brace expansions can be used which don\'t generate any
Lastly, empty brace expansions can be used which don't generate any
output that would need to be filtered. The disadvantage of course is
that you must construct the brace expansion syntax to add up to the
number of arguments to be generated, where the most optimal solution is

View File

@ -1,6 +1,6 @@
# Save and restore terminal/screen content
\-\-\-- dataentry snipplet \-\-\-- snipplet_tags: terminal, restore
---- dataentry snipplet ---- snipplet_tags: terminal, restore
screen LastUpdate_dt: 2010-07-31 Contributors: Greg Wooledge type:
snipplet

View File

@ -1,6 +1,6 @@
# Fetching SSH hostkeys without interaction
\-\-\-- dataentry snipplet \-\-\-- snipplet_tags: ssh, ssh-keys
---- dataentry snipplet ---- snipplet_tags: ssh, ssh-keys
LastUpdate_dt: 2010-07-31 Contributors: Jan Schampera
------------------------------------------------------------------------
@ -8,7 +8,7 @@ LastUpdate_dt: 2010-07-31 Contributors: Jan Schampera
Applies at least to `openssh`.
To get the hostkeys for a server, and write them to `known_hosts`-file
(to avoid that yes/no query when the key isn\'t known), you can do:
(to avoid that yes/no query when the key isn't known), you can do:
ssh-keyscan -t rsa foo foo.example.com 1.2.3.4 >> ~/.ssh/known_host

View File

@ -1,6 +1,6 @@
# Run some bash commands with SSH remotely using local variables
\-\-\-- dataentry snipplet \-\-\-- snipplet_tags: ssh, variables
---- dataentry snipplet ---- snipplet_tags: ssh, variables
LastUpdate_dt: 2010-07-31 Contributors: cweiss type: snipplet
------------------------------------------------------------------------

View File

@ -1,6 +1,6 @@
# Generate code with own arguments properly quoted
\-\-\-- dataentry snipplet \-\-\-- snipplet_tags: arguments, quoting,
---- dataentry snipplet ---- snipplet_tags: arguments, quoting,
escaping, wrapper LastUpdate_dt: 2010-07-31 Contributors: Jan Schampera
type: snipplet

View File

@ -1,6 +1,6 @@
# X-Clipboard on Commandline
\-\-\-- dataentry snipplet \-\-\-- snipplet_tags: clipboard, x11, xclip,
---- dataentry snipplet ---- snipplet_tags: clipboard, x11, xclip,
readline LastUpdate_dt: 2010-07-31 Contributors: Josh Triplett type:
snipplet
@ -31,4 +31,4 @@ The behaviour is a bit tricky to explain:
Of course you can use any other command, you\'re not limited to `xclip`
here.
Note: C-@ as well as M-SPC both works and set the mark for me \-- pgas
Note: C-@ as well as M-SPC both works and set the mark for me -- pgas

View File

@ -20,7 +20,7 @@ from the C programming language.
This article describes the theory of the used syntax and the behaviour.
To get practical examples without big explanations, see [this page on
Greg\'s
Greg's
wiki](http://mywiki.wooledge.org/BashGuide/CompoundCommands#Arithmetic_Evaluation).
## Constants
@ -64,7 +64,7 @@ When no base is specified, the base 10 (decimal) is assumed, except when
the prefixes as mentioned above (octals, hexadecimals) are present. The
specified base can range from 2 to 64. To represent digits in a
specified base greater than 10, characters other than 0 to 9 are needed
(in this order, low =\> high):
(in this order, low => high):
- `0 ... 9`
- `a ... z`
@ -72,7 +72,7 @@ specified base greater than 10, characters other than 0 to 9 are needed
- `@`
- `_`
Let\'s quickly invent a new number system with base 43 to show what I
Let's quickly invent a new number system with base 43 to show what I
mean:
$ echo $((43#1))
@ -91,7 +91,7 @@ mean:
bash: 43#H: value too great for base (error token is "43#H")
If you have no clue what a base is and why there might be other bases,
and what numbers are and how they are built, then you don\'t need
and what numbers are and how they are built, then you don't need
different bases.
If you want to convert between the usual bases (octal, decimal, hex),
@ -103,7 +103,7 @@ strings.
Shell variables can of course be used as operands, even when the integer
attribute is not turned on (by `declare -i <NAME>`). If the variable is
empty (null) or unset, its reference evaluates to 0. If the variable
doesn\'t hold a value that looks like a valid expression (numbers or
doesn't hold a value that looks like a valid expression (numbers or
operations), the expression is re-used to reference, for example, the
named parameters, e.g.:
@ -223,7 +223,7 @@ That means, the following `if`-clause will execute the `else`-thread:
`-` unary minus
`<EXPR> ? <EXPR> : <EXPR>` conditional (ternary) operator\
\<condition\> ? \<result-if-true\> : \<result-if-false\>
<condition> ? <result-if-true> : <result-if-false>
`<EXPR> , <EXPR>` expression list
@ -232,7 +232,7 @@ That means, the following `if`-clause will execute the `else`-thread:
## Precedence
The operator precedence is as follows (highest -\> lowest):
The operator precedence is as follows (highest -> lowest):
- Postfix (`id++`, `id--`)
- Prefix (`++id`, `--id`)
@ -260,16 +260,16 @@ first.
## Arithmetic expressions and return codes
Bash\'s overall language construct is based on exit codes or return
Bash's overall language construct is based on exit codes or return
codes of commands or functions to be executed. `if` statements, `while`
loops, etc., they all take the return codes of commands as conditions.
Now the problem is: The return codes (0 means \"TRUE\" or \"SUCCESS\",
not 0 means \"FALSE\" or \"FAILURE\") don\'t correspond to the meaning
not 0 means \"FALSE\" or \"FAILURE\") don't correspond to the meaning
of the result of an arithmetic expression (0 means \"FALSE\", not 0
means \"TRUE\").
That\'s why all commands and keywords that do arithmetic operations
That's why all commands and keywords that do arithmetic operations
attempt to **translate** the arithmetical meaning into an equivalent
return code. This simply means:
@ -292,7 +292,7 @@ else
fi
```
\<WRAP center round important\> Beware that `set -e` can change the
<WRAP center round important> Beware that `set -e` can change the
runtime behavior of scripts. For example,
This non-equivalence of code behavior deserves some attention. Consider
@ -332,7 +332,7 @@ echo $?
(\"SUCCESS\")
This change in code behavior was discovered once the script was run
under set -e. \</WRAP\>
under set -e. </WRAP>
## Arithmetic expressions in Bash

View File

@ -43,7 +43,7 @@ Bash supports two different types of ksh-like one-dimensional arrays.
arrays allow you to look up a value from a table based upon its
corresponding string label. **Associative arrays are always
unordered**, they merely *associate* key-value pairs. If you
retrieve multiple values from the array at once, you can\'t count on
retrieve multiple values from the array at once, you can't count on
them coming out in the same order you put them in. Associative
arrays always carry the `-A` attribute, and unlike indexed arrays,
Bash requires that they always be declared explicitly (as indexed
@ -90,7 +90,7 @@ synonymous with referring to the array name without a subscript.
hi hi hi
The only exceptions to this rule are in a few cases where the array
variable\'s name refers to the array as a whole. This is the case for
variable's name refers to the array as a whole. This is the case for
the `unset` builtin (see [destruction](#Destruction)) and when declaring
an array without assigning any values (see [declaration](#Declaration)).
@ -106,7 +106,7 @@ arrays:
`declare -a ARRAY` Declares an **indexed** array `ARRAY`. An existing array is not initialized.
`declare -A ARRAY` Declares an **associative** array `ARRAY`. This is the one and only way to create associative arrays.
As an example, and for use below, let\'s declare our `NAMES` array as
As an example, and for use below, let's declare our `NAMES` array as
described [above](#purpose):
declare -a NAMES=('Peter' 'Anna' 'Greg' 'Jan')
@ -127,17 +127,17 @@ variables.
`ARRAY+=(E1\ E2\ ...)` Append to ARRAY.
`ARRAY=("${ANOTHER_ARRAY[@]}")` Copy ANOTHER_ARRAY to ARRAY, copying each element.
As of now, arrays can\'t be exported.
As of now, arrays can't be exported.
### Getting values
\<note\> For completeness and details on several parameter expansion
<note> For completeness and details on several parameter expansion
variants, see the [article about parameter expansion](../syntax/pe.md) and
check the notes about arrays. \</note\>
check the notes about arrays. </note>
Syntax Description
----------------------------------------------------------------------- -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
`${ARRAY[N]}` Expands to the value of the index `N` in the **indexed** array `ARRAY`. If `N` is a negative number, it\'s treated as the offset from the maximum assigned index (can\'t be used for assignment) - 1
`${ARRAY[N]}` Expands to the value of the index `N` in the **indexed** array `ARRAY`. If `N` is a negative number, it's treated as the offset from the maximum assigned index (can't be used for assignment) - 1
`${ARRAY[S]}` Expands to the value of the index `S` in the **associative** array `ARRAY`.
`"${ARRAY[@]}" ${ARRAY[@]} "${ARRAY[*]}" ${ARRAY[*]}` Similar to [mass-expanding positional parameters](../scripting/posparams.md#mass_usage), this expands to all elements. If unquoted, both subscripts `*` and `@` expand to the same result, if quoted, `@` expands to all elements individually quoted, `*` expands to all elements quoted as a whole.
`"${ARRAY[@]:N:M}" ${ARRAY[@]:N:M} "${ARRAY[*]:N:M}" ${ARRAY[*]:N:M}` Similar to what this syntax does for the characters of a single string when doing [substring expansion](../syntax/pe.md#substring_expansion), this expands to `M` elements starting with element `N`. This way you can mass-expand individual indexes. The rules for quoting and the subscripts `*` and `@` are the same as above for the other mass-expansions.
@ -146,7 +146,7 @@ For clarification: When you use the subscripts `@` or `*` for
mass-expanding, then the behaviour is exactly what it is for `$@` and
`$*` when [mass-expanding the positional
parameters](../scripting/posparams.md#mass_usage). You should read this
article to understand what\'s going on.
article to understand what's going on.
### Metadata
@ -185,7 +185,7 @@ It is best to [explicitly specify
-v](../commands/builtin/unset.md#portability_considerations) when unsetting
variables with unset.
\<note warning\> Specifying unquoted array elements as arguments to any
<note warning> Specifying unquoted array elements as arguments to any
command, such as with the syntax above **may cause [pathname
expansion](../syntax/expansion/globs.md) to occur** due to the presence of
glob characters.
@ -205,7 +205,7 @@ To avoid this, **always quote** the array name and index:
unset -v 'x[1]'
This applies generally to all commands which take variable names as
arguments. Single quotes preferred. \</note\>
arguments. Single quotes preferred. </note>
## Usage
@ -217,13 +217,13 @@ less explain all the needed background theory.
Now, some examples and comments for you.
Let\'s say we have an array `sentence` which is initialized as follows:
Let's say we have an array `sentence` which is initialized as follows:
sentence=(Be liberal in what you accept, and conservative in what you send)
Since no special code is there to prevent word splitting (no quotes),
every word there will be assigned to an individual array element. When
you count the words you see, you should get 12. Now let\'s see if Bash
you count the words you see, you should get 12. Now let's see if Bash
has the same opinion:
$ echo ${#sentence[@]}
@ -246,7 +246,7 @@ sometimes. Please understand that **numerical array indexing begins at 0
The method above, walking through an array by just knowing its number of
elements, only works for arrays where all elements are set, of course.
If one element in the middle is removed, then the calculation is
nonsense, because the number of elements doesn\'t correspond to the
nonsense, because the number of elements doesn't correspond to the
highest used index anymore (we call them \"*sparse arrays*\").
Now, suppose that you want to replace your array `sentence` with the
@ -260,7 +260,7 @@ think you could just do
$ for ((i = 0; i < ${#sentence[@]}; i++)); do echo "Element $i: '${sentence[i]}'" ; done
Element 0: 'NAMES'
Obviously that\'s wrong. What about
Obviously that's wrong. What about
$ unset sentence ; declare -a sentence=${NAMES}
@ -271,7 +271,7 @@ Obviously that\'s wrong. What about
$ for ((i = 0; i < ${#sentence[@]}; i++)); do echo "Element $i: '${sentence[i]}'" ; done
Element 0: 'Peter'
So what\'s the **right** way? The (slightly ugly) answer is, reuse the
So what's the **right** way? The (slightly ugly) answer is, reuse the
enumeration syntax:
$ unset sentence ; declare -a sentence=("${NAMES[@]}")
@ -297,7 +297,7 @@ starting at zero) just is replaced with an arbitrary string:
sentence[End]='in what you send'
sentence['Very end']=...
[**Beware:**]{.underline} don\'t rely on the fact that the elements are
[**Beware:**]{.underline} don't rely on the fact that the elements are
ordered in memory like they were declared, it could look like this:
# output from 'set' command
@ -305,7 +305,7 @@ ordered in memory like they were declared, it could look like this:
This effectively means, you can get the data back with
`"${sentence[@]}"`, of course (just like with numerical indexing), but
you can\'t rely on a specific order. If you want to store ordered data,
you can't rely on a specific order. If you want to store ordered data,
or re-order data, go with numerical indexes. For associative arrays, you
usually query known index values:
@ -369,13 +369,13 @@ strings would have been inserted into the integer array without
evaluating the arithmetic. A special-case of this is shown in the next
section.
\<note\> Bash declaration commands are really keywords in disguise. They
<note> Bash declaration commands are really keywords in disguise. They
magically parse arguments to determine whether they are in the form of a
valid assignment. If so, they are evaluated as assignments. If not, they
are undergo normal argument expansion before being passed to the builtin
which evaluates the resulting string as an assignment (somewhat like
`eval`, but there are differences.) `'Todo:`\' Discuss this in detail.
\</note\>
</note>
### Indirection
@ -479,7 +479,7 @@ dynamically calls a function whose name is resolved from the array.
prior to assignment. In order to preserve attributes, you must use
the `+=` operator. However, declaring an associative array, then
attempting an `a=(...)` style compound assignment without specifying
indexes is an error. I can\'t explain this
indexes is an error. I can't explain this
inconsistency.` $ ksh -c 'function f { typeset -a a; a=([0]=foo [1]=bar); typeset -p a; }; f' # Attribute is lost, and since subscripts are given, we default to associative.
typeset -A a=([0]=foo [1]=bar)
$ ksh -c 'function f { typeset -a a; a+=([0]=foo [1]=bar); typeset -p a; }; f' # Now using += gives us the expected results.
@ -490,7 +490,7 @@ dynamically calls a function whose name is resolved from the array.
- Only Bash and mksh support compound assignment with mixed explicit
subscripts and automatically incrementing subscripts. In ksh93, in
order to specify individual subscripts within a compound assignment,
all subscripts must be given (or none). Zsh doesn\'t support
all subscripts must be given (or none). Zsh doesn't support
specifying individual subscripts at all.
- Appending to a compound assignment is a fairly portable way to
append elements after the last index of an array. In Bash, this also
@ -526,7 +526,7 @@ dynamically calls a function whose name is resolved from the array.
portability.
- Zsh and mksh do not support compound assignment arguments to
`typeset`.
- Ksh88 didn\'t support modern compound array assignment syntax. The
- Ksh88 didn't support modern compound array assignment syntax. The
original (and most portable) way to assign multiple elements is to
use the `set -A name arg1 arg2 ...` syntax. This is supported by
almost all shells that support ksh-like arrays except for Bash.
@ -563,9 +563,9 @@ dynamically calls a function whose name is resolved from the array.
- Assigning or referencing negative indexes in mksh causes
wrap-around. The max index appears to be `UINT_MAX`, which would be
addressed by `arr[-1]`.
- So far, Bash\'s `-v var` test doesn\'t support individual array
- So far, Bash's `-v var` test doesn't support individual array
subscripts. You may supply an array name to test whether an array is
defined, but can\'t check an element. ksh93\'s `-v` supports both.
defined, but can't check an element. ksh93's `-v` supports both.
Other shells lack a `-v` test.
### Bugs
@ -608,7 +608,7 @@ dynamically calls a function whose name is resolved from the array.
<a b cfoo>
`
- **Fixed in 4.3** Process substitutions are evaluated within array
indexes. Zsh and ksh don\'t do this in any arithmetic context.
indexes. Zsh and ksh don't do this in any arithmetic context.
`# print "moo"
dev=fd=1 _[1<(echo moo >&2)]=
@ -686,7 +686,7 @@ to generate these results.
variables?](http://mywiki.wooledge.org/BashFAQ/005) - A very
detailed discussion on arrays with many examples.
- [BashSheet - Arrays](http://mywiki.wooledge.org/BashSheet#Arrays) -
Bashsheet quick-reference on Greycat\'s wiki.
Bashsheet quick-reference on Greycat's wiki.
\<div hide\> vim: set fenc=utf-8 ff=unix ts=4 sts=4 sw=4 ft=dokuwiki et
wrap lbr: \</div\>
<div hide> vim: set fenc=utf-8 ff=unix ts=4 sts=4 sw=4 ft=dokuwiki et
wrap lbr: </div>

View File

@ -7,7 +7,7 @@ code you see everywhere, the code you use, is based on those rules.
However, **this is a very theoretical view**, but if you\'re interested,
it may help you understand why things look the way they look.
If you don\'t know the commands used in the following examples, just
If you don't know the commands used in the following examples, just
trust the explanation.
## Simple Commands
@ -29,20 +29,20 @@ Every complex Bash operation can be split into simple commands:
LC_ALL=C ls
The last one might not be familiar. That one simply adds \"`LC_ALL=C`\"
to the environment of the `ls` program. It doesn\'t affect your current
to the environment of the `ls` program. It doesn't affect your current
shell. This also works while calling functions, unless Bash runs in
POSIX(r) mode (in which case it affects your current shell).
Every command has an exit code. It\'s a type of return status. The shell
Every command has an exit code. It's a type of return status. The shell
can catch it and act on it. Exit code range is from 0 to 255, where 0
means success, and the rest mean either something failed, or there is an
issue to report back to the calling program.
\<wrap center round info 90%\> The simple command construct is the
<wrap center round info 90%> The simple command construct is the
**base** for all higher constructs. Everything you execute, from
pipelines to functions, finally ends up in (many) simple commands.
That\'s why Bash only has one method to [expand and execute a simple
command](../syntax/grammar/parser_exec.md). \</wrap\>
That's why Bash only has one method to [expand and execute a simple
command](../syntax/grammar/parser_exec.md). </wrap>
## Pipelines
@ -50,8 +50,8 @@ FIXME Missing an additional article about pipelines and pipelining
`[time [-p]] [ ! ] command [ | command2 ... ]`
**Don\'t get confused** about the name \"pipeline.\" It\'s a grammatic
name for a construct. Such a pipeline isn\'t necessarily a pair of
**Don't get confused** about the name \"pipeline.\" It's a grammatic
name for a construct. Such a pipeline isn't necessarily a pair of
commands where stdout/stdin is connected via a real pipe.
Pipelines are one or more [simple
@ -76,24 +76,24 @@ executed if the pattern \"\^root:\" is **not** found in `/etc/passwd`:
Yes, this is also a pipeline (although there is no pipe!), because the
**exclamation mark to invert the exit code** can only be used in a
pipeline. If `grep`\'s exit code is 1 (FALSE) (the text was not found),
pipeline. If `grep`'s exit code is 1 (FALSE) (the text was not found),
the leading `!` will \"invert\" the exit code, and the shell sees (and
acts on) exit code 0 (TRUE) and the `then` part of the `if` stanza is
executed. One could say we checked for
\"`not grep "^root" /etc/passwd`\".
The [set option pipefail](../commands/builtin/set.md#attributes) determines
the behavior of how bash reports the exit code of a pipeline. If it\'s
the behavior of how bash reports the exit code of a pipeline. If it's
set, then the exit code (`$?`) is the last command that exits with non
zero status, if none fail, it\'s zero. If it\'s not set, then `$?`
zero status, if none fail, it's zero. If it's not set, then `$?`
always holds the exit code of the last command (as explained above).
The shell option `lastpipe` will execute the last element in a pipeline
construct in the current shell environment, i.e. not a subshell.
There\'s also an array `PIPESTATUS[]` that is set after a foreground
There's also an array `PIPESTATUS[]` that is set after a foreground
pipeline is executed. Each element of `PIPESTATUS[]` reports the exit
code of the respective command in the pipeline. Note: (1) it\'s only for
code of the respective command in the pipeline. Note: (1) it's only for
foreground pipe and (2) for higher level structure that is built up from
a pipeline. Like list, `PIPESTATUS[]` holds the exit status of the last
pipeline command executed.
@ -115,7 +115,7 @@ A list is a sequence of one or more [pipelines](../basicgrammar.md#pipelines)
separated by one of the operators `;`, `&`, `&&`, or `││`, and
optionally terminated by one of `;`, `&`, or `<newline>`.
=\> It\'s a group of **pipelines** separated or terminated by **tokens**
=> It's a group of **pipelines** separated or terminated by **tokens**
that all have **different meanings** for Bash.
Your whole Bash script technically is one big single list!
@ -139,7 +139,7 @@ There are two forms of compound commands:
- form a new syntax element using a list as a \"body\"
- completly independant syntax elements
Essentially, everything else that\'s not described in this article.
Essentially, everything else that's not described in this article.
Compound commands have the following characteristics:
- they **begin** and **end** with a specific keyword or operator (e.g.
@ -151,17 +151,17 @@ overview):
Compound command syntax Description
------------------------------------------------------------ ---------------------------------------------------------------------------------------------------------------------------------------------------------
`( <LIST> )` Execute `<LIST>` in an extra subshell =\> [article](../syntax/ccmd/grouping_subshell.md)
`{ <LIST> ; }` Execute `<LIST>` as separate group (but not in a subshell) =\> [article](../syntax/ccmd/grouping_plain.md)
`(( <EXPRESSION> ))` Evaluate the arithmetic expression `<EXPRESSION>` =\> [article](../syntax/ccmd/arithmetic_eval.md)
`[[ <EXPRESSION> ]]` Evaluate the conditional expression `<EXPRESSION>` (aka \"the new test command\") =\> [article](../syntax/ccmd/conditional_expression.md)
`for <NAME> in <WORDS> ; do <LIST> ; done` Executes `<LIST>` while setting the variable `<NAME>` to one of `<WORDS>` on every iteration (classic for-loop) =\> [article](../syntax/ccmd/classic_for.md)
`for (( <EXPR1> ; <EXPR2> ; <EXPR3> )) ; do <LIST> ; done` C-style for-loop (driven by arithmetic expressions) =\> [article](../syntax/ccmd/c_for.md)
`select <NAME> in <WORDS> ; do <LIST> ; done` Provides simple menus =\> [article](../syntax/ccmd/user_select.md)
`case <WORD> in <PATTERN>) <LIST> ;; ... esac` Decisions based on pattern matching - executing `<LIST>` on match =\> [article](../syntax/ccmd/case.md)
`if <LIST> ; then <LIST> ; else <LIST> ; fi` The if clause: makes decisions based on exit codes =\> [article](../syntax/ccmd/if_clause.md)
`while <LIST1> ; do <LIST2> ; done` Execute `<LIST2>` while `<LIST1>` returns TRUE (exit code) =\> [article](../syntax/ccmd/while_loop.md)
`until <LIST1> ; do <LIST2> ; done` Execute `<LIST2>` until `<LIST1>` returns TRUE (exit code) =\> [article](../syntax/ccmd/until_loop.md)
`( <LIST> )` Execute `<LIST>` in an extra subshell => [article](../syntax/ccmd/grouping_subshell.md)
`{ <LIST> ; }` Execute `<LIST>` as separate group (but not in a subshell) => [article](../syntax/ccmd/grouping_plain.md)
`(( <EXPRESSION> ))` Evaluate the arithmetic expression `<EXPRESSION>` => [article](../syntax/ccmd/arithmetic_eval.md)
`[[ <EXPRESSION> ]]` Evaluate the conditional expression `<EXPRESSION>` (aka \"the new test command\") => [article](../syntax/ccmd/conditional_expression.md)
`for <NAME> in <WORDS> ; do <LIST> ; done` Executes `<LIST>` while setting the variable `<NAME>` to one of `<WORDS>` on every iteration (classic for-loop) => [article](../syntax/ccmd/classic_for.md)
`for (( <EXPR1> ; <EXPR2> ; <EXPR3> )) ; do <LIST> ; done` C-style for-loop (driven by arithmetic expressions) => [article](../syntax/ccmd/c_for.md)
`select <NAME> in <WORDS> ; do <LIST> ; done` Provides simple menus => [article](../syntax/ccmd/user_select.md)
`case <WORD> in <PATTERN>) <LIST> ;; ... esac` Decisions based on pattern matching - executing `<LIST>` on match => [article](../syntax/ccmd/case.md)
`if <LIST> ; then <LIST> ; else <LIST> ; fi` The if clause: makes decisions based on exit codes => [article](../syntax/ccmd/if_clause.md)
`while <LIST1> ; do <LIST2> ; done` Execute `<LIST2>` while `<LIST1>` returns TRUE (exit code) => [article](../syntax/ccmd/while_loop.md)
`until <LIST1> ; do <LIST2> ; done` Execute `<LIST2>` until `<LIST1>` returns TRUE (exit code) => [article](../syntax/ccmd/until_loop.md)
## Shell Function Definitions
@ -208,7 +208,7 @@ Bash allows three equivalent forms of the function definition:
The space between `NAME` and `()` is optional, usually you see it
without the space.
I suggest using the first form. It\'s specified in POSIX and all
I suggest using the first form. It's specified in POSIX and all
Bourne-like shells seem to support it.
[**Note:**]{.underline} Before version `2.05-alpha1`, Bash only
@ -254,14 +254,14 @@ It is possible to create function names containing slashes:
echo LS FAKE
}
The elements of this name aren\'t subject to a path search.
The elements of this name aren't subject to a path search.
Weird function names should not be used. Quote from the maintainer:
- * It was a mistake to allow such characters in function names
(\`unset\' doesn\'t work to unset them without forcing -f, for
(\`unset\' doesn't work to unset them without forcing -f, for
instance). We\'re stuck with them for backwards compatibility, but I
don\'t have to encourage their use. *
don't have to encourage their use. *
## Grammar summary
@ -312,7 +312,7 @@ FIXME more\...
- the [list](../basicgrammar.md#lists) that `if` **executes** contains a
simple command (`cp mymusic.mp3 /data/mp3`)
Let\'s invert test command exit code, only one thing changes:
Let's invert test command exit code, only one thing changes:
if ! [ -d /data/mp3 ]; then
cp mymusic.mp3 /data/mp3

View File

@ -28,7 +28,7 @@ mechanisms available in the language.
The `((;;))` syntax at the top of the loop is not an ordinary
[arithmetic compound command](../../syntax/ccmd/arithmetic_eval.md), but is part
of the C-style for-loop\'s own syntax. The three sections separated by
of the C-style for-loop's own syntax. The three sections separated by
semicolons are [arithmetic expression](../../syntax/arith_expr.md) contexts.
Each time one of the sections is to be evaluated, the section is first
processed for: brace, parameter, command, arithmetic, and process
@ -68,7 +68,7 @@ command](../../syntax/ccmd/arithmetic_eval.md) would be structured as:
(( <EXPR3> ))
done
The equivalent `while` construct isn\'t exactly the same, because both,
The equivalent `while` construct isn't exactly the same, because both,
the `for` and the `while` loop behave differently in case you use the
[continue](../../commands/builtin/continuebreak.md) command.
@ -83,9 +83,9 @@ Bash, Ksh93, Mksh, and Zsh also provide an alternate syntax for the
echo $x
}
This syntax is **not documented** and shouldn\'t be used. I found the
This syntax is **not documented** and shouldn't be used. I found the
parser definitions for it in 1.x code, and in modern 4.x code. My guess
is that it\'s there for compatibility reasons. Unlike the other
is that it's there for compatibility reasons. Unlike the other
aforementioned shells, Bash does not support the analogous syntax for
[case..esac](../../syntax/ccmd/case.md#portability_considerations).
@ -96,15 +96,15 @@ The return status is that of the last command executed from `<LIST>`, or
## Alternatives and best practice
\<div center round todo 60%\>TODO: Show some alternate usages involving
functions and local variables for initialization.\</div\>
<div center round todo 60%>TODO: Show some alternate usages involving
functions and local variables for initialization.</div>
## Examples
### Simple counter
A simple counter, the loop iterates 101 times (\"0\" to \"100\" are 101
numbers -\> 101 runs!), and everytime the variable `x` is set to the
numbers -> 101 runs!), and everytime the variable `x` is set to the
current value.
- It **initializes** `x = 0`
@ -132,7 +132,7 @@ will count from 0 to 100, but with a **step of 10**.
This example loops through the bit-values of a Byte, beginning from 128,
ending at 1. If that bit is set in the `testbyte`, it prints \"`1`\",
else \"`0`\" =\> it prints the binary representation of the `testbyte`
else \"`0`\" => it prints the binary representation of the `testbyte`
value (8 bits).
#!/usr/bin/env bash
@ -161,7 +161,7 @@ value (8 bits).
# vim: set fenc=utf-8 ff=unix ft=sh :
\<div hide\>
<div hide>
testbyte=123
for (( n = 128 ; n >= 1 ; n /= 2 )); do
@ -173,10 +173,10 @@ value (8 bits).
done
echo
\</div\>
</div>
Why that one begins at 128 (highest value, on the left) and not 1
(lowest value, on the right)? It\'s easier to print from left to
(lowest value, on the right)? It's easier to print from left to
right\...
We arrive at 128 for `n` through the recursive arithmetic expression
@ -196,7 +196,7 @@ variables.
printf '%*s\n' "$((n+1))" "$n"
done
\<code\> \~ \$ bash \<(xclip -o) 1
<code> \~ \$ bash <(xclip -o) 1
2
3
@ -216,20 +216,20 @@ variables.
3
2
1 \</code\>
1 </code>
## Portability considerations
- C-style for loops aren\'t POSIX. They are available in Bash, ksh93,
- C-style for loops aren't POSIX. They are available in Bash, ksh93,
and zsh. All 3 have essentially the same syntax and behavior.
- C-style for loops aren\'t available in mksh.
- C-style for loops aren't available in mksh.
## Bugs
- *Fixed in 4.3*. ~~There appears to be a bug as of Bash 4.2p10 in
which command lists can\'t be distinguished from the for loop\'s
which command lists can't be distinguished from the for loop's
arithmetic argument delimiter (both semicolons), so command
substitutions within the C-style for loop expression can\'t contain
substitutions within the C-style for loop expression can't contain
more than one command.~~
## See also

View File

@ -18,16 +18,16 @@ against every pattern `<PATTERNn>` and on a match, the associated
[list](../../syntax/basicgrammar.md#lists) `<LISTn>` is executed. Every
commandlist is terminated by `;;`. This rule is optional for the very
last commandlist (i.e., you can omit the `;;` before the `esac`). Every
`<PATTERNn>` is separated from it\'s associated `<LISTn>` by a `)`, and
`<PATTERNn>` is separated from it's associated `<LISTn>` by a `)`, and
is optionally preceded by a `(`.
Bash 4 introduces two new action terminators. The classic behavior using
`;;` is to execute only the list associated with the first matching
pattern, then break out of the `case` block. The `;&` terminator causes
`case` to also execute the next block without testing its pattern. The
`;;&` operator is like `;;`, except the case statement doesn\'t
`;;&` operator is like `;;`, except the case statement doesn't
terminate after executing the associated list - Bash just continues
testing the next pattern as though the previous pattern didn\'t match.
testing the next pattern as though the previous pattern didn't match.
Using these terminators, a `case` statement can be configured to test
against all patterns, or to share code between blocks, for example.
@ -74,17 +74,17 @@ Another one of my stupid examples\...
echo 'Hm, a bit awry, no?'
;;
orange|tangerine)
echo $'Eeeks! I don\'t like those!\nGo away!'
echo $'Eeeks! I don't like those!\nGo away!'
exit 1
;;
*)
echo "Unknown fruit - sure it isn't toxic?"
esac
Here\'s a practical example showing a common pattern involving a `case`
Here's a practical example showing a common pattern involving a `case`
statement. If the first argument is one of a valid set of alternatives,
then perform some sysfs operations under Linux to control a video
card\'s power profile. Otherwise, show a usage synopsis, and print the
card's power profile. Otherwise, show a usage synopsis, and print the
current power profile and GPU temperature.
``` bash
@ -146,14 +146,14 @@ f a b c
## Portability considerations
- Only the `;;` delimiter is specified by POSIX.
- zsh and mksh use the `;|` control operator instead of Bash\'s `;;&`.
- zsh and mksh use the `;|` control operator instead of Bash's `;;&`.
Mksh has `;;&` for Bash compatability (undocumented).
- ksh93 has the `;&` operator, but no `;;&` or equivalent.
- ksh93, mksh, zsh, and posh support a historical syntax where open
and close braces may be used in place of `in` and `esac`:
`case word { x) ...; };`. This is similar to the alternate form Bash
supports for its [for loops](../../syntax/ccmd/classic_for.md), but Bash
doesn\'t support this syntax for `case..esac`.
doesn't support this syntax for `case..esac`.
## See also

View File

@ -53,7 +53,7 @@ for x in 1 2 3
This syntax is **not documented** and should not be used. I found the
parser definitions for it in 1.x code, and in modern 4.x code. My guess
is that it\'s there for compatiblity reasons. This syntax is not
is that it's there for compatiblity reasons. This syntax is not
specified by POSIX(r).
### Return status
@ -137,13 +137,13 @@ done
This is just an example. In *general*
- it\'s not a good idea to parse `ls(1)` output
- it's not a good idea to parse `ls(1)` output
- the [while loop](../../syntax/ccmd/while_loop.md) (using the `read` command)
is a better joice to iterate over lines
### Nested for-loops
It\'s of course possible to use another for-loop as `<LIST>`. Here,
It's of course possible to use another for-loop as `<LIST>`. Here,
counting from 0 to 99 in a weird way:
``` bash

View File

@ -8,7 +8,7 @@
The conditional expression is meant as the modern variant of the
[classic test command](../../commands/classictest.md). Since it is **not** a
normal command, Bash doesn\'t need to apply the normal commandline
normal command, Bash doesn't need to apply the normal commandline
parsing rules like recognizing `&&` as [command
list](../../syntax/basicgrammar.md#lists) operator.
@ -52,7 +52,7 @@ When the operators `<` and `>` are used (string collation order), the
test happens using the current locale when the `compat` level is greater
than \"40\".
Operator precedence (highest =\> lowest):
Operator precedence (highest => lowest):
- `( <EXPRESSION> )`
- `! <EXPRESSION>`
@ -79,7 +79,7 @@ quoting:
fi
Compare that to the [classic test command](../../commands/classictest.md), where
word splitting is done (because it\'s a normal command, not something
word splitting is done (because it's a normal command, not something
special):
sentence="Be liberal in what you accept, and conservative in what you send"
@ -151,18 +151,18 @@ Example:
### Behaviour differences compared to the builtin test command
As of Bash 4.1 alpha, the test primaries \'\<\' and \'\>\' (compare
As of Bash 4.1 alpha, the test primaries \'<\' and \'>\' (compare
strings lexicographically) use the current locale settings, while the
same primitives for the builtin test command don\'t. This leads to the
same primitives for the builtin test command don't. This leads to the
following situation where they behave differently:
$ ./cond.sh
[[ ' 4' < '1' ]] --> exit 1
[[ 'step+' < 'step-' ]] --> exit 1
[ ' 4' \< '1' ] --> exit 0
[ 'step+' \< 'step-' ] --> exit 0
[ ' 4' < '1' ] --> exit 0
[ 'step+' < 'step-' ] --> exit 0
It won\'t be aligned. The conditional expression continues to respect
It won't be aligned. The conditional expression continues to respect
the locate, as introduced with 4.1-alpha, the builtin `test`/`[` command
continues to behave differently.
@ -178,7 +178,7 @@ both contains whitespace and is not the result of an expansion.
## Portability considerations
- `[[ ... ]]` functionality isn\'t specified by POSIX(R), though it\'s
- `[[ ... ]]` functionality isn't specified by POSIX(R), though it's
a reserved word
- Amongst the major \"POSIX-shell superset languages\" (for lack of a
better term) which do have `[[`, the test expression compound
@ -187,12 +187,12 @@ both contains whitespace and is not the result of an expansion.
between Ksh88, Ksh93, mksh, Zsh, and Bash. Ksh93 also adds a large
number of unique pattern matching features not supported by other
shells including support for several different regex dialects, which
are invoked using a different syntax from Bash\'s `=~`, though `=~`
are invoked using a different syntax from Bash's `=~`, though `=~`
is still supported by ksh and defaults to ERE.
- As an extension to POSIX ERE, most GNU software supports
backreferences in ERE, including Bash. According to POSIX, only BRE
is supposed to support them. This requires Bash to be linked against
glibc, so it won\'t necessarily work on all platforms. For example,
glibc, so it won't necessarily work on all platforms. For example,
`$(m='(abc(def))(\1)(\2)'; [[ abcdefabcdefdef =~ $m ]]; printf '<%s> ' $? "${BASH_REMATCH[@]}" )`
will give `<0> <abcdefabcdefdef> <abcdef> <def> <abcdef> <def>`.
- the `=~` (regex) operator was introduced in Bash 3.0, and its
@ -207,5 +207,5 @@ both contains whitespace and is not the result of an expansion.
- Internal: [the classic test command](../../commands/classictest.md)
- Internal: [the if-clause](../../syntax/ccmd/if_clause.md)
- [What is the difference between test, \[ and \[\[
?](http://mywiki.wooledge.org/BashFAQ/031) - BashFAQ 31 - Greg\'s
?](http://mywiki.wooledge.org/BashFAQ/031) - BashFAQ 31 - Greg's
wiki.

View File

@ -31,7 +31,7 @@ The input and output **filedescriptors** are cumulative:
This compound command also usually is the body of a [function
definition](../../syntax/basicgrammar.md#shell_function_definitions), though not
the only compound command that\'s valid there:
the only compound command that's valid there:
print_help() {
echo "Options:"
@ -64,8 +64,8 @@ the only compound command that\'s valid there:
without extra separator (also in some other shells), **example**:
`{ while sleep 1; do echo ZzZzzZ; done }` is valid. But this is not
documented, infact the documentation explicitly says that a
semicolon or a newline must separate the enclosed list. \-- thanks
semicolon or a newline must separate the enclosed list. -- thanks
`geirha` at Freenode
[^2]: The main reason is the fact that in shell grammar, the curly
braces are not control operators but reserved words \-- TheBonsai
braces are not control operators but reserved words -- TheBonsai

View File

@ -32,4 +32,4 @@ echo "$PWD" # Still in the original directory.
## See also
- [grouping commands](../../syntax/ccmd/grouping_plain.md)
- [Subshells on Greycat\'s wiki](http://mywiki.wooledge.org/SubShell)
- [Subshells on Greycat's wiki](http://mywiki.wooledge.org/SubShell)

View File

@ -22,7 +22,7 @@
## Description
The `if`-clause can control the script\'s flow (what\'s executed) by
The `if`-clause can control the script's flow (what's executed) by
looking at the exit codes of other commands.
All commandsets `<LIST>` are interpreted as [command
@ -65,19 +65,19 @@ commands of the condition that succeeded.
**Multiple commands as condition**
It\'s perfectly valid to do:
It's perfectly valid to do:
if echo "I'm testing!"; [ -e /some/file ]; then
...
fi
The exit code that dictates the condition\'s value is the exit code of
The exit code that dictates the condition's value is the exit code of
the very last command executed in the condition-list (here: The
`[ -e /some/file ]`)
**A complete pipe as condition**
A complete pipe can also be used as condition. It\'s very similar to the
A complete pipe can also be used as condition. It's very similar to the
example above (multiple commands):
if echo "Hello world!" | grep -i hello >/dev/null 2>&1; then

View File

@ -1,6 +1,6 @@
# Bash compound commands
The main part of Bash\'s syntax are the so-called **compound commands**.
The main part of Bash's syntax are the so-called **compound commands**.
They\'re called like that because they use \"real\" commands ([simple
commands](../../syntax/basicgrammar.md#simple_commands) or
[lists](../../syntax/basicgrammar.md#lists)) and knit some intelligence around

View File

@ -46,7 +46,7 @@ loop body in `{...}` instead of `do ... done`:
This syntax is **not documented** and should not be used. I found the
parser definitions for it in 1.x code, and in modern 4.x code. My guess
is that it\'s there for compatiblity reasons. This syntax is not
is that it's there for compatiblity reasons. This syntax is not
specified by POSIX(R).
## Examples

View File

@ -8,7 +8,7 @@ The [arithmetic expression](../../syntax/arith_expr.md) `<EXPRESSION>` is
evaluated and expands to the result. The output of the arithmetic
expansion is guaranteed to be one word and a digit in Bash.
Please **do not use the second form `$[ ... ]`**! It\'s deprecated. The
Please **do not use the second form `$[ ... ]`**! It's deprecated. The
preferred and standardized form is `$(( ... ))`!
Example
@ -26,7 +26,7 @@ function printSum {
}
```
**Note** that in Bash you don\'t need the arithmetic expansion to check
**Note** that in Bash you don't need the arithmetic expansion to check
for the boolean value of an arithmetic expression. This can be done
using the [arithmetic evaluation compound
command](../../syntax/ccmd/arithmetic_eval.md):
@ -58,11 +58,11 @@ echo $(($x[0])) # Error. This expands to $((1[0])), an invalid expression.
## Bugs and Portability considerations
- The original Bourne shell doesn\'t have arithmetic expansions. You
- The original Bourne shell doesn't have arithmetic expansions. You
have to use something like `expr(1)` within backticks instead. Since
`expr` is horrible (as are backticks), and arithmetic expansion is
required by POSIX, you should not worry about this, and preferably
fix any code you find that\'s still using `expr`.
fix any code you find that's still using `expr`.
## See also

View File

@ -17,10 +17,10 @@ Brace expansion is used to generate arbitrary strings. The specified
strings are used to generate **all possible combinations** with the
optional surrounding prefixes and suffixes.
Usually it\'s used to generate mass-arguments for a command, that follow
Usually it's used to generate mass-arguments for a command, that follow
a specific naming-scheme.
:!: It is the very first step in expansion-handling, it\'s important to
:!: It is the very first step in expansion-handling, it's important to
understand that. When you use
echo {a,b}$PATH
@ -32,9 +32,9 @@ in a **later step**. Brace expansion just makes it being:
Another common pitfall is to assume that a range like `{1..200}` can be
expressed with variables using `{$a..$b}`. Due to what I described
above, it **simply is not possible**, because it\'s the very first step
above, it **simply is not possible**, because it's the very first step
in doing expansions. A possible way to achieve this, if you really
can\'t handle this in another way, is using the `eval` command, which
can't handle this in another way, is using the `eval` command, which
basically evaluates a commandline twice: `eval echo {$a..$b}` For
instance, when embedded inside a for loop :
`for i in $(eval echo {$a..$b})` This requires that the entire command
@ -78,8 +78,8 @@ With prefix or suffix strings, the result is a space-separated list of
The brace expansion is only performed, if the given string list is
really a **list of strings**, i.e., if there is a minimum of one \"`,`\"
(comma)! Something like `{money}` doesn\'t expand to something special,
it\'s really only the text \"`{money}`\".
(comma)! Something like `{money}` doesn't expand to something special,
it's really only the text \"`{money}`\".
## Ranges
@ -112,7 +112,7 @@ generated range is zero padded, too:
$ echo {01..10}
01 02 03 04 05 06 07 08 09 10
There\'s a chapter of Bash 4 brace expansion changes at [the end of this
There's a chapter of Bash 4 brace expansion changes at [the end of this
article](#new_in_bash_4.0).
Similar to the expansion using stringlists, you can add prefix and
@ -127,7 +127,7 @@ suffix strings:
## Combining and nesting
When you combine more brace expansions, you effectively use a brace
expansion as prefix or suffix for another one. Let\'s generate all
expansion as prefix or suffix for another one. Let's generate all
possible combinations of uppercase letters and digits:
$ echo {A..Z}{0..9}
@ -147,7 +147,7 @@ Hey.. that **saves you writing** 260 strings!
Brace expansions can be nested, but too much of it usually makes you
losing overview a bit ;-)
Here\'s a sample to generate the alphabet, first the uppercase letters,
Here's a sample to generate the alphabet, first the uppercase letters,
then the lowercase ones:
$ echo {{A..Z},{a..z}}
@ -160,18 +160,18 @@ then the lowercase ones:
In this example, `wget` is used to download documentation that is split
over several numbered webpages.
`wget` won\'t see your braces. It will see **6 different URLs** to
`wget` won't see your braces. It will see **6 different URLs** to
download.
wget http://docs.example.com/documentation/slides_part{1,2,3,4,5,6}.html
Of course it\'s possible, and even easier, to do that with a sequence:
Of course it's possible, and even easier, to do that with a sequence:
wget http://docs.example.com/documentation/slides_part{1..6}.html
### Generate a subdirectory structure
Your life is hard? Let\'s ease it a bit - that\'s what shells are here
Your life is hard? Let's ease it a bit - that's what shells are here
for.
mkdir /home/bash/test/{foo,bar,baz,cat,dog}
@ -209,12 +209,12 @@ Can be written as
\...which is a kind of a hack, but hey, it works.
\<div round info\>
<div round info>
#### More fun
The most optimal possible brace expansion to expand n arguments of
course consists of n\'s prime factors. We can use the \"factor\" program
course consists of n's prime factors. We can use the \"factor\" program
bundled with GNU coreutils to emit a brace expansion that will expand
any number of arguments.
@ -234,7 +234,7 @@ In this case, the output is:
eval printf "$arg"{,,}{,,}{,,}{,,}{,,}{,,}{,,,,,}{,,,,,}{,,,,,}{,,,,,}{,,,,,}{,,,,,}
\</div\>
</div>
## New in Bash 4.0

View File

@ -23,7 +23,7 @@ results, quote the command substitution!**
The second form `` `COMMAND` `` is more or less obsolete for Bash, since
it has some trouble with nesting (\"inner\" backticks need to be
escaped) and escaping characters. Use `$(COMMAND)`, it\'s also POSIX!
escaped) and escaping characters. Use `$(COMMAND)`, it's also POSIX!
When you [call an explicit subshell](../../syntax/ccmd/grouping_subshell.md)
`(COMMAND)` inside the command substitution `$()`, then take care, this
@ -51,8 +51,8 @@ command was `cat FILE`.
## A closer look at the two forms
In general you really should only use the form `$()`, it\'s
escaping-neutral, it\'s nestable, it\'s also POSIX. But take a look at
In general you really should only use the form `$()`, it's
escaping-neutral, it's nestable, it's also POSIX. But take a look at
the following code snips to decide yourself which form you need under
specific circumstances:
@ -77,7 +77,7 @@ normal on a commandline. No special escaping of **nothing** is needed:
**[Constructs you should avoid]{.underline}**
It\'s not all shiny with `$()`, at least for my current Bash
It's not all shiny with `$()`, at least for my current Bash
(`3.1.17(1)-release`. :!: [**Update:** Fixed since `3.2-beta` together
with a misinterpretion of \'))\' being recognized as arithmetic
expansion \[by redduck666\]]{.underline}). This command seems to
@ -101,10 +101,10 @@ uncommon ;-)) construct like:
In general, the `$()` should be the preferred method:
- it\'s clean syntax
- it\'s intuitive syntax
- it\'s more readable
- it\'s nestable
- it's clean syntax
- it's intuitive syntax
- it's more readable
- it's nestable
- its inner parsing is separate
## Examples

View File

@ -26,7 +26,7 @@ other `IFS`-characters they contain.
be matched explicitly
- the dot at the beginning of a filename must be matched explicitly
(also one following a `/` in the glob)
- a glob that doesn\'t match a filename is unchanged and remains what
- a glob that doesn't match a filename is unchanged and remains what
it is
## Customization
@ -42,12 +42,12 @@ other `IFS`-characters they contain.
- when the shell option `dirspell` is set, Bash performs spelling
corrections when matching directory names
- when the shell option `globstar` is set, the glob `**` will
recursively match all files and directories. This glob isn\'t
\"configurable\", i.e. you **can\'t** do something like `**.c` to
recursively match all files and directories. This glob isn't
\"configurable\", i.e. you **can't** do something like `**.c` to
recursively get all `*.c` filenames.
- when the shell option `globasciiranges` is set, the bracket-range
globs (e.g. `[A-Z]`) use C locale order rather than the configured
locale\'s order (i.e. `ABC...abc...` instead of e.g. `AaBbCc...`) -
locale's order (i.e. `ABC...abc...` instead of e.g. `AaBbCc...`) -
since 4.3-alpha
- the variable [GLOBIGNORE](../../syntax/shellvars.md#GLOBIGNORE) can be set
to a colon-separated list of patterns to be removed from the list
@ -80,7 +80,7 @@ with an error, since no file named `*.txt` exists.
Now, when the shell option `nullglob` is set, Bash will remove the
entire glob from the command line. In case of the for-loop here, not
even one iteration will be done. It just won\'t run.
even one iteration will be done. It just won't run.
So in our first example:

View File

@ -14,20 +14,20 @@ The most simple example of this behaviour is a referenced variable:
mystring="Hello world"
echo "$mystring"
The `echo` program definitely doesn\'t care about what a shell variable
is. It is Bash\'s job to deal with the variable. Bash **expands** the
The `echo` program definitely doesn't care about what a shell variable
is. It is Bash's job to deal with the variable. Bash **expands** the
string \"`$mystring`\" to \"`Hello world`\", so that `echo` will only
see `Hello world`, not the variable or anything else!
After all these expansions and substitutions are done, all quotes that
are not meant literally (i.e., [the quotes that marked contiguous
words](../../syntax/quoting.md), as part of the shell syntax) are removed from
the commandline text, so the called program won\'t see them. This step
the commandline text, so the called program won't see them. This step
is called **quote-removal**.
## Overview
Saw a possible expansion syntax but don\'t know what it is? Here\'s a
Saw a possible expansion syntax but don't know what it is? Here's a
small list.
- [Parameter expansion](../../syntax/pe.md) (it has its own [overview

View File

@ -24,13 +24,13 @@ is connected to a FIFO or a file in `/dev/fd/`. The filename (where the
filedescriptor is connected) is then used as a substitution for the
`<(...)`-construct.
That, for example, allows to give data to a command that can\'t be
reached by pipelining (that doesn\'t expect its data from `stdin` but
That, for example, allows to give data to a command that can't be
reached by pipelining (that doesn't expect its data from `stdin` but
from a file).
### Scope
\<note important\> Note: According to multiple comments and sources, the
<note important> Note: According to multiple comments and sources, the
scope of process substitution file descriptors is **not** stable,
guaranteed, or specified by bash. Newer versions of bash (5.0+) seem to
have shorter scope, and substitutions scope seems to be shorter than
@ -39,7 +39,7 @@ function scope. See
and
[stackoverflow](https://stackoverflow.com/questions/46660020/bash-what-is-the-scope-of-the-process-substitution);
the latter discussion contains a script that can test the scoping
behavior case-by-case \</note\>
behavior case-by-case </note>
If a process substitution is expanded as an argument to a function,
expanded to an environment variable during calling of a function, or
@ -52,8 +52,8 @@ the caller when the callee returns.
In essence, process substitutions expanded to variables within functions
remain open until the function in which the process substitution occured
returns - even when assigned to locals that were set by a function\'s
caller. Dynamic scope doesn\'t protect them from closing.
returns - even when assigned to locals that were set by a function's
caller. Dynamic scope doesn't protect them from closing.
## Examples
@ -74,7 +74,7 @@ diff <(ls "$first_directory") <(ls "$second_directory")
```
This will compare the contents of each directory. In this command, each
*process* is *substituted* for a *file*, and diff doesn\'t see \<(bla),
*process* is *substituted* for a *file*, and diff doesn't see <(bla),
it sees two files, so the effective command is something like
``` bash
@ -85,10 +85,10 @@ where those files are written to and destroyed automatically.
### Avoiding subshells
\<WRAP center round info 60%\> See Also:
[BashFAQ/024](http://mywiki.wooledge.org/BashFAQ/024) \-- *I set
variables in a loop that\'s in a pipeline. Why do they disappear after
the loop terminates? Or, why can\'t I pipe data to read?* \</WRAP\>
<WRAP center round info 60%> See Also:
[BashFAQ/024](http://mywiki.wooledge.org/BashFAQ/024) -- *I set
variables in a loop that's in a pipeline. Why do they disappear after
the loop terminates? Or, why can't I pipe data to read?* </WRAP>
One of the most common uses for process substitutions is to avoid the
final subshell that results from executing a pipeline. The following is
@ -124,7 +124,7 @@ echo "$counter files"
```
This is the normal input file redirection `< FILE`, just that the `FILE`
in this case is the result of process substitution. It\'s important to
in this case is the result of process substitution. It's important to
note that the space is required in order to disambiguate the syntax from
[here documents](../../syntax/redirection.md#here_documents).
@ -138,7 +138,7 @@ note that the space is required in order to disambiguate the syntax from
This example demonstrates how process substitutions can be made to
resemble \"passable\" objects. This results in converting the output of
`f`\'s argument to uppercase.
`f`'s argument to uppercase.
``` bash
f() {
@ -159,7 +159,7 @@ See the above section on [#scope](#scope)
- Process substitution is supported only on systems that support
either named pipes (FIFO - a [special
file](../../dict/terms/special_file.md)) or the `/dev/fd/*` method for
accessing open files. If the system doesn\'t support `/dev/fd/*`,
accessing open files. If the system doesn't support `/dev/fd/*`,
Bash falls back to creating named pipes. Note that not all shells
that support process substitution have that fallback.
- Bash evaluates process substitutions within array indices, but not

View File

@ -23,7 +23,7 @@ The tilde expansion is used to expand to several specific pathnames:
Tilde expansion is only performed, when the tilde-construct is at the
beginning of a word, or a separate word.
If there\'s nothing to expand, i.e., in case of a wrong username or any
If there's nothing to expand, i.e., in case of a wrong username or any
other error condition, the tilde construct is not replaced, it stays
what it is.
@ -33,13 +33,13 @@ Tilde expansion is also performed everytime a variable is assigned:
- after **every** `:` (colon) in the assigned value:
`TARGET=file:~moonman/share`
\<note info\> As of now (Bash 4.3-alpha) the following constructs
**also** works, though it\'s not a variable assignment:
<note info> As of now (Bash 4.3-alpha) the following constructs
**also** works, though it's not a variable assignment:
echo foo=~
echo foo=:~
I don\'t know yet, if this is a bug or intended. \</note\>
I don't know yet, if this is a bug or intended. </note>
This way you can correctly use the tilde expansion in your
[PATH](../../syntax/shellvars.md#PATH):
@ -60,15 +60,15 @@ This way you can correctly use the tilde expansion in your
This form expands to the home-directory of the current user (`~`) or the
home directory of the given user (`~<NAME>`).
If the given user doesn\'t exist (or if his home directory isn\'t
determinable, for some reason), it doesn\'t expand to something else, it
If the given user doesn't exist (or if his home directory isn't
determinable, for some reason), it doesn't expand to something else, it
stays what it is. The requested home directory is found by asking the
operating system for the associated home directory for `<NAME>`.
To find the home directory of the current user (`~`), Bash has a
precedence:
- expand to the value of [HOME](../../syntax/shellvars.md#HOME) if it\'s
- expand to the value of [HOME](../../syntax/shellvars.md#HOME) if it's
defined
- expand to the home directory of the user executing the shell
(operating system)

View File

@ -18,9 +18,9 @@ are **not double-quoted**!
The `IFS` variable holds the characters that Bash sees as word
boundaries in this step. The default contains the characters
- \<space\>
- \<tab\>
- \<newline\>
- <space>
- <tab>
- <newline>
These characters are also assumed when IFS is **unset**. When `IFS` is
**empty** (nullstring), no word splitting is performed at all.
@ -31,14 +31,14 @@ The results of the expansions mentioned above are scanned for
`IFS`-characters. If **one or more** (in a sequence) of them is found,
the expansion result is split at these positions into multiple words.
This doesn\'t happen when the expansion results were **double-quoted**.
This doesn't happen when the expansion results were **double-quoted**.
When a null-string (e.g., something that before expanded to
\>\>nothing\<\<) is found, it is removed, unless it is quoted (`''` or
>>nothing<<) is found, it is removed, unless it is quoted (`''` or
`""`).
[**Again note:**]{.underline} Without any expansion beforehand, Bash
won\'t perform word splitting! In this case, the initial token parsing
won't perform word splitting! In this case, the initial token parsing
is solely responsible.
## See also
@ -49,4 +49,4 @@ is solely responsible.
- [WordSplitting](http://mywiki.wooledge.org/WordSplitting),
[IFS](http://mywiki.wooledge.org/IFS), and
[DontReadLinesWithFor](http://mywiki.wooledge.org/DontReadLinesWithFor) -
Greg\'s wiki
Greg's wiki

View File

@ -10,12 +10,12 @@ evaluate and execute is the simple command.
## Simple command expansion
\<div center round info 60%\>
<div center round info 60%>
- <http://lists.gnu.org/archive/html/bug-bash/2013-01/msg00040.html>
- <http://lists.research.att.com/pipermail/ast-developers/2013q2/002456.html>
\</div\>
</div>
This step happens after the initial command line splitting.
@ -69,8 +69,8 @@ Otherwise, if a command name results:
entire script to terminate*
The behavior regarding the variable assignment errors can be tested:
\<div center round info
60%\><http://lists.gnu.org/archive/html/bug-bash/2013-01/msg00054.html>\</div\>
<div center round info
60%><http://lists.gnu.org/archive/html/bug-bash/2013-01/msg00054.html></div>
**[This one exits the script completely]{.underline}**
@ -113,7 +113,7 @@ As of Bash Version 4, when a command search fails, the shell executes a
shell function named `command_not_found_handle()` using the failed
command as arguments. This can be used to provide user friendly messages
or install software packages etc. Since this function runs in a separate
execution environment, you can\'t really influence the main shell with
execution environment, you can't really influence the main shell with
it (changing directory, setting variables).
FIXME to be continued

View File

@ -56,7 +56,7 @@ redirected to the pipe.
#### Avoid the final pipeline subshell
The traditional Ksh workaround to avoid the subshell when doing
`command | while read` is to use a coprocess. Unfortunately, Bash\'s
`command | while read` is to use a coprocess. Unfortunately, Bash's
behavior differs.
In Ksh you would do:
@ -81,14 +81,14 @@ bash: read: line: invalid file descriptor specification
By the time we start reading from the output of the coprocess, the file
descriptor has been closed.
See [this FAQ entry on Greg\'s
See [this FAQ entry on Greg's
wiki](http://mywiki.wooledge.org/BashFAQ/024) for other pipeline
subshell workarounds.
#### Buffering
In the first example, we GNU awk\'s `fflush()` command. As always, when
you use pipes the I/O operations are buffered. Let\'s see what happens
In the first example, we GNU awk's `fflush()` command. As always, when
you use pipes the I/O operations are buffered. Let's see what happens
with `sed`:
``` bash
@ -100,9 +100,9 @@ nothing read
```
Even though this example is the same as the first `awk` example, the
`read` doesn\'t return because the output is waiting in a buffer.
`read` doesn't return because the output is waiting in a buffer.
See [this faq entry on Greg\'s
See [this faq entry on Greg's
wiki](http://mywiki.wooledge.org/BashFAQ/009) for some workarounds and
more information on buffering issues.
@ -147,9 +147,9 @@ Here, fd 3 is inherited.
### Anonymous Coprocess
Unlike ksh, Bash doesn\'t have true anonymous coprocesses. Instead, Bash
Unlike ksh, Bash doesn't have true anonymous coprocesses. Instead, Bash
assigns FDs to a default array named `COPROC` if no `NAME` is supplied.
Here\'s an example:
Here's an example:
``` bash
$ coproc awk '{print "foo" $0;fflush()}'
@ -171,7 +171,7 @@ $ IFS= read -ru ${COPROC[0]} x; printf '%s\n' "$x"
foobar
```
When we don\'t need our command anymore, we can kill it via its pid:
When we don't need our command anymore, we can kill it via its pid:
$ kill $COPROC_PID
$
@ -209,10 +209,10 @@ exec >&${tee[1]} 2>&1
- The `coproc` keyword is not specified by POSIX(R)
- The `coproc` keyword appeared in Bash version 4.0-alpha
- The `-p` option to Bash\'s `print` loadable is a NOOP and not
- The `-p` option to Bash's `print` loadable is a NOOP and not
connected to Bash coprocesses in any way. It is only recognized as
an option for ksh compatibility, and has no effect.
- The `-p` option to Bash\'s `read` builtin conflicts with that of all
- The `-p` option to Bash's `read` builtin conflicts with that of all
kshes and zsh. The equivalent in those shells is to add a `\?prompt`
suffix to the first variable name argument to `read`. i.e., if the
first variable name given contains a `?` character, the remainder of
@ -227,23 +227,23 @@ which all do approximately the same thing. ksh93 and mksh have virtually
identical syntax and semantics for coprocs. A *list* operator: `|&` is
added to the language which runs the preceding *pipeline* as a coprocess
(This is another reason not to use the special `|&` pipe operator in
Bash \-- its syntax is conflicting). The `-p` option to the `read` and
Bash -- its syntax is conflicting). The `-p` option to the `read` and
`print` builtins can then be used to read and write to the pipe of the
coprocess (whose FD isn\'t yet known). Special redirects are added to
coprocess (whose FD isn't yet known). Special redirects are added to
move the last spawned coprocess to a different FD: `<&p` and `>&p`, at
which point it can be accessed at the new FD using ordinary redirection,
and another coprocess may then be started, again using `|&`.
zsh coprocesses are very similar to ksh except in the way they are
started. zsh adds the shell reserved word `coproc` to the pipeline
syntax (similar to the way Bash\'s `time` keyword works), so that the
pipeline that follows is started as a coproc. The coproc\'s input and
syntax (similar to the way Bash's `time` keyword works), so that the
pipeline that follows is started as a coproc. The coproc's input and
output FDs can then be accessed and moved using the same `read`/`print`
`-p` and redirects used by the ksh shells.
It is unfortunate that Bash chose to go against existing practice in
their coproc implementation, especially considering it was the last of
the major shells to incorporate this feature. However, Bash\'s method
the major shells to incorporate this feature. However, Bash's method
accomplishes the same without requiring nearly as much additional
syntax. The `coproc` keyword is easy enough to wrap in a function such
that it takes Bash code as an ordinary argument and/or stdin like
@ -261,10 +261,10 @@ The ability to use multiple coprocesses in Bash is considered
\"experimental\". Bash will throw an error if you attempt to start more
than one. This may be overridden at compile-time with the
`MULTIPLE_COPROCS` option. However, at this time there are still issues
\-- see the above mailing list discussion.
-- see the above mailing list discussion.
## See also
- [Anthony Thyssen\'s Coprocess
- [Anthony Thyssen's Coprocess
Hints](http://www.ict.griffith.edu.au/anthony/info/shell/co-processes.hints) -
excellent summary of everything around the topic

View File

@ -14,7 +14,7 @@ A pattern is a **string description**. Bash uses them in various ways:
- Pattern-based branching using the [case command](../syntax/ccmd/case.md)
The pattern description language is relatively easy. Any character
that\'s not mentioned below matches itself. The `NUL` character may not
that's not mentioned below matches itself. The `NUL` character may not
occur in a pattern. If special characters are quoted, they\'re matched
literally, i.e., without their special meaning.
@ -28,7 +28,7 @@ share some symbols and do similar matching work.
`*` Matches **any string**, including the null string (empty string)
`?` Matches any **single character**
`X` Matches the character `X` which can be any character that has no special meaning
`\X` Matches the character `X`, where the character\'s special meaning is stripped by the backslash
`\X` Matches the character `X`, where the character's special meaning is stripped by the backslash
`\\` Matches a backslash
`[...]` Defines a pattern **bracket expression** (see below). Matches any of the enclosed characters at this position.
@ -55,23 +55,23 @@ Some simple examples using normal pattern matching:
- Pattern `"Hello world"` matches
- `Hello world`
- Pattern `[Hh]"ello world"` matches
- =\> `Hello world`
- =\> `hello world`
- => `Hello world`
- => `hello world`
- Pattern `Hello*` matches (for example)
- =\> `Hello world`
- =\> `Helloworld`
- =\> `HelloWoRlD`
- =\> `Hello`
- => `Hello world`
- => `Helloworld`
- => `HelloWoRlD`
- => `Hello`
- Pattern `Hello world[[:punct:]]` matches (for example)
- =\> `Hello world!`
- =\> `Hello world.`
- =\> `Hello world+`
- =\> `Hello world?`
- => `Hello world!`
- => `Hello world.`
- => `Hello world+`
- => `Hello world?`
- Pattern
`[[.backslash.]]Hello[[.vertical-line.]]world[[.exclamation-mark.]]`
matches (using [collation
symbols](https://pubs.opengroup.org/onlinepubs/009696899/basedefs/xbd_chap07.html#tag_07_03_02_04))
- =\> `\Hello|world!`
- => `\Hello|world!`
## Extended pattern language
@ -127,7 +127,7 @@ least) ksh93 and zsh translate patterns into regexes and then use a
regex compiler to emit and cache optimized pattern matching code. This
means Bash may be an order of magnitude or more slower in cases that
involve complex back-tracking (usually that means extglob quantifier
nesting). You may wish to use Bash\'s regex support (the `=~` operator)
nesting). You may wish to use Bash's regex support (the `=~` operator)
if performance is a problem, because Bash will use your C library regex
implementation rather than its own pattern matcher.
@ -141,8 +141,8 @@ to those described above.
\* ksh93 supports arbitrary quantifiers just like ERE using the
`{from,to}(pattern-list)` syntax. `{2,4}(foo)bar` matches between 2-4
\"foo\"\'s followed by \"bar\". `{2,}(foo)bar` matches 2 or more
\"foo\"\'s followed by \"bar\". You can probably figure out the rest. So
\"foo\"'s followed by \"bar\". `{2,}(foo)bar` matches 2 or more
\"foo\"'s followed by \"bar\". You can probably figure out the rest. So
far, none of the other shells support this syntax.
\* In ksh93, a `pattern-list` may be delimited by either `&` or `|`. `&`
@ -155,7 +155,7 @@ double extglob negation. The aforementioned ksh93 pattern is equivalent
in Bash to: `[[ fo0bar == !(!(fo[0-9])|!(+([[:alnum:]])))bar ]]`, which
is technically more portable, but ugly.
\* ksh93\'s [printf](../commands/builtin/printf.md) builtin can translate from
\* ksh93's [printf](../commands/builtin/printf.md) builtin can translate from
shell patterns to ERE and back again using the `%R` and `%P` format
specifiers respectively.

View File

@ -84,7 +84,7 @@ Looking for a specific syntax you saw, without knowing the name?
`${PARAMETER}`
The easiest form is to just use a parameter\'s name within braces. This
The easiest form is to just use a parameter's name within braces. This
is identical to using `$FOO` like you see it everywhere, but has the
advantage that it can be immediately followed by characters that would
be interpreted as part of the parameter name otherwise. Compare these
@ -168,9 +168,9 @@ It was an unfortunate design decision to use the `!` prefix for
indirection, as it introduces parsing ambiguity with other parameter
expansions that begin with `!`. Indirection is not possible in
combination with any parameter expansion whose modifier requires a
prefix to the parameter name. Specifically, indirection isn\'t possible
prefix to the parameter name. Specifically, indirection isn't possible
on the `${!var@}`, `${!var*}`, `${!var[@]}`, `${!var[*]}`, and `${#var}`
forms. This means the `!` prefix can\'t be used to retrieve the indices
forms. This means the `!` prefix can't be used to retrieve the indices
of an array, the length of a string, or number of elements in an array
indirectly (see [syntax/arrays#indirection](../syntax/arrays.md#indirection)
for workarounds). Additionally, the `!`-prefixed parameter expansion
@ -208,14 +208,14 @@ The `^` operator modifies the first character to uppercase, the `,`
operator to lowercase. When using the double-form (`^^` and `,,`), all
characters are converted.
\<wrap center round info 60%\>
<wrap center round info 60%>
The (**currently undocumented**) operators `~` and `~~` reverse the case
of the given text (in `PARAMETER`).`~` reverses the case of first letter
of words in the variable while `~~` reverses case for all. Thanks to
`Bushmills` and `geirha` on the Freenode IRC channel for this finding.
\</wrap\>
</wrap>
[**Example: Rename all `*.txt` filenames to lowercase**]{.underline}
@ -245,13 +245,13 @@ examples:
Assume: `array=(This is some Text)`
- `echo "${array[@],}"`
- =\> `this is some text`
- => `this is some text`
- `echo "${array[@],,}"`
- =\> `this is some text`
- => `this is some text`
- `echo "${array[@]^}"`
- =\> `This Is Some Text`
- => `This Is Some Text`
- `echo "${array[@]^^}"`
- =\> `THIS IS SOME TEXT`
- => `THIS IS SOME TEXT`
```{=html}
<!-- -->
@ -267,7 +267,7 @@ Assume: `array=(This is some Text)`
This expands to a list of all set **variable names** beginning with the
string `PREFIX`. The elements of the list are separated by the first
character in the `IFS`-variable (\<space\> by default).
character in the `IFS`-variable (<space> by default).
This will show all defined variable names (not values!) beginning with
\"BASH\":
@ -287,7 +287,7 @@ This list will also include [array names](../syntax/arrays.md).
`${PARAMETER%%PATTERN}`
This one can **expand only a part** of a parameter\'s value, **given a
This one can **expand only a part** of a parameter's value, **given a
pattern to describe what to remove** from the string. The pattern is
interpreted just like a pattern to describe a filename to match
(globbing). See [Pattern matching](../syntax/pattern.md) for more.
@ -336,16 +336,16 @@ filename**. Just look at the following list with examples:
- **Get name without extension**
- `${FILENAME%.*}`
- =\> `bash_hackers.txt`
- => `bash_hackers.txt`
- **Get extension**
- `${FILENAME##*.}`
- =\> `bash_hackers.txt`
- => `bash_hackers.txt`
- **Get directory name**
- `${PATHNAME%/*}`
- =\> `/home/bash/bash_hackers.txt`
- => `/home/bash/bash_hackers.txt`
- **Get filename**
- `${PATHNAME##*/}`
- =\> `/home/bash/bash_hackers.txt`
- => `/home/bash/bash_hackers.txt`
These are the syntaxes for filenames with a single extension. Depending
on your needs, you might need to adjust shortest/longest match.
@ -362,7 +362,7 @@ expansion):
Assume: `array=(This is a text)`
- `echo "${array[@]%is}"`
- =\> `Th a text`
- => `Th a text`
- (it was: `This is a text`)
All other variants of this expansion behave the same.
@ -392,31 +392,31 @@ The first one (*one slash*) is to only substitute **the first
occurrence** of the given pattern, the second one (*two slashes*) is to
substitute **all occurrences** of the pattern.
First, let\'s try to say \"happy\" instead of \"conservative\" in our
First, let's try to say \"happy\" instead of \"conservative\" in our
example string:
${MYSTRING//conservative/happy}
=\>
=>
`Be liberal in what you accept, and conservativehappy in what you send`
Since there is only one \"conservative\" in that example, it really
doesn\'t matter which of the two forms we use.
doesn't matter which of the two forms we use.
Let\'s play with the word \"in\", I don\'t know if it makes any sense,
but let\'s substitute it with \"by\".
Let's play with the word \"in\", I don't know if it makes any sense,
but let's substitute it with \"by\".
[**First form: Substitute first occurrence**]{.underline}
${MYSTRING/in/by}
=\> `Be liberal inby what you accept, and conservative in what you send`
=> `Be liberal inby what you accept, and conservative in what you send`
[**Second form: Substitute all occurrences**]{.underline}
${MYSTRING//in/by}
=\>
=>
`Be liberal inby what you accept, and conservative inby what you send`
[**Anchoring**]{.underline} Additionally you can \"anchor\" an
@ -447,28 +447,28 @@ A simple example, changing the (lowercase) letter `t` to `d`:
Assume: `array=(This is a text)`
- `echo "${array[@]/t/d}"`
- =\> `This is a dext`
- => `This is a dext`
- `echo "${array[@]//t/d}"`
- =\> `This is a dexd`
- => `This is a dexd`
## String length
`${#PARAMETER}`
When you use this form, the length of the parameter\'s value is
When you use this form, the length of the parameter's value is
expanded. Again, a quote from a big man, to have a test text:
MYSTRING="Be liberal in what you accept, and conservative in what you send"
Using echo `${#MYSTRING}`\...
=\> `64`
=> `64`
The length is reported in characters, not in bytes. Depending on your
environment this may not always be the same (multibyte-characters, like
in UTF8 encoding).
There\'s not much to say about it, mh?
There's not much to say about it, mh?
### (String) length: Arrays
@ -484,14 +484,14 @@ Example:
Assume: `array=(This is a text)`
- `echo ${#array[1]}`
- =\> 2 (the word \"is\" has a length of 2)
- => 2 (the word \"is\" has a length of 2)
- `echo ${#array[@]}`
- =\> 4 (the array contains 4 elements)
- => 4 (the array contains 4 elements)
[**Attention:**]{.underline} The number of used elements does not need
to conform to the highest index. Sparse arrays are possible in Bash,
that means you can have 4 elements, but with indexes 1, 7, 20, 31. **You
can\'t loop through such an array with a counter loop based on the
can't loop through such an array with a counter loop based on the
number of elements!**
## Substring expansion
@ -500,10 +500,10 @@ number of elements!**
`${PARAMETER:OFFSET:LENGTH}`
This one can expand only a **part** of a parameter\'s value, given a
This one can expand only a **part** of a parameter's value, given a
**position to start** and maybe a **length**. If `LENGTH` is omitted,
the parameter will be expanded up to the end of the string. If `LENGTH`
is negative, it\'s taken as a second offset into the string, counting
is negative, it's taken as a second offset into the string, counting
from the end of the string.
`OFFSET` and `LENGTH` can be **any** [arithmetic
@ -520,7 +520,7 @@ that the offset 0 is the first character:
echo ${MYSTRING:35}
=\>
=>
`<del>Be liberal in what you accept, and </del>conservative in what you send`
### Using Offset and Length
@ -529,12 +529,12 @@ In the second form we also give a length value:
echo ${MYSTRING:35:12}
=\>
=>
`<del>Be liberal in what you accept, and </del>conservative<del> in what you send</del>`
### Negative Offset Value
If the given offset is negative, it\'s counted from the end of the
If the given offset is negative, it's counted from the end of the
string, i.e. an offset of -1 is the last character. In that case, the
length still counts forward, of course. One special thing is to do when
using a negative offset: You need to separate the (negative) number from
@ -543,18 +543,18 @@ the colon:
${MYSTRING: -10:5}
${MYSTRING:(-10):5}
Why? Because it\'s interpreted as the parameter expansion syntax to [use
Why? Because it's interpreted as the parameter expansion syntax to [use
a default value](../syntax/pe.md#use_a_default_value).
### Negative Length Value
If the `LENGTH` value is negative, it\'s used as offset from the end of
If the `LENGTH` value is negative, it's used as offset from the end of
the string. The expansion happens from the first to the second offset
then:
echo "${MYSTRING:11:-17}"
=\>
=>
`<del>Be liberal </del>in what you accept, and conservative<del> in what you send</del>`
This works since Bash 4.2-alpha, see also
@ -575,9 +575,9 @@ Example:
Assume: `array=(This is a text)`
- `echo ${array[0]:2:2}`
- =\> `is` (the \"is\" in \"This\", array element 0)
- => `is` (the \"is\" in \"This\", array element 0)
- `echo ${array[@]:1:2}`
- =\> `is a` (from element 1 inclusive, 2 elements are expanded,
- => `is a` (from element 1 inclusive, 2 elements are expanded,
i.e. element 1 and 2)
## Use a default value
@ -603,7 +603,7 @@ useful, you need to put that parameter syntax in.
read -p "Enter your gender (just press ENTER to not tell us): " GENDER
echo "Your gender is ${GENDER:-a secret}."
It will print \"Your gender is a secret.\" when you don\'t enter the
It will print \"Your gender is a secret.\" when you don't enter the
gender. Note that the default value is **used on expansion time**, it is
**not assigned to the parameter**.
@ -614,7 +614,7 @@ have to make a difference between expanding an individual element by a
given index and mass-expanding the array using the `@` and `*`
subscripts.
- For individual elements, it\'s the very same: If the expanded
- For individual elements, it's the very same: If the expanded
element is `NULL` or unset (watch the `:-` and `-` variants), the
default text is expanded
- For mass-expansion syntax, the default text is expanded if the array
@ -683,7 +683,7 @@ assigned when the parameter was **unset**.
After the first expansion here (`${HOME:=/home/$USER}`), `HOME` is set
and usable.
Let\'s change our code example from above:
Let's change our code example from above:
#!/bin/bash
@ -696,7 +696,7 @@ Let\'s change our code example from above:
For [arrays](../syntax/arrays.md) this expansion type is limited. For an
individual index, it behaves like for a \"normal\" parameter, the
default value is assigned to this one element. The mass-expansion
subscripts `@` and `*` **can not be used here** because it\'s not
subscripts `@` and `*` **can not be used here** because it's not
possible to assign to them!
## Use an alternate value
@ -706,7 +706,7 @@ possible to assign to them!
`${PARAMETER+WORD}`
This form expands to nothing if the parameter is unset or empty. If it
is set, it does not expand to the parameter\'s value, **but to some text
is set, it does not expand to the parameter's value, **but to some text
you can specify**:
echo "The Java application was installed and can be started.${JAVAPATH:+ NOTE: JAVAPATH seems to be set}"
@ -802,7 +802,7 @@ Removing the first 6 characters from a text string:
- **Fixed in 4.2.36**
([patch](ftp://ftp.cwru.edu/pub/bash/bash-4.2-patches/bash42-036)).
Bash doesn\'t follow either POSIX or its own documentation when
Bash doesn't follow either POSIX or its own documentation when
expanding either a quoted `"$@"` or `"${arr[@]}"` with an adjacent
expansion. `"$@$x"` expands in the same way as `"$*$x"` - i.e. all
parameters plus the adjacent expansion are concatenated into a
@ -857,7 +857,7 @@ Removing the first 6 characters from a text string:
`When `IFS` is set to a non-null value, or unset, all shells behave
the same - first expanding into separate args, then applying
pathname expansion and word-splitting to the results, except for
zsh, which doesn\'t do pathname expansion in its default mode.
zsh, which doesn't do pathname expansion in its default mode.
```{=html}
<!-- -->
@ -884,7 +884,7 @@ Removing the first 6 characters from a text string:
ksh : <a> <b> <ca> <b> <c> <a b c>
zsh : <a> <b> <ca> <b> <c> <a-b-c>
`ksh93 and mksh can additionally achieve this side effect (and
others) via the `${ cmds;}` expansion. I haven\'t yet tested every
others) via the `${ cmds;}` expansion. I haven't yet tested every
possible side-effect that can affect expansion halfway through
expansion that way.
@ -900,7 +900,7 @@ Removing the first 6 characters from a text string:
```{=html}
<!-- -->
```
- Bash (and most other shells) don\'t allow .\'s in identifiers. In
- Bash (and most other shells) don't allow .'s in identifiers. In
ksh93, dots in variable names are used to reference methods (i.e.
\"Discipline Functions\"), attributes, special shell variables, and
to define the \"real value\" of an instance of a class.
@ -920,7 +920,7 @@ Removing the first 6 characters from a text string:
- Bash only evaluates the subscripts of the slice expansion
(`${x:y:z}`) if the parameter is set (for both nested expansions and
arithmetic). For ranges, Bash evaluates as little as possible, i.e.,
if the first part is out of range, the second won\'t be evaluated.
if the first part is out of range, the second won't be evaluated.
ksh93 and mksh always evaluate the subscript parts even if the
parameter is unset.
` $ bash -c 'n="y[\$(printf yo >&2)1]" m="y[\$(printf jo >&2)1]"; x=(); echo "${x[@]:n,6:m}"' # No output
@ -1030,7 +1030,7 @@ it. Bash will actually expand the command as one of these:
To the best of my knowledge, ksh93 is the only shell that acts
differently. Rather than forcing nested expansions into quoting, a quote
at the beginning and end of the nested region will cause the quote state
to reverse itself within the nested part. I have no idea whether it\'s
to reverse itself within the nested part. I have no idea whether it's
an intentional or documented effect, but it does solve the problem and
consequently adds a lot of potential power to these expansions.

View File

@ -43,12 +43,12 @@ backslash:
echo \$HOME is set to \"$HOME\"
- `\$HOME` won\'t expand because it\'s not in variable-expansion
- `\$HOME` won't expand because it's not in variable-expansion
syntax anymore
- The backslash changes the quotes into literals - otherwise Bash
would interpret them
The sequence `\<newline>` (an unquoted backslash, followed by a
The sequence `<newline>` (an unquoted backslash, followed by a
`<newline>` character) is interpreted as **line continuation**. It is
removed from the input stream and thus effectively ignored. Use it to
beautify your code:
@ -68,7 +68,7 @@ meaning to bash. [Exception:]{.underline} Inside a single-quoted string
## Weak quoting
Inside a weak-quoted string there\'s **no special interpretion of**:
Inside a weak-quoted string there's **no special interpretion of**:
- spaces as word-separators (on inital command line splitting and on
[word splitting](../syntax/expansion/wordsplit.md)!)
@ -88,7 +88,7 @@ unless you have a file named `*`, spit out an error.
echo "Your PATH is: $PATH"
Will work as expected. `$PATH` is expanded, because it\'s double (weak)
Will work as expected. `$PATH` is expanded, because it's double (weak)
quoted.
If a backslash in double quotes (\"weak quoting\") occurs, there are 2
@ -112,8 +112,8 @@ single-quote that closes the string.
echo 'Your PATH is: $PATH'
`$PATH` won\'t be expanded, it\'s interpreted as ordinary text because
it\'s surrounded by strong quotes.
`$PATH` won't be expanded, it's interpreted as ordinary text because
it's surrounded by strong quotes.
In practise that means, to produce a text like `Here's my test...` as a
single-quoted string, you have to leave and re-enter the single quoting
@ -222,7 +222,7 @@ is seen as **one word**. The for loop iterates exactly one time, with
The command `test` or `[ ... ]` ([the classic test
command](../commands/classictest.md)) is an ordinary command, so ordinary
syntax rules apply. Let\'s take string comparison as an example:
syntax rules apply. Let's take string comparison as an example:
[ WORD = WORD ]
@ -232,7 +232,7 @@ writing this as a test command it would be:
test WORD = WORD
When you compare variables, it\'s wise to quote them. Let\'s create a
When you compare variables, it's wise to quote them. Let's create a
test string with spaces:
mystring="my string"
@ -259,8 +259,8 @@ Now the command has three parameters, which makes sense for a binary
(two argument) operator.
**[Hint:]{.underline}** Inside the [conditional
expression](../syntax/ccmd/conditional_expression.md) (`[[ ]]`) Bash doesn\'t
perform word splitting, and thus you don\'t need to quote your variable
expression](../syntax/ccmd/conditional_expression.md) (`[[ ]]`) Bash doesn't
perform word splitting, and thus you don't need to quote your variable
references - they are always seen as \"one word\".
## See also

View File

@ -1,8 +1,8 @@
# Redirection
\<wrap left todo\>Fix me: To be continued\</wrap\>\
<wrap left todo>Fix me: To be continued</wrap>\
Redirection makes it possible to control where the output of a command
goes to, and where the input of a command comes from. It\'s a mighty
goes to, and where the input of a command comes from. It's a mighty
tool that, together with pipelines, makes the shell powerful. The
redirection operators are checked whenever a [simple command is about to
be executed](../syntax/grammar/parser_exec.md).
@ -16,9 +16,9 @@ file descriptors 0, 1 and 2, all connected to your terminal:
`stdout` 1 standard output stream (e.g. monitor)
`stderr` 2 standard error output stream (usually also on monitor)
\<wrap center info\>The terms \"monitor\" and \"keyboard\" refer to the
<wrap center info>The terms \"monitor\" and \"keyboard\" refer to the
same device, the **terminal** here. Check your preferred UNIX(r)-FAQ for
details, I\'m too lazy to explain what a terminal is ;-) \</wrap\>
details, I\'m too lazy to explain what a terminal is ;-) </wrap>
Both, `stdout` and `stderr` are output file descriptors. Their
difference is the **convention** that a program outputs payload on
@ -45,9 +45,9 @@ these examples are equivalent:
cat >new.txt foo.txt bar.txt
>new.txt cat foo.txt bar.txt
\<wrap center important\>Every redirection operator takes one or two
<wrap center important>Every redirection operator takes one or two
words as operands. If you have to use operands (e.g. filenames to
redirect to) that contain spaces you **must** quote them!\</wrap\>
redirect to) that contain spaces you **must** quote them!</wrap>
## Valid redirection targets and sources
@ -99,16 +99,16 @@ truncated** before writing starts.
>& TARGET
This special syntax redirects both, `stdout` and `stderr` to the
specified target. It\'s **equivalent** to
specified target. It's **equivalent** to
> TARGET 2>&1
Since Bash4, there\'s `&>>TARGET`, which is equivalent to
Since Bash4, there's `&>>TARGET`, which is equivalent to
`>> TARGET 2>&1`.
\<wrap center important\>This syntax is deprecated and should not be
<wrap center important>This syntax is deprecated and should not be
used. See the page about [obsolete and deprecated
syntax](../scripting/obsolete.md).\</wrap\>
syntax](../scripting/obsolete.md).</wrap>
## Appending redirected output and error output
@ -134,7 +134,7 @@ omitted, filedescriptor 0 (`stdin`) is assumed.
## Here documents
\<BOOKMARK:tag_heredoc\>
<BOOKMARK:tag_heredoc>
<<TAG
...
@ -146,7 +146,7 @@ omitted, filedescriptor 0 (`stdin`) is assumed.
A here-document is an input redirection using source data specified
directly at the command line (or in the script), no \"external\" source.
The redirection-operator `<<` is used together with a tag `TAG` that\'s
The redirection-operator `<<` is used together with a tag `TAG` that's
used to mark the end of input later:
# display help
@ -178,11 +178,11 @@ here-documents.
The tag you use **must** be the only word in the line, to be recognized
as end-of-here-document marker.
\<wrap center info\>It seems that here-documents (tested on versions
<wrap center info>It seems that here-documents (tested on versions
`1.14.7`, `2.05b` and `3.1.17`) are correctly terminated when there is
an EOF before the end-of-here-document tag. The reason is unknown, but
it seems to be done on purpose. Bash 4 introduced a warning message when
end-of-file is seen before the tag is reached.\</wrap\>
end-of-file is seen before the tag is reached.</wrap>
## Here strings
@ -214,7 +214,7 @@ to hide it), this is **the wrong way**:
Why? Relatively easy:
- initially, `stdout` points to your terminal (you read it)
- same applies to `stderr`, it\'s connected to your terminal
- same applies to `stderr`, it's connected to your terminal
- `2>&1` redirects `stderr` away from the terminal to the target for
`stdout`: **the terminal** (again\...)
- `1>/dev/null` redirects `stdout` away from your terminal to the file

View File

@ -19,17 +19,17 @@
`?` question mark Status of the most recently executed foreground-pipeline (exit/return code)
`-` dash Current option flags set by the shell itself, on invocation, or using the [set builtin command](../commands/builtin/set.md). It\'s just a set of characters, like `himB` for `h`, `i`, `m` and `B`.
`-` dash Current option flags set by the shell itself, on invocation, or using the [set builtin command](../commands/builtin/set.md). It's just a set of characters, like `himB` for `h`, `i`, `m` and `B`.
`$` dollar-sign The process ID (PID) of the shell. In an [explicit subshell](../syntax/ccmd/grouping_subshell.md) it expands to the PID of the current \"main shell\", not the subshell. This is different from `$BASHPID`!
`!` exclamation mark The process ID (PID) of the most recently executed background pipeline (like started with `command &`)
`0` zero The name of the shell or the shell script (filename). Set by the shell itself.\
If Bash is started with a filename to execute (script), it\'s set to this filename. If started with the `-c <CMDLINE>` option (commandline given as argument), then `$0` will be the first argument after the given `<CMDLINE>`. Otherwise, it is set to the string given on invocation for `argv[0]`.\
If Bash is started with a filename to execute (script), it's set to this filename. If started with the `-c <CMDLINE>` option (commandline given as argument), then `$0` will be the first argument after the given `<CMDLINE>`. Otherwise, it is set to the string given on invocation for `argv[0]`.\
Unlike popular belief, `$0` is *not a positional parameter*.
`_` underscore A kind of catch-all parameter. Directly after shell invocation, it\'s set to the filename used to invoke Bash, or the absolute or relative path to the script, just like `$0` would show it. Subsequently, expands to the last argument to the previous command. Placed into the environment when executing commands, and set to the full pathname of these commands. When checking mail, this parameter holds the name of the mail file currently being checked.
`_` underscore A kind of catch-all parameter. Directly after shell invocation, it's set to the filename used to invoke Bash, or the absolute or relative path to the script, just like `$0` would show it. Subsequently, expands to the last argument to the previous command. Placed into the environment when executing commands, and set to the full pathname of these commands. When checking mail, this parameter holds the name of the mail file currently being checked.
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
## Shell Variables
@ -170,7 +170,7 @@ is the command executing at the time of the trap.
Type: normal variable Read-only: no
Set by Bash: no Default: n/a
The value is used to set the shell\'s compatibility level. The value may
The value is used to set the shell's compatibility level. The value may
be a decimal number (e.g., `4.2`) or an integer (e.g., `42`)
corresponding to the desired compatibility level. If `BASH_COMPAT` is
unset or set to the empty string, the compatibility level is set to the
@ -269,7 +269,7 @@ follows:
Expands to a string describing the version of this instance of Bash.
Since Bash 2.0 it includes the shell\'s \"release status\" (alpha\[N\],
Since Bash 2.0 it includes the shell's \"release status\" (alpha\[N\],
beta\[N\], release).
### CHILD_MAX
@ -618,7 +618,7 @@ contain only a single command).
Type: integer variable Read-only: yes
Set by Bash: yes Default: n/a
The process ID of the shell\'s parent process.
The process ID of the shell's parent process.
### PWD
@ -883,7 +883,7 @@ the history list:
--------------- ------------------------------------------------------------------------------------------------------------
`ignorespace` lines which begin with a space character are not saved in the history list
`ignoredups` don\'t save lines matching the previous history entry
`ignoredups` don't save lines matching the previous history entry
`ignoreboth` short for `ignorespace:ignoredups`
`erasedups` remove all previous lines matching the current line from the history list before the current line is saved
--------------- ------------------------------------------------------------------------------------------------------------
@ -1273,7 +1273,7 @@ Prompt](https://www.gnu.org/software/bash/manual/bash.html#Controlling-the-Promp
Variable: `PS2` Since: unknown
-------------- ----------------- ------------ -----------------
Type: normal variable Read-only: no
Set by Bash: if unset Default: \"\'\'\> \'\'\"
Set by Bash: if unset Default: \"\'\'> \'\'\"
The value of this parameter is expanded as with PS1 and used as the
secondary prompt string.
@ -1309,7 +1309,7 @@ indicate multiple levels of indirection.
The full pathname to the shell is kept in this environment variable. If
it is not set when the shell starts, Bash assigns the full pathname of
the current user\'s login shell.
the current user's login shell.
### SRANDOM
@ -1386,7 +1386,7 @@ arrive.
Set by Bash: no Default: n/a
If set, Bash uses its value as the name of a directory in which Bash
creates temporary files for the shell\'s use.
creates temporary files for the shell's use.
### auto_resume
@ -1408,7 +1408,7 @@ The substring value provides functionality analogous to the %? job
identifier.
If set to any other value, the supplied string must be a prefix of a
stopped job\'s name; this provides functionality analogous to the
stopped job's name; this provides functionality analogous to the
`%string` job identifier.
### histchars

View File

@ -4,7 +4,7 @@
FIXME This article needs a review, it covers two topics (command line
splitting and word splitting) and mixes both a bit too much. But in
general, it\'s still usable to help understand this behaviour, it\'s
general, it's still usable to help understand this behaviour, it's
\"wrong but not wrong\".
One fundamental principle of Bash is to recognize words entered at the
@ -22,7 +22,7 @@ by a space. When you enter an echo command at the Bash prompt, Bash will
look for those special characters, and use them to separate the
parameters.
You don\'t know what I\'m talking about? I\'m talking about this:
You don't know what I\'m talking about? I\'m talking about this:
$ echo Hello little world
Hello little world
@ -31,7 +31,7 @@ In other words, something you do (and Bash does) everyday. The
characters where Bash splits the command line (SPACE, TAB i.e. blanks)
are recognized as delimiters. There is no null argument generated when
you have 2 or more blanks in the command line. **A sequence of more
blank characters is treated as a single blank.** Here\'s an example:
blank characters is treated as a single blank.** Here's an example:
$ echo Hello little world
Hello little world
@ -40,13 +40,13 @@ Bash splits the command line at the blanks into words, then it calls
echo with **each word as an argument**. In this example, echo is called
with three arguments: \"`Hello`\", \"`little`\" and \"`world`\"!
[Does that mean we can\'t echo more than one Space?]{.underline} Of
[Does that mean we can't echo more than one Space?]{.underline} Of
course not! Bash treats blanks as special characters, but there are two
ways to tell Bash not to treat them special: **Escaping** and
**quoting**.
Escaping a character means, to **take away its special meaning**. Bash
will use an escaped character as text, even if it\'s a special one.
will use an escaped character as text, even if it's a special one.
Escaping is done by preceeding the character with a backslash:
$ echo Hello\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ little \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ world
@ -56,7 +56,7 @@ None of the escaped spaces will be used to perform word splitting. Thus,
echo is called with one argument: \"`Hello little world`\".
Bash has a mechanism to \"escape\" an entire string: **Quoting**. In the
context of command-splitting, which this section is about, it doesn\'t
context of command-splitting, which this section is about, it doesn't
matter which kind of quoting you use: weak quoting or strong quoting,
both cause Bash to not treat spaces as special characters:
@ -93,7 +93,7 @@ For a more technical description, please read the [article about word
splitting](../syntax/expansion/wordsplit.md)!
The first kind of splitting is done to parse the command line into
separate tokens. This is what was described above, it\'s a pure
separate tokens. This is what was described above, it's a pure
**command line parsing**.
After the command line has been split into words, Bash will perform
@ -122,7 +122,7 @@ word splitting:
## Example
Let\'s follow an unquoted command through these steps, assuming that the
Let's follow an unquoted command through these steps, assuming that the
variable is set:
MYFILE="THE FILE.TXT"
@ -148,7 +148,7 @@ splitting](../syntax/expansion/wordsplit.md) on the results:
Word 1 Word 2 Word 3 Word 4 Word 5 Word 6 Word 7
`echo` `The` `file` `is` `named` `THE` `FILE.TXT`
Now let\'s imagine we quoted `$MYFILE`, the command line now looks like:
Now let's imagine we quoted `$MYFILE`, the command line now looks like:
echo The file is named "$MYFILE"