GCD

is a dyadic scalar function which returns the Greatest Common Divisor of two integer arguments. It is an extension of Or which maintains the same results on Boolean arguments and the same identity element 0, in the same way that LCM extends And.

Examples
For positive integer arguments, the GCD is the largest positive number which divides both numbers. If one of the arguments is zero, the GCD function returns the other number.

While the mathematical definition of GCD does not cover non-integers, some implementations accept them as arguments. In this case, the return value of  is chosen so that both   and   are integers (or Gaussian integers, when X and/or Y are complex numbers).

Description
If both arguments are integers, the GCD is the greatest positive integer which divides both numbers evenly, or 0 if both arguments are 0. That is, each argument is an integer multiple of the GCD of both arguments. Under this definition, any number is considered a divisor of 0, because multiplying it by 0 results in 0. Using Residue, we might also write the divisibility criterion as.

Because 1 divides every integer, there is always some common divisor of any pair of arguments, and the GCD is well-defined. The identity element for GCD is 0 on the domain of non-negative real numbers, because the other argument will always be a divisor of 0, and so it is returned as the result (for an arbitrary real number, the result is its absolute value). Because 1 is the only positive divisor of itself, the GCD of 1 and any other number is 1.

History
The use of GCD as an extension of Or, and its extension to complex rational numbers, was proposed by Eugene McDonnell at APL75. This definition has become common among many APLs, with SHARP APL, Dyalog APL (as of version 11.0), J, NARS2000, GNU APL, ngn/apl, and dzaima/APL adopting it. However, some APLs, such as APL2 and APLX, keep Or as a pure boolean function and do not extend it, while K uses the

Documentation

 * Dyalog
 * J Dictionary, NuVoc