A prime number is defined as a positive integer which has exactly two factors: one and itself. A factor of a number is an integer which divides evenly into it--that is, divides with a remainder of zero. By this definition one and zero are not prime. One only has one factor (which is itself), and zero divided by any number always has a remainder of zero, so it effectively has an infinite number of factors.
Primality is frequently used for instructional purposes in computer science. Naive algorithms for calculating it are usually complex enough to be a test for a new programmer (barring algorithms built into languages), but simple enough that they can be used as parts of problems to follow.
Below is a list of tasks which involve:
- the calculation (or generation) of primes or types of primes
- the use of primes in finding other types of numbers
- the factorization of integers
- the use of various algorithms in finding or detecting primes or types of primes
- the coding of various types of primality tests
- the use of primes in generating various (number) sequences
Pages in category "Prime Numbers"
The following 45 pages are in this category, out of 45 total.