Problem statement

The following iterative sequence is defined for the set of positive integers:

n → n/2 (n is even) n → 3n + 1 (n is odd)

Using the rule above and starting with 13, we generate the following sequence:

13 → 40 → 20 → 10 → 5 → 16 → 8 → 4 → 2 → 1 It can be seen that this sequence (starting at 13 and finishing at 1) contains 10 terms. Although it has not been proved yet (Collatz Problem), it is thought that all starting numbers finish at 1.

Which starting number, under one million, produces the longest chain?

NOTE: Once the chain starts the terms are allowed to go above one million.

Thoughts

As far as I am aware, there is no trivial method of evaluating the length of a chain for any given seed, aside from just trying it out. I think the solution to this problem is to compute the chain for all the numbers up to 1 million. However, given that numbers will so often be in multiple chains, we can stop as soon as one chain ends up connected to the existing tree of chains.

Solution

Initialise a list of numbers to store chain lengths. For each number, apply the algorithm until reaching a precomputed number (then add that count on top and exit), or reaching 1.

#include <stdio.h>
#define LIMIT 1000000
int main()
{
	unsigned int lens[LIMIT + 1] = { 0 };
	unsigned int i;
	unsigned int longest = 1;

	for (i = 1; i < LIMIT + 1; i++) {
		unsigned long long n = i;
		lens[i] = 1;
		while (n != 1) {
			if ((n % 2) == 0) {
				n = n / 2;
			} else {
				n = (3 * n) + 1;
			}

			if (n < LIMIT + 1 && lens[n] != 0) {
				lens[i] += lens[n];
				break;
			} else {
				lens[i]++;
			}
		}
		if (lens[i] > lens[longest]) {
			longest = i;
		}
	}

	printf("Seed %u produces chain of length %u\n", longest, lens[longest]);
}