## Spider algorithm with a path

In an appendix to the paper from which I implemented the slow mating algorithm in my previous post, there is a brief description of another algorithm:

The Thurston Algorithm for quadratic matings

Initialization A.2 (Spider algorithm with a path)Suppose \(\theta = \theta_1 \in \mathbb{Q} \backslash \mathbb{Z}\) has prepreperiod \(k\) and period \(p\). Define \((x_1(t), \ldots, x_{k+p}(t))\) for \(0 \le t \le 1\) as

\[ x_1(t) = t e^{i 2 \pi \theta_1} \\ x_p(t) = (1 - t) e^{i 2 \pi \theta_p}, \text{ if } k = 0 \\ x_j(t) = e^{i 2 \pi \theta_j}, \text{ otherwise.} \]Pull this path back continuously with \(x_i(t + 1) = \pm \sqrt{x_{i+1}(t)-x_1(t)}\). Then it converges to the marked points of \(f_c\) with appropriate collisions.

In short, given a rational \(\theta\) measured in turns, this provides a way to calculate \(c\) in the Mandelbrot set that has corresponding dynamics. Here \(\theta_j = 2^{j - 1} \theta \mod 1\), and the desired \(c = x_1(\infty)\).

This week I implemented it in my mandelbrot-numerics library, in the hope that it might be faster than my previous method of tracing external rays. Alas, it wasn't to be: both algorithms are \(O(n^2)\) when ignoring the way cost varies with numerical precision, and the spider path algorithm has higher constant factors and requires \(O(n)\) space vs ray tracing \(O(1)\) space. This meant spider path was about 6x slower than ray tracing when using a single-threaded implementation, in one test at period 469, and I imagine it would be slower still at higher periods and precisions.

This isn't entirely surprising, spider path does \(s n\) complex square roots to extend the paths by \(t \to t + 1\), while ray trace does \(s t\) arithmetical operations to extend the ray from depth \(t \to t + 1\). The \(O(n^2)\) comes from \(t\) empirically needing to be about \(2 n\) to be close enough to switch to the faster Newton's root finding method.

Moreover spider path needs very high precision all the way through, the initial points on the unit circle need at least \(n\) bits (I used about \(2 n\) to be sure) to resolve the small differences in external angles, even though the final root can usually be distinguished from other roots of the same period using much less precision. In fact I measured spider path time to be around \(O(n^{2.9})\), presumably because of the precision. Ray tracing was very close to \(O(n^2)\).

Ray tracing has a natural stopping condition: when the ray enters the atom domain with period \(p\), Newton's method is very likely to converge to the nucleus at its center. I imagine something similar will apply to preperiodic Misiurewicz domains, but I have not checked yet. I tried it with spider path but in one instance I got a false positive and ended up at a different minibrot to the one I wanted.

The only possible advantages that remain for the spider path algorithm is that it can be parallelized more effectively than ray tracing, and that the numbers are all in the range \([-2,2]\) which means fixed point could be used. Perhaps a GPU implementation of spider path would be competitive with ray tracing on an elapsed wall-clock time metric, though it would probably still lose on power consumption.

I plotted a couple of graphs of the spider paths, the path points end up log-spiraling around their final resting places. I think this means it converges linearly. Ray tracing is also linear when you are far from the landing point (before the period-doubling cascade starts in earnest). Newton's method converges quadratically, which means the number of accurate digits doubles each time, but you need to start from somewhere accurate enough.