Great Circle Intersections

I was driving through Northern California a few years back, and was wondering what a particular mountain peak off in the distance was. It occurred to me that if I was able to take a bearing on it from two different points, I’d be able to triangulate its position on a map. I decided to write an iPhone app called WhatThePeak to do this.

Download WhatThePeak in the iTunes App Store.

I’m going to discuss here the math that makes the app work.

Great circle – Wikipedia, the free encyclopedia

The shortest path between two points on the surface of the earth is called a Great Circle. This is why airlines fly routes that, just looking on a flat map seem unintuitive — Chicago to Moscow going over the North Pole, for instance. If you actually look at the path on the globe, you’ll see that it is indeed the shortest way to get there, other than drilling a tunnel through the earth.

Any two distinct great circles will intersect in two places on the earth.

You can define a great circle with a latitude, longitude, and initial heading from that point.

The easy way to find the intersections of two great circles is to use two points on each of the circles to define the planes that the circles pass through. Next, calculate the normal vectors for those planes, using the cross product of the vectors defined by those points. Finally, take the cross product of those two normal vectors. This cross product describes one intersection between the two great circles. Its negative vector describes the other intersection.

As an example, I’ll use two points located in my home town, in tuples of (longitude, latitude, heading):

centennial = (-112.554006, 45.995948, 155.43)

continental = (-112.474351, 45.961395, 278.30)

The first point is looking south-southeast from Centennial blvd, and the second roughly west from continental drive.

wpid-conti_cent_map-2014-07-26-15-43.png

Quaternions and spatial rotation – Wikipedia, the free encyclopedia

To turn those points into great circles, I’ll use a technique called quaternion rotation. A quaternion is a 4-element vector which is made up of an axis and a rotation about that axis. You can use it to describe any rotation in 3-space.

For each tuple of coordinate and heading, I’ll first use the coordinate to generate a cartesian (x, y, z) vector in a coordinate system defined on the earth as x-axis positive towards the prime meridian at the equator, y-axis positive towards 90 degrees east at the equator, and the z-axis pointing at the north pole. This vector is used as my axis of rotation. To finish defining the rotation quaternion, I’ll use the negative of the heading, as rotations in compass coordinates are positive in the clockwise direction, but rotations in cartesian space are positive in the counter-clockwise direction.

Next, I’ll use this quaternion to rotate a vector pointing at the north pole (0, 0, 1). The second vector will now be oriented along the great circle defined by the heading.

The cross product of these two vectors is the vector normal to the plane in which the great circle sits.

Generating two normal vectors from both sets of coordinate and heading, I use the cross product to find the normal to these normals, which points to one of the intersections of the two great circles. The other intersection is found by negating the first intersection vector.

The distance between two vectors can be determined by taking the magnitude of the vector produced by subtracting one vector from the other. I compare the distance between each of the intersections and the first coordinate, and choose whichever intersection is closer.

In the diagram below, the equator is yellow, and the prime meridian is green. The red circle is the bearing west from Continental Drive, and the blue circle is the bearing south from Centennial. The light blue and red vectors are the calculated second vectors along the great circles, and the dark vectors are the surface normals of the great circles’ planes. The magenta vector is the chosen intersection.

The top view is looking down from the north pole, and the front view is looking at the earth from 90 degrees west along the equator.

wpid-earth_conti_cent-2014-07-26-15-43.png

Below are two python functions that generate the great circles and calculate their intersections.

I use the cross product function from numerical python, along with functions from a quaternion library to do quaternion rotations.

gs_orient() and gs_intersect() are the interesting functions here.



import numpy as np
import math as m
import transformations as tran

rad2deg = lambda x: (x/pi)*180
deg2rad = lambda d: (d*pi)/180

def quat(axis, theta):
 return tran.quaternion_about_axis(theta, axis)

def qtransform(tpoint, q):
 tmat = tran.quaternion_matrix(q)
 point = np.array([tpoint[0], tpoint[1], tpoint[2], 0])
 return np.dot(tmat, point)[0:3]

mag = lambda v: np.sqrt(np.vdot(v, v))
norm = lambda v: v/mag(v)

north = np.array([0, 0, 1]) # north pole
#ninety = np.array([1, 0, 0]) # looking at 90 degrees west on the equator
origin = np.array([0,0,0])

def lonlat_xyz(longitude, latitude, r=1):
 lat = deg2rad(latitude)
 #print "lat_d: %s, lat_r: %s" % (latitude, lat)
 lon = deg2rad(longitude)
 #print "lon_d: %s, lon_r: %s" % (longitude, lon)
 x = r * m.cos(lat)*m.cos(lon)
 y = r*m.cos(lat)*m.sin(lon)
 z = r*m.sin(lat)
 return array([x, y, z])

def xyz_lonlat(x, y, z, r=1):
 lat = rad2deg(m.asin(z/r))
 lon = rad2deg(m.atan2(y, x))
 return array([lon, lat])

def decdeg2dms(dd):
 minutes,seconds = divmod(abs(dd)*3600,60)
 degrees,minutes = divmod(minutes,60)
 degrees = degrees if (dd >= 0) else -degrees
 return "%do%d'%f''" % (degrees,minutes,seconds)

def gs_orient(lon, lat, azimuth):
 # find a vector representing a great circle defined by a longitude, latitude, and heading

 # define an axis of rotation based on the vector representation of the (lon, lat)
 p1 = lonlat_xyz(lon, lat)
 # this axis of rotation along with the negated azimuth
 # (compass rotation is clockwise, cartesian rotation is CCW)
 # define a quaternion that can be used for rotation
 trans_quat = quat(p1, deg2rad(-azimuth))
 # use this quaternion to rotate a vector at the north pole to a
 # new position on the great circle defined by the (lon, lat) and azimuth
 p2 = qtransform(north, trans_quat)
 return p2

def gs_intersect(lla1, lla2):
 # find the intersection of two points defined by a longitude, latitude, and heading

 # turn the first point into a vector from a (lon, lat) pair
 p1 = lonlat_xyz(lla1[0], lla1[1])
 # find another vector in the great circle from the first point along the first heading
 gv1 = gs_orient(lla1[0], lla1[1], lla1[2])
 # find the normal to the plane described by these two vectors
 norm1 = np.cross(gv1, p1)

 # do the same for the second point
 p2 = lonlat_xyz(lla2[0], lla2[1])
 gv2 = gs_orient(lla2[0], lla2[1], lla2[2])
 norm2 = np.cross(gv2, p2)

 # An intersection between the two great circles is the cross product
 # of the two planes' normals
 inter = norm(np.cross(norm1, norm2))
 # The other intersection is the first vector negated
 inter2 = -1*inter

 # find a straight-line distance between the first point and both intersections
 dist1 = mag(inter-p1)
 dist2 = mag(inter2-p1)

 # return the (lon_lat pair of the closer intersection)
 if dist1 < dist2:
 int_ll = xyz_lonlat(*inter)
 else:
 int_ll = xyz_lonlat(*inter2)

 return int_ll

Some pretty graphs of filters

Given this RLC filter circuit:

wpid-pastedgraphic-3-2014-05-11-11-23.png

wpid-pastedgraphic-4-2014-05-11-11-23.png

Producing this transfer function:

wpid-pastedgraphic-5-2014-05-11-11-23.png
Stimulated at 3 different frequencies:

wpid-pastedgraphic-6-2014-05-11-11-23.png

I have this bode plot of frequency and magnitude response:

wpid-pastedgraphic-7-2014-05-11-11-23.png
And for each of the test frequencies, I have a plot of the log-scaled transfer function, input stimulation, output based on convolution of the input stimulation and transfer function (blue) and steady state output of the transfer function (magenta) derived by replacing the ‘s’ in the transfer function by the angular frequency times j (s=jω).  Note how after the initial transient response, the convolution output catches up with the steady state output.  How quickly it does this depends on the filter.

wpid-pastedgraphic-8-2014-05-11-11-23.png
wpid-pastedgraphic-9-2014-05-11-11-23.png
wpid-pastedgraphic-10-2014-05-11-11-23.png

Fiber Optic Tree Arduino/LED Upgrade

A number of years ago, my girlfriend at the time bought me a fiber optic christmas tree to cheer up my office.

Something similar to this one:

Amazon.com – 32 Inch Fiber Optic Christmas Tree with Stand

I loved the thing, but it had its problems: The thing wasn’t built to last. It had a cheap, noisy, hot motor, and a 10W halogen bulb that was prone to overheating. The bulb wasn’t exactly cheap to replace, either.

After last christmas, and the third or fourth bulb replacement, I decided I had enough.

I left the original wall wart and switch in the system, and scooped everything out. Replaced the noisy motor with a stepper, and the halogen with a 1W LED. Control the whole thing with an Arduino.

I’m using an off the shelf buck converter and a bridge rectifier to get 5V DC efficiently from the 12VAC wall wart.

LED running from a MOSFET:

MOSFET + LED

Stepper and color wheel in place:

Added Stepper Motor

Here’s the whole thing running:

Arduino Fiber Optic Tree

I’m pondering adding a wifi breakout to it, and make it reactive to network events.

The LED doesn’t have enough red in it, so I’ll probably replace it with a warmer one. Otherwise, this was a fun 2-hour hack.

PiBot’s Mast cam and Command Interface

I’ve started building the command interface and computer vision software for PiBot. The core architecture of PiBot is an Arduino Mega acting as sort of a brainstem, speaking a serial command language. It can listen on either a UART link connected to a Raspberry Pi, or an XBee module.

In this video, I’m controlling the sensor mast using a Playstation 3 controller, with a PyGame program that reads the controller and generates the command language.

OpenGL and Inertial Measurement

To teach myself modern OpenGL, I wrote a sort of 3D sandbox I call Holospectre. It’s a Cocoa application that bolts an interface to an NSOpenGLView. The interesting part about it is it binds an embedded Python interpreter, and runs python modules to generate displays. I’ve done a couple of cool things with it. Here it is bound to the output of my MSP430 spectrum analyzer:

I’m working on a quadrotor autopilot, which I’ve mentioned before. It’s the basic reason for PiBot’s existence. This “poor man’s Wiimote” was my first experiment with working with the output of an Inertia Measurement Unit, in this case the MPU-6050. Obviously, I’m not correcting for gyroscope drift at all, but this was basically a smoke check. MPU to MSP430, serial output to computer, and then interpretation of the data on the desktop.

I needed a wiki.

Last fall, it became horribly apparent that I needed a new way of organizing my notes.  After evaluating a few options, I decided to roll my own wiki, with a few interesting features:

  • integration with my own circuit analysis code libraries
  • ability to display inline LaTeX
  • ability to manage a library of PDF textbooks
  • a MacOS integrated clipping application that I’ll describe separately late
  • highlighted code display
  • tags

The final app is a Django application backed by Postgres.  I’ve never really liked building web apps, but I find this one pretty useful.

LED strip spectrum analyzer

I bought an TM1803-based LED strip from Radio Shack this spring. It has 10 segments that are addressable 24-bit RGB, but it’s a bit tricky to program. Requires some pretty tight timing.

It can be controlled by a 16MHz microcontroller, but barely. The timing requirements are pretty tight, and jitter can be a bitch. Even so, I managed to get a pretty cool spectrum analyzer light show going with it:

It’s an MSP430 with an LM386 driving the LED strip.

The code was originally written to make a bar-graph display on an 16×2 LCD. I spent a lot of time getting the FFT code working there, and had it pretty much canned when I got the LED strip.

Developing the bar graph:

When starting this project, I first chose a target sampling rate of 8000Hz, giving me a maximum detectable frequency of 4000Hz.
Additionally, I had a target output frame rate of 60Hz, roughly double the human eye’s update rate. 8000 / 60 is 133.3, giving the nearest even power of two to be 128. I decided to try running the chip at 1MHz, and sampling as often as I could, giving a 128-point FFT. This was optimistic, but it gave me a lot of room to adjust down.
My next task was to select an integer-based fast Fourier transform that would run acceptably given the limitations of the machine: slow, without a floating point unit or even a hardware multiplier, with an amazing 1K of available RAM.
I found exactly what I was looking for in fix_fft.c, a bit of code that has been floating around the net and used by microcontroller enthusiasts since 1989.

I built a test harness using python, and python’s ability to load C dynamic libraries. This allowed me to run the same code in test as I would be deploying on the microcontroller.

Working on the LCD display, when I got all of the code running on the chip, I found that my update frequency was a staggering 0.25 Hz. This is where I started adjusting my expectations down. Making the realization that I only have 16 output buckets to work with (the LCD is 16×2 characters), I switched over to a 32-point FFT. Coupling this with a clock speed of 16MHz made the display more than fast enough.

Finally, after I managed to get text output working on the LCD, I wanted to set up a set of dot-addressable characters to use as a bar graph. This necessitated a trip into the data sheet for the Sitronix ST7032 LCD display controller, where I found a new definition for poor documentation. I’ve never encountered abuse of the English language on this scale before.

Shortly after I got the display working, I managed to fry it. It took less than a (horribly frustrating) week to get a replacement, during which time I switched over to outputting data to the host computer using the UART.
My final problem ended up being caused by designing on a 64-bit machine and testing on a 16-bit machine: The MSP430 has 16-bit ints, and the FFT code I was using assumed at least 32-bit ints. This made for some frustrating testing when I was using the same numbers in my test harness as were being run on the microcontroller, and getting different results. I ended up writing some python code that would use actual data copied out of the debugger, as shown below. I banged my head for a couple hours on this before it occurred to me to check the assumptions in fix_fft.c: turns out that this piece of code is so old that it was never updated to use explicit type sizes from . I changed all of the ints to int32_t, and finally got the same results on the micro and my computer.

Here it is running.

But the LED strip had its own amusing issues …

I naively assumed, when I started playing with the LED strip, that it would be easy to program. After all, it came off the shelf from Radio Shack with a pretty nifty canned Arduino program.

Whoever wrote the program must have pulled a lot of hair in trial and error. Modifying its parameters didn’t give really good results, which, with a lot of oscilloscope and logic analyzer work, I finally tracked down to jitter. I was able to get some goodish results, but the final product has always been a bit of a disappointment to me.

This has led me to design a better solution in Verilog, implemented on an FPGA or CPLD. I’ll write about that in another post.

My robot tentacle made Hackaday!

Neat, Hackaday came across one of my projects.

Robotic tentacles for a disturbing haunted house

They got some of the details wrong, but the jist was right.

This is a prototype tentacle robot I built this summer for a friend’s haunted house.

Electronically, it’s an Arduino with a motor shield running a NEMA17 stepper. Mechanically, the stepper output goes through a 2.5:1 reduction, and drives a PVC cylinder that the control lines for the tentacle are wrapped around — one line plays out as the other is drawn in.

The really cool thing about this is it took me 10 days from seeing the tentacle to having parts fabricated. I used the laser cutting service Ponoko.com. Wooden parts are 6.7mm 3-layer bamboo plywood, plastic bits are 3.2mm delrin, which has some really nice mechanical properties. Got the parts with one-week turnaround. Other than the bearings and axle, which I had to order from Amazon supply, the L-brackets, shaft collars, and PVC all came from Ace hardware.

Gears were generated with a python script that outputs a .dxf file.

Here’s a flickr set of the AutoCAD assembly. From AutoCAD, parts are exported as PDF, and then edited into a EPS file using Illustrator for Ponoko, using one of their three standard stock size templates.

Jawathulhu Tentacle Mechanism

Flickr Set

I accidentally built a robot

I was cleaning my desk last weekend, and accidentally built a robot.

Done-ish PiBot

(Admittedly, the reason my desk was cluttered was that I had been gathering parts for this robot, but I wasn’t really anticipating building it this weekend.)
Brains are an Arduino Mega + Raspberry Pi. Chassis has 4 pretty nice gear motors, a Sabertooth motor controller, and 2 quadrature encoders (motors on each side are in parallel).
On-board sensors are accelerometer/gyro/compass, GPS, and barometric pressure. The sensor mast swivels 180 degrees, and has on one side, 3 parallax ping ultrasonic sensors that can be swiveled 180 degrees on one side, and independent to that, a RocketFish HD webcam with 2 lasers in the same plane that can be moved independently horizontally, and a bigass light, all on the same swivel. This gives me hemispheric sensor coverage out front.
(The reason for all the motion sensing stuff is this thing is going to be a development platform for quad rotor autopilots.)
That’s a GoPro on front and a 5.8 GHz FPV system. Main control is through an XBee on the Arduino, although the Pi has WiFi.
Mechanically, the chassis is laser-cut MDF, the superstructure is a combination of MicroRax aluminum extrusion and some delrin I got cut at Ponoko.com.
Now I just need to wire the thing up, and, you know, program it.

Reddit discussion