Computing pushed math, led to nuclear option

Mathematics had a big hand in the development of computers, but during the early days of computing it was the latter that helped drive the former, leading to some very odd but useful applications in Canada – particularly in the oil and gas sector.

For instance, in the mid-1960s the U.S. Air Force contracted a Canadian professor – the then head of the mathematics department at the University of Manitoba, to develop a particular computer solution – finding what’s known as “non-cyclic Latin squares of high order,” which are still quite difficult to compute today.

Another was to calculate exactly how much stress was being put on turbines when aircraft turned at high speeds. Jet engines want to keep the plane in a straight line, resulting in tremendous stress on the turbine blades. The computer solution was to develop a calculus-like computational technique, which in turn helped engine designers ensure turbines did not disintegrate in flight.

But it was in the oil industry where math and computers, working together, had the most impact. Linear programming had recently been developed as a way of optimising outputs for a given set of inputs. For instance, if a variety of cat foods needed to be created from mixing a series of nutritious foods, linear programming could tell you what the most profitable mixture might be. It was adapted by the oil industry to optimise the outputs of gasoline and other petroleum products, given the type of crude input used at the refinery.

At first only technical people were allowed to use this technique at the refineries. But it led to the development of IT training programs for refinery managers who could then use it in their roles.

Another challenge was finding the oil. At the time, Alberta companies were allowed to explore for hydrocarbons in designated areas. Upon finding hydrocarbons they were allowed to select half the designated territory for exploitation, the other half going to the highest bidder. The 50 per cent selected exploitation area had to consist of limited-size rectangles, able to join at the corners but not side by side, like the step of stairs. The problem was how to optimize rectangle selection.

Initially the problem was tackled by a computerised “Monte Carlo” technique (akin to throwing dice a few thousand times) but an American mathematician, George B. Danzig, was asked to develop a method of getting integer solutions to the problem so that the rectangles selected conformed to the legal requirement of having non-fractional results, a process now known as integer programming.

Drilling with bombs

Mathematics was also used in another, controversial plan: the potential release of oil from the deep tar sands of Alberta by exploding nuclear devices underneath them – a joint project considered by Imperial Oil and Richfield Oil.

The theory postulated that the shock waves would crack the heavy oil, making it more fluid, and the generated heat would enable it to flow. Tests in Nevada established the feasibility of the plan. A j-shaped tunnel was created with the bomb at the end of the j. Tar sands were placed in the tunnel and the bomb exploded, showing the theory worked even in the “tuff” rock of Nevada. But under extreme heat and pressure, rocks behave differently and equations had to be mathematically converted to the rock types of Alberta.

I was a part of the project team, and we estimated that each bomb, costing $1,000,000 with the necessary infrastructure, would generate between 250,000 and 1,000,000 barrels of oil, giving a worst price of $4 per barrel, and a best price of $1 per barrel. But it never happened – somehow the project information was released in the U.S. before Canada had approved it for Alberta. Prime Minister Diefenbaker was so incensed at this he killed the project.

One statistical technique adapted to the computer was regression analysis, used to analyze the best way of building a pipeline, drilling wells at minimal cost, and how best to construct a gas processing plant. It involved first using all possible variables, then dropping the least significant variable and continuing the process until only significant variables were left.

Another oil-seeking development was the technique of building what’s known as “synthetic seismograms.” Up to this point geophysicists set off explosions at ground level and graphed the deflections from underground surfaces.

By looking at the rather complicated output they tried to predict where it might be useful to drill a well. It was surmised that if they could develop a seismogram synthetically, with known parameters, it would help them analyse the signals generated from the surface explosions, which proved to be the case.

Many years later, the medical profession developed synthetic electrocardiograms and electroencephalograms. Had the oil industry not been so secretive the medical profession could have saved years of work.

Hodson is an Ottawa-based IT industry veteran who has helped develop Canadian computer science programs. Contact him at [email protected].

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now