The Role of Technology in Meeting Current and Future Petroleum Energy DemandJohn Rudesill
Author: I n f i n i t e E n e r g y
August 31st, 2005
The current debate over “peak oil” is both vigorous and contentious, reflecting the high stakes involved. Optimists maintain that the future availability of petroleum crude oil is unlimited and there is no need to worry about satisfying rapidly increasing consumption, even though production in major mature fields is declining and the rate of major new discoveries is in danger of not keeping pace with reserves depletion. Pessimists assert that the historical peak in petroleum production is imminent and will herald dire shortages and political unrest. Other articles in this issue address the scientific objectivity and validity of these opposing viewpoints.
Many discussions of “peak oil” only briefly consider the role of technology, as though it was not important in meeting petroleum product demand. Our task in this article is to assume as a given that supplies of natural petroleum are unlimited and then attempt to answer the following question: What role does technology have in limiting our ability to meet current and future demand for petroleum products? We will endeavor to justify to the reader our conclusion that while technology does play a very important role in meeting our petroleum fuel needs, that role is dependent on investment and, ultimately, political priorities.
Introduction and Background
A meaningful discussion about any role technology plays needs a working definition of the term technology. In the most general case, technology can be seen as any means that enables and/or facilitates the accomplishment of a given useful result. We may recall the old cartoon image of a cave man struggling to move a stone too heavy for him to budge. Then, he gets an idea (light bulb above head) and fashions the first lever made from a sturdy tree limb and he is then able to pry the stone to move in useful increments with reasonable effort. This example demonstrates the principles of mechanical advantage and leverage. Ideas are initially intellectual curiosities and only become technology when they have demonstrated a practical ability to achieve a given useful economic result. In this sense, language and mathematics, for example, qualify as technology when applied to get a useful result. The value of technology is a function of the magnitude of its potential to create economic advantage. The value of some technologies is so great that the political power structures are obliged to assert control over access. Energy supplies and weapons of mass destruction are two such areas currently subject to careful monitoring and control.
Petroleum products reach consumers after passing through a multitude of complicated processes highlighted by several key steps. The sequence begins with exploration and production (E&P). This phase includes the exploration, drilling, discovery, well completion, production of the crude petroleum oil from the well, and transport to refineries. At the refinery, the raw crude oil is converted into its various petroleum products and from there transported to consumers. Many of these conversion steps require intense and sophisticated technology to achieve. As we study the key steps, we will focus on the current status in the U.S. (the world’s largest and most sophisticated refining industry). We will then review a range of technology choices we have available toward meeting increasing petroleum product demand, especially in developing countries, and we will project the likely consequences of these choices. Finally, we will conclude with some preferred courses of action.
Petroleum products are essentially all liquid hydrocarbons as used. There are many new and historical processes to synthesize liquid hydrocarbons from other natural fossil carbon sources that are not formally considered petroleum crude oil. Technology provides the methods to either extract synthetic crude in the case of oil shale and tar sands or convert in the case of coal, bitumen, petroleum coke, petroleum resid, and certainly natural gas to the desired liquid hydrocarbons. All of the synthesis scenarios require gasifying the carbon source and are often lumped under the subject “gas to liquids” (GTL). We will also have some commentary on the developing schemes to make use of the large, even vast, untapped natural methane resources in remote locations, including methane hydrates in permafrost and ocean bottoms.
A wide range of resources is available to anyone interested in learning about petroleum issues and many are online from the U.S. Government DOE and EPA web pages and by many major oil companies, for a start. There are many recent books out, as well as older sources and endless blogs one can peruse for further perspective. Where a source is directly used it will be noted; otherwise the views are those of the author.To help the reader grasp the magnitude of the petroleum industry on Earth, it is useful to know some contextual background information. Total worldwide daily production in round figures is about 80 million barrels per day (b/d)—1 barrel = 42 gallons. This figure includes actual crude oil and the crude equivalents of light hydrocarbon liquids condensed from natural gas production. The U.S. consumes about 20 million b/d, or 25% of the world total. This means that ~300 million people, only ~5% of the world’s population, consume one-quarter of the petroleum produced. Petroleum is 40% of the total energy consumed in the U.S. and about 90% of that is used in transportation of all types. This converts to 2.8 gal. per day per person in the U.S., while the rest of the world averages only 0.42 gal. per day per person. The least developed countries have consumptions less than a tenth of a gallon per day. We are clearly the big consumers. It is important to realize, however, that if the U.S. stopped using any oil, it would only raise the average daily per capita of the remaining six billion humans to 0.56 gal. per day. Conversely, for the same six billion humans to enjoy our standard of consumption would require production increases from 80 million b/d to 400 million b/d. That is a five-fold increase in production and use of petroleum. Even the optimistic oil supply proponents would have to admit that this level of greenhouse gas carbon dioxide emissions is asking for climatic problems.
Here is a surprising statistic (from a DOE report by Energetics, attributed in the Refining Section) that gave me pause. While the petroleum refining industry in the U.S. provides 40% of our total energy as stated above, it also consumes 7.5% of the total! When normalized to a petroleum basis, we find that the equivalent of almost 19% of the energy value of the petroleum itself is consumed during refining. Further consumption occurs during production, transportation in crude tankers for imported oil, and from the refinery to the end user, all of which perhaps add another 1% to the 19%, giving ~20% as the total self-consumption. This is a very substantial conversion loss and denotes the intense technologic input into the crude petroleum needed to raise its quality to meet the standards for clean and convenient use for transportation engines, heating, and power generation. A phrase, “wells to wheels,” was coined to capture the overall efficiency for realizing a unit of motive energy at the wheels of, say, a car compared to the energy potential at the well where the necessary quantity of petroleum originated. In the following sections, we will see how technology drives this efficiency.
Exploration and Production
Natural petroleum crude oil has been known since ancient times at places where it naturally seeps out from the Earth’s surface, e.g. the famous Brea Tar Pits. The modern extraction of petroleum dates from 1859, started by Col. Edwin Drake in Titusville, Pennsylvania. Crude oil is composed primarily of carbon and hydrogen atoms combined in thousands of different molecular variations ranging from a low molecular weight of 16 g/mol for methane CH4 up to 1,000 or more for very large asphalt molecules. Most commercial crude oils have an average molecular weight in the 400-500 range. Crude oil composition varies greatly from one source to another, such that each producing zone has a unique chemical fingerprint identity. There are many impurities in crude oil, with sulfur and nitrogen atoms being the most common and problematic. Both can exceed 1% by weight and are substituted for carbon atoms in the hydrocarbon molecules. The most valuable crude oil has both low sulfur (< 1% and is termed “sweet crude”) and has a lower average molecular weight “light crude.” Such “light sweet crude” requires the least refining and yields the most gasoline, leading to the highest refining margins. Lesser amounts of oxygen and other trace metal atoms like nickel and vanadium also occur and sometimes present special challenges during refining, which will be addressed in a later section.
Petroleum geologists are the specialists who probe the jumble of rock formations as much as five miles below the surface on land and below the ocean bottoms and decide where to drill exploratory wells. This is the highest risk part of the “wells to wheels” enterprise and enormous investments are required to proceed. How then do petroleum geologists decide where to drill exploratory wells?
In the days of Col. Drake, drilling was attempted by guess work based on surface seeps and often resulted in producing wells from as poor as 1 in 40 attempts to 1 out of 10 times at best. This form of near random drilling is truly “wildcatting.” We can see a diagram of the various technologies used to find oil on the Chevron Texaco website (“Petroleum Prospecting Primer”): http://www.chevron.com/learning_center/primer/.
Finding petroleum and producing it has benefited from dramatic advances in technology in recent years and in some cases has been in the forefront in the specialized development and uses of computer hardware and software. Exxon’s website (http://www.exxonmobil.com/corporate/Campaign/Corp_ campaignhome.asp) has information on many aspects of E&P technology. We will summarize the relevant points in the following paragraphs.
The interest in petroleum crude oil in the second half of the nineteenth century was mainly to replace coal oil in kerosene lamps introduced in 1854. The gasoline fraction was burned off, as it was too volatile for safe use in technology of the day. What this points out is that technology advances play a powerful role on the demand side of the economic equation as well as the supply side. Toward the end of the nineteenth century, the spark ignition internal combustion engine became practical and the demand for the gasoline fraction of crude oil to fuel it started to grow even as the electric light was displacing oil lamps. The twentieth century saw continuous growth in the demand for gasoline to fuel automobiles and the near complete dominance of electric lighting.
An oil well success ratio of 1 in 10 was unacceptable and geologists began to recognize that certain types of geologic sedimentary basins overlay potential oil-bearing strata. They also began to realize that natural petroleum was apparently forming (biogenic or otherwise) in what are called source rocks containing trapped decomposing carbonaceous matter. Mobil liquids then migrated to nearby porous reservoir rock, where it stayed only if it was capped by impervious strata that formed a trap. It is very expensive and slow to map a promising formation with numerous core drillings as is used for mining surveys. The outcome was the development of seismic surveying technology, begun around 1930. It is a crude form of what we today recognize as high-tech medical ultrasound imaging. Geologists would place a string of evenly spaced geophones (like microphones) in straight lines over the strata of interest and then set off a dynamite charge to create a low-frequency sound pulse. The geophones would receive the reflected sound pulses and the signals from each phone were recorded either on paper strip charts, or magnetic media in recent years. The plotted result was a “2D” two-dimensional slice image of the underlying strata. The resolution was poor by today’s standards and there is a current effort to re-examine old seismic surveys with modern interpretation tools and see if reservoirs were missed. As technology improvements provided better quality data acquisition and faster computers, it became practical to run parallel 2D slices and construct a “3D” three-dimensional image of the strata. An array of thousands of geophones is used in modern surveys. Some of the most powerful super-computers are now used to process seismic data and Exxon has what they call “4D” imaging, which even shows the movement of oil in the reservoir strata by repeating “3D” seismic surveys spaced over time. Petroleum geologists and reservoir engineers employ very sophisticated and highly proprietary software to model the life of a producing reservoir so they can optimize initial well placement, production and depletion rates, and plan eventual rework and maintenance of the reservoir. The incentive, of course, is to maximize the recovery and profits long term. The successful drilling ratio now approaches 1 in 2.
Seismic survey has been adapted to map the strata below the ocean bottom by towing strings of evenly spaced hydrophones behind a vessel and using compressed air guns to supply the sound pulses. Today deep water offshore oil wells are made possible by remarkable advances in technology. It is now practical to drill from a floating platform in 7,000 feet of water to depths of five miles below sea bottom. When combined with offset drilling (advent in late 1970s) that enables horizontal boring, we can access ~100 square miles of under sea producing zones from a single platform using multiple offset wells splayed out like spokes. A deep water project with a floating platform and multiple wells will cost several hundred million dollars and take several years to start producing. The ocean bottom now accessible for exploration and drilling has expanded tremendously by the innovation of floating platforms with multiple offset wells and is no longer limited to the continental shelves along each continent’s coast.
Wells were drilled for decades by mounting a fixed drill bit on the end of the drill pipe string and then rotating the entire drill string lubricated by specially formulated drilling mud. That is quite a trick when you have miles of pipe hanging on the derrick and drive mechanism and then you have to replace a dull or broken bit! The entire string of pipe has to be removed piece by piece and then reinserted after the bit is changed. The offset drilling technology uses a downhole electric motor-driven cutting head that can be canted like a wrist joint to accomplish the turn from vertical to horizontal. The heads are guided by various technologies, including inertial. The systems have to be able to operate at elevated temperatures up to 400°F and pressures up to 20,000 psig and are models of robust engineering.
Materials technology has helped improve drill bit longevity and cutting efficiency through advances in materials, including diamond coatings and other super hard coatings for the bits.
There are a number of high-tech downhole instruments used to analyze the strata the hole is exposing for signs of oil and for information about the porosity of the rock. Among the instruments used are radiation sensors, gamma and neutron probes, acoustic porosity sensors, temperature, conductivity, and others. The measurements are logged as a function of depth and used to determine the location of hydrocarbon zones and rock porosity. The field engineers use the data to make recommendations for finishing the well to put it into production.
A final note on high-tech instruments in the exploration phase: Gravity and magnetic anomaly maps are made from aerial surveys with gravity meters and magnetometers and are used to help decide where to begin seismic surveys. Sniffing instruments are also in use at or near the surface to sense telltale gases often found above oil and gas deposits. Portable gas chromatographs with various detectors can be used for this purpose.
During production, crude oil frequently is associated with brines and a certain amount of brine is produced with the crude oil. The brine must then be separated and disposed of properly. Crude production takes place in several phases. Essentially all 11 million b/d of the Saudi Arabian crude oil is from what is called secondary production using what is termed “water flood.” This means water from either deep aquifers or salt water from the Persian Gulf is pumped down injection flood wells and the brine pushes or sweeps the crude oil along to the producing wells, where it comes to the surface under pressure from the water flood. If water injection is stopped, oil production soon stops. Reservoir modeling technology is used to optimize the placement of the water injection wells and to determine the optimum pumping rate to recover the most total crude oil. Primary production results from natural gas and hydraulic pressures in the producing zone forcing the crude oil to the surface. So-called gushers result when this pressure is allowed to vent without control. Rapid release of this pressure must be prevented, as doing so can severely damage the well so that ultimate recovery is much less. Production rate declines as the natural pressure falls off after a few years in service and if the producing zone is not suitable for secondary water or carbon dioxide flood, then a lift pump is inserted into the well and the well is called a “stripper well,” yielding less than 20 b/d. There are huge numbers of stripper wells operating in all the older major producing regions. I remember as a child seeing them in the early 1950s in Southern California.
One of the major production challenges is transporting the oil from where the wells are to where the demand is. In the case of the U.S., we import several million b/d onboard supertankers. Here again technology has delivered by providing floating tankage that can exceed two million barrels in one load. These massive ~1,000’ long vessels are double hulled for protection from hull breaches and leakage and they are so automated that the crew may only be a couple of dozen. They typically travel at 12-15 knots underway and may reach the U.S. Gulf Coast in about three weeks from the
Figure 1. Refinery outputs 1996. From the Petroleum Refining Industry Study, available at www.eere.energy.gov/industry/petroleum_refining/pdfs/profile.pdf
Middle East if they are too large to pass the Suez Canal. These ships are not built overnight, nor are the shipyards that build them readily available, and so as consumption remote from the wells increases it will be necessary to build more of these supertankers in phase with the demand increase. The current tanker fleet contains many old smaller tankers that need either to be scrapped or overhauled, but the needed investment is lagging, increasing the risk of oil spills.
In the final sections of this article, we are going to proceed with the idea that the foregoing E&P section is rolled into our base assumption that the crude oil supply is unlimited, so that we can focus on any roles technology plays in limiting the ability of the crude oil refining infrastructure to meet current and increasing demand for petroleum products.
Crude oil refining is a large mature industrial enterprise that does mainly two things: One, it removes unwanted impurities from the crude oil. Two, it converts the jumble of large hydrocarbon molecules into the structures most valued by consumers, i.e. gasoline. The U.S. market demands about 45% of the crude oil be converted to gasoline. See Figure 1 and Table 1 for breakdown of the typical U.S. refinery product slate and note that there is a volume increase through the refinery due to a decrease in density of products from the crude oil feedstock such that a 42 gallon barrel of feed gives 44.6 barrels of products. It is interesting that we have not built any new grassroots refineries in the U.S. since the early 1970s, coincidentally when the U.S. crude production was peaking at over 12 million b/d. Instead there has been an ongoing closure of refineries and consolidation of ownership into a few rather large refining companies. Since both U.S. and world crude demand has steadily increased, it has been necessary to build refineries in other parts of the world (South Korea, India, and Indonesia, to name a few) to match refining capacity with demand. The present U.S. refining
Table 1. Average yields for U.S. refineries in 2000. Available online at: www.eere.energy.gov/industry/petroleum_refining/profile.html.
We have noted that a high level technology is used to manage the refining assets to just meet the market demands. How does that translate down to the physical and chemical technology at work in the refinery? A more familiar analogy may help here. Consider a large integrated bakery that takes in flour as the main ingredient and then has to try to use as much of the machine capacity they have to make each item, such as bread, donuts, cookies, pizza dough, etc., and somehow match exactly what the consumers want of each item. If they make too many cookies, they may have to sell some at a loss or even trash them. If they don’t make enough bread, then they lose sales and the consumer may buy from a competitor. The engineers and managers running an oil refinery face similar challenges.
As we said before, the oil refinery removes impurities from the crude oil and restructures the molecular stew to meet standardized product specifications. The capacities to do each of these processes were set when the refinery was planned and that was based usually on a particular crude oil composition and a particular expectation for product demand. Since U.S. refineries are about 30 years old, they are not seeing either the crudes they were designed for or the product demands and specifications expected. These changes are part of the reason that some refineries became unprofitable and were shut down. Consumers have demanded cleaner, more environmentally friendly low-sulfur fuels at the same time that refiners are driven to process less costly higher-sulfur crudes just to survive. This bind has further contributed to refinery shut downs, as the investment necessary to make low-sulfur products from increasingly highersulfur feeds is prohibitive for some companies. Wall Street investors continue to prefer to make higher short term returns on many other flashier high-tech investments than oil refining stocks.
Figure 2. Simplified refinery process flow diagram.. From the Petroleum Refining Industry Study, www.eere.energy.gov/industry/petroleum_refining/pdfs/profile.pdf.
We have given a brief description of how sulfur and other contaminants are replaced with hydrogen. Now let’s look at how carbon is rejected in other major process units in a typical refinery. There are two main types of carbon rejection process units in the refinery. The first is the fluid catalytic cracking unit and the second is the delayed coker unit. In both units, a shift of carbon and hydrogen arrangements occurs where large molecules often boiling above 950°F are broken or cracked and a low hydrogen coke residue is made along with a full boiling range of liquids, some C1-C4 gases, and hydrogen. Some sulfur and nitrogen atoms are cracked out of the molecules and are recovered as hydrogen sulfide and ammonia.
The gasoline liquids from the FCCU and the coker still don’t add up to the demand for 45% gasoline product. Further, the straight run gasoline from the crude unit is of such low octane value as to be unusable in modern engines without upgrading it.
Petroleum Liquid Fuels from Other Carbon Sources
Large amounts of fossil carbon exist in other forms that crude oil and some of these resources are being exploited already. The most significant example is the extraction and conversion of bitumen from oil sand into syncrude in Alberta, Canada. This operation supplies about 13% of Canada’s liquid fuels and some is exported to the U.S. The reserves are 2-3 trillion barrels of bitumen, equal to about 15 years of total current world consumption. It is a technologylimited process because huge amounts of oil sand have to be surface mined and extracted to recover 90% of the bitumen. Large amounts of water and natural gas are also needed to run the process. Current annual output is only equal to a few days of U.S. demand. Large tracts of land are disturbed and the sensitive habitats have to be restored. Until a more efficient extraction technology is developed, oil sand syncrude is not going to rival crude oil any time soon.
There are similar large amounts of heavy solid oil in Venezuela awaiting efficient extraction and conversion technology to really supplement crude oil. We will probably utilize the more liquid heavy crudes found in Columbia, Saudia Arabia, and other locations before the Venezuelan tar is used in any quantity.
Shale oil is another carbon resource (found in large quantities in the Western U.S.) that awaits an efficient technology to make its extraction attractive. It also requires large amounts of process water and disruption of surface habitat.
Gas to Liquids (GTL) This is a technology limited process that can convert many different carbon sources into clean low-sulfur products. We can look to the Exxon website and find that they are the leading major oil company in this technology. There are several competing catalytic processes to make liquids from gasified carbon sources. They are all based on the German Fischer-Tropsch hydrocarbon synthesis known for almost a century. In this reaction, the carbon source is partially oxidized in air or pure oxygen to give mostly carbon monoxide which can be reacted with steam to make hydrogen and carbon dioxide. The hydrogen is then used to react with more carbon monoxide and linear hydrocarbon chains can be synthesized up to solid waxes. The Sasol Corporation in South Africa has been making liquid fuels from coal by this process for decades due to the embargo on petroleum imports during the apartheid era.
Exxon is planning to utilize natural gas that is either flared or not produced to make liquids. There is a lot of gas that occurs too far away from pipelines to use it, so it is left in the ground. There is also a lot of natural gas produced with some crude oils that has no ready access to market and is either flared or the wells are left shut in. Exxon is beginning a large project in Qatar in the Persian Gulf to convert large amounts of natural gas available there into clean distillate liquids potentially usable to help meet low-sulfur diesel targets by blending with crude oil-based diesel. These plants are huge, requiring several billion dollars to make a single 100,000 b/d plant. There are offshore areas where natural gas is available, but they are too remote to access a pipeline and liquefied natural gas plants are not yet planned for remote gas. Exxon and others do contemplate building GTL plants for some of the off shore remote natural gas. The investment required is very large and putting such a plant on a floating platform is unprecedented.
Then there is the enormous amount of methane believed held in methane hydrates at the bottom of the ocean in large areas and in the some polar permafrost soil. Can this be produced economically? That is a speculative issue now, as no proven technology exists to safely bring the methane up from the bottom without leaking it to the atmosphere, where it is a potent greenhouse gas. Some research programs are underway, as the payoff is potentially very rich. The danger is that the unstable methane hydrates could be disturbed by trying to extract the methane from the ocean bottom, triggering a chain reaction of massive methane release and a greenhouse global warming event. There is also concern about disrupting the sensitive ocean bottom ecosystem on a grand scale. A lot of research and development has to be done fast if methane hydrates are going to be part of our fuel mix any time soon. Since most of the methane hydrates are well offshore, they would need a platform GTL plant to make production viable. Here is another investment opportunity choice.
We have touched on a lot of technology, from petroleum crude oil exploration and production through its refining and alternative carbon source conversion to petroleum liquids. A single theme repeatedly occurs. There are real technology limits at key points in these processes, but they are always dependent on investment. So even if the fundamental natural supply of crude oil is unlimited, that fact alone does not mean we will have enough petroleum products to meet increasing demand. Diligent scientists and engineers can likely invent innovative technology to overcome many if not all existing limits. However, as long as our collective demands for investment returns are too high and too short term to properly underwrite the large long cycle time investments in our petroleum energy infrastructure, we risk precipitating disruptions and shortages in petroleum product supply. We should keep in mind also that the demand side of technology can change the outcome quite effectively. The whole SUV phenomenon is mostly a marketing coup and has very little objective necessity. SUVs and other low gas mileage consumer vehicles have artificially increased demand for gasoline in the U.S. We don’t have to consume energy at near the rates that we do. We need to apply return on investment metrics to our personal energy consumption to see what we are getting for our expenditures. We also need to engage the political process to find a way to raise the incentive to invest in long term lower return financial instruments. The intended result is to ensure that sufficient capital is available to support improvements in refining technology and in conversion of alternative carbon sources to liquid petroleum products. The technology versus investment allocation issue seems like a “chicken or the egg first” question and at that point it appeals to our basic strength of will and courage to take the necessary risks and strive to build a healthy rewarding future.
Originally Published in Infinite Energy Magazine Issue 60, March/April 2005
|Home :: Archives :: Contact||
June 23rd, 2017
© 2017 321energy.com