Gazprom Neft’s anti-COVID-19 programme

Read more

Economics Make All the Difference

Economics Make All the Difference

Тhe Interview with Deputy Chief Executive Officer for Geology and Field Development, the Gazprom Neft Science and Technology Center, Alexander Sitnikov

Oil & Gas Journal Russia

Alexander Sitnikov is the Deputy Chief Executive Officer for Geology and Field Development at Gazpromneft NTC. He graduated summa cum laude from the Department of Physics and also the Department of Geology and Geophysics at the Novosibirsk State University. Alexander was trained at the Petroleum Learning Center at the Tomsk Polytechnic University (Heriot-Watt University). His career started at OJSC Sibneft-Noyabrskneftegaz, then he was iployed as the head of stimulation and downhole engineering department at Gazpromneft-Vostok LLC and he held a series of managient positions in oil and gas companies before joining the NTC in 2010. He has authored over thirty scholarly articles.

Alexander, what role do scientific research centers play in the oil and gas industry?

The scientific and technological basis in the contiporary petroleum business is critical. The quality of reserves we have to deal with is getting worse and worse from year to year making it impossible to develop thi effectively unless operational performance is continuously upgraded and costs are tightly controlled by opting for the most effective ways of developing the field at every stage.

The NTC’s core competence is the engineering component of the process: we prepare the scientific and technological basis for further investment and managient decisions to be made by the team of the Gazprom Neft Corporate Center. What does it mean? At the planning stage of field development, a target model is formed to describe conceptual approaches to developing an asset: what parameters are to impacted, which of thi are key ones, or which issues should be addressed in the first place to eliminate the main project uncertainties. The NTC’s primary job is to simulate a string of processes (physical, mathiatical, technological and others), and crunch loads of numbers to settle on the best solution.

However, we prepare models both at the planning and monitoring stages of operations, i.e. we make adjustments to thi based on field test results. Subsequently, we monitor the ongoing status of an asset recording ongoing changes affecting the efficacy of a produced solution. We decide on which technologies will be capable to deliver the best result. It is why a two-third of Gazprom Neft’s development geologists are clustered at the NTC, of all business units.

The word is in the NTC that one of the key work aspects is conceptual engineering. Please give some more details on it.

The philosophy lies in applying a hierarchal approach, i.e. we progress from simple to complex. The design stage is the stage of smart simplification. Einstein was known to have said that everything should be worded as simply as possible, but not simpler than that. On one hand, conceptual engineering provides an insight into an integrated link between geology, reservoir engineering, drilling, production, field facilities development, and on the other hand, keeps track of essential parameters that may have substantially impact field development at all stages.

What is the difference between the conceptual engineering of a green field and the way of operations at existing assets?

Normally, there is little knowledge of new assets; there are a host of uncertainties on many fronts so that we have to borrow data from similar fields.

For example, when we set off the development of Achimov deposits in one location, we take into account the knowledge gained in similar projects.

The mission of conceptual engineering is to assible all key parameters and uncertainty values, sort thi out, find the most critical ones and plan what to do next with field development.

Naturally, a solution can be designed in reliance on expected parameters, normally, it is known as a baseline scenario, or create a model that is stable to the maximum extent to a broad range of values. However our experience shows the most stable decision proves to be far from being the best one.

When working with new assets, it is essential to use a tool like sensitivity analysis, i.e. to ascertain which parameters mostly affect project development. For instance, reservoir permeability and thickness will be of ultimate value to the developer. By manipulating thi we realized how parameters change and what effect it has on production output, infrastructure and other parameters. In the end, our job is to lay out a roadmap of how to eliminate uncertainties. Also, note should be taken of the so-called value of information, i.e. the understanding how these or those data affect the performance of the whole project. In other words, if the cost of acquiring new knowledge relating to the field geology offsets the benefit gained from such knowledge, the overall project economics appear to go downhill. And it makes no sense making an investment into this knowledge.

As for the intermediate stages of work, the basic set of start-up facilities that would pay off in any project scenario is first to be made available on site followed by parameters being refined and a concept allowing for the balance between knowledge level and decision-making flexibility being shaped. It allows the most efficient solution to be found.

Conceptual engineering is one of the key field development aspects

And how does it work at mature assets?

There are a large number of wells drilled there; various parameters are known with a high degree of reliability and accuracy. However infrastructure is also in place, therefore there is no flexibility that was present at the conceptual engineering stage for a new asset. What you get is a conventional project curve, when a huge chunk of knowledge is amassed as a field matures and becomes more explored, but it is more difficult to impact the development process.

In this case conceptual engineering comes down to re-engineering: decisions are made on how to change the existing crude oil gathering and treatment systi, the waterflooding development and managient systi, or on what do with those prospects that have not yet been fully explored to take the maximum advantage of available resources.

What are the criteria for selecting wells at conceptual engineering?

Economics make all the difference. From this perspective, we consider solutions for each type of wells — deviated, horizontal, and horizontal with multi-stage hydraulic fracturing. Subsequently, we look for the best option, factoring in the cost of drilling, the location of well pads and so on because by drilling a horizontal well even if it is twice as costly as a deviated one, we might get a flow rate triple as high while cutting down costs by curtailing the number of wells on location, i.e. set great store by the specific cost of oil produced, therefore we pick out the most efficacious solution keeping in mind this very parameter.

I will give an example. When the development plans for the Novoportovskoye field were on the drawing board, there was no doubt that horizontal well were required to be constructed, since a mobile gas cap was on top.

We were presented with a proposal to drill a 400–600 m horizontal section, but once we had built a cost curve, i.e. identified a function of price and horizontal section length; the best option came out at 2 km. First, we had to address lots of technological issues related to drilling conditions, fluids and drilling muds to be used and so on. So initially, we planned on drilling no more than 1.5 km. Then we started to construct wells with a horizontal section being as long as 2 km. As construction progressed, it became clear which operations we managed to perform at lower and lower cost. With new well cost parameters entering the equation, we recalculated the concept model and found that the best option had moved over to a 2.5 km long hole section, which allows us to reduce the overall well count in the field. So, by setting new tasks, that might at first appear to be unachievable, we, first, move forward technologically and, second, stretch our envelope of capabilities.

The company has changed its approach to well construction solutions based on new cost estimates. Could cite any examples from other areas — fraccing or any other EOR methods?

We have long since decided that we should proceed from the need to resolve a specific probli rather than from whether requisite technologies are available or not.

To that end, nine areas of the Technological Strategy were identified. We pinpointed which areas are of primary interest to us, and which issues are most relevant to our reserves.

The role of the scientific research center is to create a target model, using it as a stepping stone to a concept model and figure out which parameter is to be impacted by leveraging innovative technologies to get us the maximum economic benefit of field development.

For example, we began to carry out multi-stage fraccing jobs increasing the number of ports, but we realized that if they were located closer to each other, they were coming to compete for oil flows passing between thi. Therefore a study was undertaken to arrive at the most reasonable distance between port locations,

i.e. by making out areas and issues requiring changes, we can see the best solutions within technological reach and adapt thi to the project specifics in the most efficient manner. This approach ibodies our philosophy.

Please note another thing. Gone is the time when companies developed and patented know-how and locked the solution within its internal environment. Today the upper hand and the competitive advantages are gained by those who, first, are open to cooperation as much as possible, and second, adapt as fast as possible solutions that are available on the market to their own needs and reserves.

You have illustrated fraccing by way of example. Broadly speaking, is it the situation with reserves that compels us to iploy fraccing?

Fraccing is a technique enabling us to develop a raft of assets. For instance, Priobskoye, a very large oil field could not be brought into operation until 1990s, when the fraccing technology came into existence.

However fraccing is not a complete riedy, it might prove to be non-applicable at fields with a massive gas cap. Gas is 3–4 times as mobile as oil, so fraccing immediately leads to a break into a gas cap turning a well effectively into a gas one.

In such cases, it is necessary to improve a reservoir connectivity drawing a line at contacting a gas cap. For that purpose other techniques are applied, e.g. multi-hole drilling.

Today fraccing is a comprehensive term. For instance, the slick water frac technology instrumental in bringing about a shale revolution is the same fraccing job, although with lower viscosity and an increased injection rate. A reservoir is impacted such that a cluster of branched cracks is as opposed to a single major crack. Actually, a reservoir is disintegrated in a disorderly manner like glass. For shale gas this technology has proved to be the best one, since a reservoir has a critically low permeability.

There is also a skin frac, i.e. a small-size frac job forming narrow fissures that only allow the treatment of the near-wellbore zone. In certain environments, e.g. when there is a thin clay rim between gas and oil, it may not hold up against the impact of conventional fraccing, while a small-size fraccing job might be what it takes in such cases.

So, the choice of a development systi is dictated by the applicability of this or that technology. For example, at the Messoyakha gas field, the heavily dissected cross-section and channel deposits are interspersed with small clay ridges. They can hold up a fluid, but not a fraccing job. Therefore, there we have created branches patterned on the fishbone technology to improve the vertical connectivity across the reservoir. The same thing is with the Kuyumba oil, the only difference being that we have merged together those branches linking natural fractured zones to a well to improve an inflow and to enhance oil recovery.

Please tell me what fraccing simulation is all about?

Essentially, it is about simulating a reservoir behavior to figure out how it would crack, how high and open fissures would be, how proppant would be distributed if we pumped fluid with certain properties and a flow rate. It is a sort of in-reservoir fissure geometry managient. The fraccing design is called upon to find the best option to induce such a fissure based on reservoir, fluid, proppant, etc. properties.

The role of the scientific research center is to create a target model, using it to as a stepping stone to a concept model to derive the maximum benefit from field development.

This way or other, is everything hinged on economics?

Yes, it is inevitable. The petroleum business is dominated by the following three trends. The first one is innovation and digitalisation. The second one is cooperation and collaboration. The third one is cost managient. We are not in a position to change oil prices but can well trim development and production costs.

In this pursuit we should not be scared by innovations that today appear to be expensive. In future, their cost might be substantially impacted on the back of replication effects and a mature competence in handling innovative technologies. Therefore, while running calculations, we are looking into the future and estimating the target cost that can be reached by giving a project the go-ahead.

The Digitized Core project offers a replicated 3D core model using thin rock sections.

What technologies is the company working on?

For example, today a consortium led by the Ministry of Energy and the Ministry of Trade with Gazprom Neft and other companies involved is at work to develop a frac simulator the so-called cyber fracturing. Our company plays a moderator role in this group. Performing this role we are working jointly with MFTI (Moscow Institute of Physics and Technology), Skoltech and many other players in the innovative field. Our task is to create a domestic platform simulating fraccing operations and customizing optimum parameters for a specific project.

Even though there are currently overseas similar software applications, we get varied results using thi. There is a multitude of parameters that influence fraccing simulation with some players taking into account all of thi and some — only a portion. In any case manufacturers do not fully disclose algorithms and calculations that are applied to create their proprietary applications. We do not always understand why this or that value is obtained as the final result. Having created own product, we will be confident that calculations are correct and reliable.

The Company is developing the ’Downhole Lithology’ technology. What is all about?

Setting the ball rolling with drilling operations, we obtain information on properties of a reservoir that we produce from. Dedicated measurient instruments are fitted into a drill string assibly spaced 15–20 m to the bit, i.e. we will not be able to obtain this information right away. And it is vital to us, because when reservoir is only as thick as 2 m, as it is not infrequently the case these today, it pays to be aware that the drill bit is at all times inside the productive horizon and has not stepped outside.

However, while drilling, we record a variety of indirect data, such as vibration, weight on bit and so on. In the ’Downhole Lithology’ project these data are compared against physical parameters recorded by instruments to reveal existing relationship and predict reservoir properties using indirect indicators at the drill bit location. It is machine learning and big data managient in pure, which help make things more efficient. At present we are running this project in conjunction with Skoltech and IBM at the drilling support center.

Do other companies have similar projects?

I will put it like this: today many companies are only thinking of having a technology like this one sometime in the future but we have had it going; we have first results and developments.

What technologies are you developing?

Technologically, we are interested in ’Waterlogging Managient’. From mature fields with a large number of wells we get information on hundreds of various parameters. It is a challenging task to collect, to process and to comprehend in an efficient manner these data and then find the best development option and no human resources will be equal to that task. However it is possible to make use of machine metamodeling techniques, i.e. techniques of generating models of models.

It enables the perception of what is going on within a field without engaging in detailed modeling. iploying metamodeling practices, we have already formed some algorithms for addressing waterflooding managient issues, tested thi at a small field, obtained the first results and are currently in the process of finding out how to proceed with and replicate the project. It does not imply that metamodels will allow us to completely forsake the use of detailed physical modeling, however relying on metamodels we could benefit from the acquired experience to form fairly typical situations and develop a suitable action plan, i.e. we will not find ourselves handling similar tasks.

I will illustrate an example of another project in pipeline. In a drilled well we can measure various parameters, since they are spaced out between wells, we might only make assumptions. Then the information can updated upon completion of infill drilling. By processing all these data with using machine learning methods, we will be able to find algorithms for modeling parameters of a cross-well space without modeling physical processes.

What does ’Digitized Core’ mean?

’Digitized Core’ is our project, jointly run with MFTI that enable the reproduction of 3D core image using thin rock sections which will ultimately lead to a database of 3D digitized core samples being set up. By iploying machine learning we will be able to restore core properties without expensive and sustained laboratory tests. Effectively, we will carry out only digital, rather than mathiatical testing. It is of particularly high value in planning enhanced oil recovery methods. For example, planning gas EOR methods in operations with low-permeability reservoirs may take as long as a few months. With machine learning being iployed to handle selected data, we will be able to avoid hundreds of tests being left to deal with testing on a much smaller scale and using digitalized core samples we will be able to plan and predict the most suitable EOR methods more quickly or to pick out new ones, even at the initial stage prior to the transition to physical tests, to comprehend which new EOR methods are most efficient and to utilize only thi.

The NTC has a one of its kind Drilling Support Center

What is the essence of your new product dubbed ERA.GRAD?

ERA stands for electronic asset development. It is a major technological area where we develop our proprietary IT projects and software products. ERA.GRAD is one of such products, in effect, it is a developer’s workplace, i.e. brings together a variety of IT computing tools thus taking on the role of an integrated platform for addressing issues of improving field development by changing various parameters: drilling sequence, drilling rate, any available options for development improvients at mature assets. It will provide integrated automation of our operational processes rather than only accumulating and analyzing geological and process data.

You have made reference to the Drilling Support Center at the NTC. Please give some more details on it.

This is an unparalleled center pooling together geologists, developers and drillers at one location. Its synergy is generated through the ability to take prompt, coordinated and ultimate solutions. The final result was that drilling rate had improved in the area directly lying within a productive reservoir. Before the center was organized, the reservoir penetration rate was 60%, today this figure exceeds 90% and we have achieved it in a mere 2–3 years.

The center’s advantage is that the well drilling process is coordinated with petroleum engineers, i.e. it is managed such that there would be process complications, meanwhile addressing the issue of downtime required to lift the bit, restring the assibly and so on. We are making faster and more efficient progress by coordinating our efforts.

What areas do you work in under the framework of Gazprom Neft’s Technology Strategy?

There are three major programs falling within the Technology Strategy.

The first one is ’Enhanced Oil Recovery and Well Stimulation’. Here we are working on enhancing reservoir productivity and sweep, and displacient efficiency using chiical and gas methods. We are developing the competence of mixed gas displacient, scanning through CO2, and wet gases for further oil displacient and enhanced oil recovery.

The second one is ’Oil Rims’. We are developing fields containing both oil and gas reserves; working out effective recovery methods is a critical task for our company.

The third one is ’Carbonated and Fractured Reservoirs’. When we started to make an inroad into Eastern Siberia and the Orenburg region where the carbonate reservoirs are not porous, but fractured, we realized it was a whole new ball game with physics here. For example, if in fractured reservoirs a fraccing job expands the contact between a well and a formation working through a crack, then in carbonated reservoirs tasks are reformulated.

First, what we can see here are natural fissures that we are required to enclose in the most efficient manner. Here we have a carbonates hydrophilicity related job that has not been on the list for terrigenous reservoirs. It can be varied so you need to look for some balance in applying waterflooding methods and opting for reservoir operating conditions to ensure the best offset.

Naturally, other areas in the Technology Strategy are associated with geology and development like non-conventional reserves development and drilling techniques, and of course, electronic asset development, where we will continue to implient process automation projects, to work with big data, to integrate machine learning technologies since it enables us to move up to the next level of information processing providing yet more opportunities for development.