Live Drilling
  • Introduction
  • FAQ
  • what's new
    • Latest releases
      • Wells 5
      • LiveRig 5
      • LiveRig 4
      • WITSML Store
    • Operations in Time by Depth Chart
    • Unit Sets
      • Per-Asset Units
      • Unit Conversion
      • Dashboard Configuration
      • Force Follow Asset Units
    • Well casing shoe schematic
    • Wells correlation
    • FFT spectrum
    • Pressure Tests
      • Configuration
      • Automated Standard Pressure Tests
      • Manual Pressure Test
      • LOT/FIT
    • Rig State detection
    • BOP Schematic
      • BOP status register
    • Signal Processing
      • Moving Average
  • Data Flow
    • Introduction
    • Data Ingestion
    • Data Normalization
      • Clock Synchronization
      • Normalized events schema
      • Data indexes and enrichment
      • Unit conversion
      • Auto-Switch
  • Physical Models
    • Introduction
      • Structure of the functions
      • Validation
    • General Equations
      • Static Data Dependencies
      • Pipes Functions
    • Trajectory
      • Introduction
      • Static Data Dependencies
      • Pipes Functions
    • Hydraulic
      • Introduction
      • Static Data Dependencies
      • Pipes Functions
    • Torque and Drag
      • Introduction
      • Static Data Dependencies
      • Pipes Functions
    • Hole Cleaning
      • Introduction
      • Static Data Dependencies
      • Pipes Functions
    • Surge and Swab
      • Introduction
      • Static Data Dependencies
      • Pipes Functions
    • Thermal
      • Introduction
    • Volume Tracker
      • Introduction
      • Pipes Functions
  • Basic Features
    • Charts
      • Channels Charts
        • Temporal Channels Chart
        • Channel Value Chart
        • Depth Channels Chart
        • Data navigation
          • Span Control
      • Rig Allocation Gantt Chart
    • Unit sets
      • Configuration changes on unit sets
      • Depth unit changes
      • Personal units sets
    • Permission schema
    • Import/Export Well
    • Add-ons
  • Static Data
    • Assets
      • Assets Structure
    • Well
      • Introduction
      • Well Schema
      • Well Units
      • Regions, fields and countries
      • Well Design Overview
      • Objectives
    • Intervention
      • Introduction
      • Intervention Schema
      • Intervention Types
      • Scenarios
      • Runs
      • Completion and Abandonment
      • Drilling Section Schema
    • Rig
      • Introduction
      • Rig Schema
      • Physical models configuration
    • Pipes functions
    • REST API Examples
  • Administration
    • High Frequency Data
      • WITSML Null Values
      • Unit Management Tools
      • WITS Custom Mapping
    • Data Normalization
      • Data Management
        • Event Settings
        • Channels Management
      • Data normalization templates
      • Data normalization templates prioritization
      • Auto-Switch
    • Standard Identifiers
    • Static Data
      • Regions, fields and countries
      • Intervention Types
  • LiveRig Collector
    • Introduction
    • Getting Started
    • Connecting to Intelie Live
    • Security
    • Local data storage
    • Data transmission and recovery
    • Monitoring
    • Remote Control
      • APIs
        • /testSourceEndpoint
        • /storeConfiguration
        • /getFromStore
        • /backlog-sync
      • Sources
        • MQTT Topics
        • OPC Requests
        • WITSML Backlog Sync
        • WITSML Object Explorer
        • WITSML Requests
      • Properties
    • HA Deployment
    • Protocols
      • WITSML
      • WITS
      • OPC-DA
      • OPC-UA
      • MODBUS
      • MQTT
      • CSV
      • RAW
    • Protocol conversion
    • Configuration
      • liverig.properties
      • sources.xml
      • store.json
      • modbus.json
      • mqtt.json
      • Configuring an OPC-UA source
      • Multiple event types for WITSML sources
      • Certificate-based authentication for WITSML HTTPS sources
    • LiveRig Collector Appliance
    • Command line Interface (CLI)
  • LIVE EDGE
    • Collector Reader
  • Integrations
    • Introduction
    • WITSML Store
    • REST Output
    • REST Input
    • WellView
    • OpenWells
    • Python
  • DEVELOPER
    • Identified Curves
    • Hidden Units
  • DEPRECATED
    • WITSML Output
    • LiveRig 3.x / 2.x
      • 3.5.0
      • 3.4.0
      • 3.3.0
      • 3.2.0
      • 3.1.0
      • 3.0.0
      • 2.29.0
Powered by GitBook
On this page
  • Filters
  • Peak Detection
  • Wave Generation
  • Outliers Removal
  • Interpolation functions
  • Multi Linear Regression
  • Pipeless Aggregations

Was this helpful?

  1. what's new

Signal Processing

PreviousBOP status registerNextMoving Average

Last updated 1 year ago

Was this helpful?

Requirement: plugin-processing-1.0.0+, plugin-spectrum-1.0.0+

Live provides pipe functions to improve signal data processing helping make business decisions.

Filters

The pipe function for filtering signals enables removing unwanted harmonic component to have a more clear curve using low/band/high pass filters.

Sine Wave Combined with Multiple Frequencies and Gaussian White Noise

def sine_wave(x, f=60, amp=1, theta=0): amp * sin(2*pi()*f*x + theta);

=> over all every sec
=> count() as x, normrandom(0, 2) as noise over all every item
=> sine_wave(x, 1/60, 10)
  + sine_wave(x, 1/30, 5)
  + sine_wave(x, 1/15, 2.5)
  + noise as y, x every item

Generate FFT of Original Signal

def sine_wave(x, f=60, amp=1, theta=0): amp * sin(2*pi()*f*x + theta);

=> over last min every sec
=> count() as x, normrandom(0, 2) as noise over all every item
=> sine_wave(x, 1/60, 10)
  + sine_wave(x, 1/30, 5)
  + sine_wave(x, 1/15, 2.5)
  + noise as y, x every item

=> signal.FFT(x, y#, 1, false) as result over last 5 min every min
=> result->magnitudes:seq:get(0) as mag, result->frequencies:seq as freq
=> @for range(freq:len()) as i, mag, freq
=> mag[i] as y, freq[i] as x

Applying Low Pass IIR Butterworth Filter

def sine_wave(x, f=60, amp=1, theta=0): amp * sin(2*pi()*f*x + theta);

=> over all every sec
=> count() as x, normrandom(0, 2) as noise over all every item
=> sine_wave(x, 1/60, 10)
  + sine_wave(x, 1/30, 5)
  + sine_wave(x, 1/15, 2.5)
  + noise as y, x every item
 
=> signal.filter(x, y, 0, 1/60, 1, 1, "low", "butterworth") as result, 
y over all every min
=> res->result:seq:get(0) as yArr, res->timestamps:seq as xArr
=> @for range(xArr:len()) as x, yArr
=> yArr[x] as y_filtered, x as x

FFT of the Filtered Signal

def sine_wave(x, f=60, amp=1, theta=0): amp * sin(2*pi()*f*x + theta);

=> over last min every sec
=> count() as x, normrandom(0, 2) as noise over all every item
=> sine_wave(x, 1/60, 10)
  + sine_wave(x, 1/30, 5)
  + sine_wave(x, 1/15, 2.5)
  + noise as y, x every item
 
=> signal.filter(x, y, 0, 1/60, 1, 1, "low", "butterworth") as result, 
y over all every 10 sec
=> res->result:seq:get(0) as yArr, res->timestamps:seq as xArr
=> @for range(xArr:len()) as x, yArr
=> yArr[x] as y, x as x

=> signal.FFT(x, y#, 1, false) as fftResultData over last 5 min every min
=> fftResultData:json():jsonparse() as result
=> result->magnitudes:seq as mag, result->frequencies:seq as freq
=> @for range(freq:len()) as i, mag, freq
=> mag[i] as y, freq[i] as x

Peak Detection

The pipes find peaks function is used to find peaks or valleys within a given sample. The values found can be filtered within a certain range that can take into account its height, plateau size, distance, prominence and width.

Detecting Peaks and Troughs on a Channel

mnemonic:MNEMONIC => value, timestamp

=> signal.findPeaks(timestamp#, value#) as res at the end
=> res->timestamps as xArr, res->heights as yArr
=> @for range(xArr:len) as x, xArr, yArr
=> yArr[x] as peak, xArr[x] as timestamp
mnemonic:MNEMONIC => value, timestamp

=> signal.findTroughs(timestamp#, value#) as res at the end
=> res->timestamps as xArr, res->heights as yArr
=> @for range(xArr:len) as x, xArr, yArr
=> yArr[x]#:abs() as trough, xArr[x] as timestamp

Wave Generation

Sine Waves

=> over last 10 min every sec
=> count() as x over all every item
=> signal.generate_wave("sin", x, 0.01, 0, 0) as y_0_deg,
   signal.generate_wave("sin", x, 0.01, 45/pi(), 0) as y_45_deg,
   signal.generate_wave("sin", x, 0.01, 60/pi(), 0) as y_60_deg

Sine Wave with Noise

=> over last 10 min every sec
=> count() as x over all every item
=> signal.generate_wave("sin", x, 0.01, 0, 0.02) as y_0_deg,
   signal.generate_wave("sin", x, 0.01, 45/pi(), 0.03) as y_45_deg,
   signal.generate_wave("sin", x, 0.01, 60/pi(), 0.04) as y_60_deg

Square Wave

=> over last 10 min every sec
=> count() as x over all every item
=> signal.generate_wave("square", x, 0.01, 0, 0) as y_0_deg

Square Wave with Noise

=> over last 10 min every sec
=> count() as x over all every item
=> signal.generate_wave("square", x, 0.01, 0, 0.05) as y_0_deg

Square Wave with differents Duty Cycles

=> over last 10 min every sec
=> count() as x over all every item
=> signal.generate_wave("square", x, 0.01, 0, 0, 0.10) as y_10,
   signal.generate_wave("square", x, 0.01, 0, 0, 0.50) as y_50

Outliers Removal

An outlier is an observation that is unusually far from the other values in a data set. Remove outlier is a common process to have a more clear data.

Remove the top 5% and bottom 5% values

data_with_outliers
=> signal.removeOutliers(timestamp#, value#, 'top_bottom') as filteredData over all every 10 minutes
=> filteredData->timestamps as xArr, filteredData->values as yArr
=> @for range(xArr:len) as x, xArr, yArr
=> yArr[x] as filtered, xArr[x] as timestamp

Interpolation functions

We can use pipes to estimate a point using interpolation functions.

In pipes we have two types of interpolation linear and polynomial ( lagrange method )

def @@x: (1.0, 2.0, 4.0, 5.0);
def @@y: (4.0, 6.0, 11.0, 17.0);
def @@xi: (3.0, 4.0);

=> @@x:seq() as x,
    @@y:seq() as y,
  @@xi:seq() as xi 
at the end
=> signal.linear_interpolation(x, y, xi) as result
def @@x: (1.0, 2.0, 4.0, 5.0);
def @@y: (4.0, 6.0, 11.0, 17.0);
def @@xi: (3.0, 4.0);

=> @@x:seq() as x,
    @@y:seq() as y,
  @@xi:seq() as xi 
at the end
=> signal.polynomial_lagrange_interpolation(x, y, xi) as result

Here's an example where the linear interpolation function can be used with real time data. On pipes based chart create two layers with the snippets bellow.

def @@channels: ("PRESSURE");
def @@ INITIAL_TIMESTAMP: 0;

rigA .timestamp:adjusted_index_timestamp adjusted_index_timestamp:* mnemonic!:@@channels
  => @compress.swingingDoor value# by mnemonic
  => newmap("y", value#, "x", timestamp) as data
  
=> @yield
=> list(_["x"]# - @@INITIAL_TIMESTAMP) as x, list(_["y"]) as y at the end
=> @for range(x:len) |> (x:get(_) as x, y:get(_) as y) as res
=> res->x as x, res->y as y_original
def @@channels: ("PRESSURE");
def @@ INITIAL_TIMESTAMP: 0;

rigA .timestamp:adjusted_index_timestamp adjusted_index_timestamp:* mnemonic!:@@channels
  => @compress.swingingDoor value# by mnemonic
  => newmap("y", value#, "x", timestamp) as data
  
=> @yield
=> list(_["x"]# - @@INITIAL_TIMESTAMP) as x, list(_["y"]) as y at the end

=> range(100) |> x:get(_) + _*8000 as xi, x, y
=> signal.linear_interpolation(x, y, xi) as yi, xi
=> @for range(xi:len) |> (xi:get(_) as x, yi:get(_) as y) as res
=> res->x as x, res->y as yi_interpolated

The result should be something like the image bellow.

Multi Linear Regression

Multi linear regression is a statistical method used to model the relationship between a dependent variable and one or more independent variables. There are several types of regression functions, including linear, polynomial, logarithmic, exponential, exponential decay, and power functions. The output of a regression analysis typically includes predicted values, coefficients, and statistical measures of goodness-of-fit.

The Multi Linear Regression pipes functions aggregate data over a certain period of time receiving the x and y values, the type of the function and, in the case of the polynomial, the degree. The return type is a row containing the predicted values, which is a sequence of numbers, the function coefficients, and, if present an error indicating what went wrong in the format of a string. These errors can be caused in case of using a invalid type or not enough data to make the regression. Therefore, their signature goes like the snippet below:

signal.regression(x, y, function_type, polynomn_degree)

For all examples (except real-time) the same base layer is used:

def @@channels: ("pressure");

event_type .timestamp:adjusted_index_timestamp adjusted_index_timestamp:* mnemonic!:@@channels
  => @compress.swingingDoor value# by mnemonic
  => value# as original

Polynomial

def @@channels: ("pressure");
def @@DEGREE: 2;

event_type .timestamp:adjusted_index_timestamp adjusted_index_timestamp:* mnemonic!:@@channels
  => @compress.swingingDoor value# by mnemonic
  => value# as y, timestamp# as x
  => signal.regression(x, y, "poly", @@DEGREE) as res, list(x) as xArr at the end
  => @for range(xArr:len) |> (xArr:get(_) as x, res->pred:get(_) as y) as r
  => r->x as timestamp, r->y as yPredOrder2

Logarithmic

def @@channels: ("pressure");

event_type .timestamp:adjusted_index_timestamp adjusted_index_timestamp:* mnemonic!:@@channels
  => @compress.swingingDoor value# by mnemonic
  => value# as y, timestamp# as x
  => signal.regression(x, y, "log") as res, list(x) as xArr at the end
  => @for range(xArr:len) |> (xArr:get(_) as x, res->pred:get(_) as y) as r
  => r->x as timestamp, r->y as yPredOrder2

Exponential

def @@channels: ("pressure");

event_type .timestamp:adjusted_index_timestamp adjusted_index_timestamp:* mnemonic!:@@channels
  => @compress.swingingDoor value# by mnemonic
  => value# as y, timestamp# as x
  => signal.regression(x, y, "exp") as res, list(x) as xArr at the end
  => @for range(xArr:len) |> (xArr:get(_) as x, res->pred:get(_) as y) as r
  => r->x as timestamp, r->y as yPredOrder2

Exponential Decay

def @@channels: ("pressure");

event_type .timestamp:adjusted_index_timestamp adjusted_index_timestamp:* mnemonic!:@@channels
  => @compress.swingingDoor value# by mnemonic
  => value# as y, timestamp# as x
  => signal.regression(x, y, "expd") as res, list(x) as xArr at the end
  => @for range(xArr:len) |> (xArr:get(_) as x, res->pred:get(_) as y) as r
  => r->x as timestamp, r->y as yPredOrder2

Power

def @@channels: ("pressure");

event_type .timestamp:adjusted_index_timestamp adjusted_index_timestamp:* mnemonic!:@@channels
  => @compress.swingingDoor value# by mnemonic
  => value# as y, timestamp# as x
  => signal.regression(x, y, "pow") as res, list(x) as xArr at the end
  => @for range(xArr:len) |> (xArr:get(_) as x, res->pred:get(_) as y) as r
  => r->x as timestamp, r->y as yPredOrder2

Real Time Usage

Layer 1:

def @@channels: ("pressure");

event_type .timestamp:adjusted_index_timestamp adjusted_index_timestamp:* mnemonic!:@@channels
  => @compress.swingingDoor value# by mnemonic
  => value# as original

Layer 2:

def @@channels: ("pressure");
def @@DEGREE: 2;

event_type .timestamp:adjusted_index_timestamp adjusted_index_timestamp:* mnemonic!:@@channels
  => @compress.swingingDoor value# by mnemonic
  => value# as y, timestamp# as x
 
  => signal.regression(x, y, "lin", @@DEGREE) as res, list(x) as xArr over last 10 sec every sec
  => res->pred:get(res->pred:len-1) as yPred, xArr:get(xArr:len - 1) as timestamp

Pipeless Aggregations

Latest versions

Requirement: plugin-processing-1.2.0+, liverig-5.2.0+, liverig-vis-4.6.0+

In this new update the aggregations are nested within the channel card to provide a better comprehension of what data is being used to create the modified series. To add a new aggregation to a channel click on the + button next to channel name.

Then select which aggregation is going to be created. At this point the plugin-processing provides the following aggregation types the corresponds to the pipes functions listed previously in this document: Moving Average, Signal Filtering, Peak and Through Detections and Outliers removal.

After adding the aggregations, the interface assumes a tree-like format.

The follow four images shows how the configurations are now set for the aggregation functions. Each of then has their own configuration panel, where the user can even customize the plotting style.

The peaks detection configuration uses a different plotting style (scatter). At this point is possible to plot the aggregation to use line or scatter plot.

Enabling and disabling aggregation can be done using a eye button like the channels cards.

All configurations can also be done in the view mode.

Older versions

Requirement: plugin-processing-1.1.0+, liverig-5.1.1+, liverig-vis-4.6.0+

To use a pipeless aggregation create a new temporal chart with the desired channels.

Next select which aggregations are going to be applied over the data.

Each aggregation has configuration fields that resembles the parameters passed to a correspondent pipes function.

It's possible to hide the original channel using the chart legend.

The aggregations can also be added in the visualization mode using the new chart configuration menu.

It is possible to turn on the filters, moving average and outliers or on a chart to calculate it through a temporal range.

Sine noised wave
FFT of the original sine
Low pass filter on noised sine
FFT of the filtered signal
Peak detection
Sine wave generated
Sine noised wave generated
Square wave generated
Square Wave with Noise
Square Wave with differents Duty Cycles
Outliers removal
Signal filter pipe function
Signal find peaks pipe function
Signal find troughs pipe function
Remove outliers pipe funn
Types of interpolation
Example of linear interpolation
Example of polynomial interpolation
Example of polynomial interpolation with real data
Example of polynomial regression
Example of logarithmic regression
Example of exponential regression
Example of exponential decay regression
Example of power regression
Example of real time data polynomial regression
UI improvement for the aggregations configuration
Editor menu with pipeless aggregations options
Listing all available aggregations
All available aggregations activated
Moving average configuration
Signal filtering configuration
Peaks detection configuration
Outliers removal configuration
Plotting style options
Enable and disable aggregations
Configuration in view mode
Editor menu with pipeless aggregations options
Applying Moving Average aggregation over the selected channel
Applying Filter aggregation, with low pass and butterworth configuration, over the selected channel
Applying Outliers aggregation over the selected channel
Hiding the original channel
New chart configuration menu with aggregations
Pipeless aggregation configuration on viewmode
Pipeless aggregations on multiple curves