TIBCO Logo Blue
Figure 1. TIBCO

Quick Start

This document outlines basic tasks required to get up and running using the Drilling Operations and Monitoring Accelerator

Copyright Notice

COPYRIGHT© 2017 TIBCO Software Inc. This document is unpublished and the foregoing notice is affixed to protect TIBCO Software Inc. in the event of inadvertent publication. All rights reserved. No part of this document may be reproduced in any form, including photocopying or transmission electronically to any computer, without prior written consent of TIBCO Software Inc. The information contained in this document is confidential and proprietary to TIBCO Software Inc. and may not be used or disclosed except as expressly authorized in writing by TIBCO Software Inc. Copyright protection includes material generated from our software programs displayed on the screen, such as icons, screen displays, and the like.

Trademarks

Technologies described herein are either covered by existing patents or patent applications are in progress. All brand and product names are trademarks or registered trademarks of their respective holders and are hereby acknowledged.

Confidentiality

The information in this document is subject to change without notice. This document contains information that is confidential and proprietary to TIBCO Software Inc. and may not be copied, published, or disclosed to others, or used for any purposes other than review, without written authorization of an officer of TIBCO Software Inc. Submission of this document does not represent a commitment to implement any portion of this specification in the products of the submitters.

Content Warranty

The information in this document is subject to change without notice. THIS DOCUMENT IS PROVIDED "AS IS" AND TIBCO MAKES NO WARRANTY, EXPRESS, IMPLIED, OR STATUTORY, INCLUDING BUT NOT LIMITED TO ALL WARRANTIES OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. TIBCO Software Inc. shall not be liable for errors contained herein or for incidental or consequential damages in connection with the furnishing, performance or use of this material. For more information, please contact:

TIBCO Software Inc.
3303 Hillview Avenue
Palo Alto, CA 94304
USA

Preface

Purpose of Document

The following document describes the steps required to get up and running with the Drilling Operations and Monitoring Accelerator. It also outlines how to use the signature demo in order to illustrate how the accelerator works.

Scope

This document outlines the following:

  • Required software

  • Installation requirements

  • Post-install steps

  • Running the demo

  • Using the simulator

  • Recommended demo scripts

Overview

Business Scenario

In the Oil and Gas industry Drilling Operations refers to the process of drilling into the earth’s surface in order to extract oil. As one can imagine, this is a complex process and one that generates a lot of information. The drill head is instrumented to send information on such things as ROP, Depth, Temperature back to the surface. This information is typically captured in an historian. As the information is captured there are valuable insights to be had on the data in real time. This Accelerator enables the real time streaming of drilling Data, and concurrent analysis. The Accelerator gives Operators the control over the data coming off the rig.

Benefits and Business Value

The Drilling Operations and Monitoring Accelerator (DOMA) gives the flexibility on what is measured from the Rig data, and also how this information is presented to the Drilling Engineers. TIBCO Spotfire provides the ability to host the real time data, alongside the historical data. This combination of realtime and historical data greatly assists the analysis in the event of a real time anomaly.

Functional Objectives

The TIBCO Drilling Operations and Monitoring Accelerator and Demo as described, provide a Drilling application. The Stages in this application are out lined in the figure below:

HighLevelArchitecture
Figure 2. High Level Architecture

In Summary the processing steps are:

  • Ingest Rig Data from WITSML Data Aggregators (PASON,NOV etc)

  • Process this data through Indicators, modules to calculate Rig State/ROP/RPM

  • Calculated fields are passed into TIBCO Live Datamart tables

  • Tables visualized in TIBCO Spotfire and LVWeb (both custom and LVWeb layouts)

Technical Scenario

The Drilling Operations and Monitoring Accelerator provides users with the ability to

  • Connect to WITSML data stores

  • Normalize the data from WITSML list format to StreamBase tuples

  • Enrich the data stream with custom indicators (eg Average RPM, Torque and Drag)

  • Visualise this data stream in a custom javascript UI

  • Embed real time visualisations in TIBCO Spotfire

Components

Top Level Application

High Level Architecture

The Accelerator comprises of 3 main components

  1. TIBCO StreamBase Event flow application

  2. TIBCO Live Datamart tables

  3. TIBCO Live Datamart UI

The StreamBase application is pictured above comprises 5 top level modules and Extension Points that encapsulate the following functions

  1. DataSource

  2. IndicatorContainer

  3. DataSinks

DataSource

This provides the stream of tuples from WITSML, WITS or CSV downstream to the rest of the application. In the case of WITSML, which is a primary data source in the accelerator the StreamBase WITSML adapters provide the connectivity to the WITSML servers. The WITSML StreamBase project contains Event Flow handler code that wraps the adapters to build functional blocks comprising of:

  • Register Active Well

  • Manually Subscribe to a Well

  • Read a WITSML log file

  • Read a WITML trajectory file

The reading of the log files is controlled by a Metronome that is configured via the variable WITSML_POLL_INTERVAL in Drilling_engine.conf file.

In case of WITS, which is also a primary data source in the accelerator, the StreamBase WITS adapters provide the connectivity to the WITS0 data via serial port. The WITS StreamBase project contains Event Flow handler code the wraps the adapters to build functional blocks comprising of:

  • Connect to serial port

  • Send handshake

  • Receive and parse WITS0 data

Datasources must implement the Datasource.sbint interface file located in the Drilling_Schema schema project

DataSource

IndicatorContainer

This Extension point contains modules for enriching the WITSML data feed with calculations. Examples of these indicators are:

  • OperationsIndicator

    • This is where the Rif State is calculated

  • AverageRotaryRPM

    • This is where Avd RPM is calculated, allowing to detect outliers at a later state.

It’s important to note that this part of the accelerator is designed to be extended on a per project basis. Another example of an indicator that could be done here is Torque and Drag. Typically the indicators combine StreamBase Event Flow code and model operator code (such as TERR). The model operator code is executed on each event as it come from the WITSML data source. The Indicators reside in the folder structure as below:

Indicator List

The Indicator(s) are executed in the StaticIndicatorContainer.sbapp via 3 sbapp modules:

Indicator Groups

In each IndicatorGroup module there is an extension point as below

IndicatorExtension Point

The above event flow code enable multiple Indicators to be executed before the result is fed downstream

DataSinks

This Extension point is where the end points to the applications are hosted. The accelerator has the following options for data publication:

Datasinks are configured in the DrillingOperationAndMonitoring.sbapp in DrillingOperationsAndMonitoring project in the properties of the DataDink Extension point.

Installation

Base Software

The following software is required to develop and/or run the accelerator. The role of each item is outlined in the table below, as well as the recommended installation location

Table 1. TIBCO Components
Software

TIBCO StreamBase Standard Edition 10.4.4

TIBCO Enterprise Runtime for R (TERR)

* TIBCO Spotfire 10 Desktop (optional to run dxp)

Build

Importing the Accelerator code

Import the accelerator code as follows:

  • Unzip the OilAndGasDrillingAccelerator.zip to a workspace location

  • In StreamBase select File→Import

  • Select Maven\Existing Maven Projects

  • For Root Directory, navigate to the location of the unpacked OilAndGasDrillingAccelerator.zip

Import

In a blank StreamBase workspace import the available projects into the workspace:

  • Drilling

  • Drilling_App

  • Drilling_LDM

  • Drilling_Rig

  • Drilling_Schema

  • Drilling_WITS

  • Drilling_WITSML

  • DrillingAccelerator

In OilAndGasDrillingAccelerator.zip there is a Spotfire directory that contains the dxp file that contains the live charts.

Customizing

Configuration

The configuration is stored in the Drilling/src/main/configuration/Drilling_engine.conf file located in the Drilling project.

General

Setting Description

DATA_SINK_CSV_FILENAME

The csv filename to use when writting the data to file. This option only applies if the CSV data sink has been added to the top level applicaiton.

DATA_SOURCE_FEEDSIM_FILENAME

The feedsim file to use. This file is only used if the feedsim is added to the data sources in the top level application.

LDM_URL

The LDM URL to use when sending data to an LDM system. This option is only used when the LDM Publisher is added as a data sink to the top level application.

WITSML

Setting Description

WITSML_SERVICE_URI

The WITSML service endpoint URI

WITSML_ENABLE_AUTH

true/false if authentication is required to connect

WITSML_USERNAME

The username to use when connecting

WITSML_PASSWORD

The password to use when connecting

WITSML_ACTIVE_WELL_STATUS

The active well status to filter for, or blank to ignore filtering

WITSML_CONNECT_TIMEOUT

The WITSML connect timeout when making SOAP requests to the WITSML server.

WITSML_READ_TIMEOUT

The WITSML read timeout when making SOAP requests to the WITSML server.

WITSML_MNEMONIC_TIME_FIELD

The field in the WITSML log data the represents the time. This value is used to determine the last time received to offset the next query to the WITSML store.

WITSML_MNEMONIC_DEPTH_FIELD

The mnemonic field in the WITSML log data the represents the depth. This value is used to determine the last depth received to offset the next query to the WITSML store.

WITSML_VERSION

The version of WITSML to use. Available options are: NONE, 1.3.1.1, 1.4.1.1

WITSML_POLL_INTERVAL

The number of seconds to wait between sending requests to the WITSML store to get new data.

WITSML_POLL_ON

Determines if WITSML polling is enabled on startup

WITSML_POLL_DATE_TIME_SECONDS

The number of seconds worth of time data to fetch for each poll. If the end date goes past the endDateTimeIndex this value will be ignored and system will fetch all remaining data available. A blank value fetches all data.

WITSML_POLL_INDEX_SIZE_VALUE

The number of deptch records to fetch past the current index for each poll interval. If the current index plus this value goes past the endIndex this value will be ignored and system will fetch all remaining data available. A blank value fetches all data.

WITSML_POLL_INDEX_SIZE_UOM

The UOM value to use with the WITSML_POLL_INDEX_SIZE_VALUE value, or blank to use existing UOM of the log.

WITSML_POLL_TRAJECTORY_DATE_TIME_SECONDS

The number of seconds worth of time data to fetch for each poll. If the end date goes past the dTimTrajEnd this value will be ignored and system will fetch all remaining data available. A blank value fetches all data unless index is used.

WITSML_POLL_TRAJECTORY_INDEX_SIZE_VALUE

The number of depth intervals to fetch past the current index for each poll interval. If the current index plus this value goes past the mdMx this value will be ignored and system will fetch all remaining data available. A blank value fetches all data.

WITSML_POLL_TRAJECTORY_INDEX_SIZE_UOM

The UOM value to use with the WITSML_POLL_TRAJECTORY_INDEX_SIZE_VALUE value, or blank to use existing UOM of the trajectory.

WITSML_DATE_FORMAT_IN_LOGS

The date format used in actual data stored the WITSML logs. See Date Format Reference Below.

WITSML_DATE_FORMAT_LOG

The date format used by the WITSML log store. See Date Format Reference Below.

WITSML_DATE_FORMAT_SUBSCRIBE_LOG

The date format used when subscribing to a WITSML log. See Date Format Reference Below.

WITSML_DATE_FORMAT_SUBSCRIBE_TRAJECTORY

The date format used when subscribing to a WITSML trajectory. See Date Format Reference Below.

WITSML_DATE_FORMAT_SUBSCRIBE_WELL

The date format used when subscribing to a WITSML well. See Date Format Reference Below.

WITSML_DATE_FORMAT_SUBSCRIBE_WELLBORE

The date format used when subscribing to a WITSML wellbore. See Date Format Reference Below.

WITSML_DATE_FORMAT_TRAJECTORY

The date format used when in the actual trajectory data. See Date Format Reference Below.

WITSML_LOG_NAME_REGEX

A regular expression used to filter out specific logs to be processed.

WITSML_TRAJECTORY_NAME_REGEX

A regular expression used to filter out specific trajectories to be processed.

WITSML_MISSING_DATA

A value to set fields which have a matching mnemonic but no value in the logs.

WITSML_MISSING_DATA_INCLUDE_MISSING_FIELDS

true/false if true then fields of the data schema will be set to the WITSML_MISSING_DATA even if no mnemonic is present for that field.

WITSML_LOG_ADAPTER_LOG_LEVEL

The WITSML log adapters log level. Set to DEBUG to view the SOAP request response. Set to TRACE for complete transaction and parsing logging.

WITSML_TRAJECTORY_ADAPTER_LOG_LEVEL

The WITSML trajectory adapters log level. Set to DEBUG to view the SOAP request response. Set to TRACE for complete transaction and parsing logging.

WITSML_WELLBORE_ADAPTER_LOG_LEVEL

The WITSML wellbore adapters log level. Set to DEBUG to view the SOAP request response. Set to TRACE for complete transaction and parsing logging.

WITSML_WELL_ADAPTER_LOG_LEVEL

The WITSML well adapters log level. Set to DEBUG to view the SOAP request response. Set to TRACE for complete transaction and parsing logging.

WITSML_NON_MATCH_LOG_SUBSCRIPTION_LOG_LEVEL

The WITSML non match log adapters log level. Set to DEBUG to view the SOAP request response. Set to TRACE for complete transaction and parsing logging.

WITSML_NON_MATCH_TRAJECTORY_SUBSCRIPTION_LOG_LEVEL

The WITSML non match trajectory adapters log level. Set to DEBUG to view the SOAP request response. Set to TRACE for complete transaction and parsing logging.

WITSML_ENABLE_PROXY

true/false if a proxy server is used to connect to the WITSML server.

WITSML_PROXY_HOST

The proxy servers host ip

WITSML_PROXY_USER

The username required to connect to the proxy server.

WITSML_PROXY_PASSWORD

The password required to connect to the proxy server.

WITSML_PROXY_PORT

The proxy servers port.

WITSML_READ_LOG_METADATA_THREAD_COUNT

The number of concurrent threads used to fetch log metadata on startup. WARNING: Setting this value causes multiple concurrent connections to the WITSML server and may overload the server.

WITSML_READ_LOG_THREAD_COUNT

The number of concurrent threads used to fetch log data. WARNING: Setting this value causes multiple concurrent connections to the WITSML server and may overload the server.

WITSML_READ_TRAJECTORY_THREAD_COUNT

The number of concurrent threads used to fetch trajectory data. WARNING: Setting this value causes multiple concurrent connections to the WITSML server and may overload the server.

WITSML_SUBSCRIBE_THREAD_COUNT

The number of concurrent threads used to fetch well and wellbore metadata on startup. WARNING: Setting this value causes multiple concurrent connections to the WITSML server and may overload the server.

Date Format Reference

Note: Date formats with microseconds will be converted to milliseconds before parsing. Also ISO timezones will be converted from Z to +0000 before parsing. For example if the date is 2018-12-09T07:59:27.3057523Z it will be converted to 2018-12-09T07:59:27.305+0000

Letter

Date or Time Component

Presentation

Examples

G

Era designator

Text

AD

y

Year

Year

1996; 96

M

Month in year

Month

July; Jul; 07

w

Week in year

Number

27

W

Week in month

Number

2

D

Day in year

Number

189

d

Day in month

Number

10

F

Day of week in month

Number

2

E

Day in week

Text

Tuesday; Tue

a

Am/pm marker

Text

PM

H

Hour in day (0-23)

Number

0

k

Hour in day (1-24)

Number

24

K

Hour in am/pm (0-11)

Number

0

h

Hour in am/pm (1-12)

Number

12

m

Minute in hour

Number

30

s

Second in minute

Number

55

S

Millisecond

Number

978

z

Time zone

General time zone

Pacific Standard Time; PST; GMT-08:00

Z

Time zone

RFC 822 time zone

-0800

WITS

The WITS configuration is stored in Drilling/src/main/resources/adapter-configuration.xml, please see the WITS adapter documenation for all the available settings. MQTT is currently the default message bus to send data from a rig to a datacenter.

Setting Description

WITS_PLAYBACK_FILENAME

If set this is a file which contains a WIT0 formatted feed to use as a playback file on startup. This can be used for testing connectivity and WITS data without having to be connected to a WITS serial comm port.

WITS_RECORD_FILENAME

If set this file will be created and will contain a recording of all the WITS0 data captured.

WITS_MQTT_SUBSCRIPTION_QOS

The MQTT subscriptions quality of service level 0 - At most once, 1 At least once, 2 Exactly once.

WITS_MQTT_PUBLISH_QOS

The MQTT publish quality of service level 0 - At most once, 1 At least once, 2 Exactly once.

WITS_MQTT_TOPIC_PREDICATE

The MQTT topic predicate, which is the string that is added before each record type to create an MQTT Topic. Example if the predicate is 'WITSRecord' and the Record is 23 you will have a MQTT topic of 'WITSRecord23'.

WITSML Mnemonics

The accelerator comes with a pre-determined set of logCurveInfo fields pulled from the WITSML log data. The initial fields need to be edited to match your WITSML servers mnemonic list, to do so please do the following:

  1. Open Drilling_Schema/src/main/eventflow/com.streambase.accelerator.drilling.operationsandmonitoring.schemas/Schema.sbint

    1. Edit the MnemonicSchema

      1. Update each field name with the correct mnemonic name from your WITSML log feed. The mnemonic field is mapped to the structured data via the name in the description for example the current TIME mnemonic is mapped to the DataPoint.Timepoint field.

      2. To add new fields follow the instructions below.

Adding WITSML Log Data Field

The accelerator currently extracts a set number of fields from the log data and converts that data into StreamBase tuples with a well defined schema. To be able to include more fields from the WITSML log data you will be required to edit the following files:

  1. Open Drilling_Schema/src/main/eventflow/com.streambase.accelerator.drilling.operationsandmonitoring.schemas/Schema.sbint

    1. Edit the DataPointSchema

      1. Add the new data point field to the schema, for example 'WeightOnBit' as a double

    2. Edit the MnemonicSchema

      1. Add any new mnemonics to this schema directly as the mnemonic name found in the log, for example 'WOB' as a double

      2. In the description column of the field add a map to the DataPointSchema by setting the description to DataPoint.FieldName, for example to map to the new field created above you would enter 'DataPoint.WeightOnBit'

Adding a new indicator

The accelerator currently has minimal indicators to demostrate how and where to process the incoming data into indicators. In the current structure indicators are broken into three groups. The reason we break them into groups is so that the following groups can using data from the first group to refine the data processing. You first need to determine if you need to pre-process your indicator, if so you need an indicator in group1, group2 and possibly group3, group1 being the pre-processing.

  1. Create a new StreamBase event flow module and place it in the Drilling/src/main/eventflow/com.streambase.accelerator.drilling.operationsandmonitoring.indicators src folder.

    1. Under the Definitions tab add an interface to "com.streambase.accelerator.drilling.operationsandmonitoring.interfaces.Indicator"

    2. Now implement your interface

  2. Finally add your interface to the appropriate indicator group by adding your indicator to one of the "Indicator" groups

    1. "Drilling/src/main/eventflow/com.streambase.accelerator.drilling.operationsandmonitoring.indicators.containers/IndicatorGroup01.sbapp"

    2. "Drilling/src/main/eventflow/com.streambase.accelerator.drilling.operationsandmonitoring.indicators.containers/IndicatorGroup02.sbapp"

    3. "Drilling/src/main/eventflow/com.streambase.accelerator.drilling.operationsandmonitoring.indicators.containers/IndicatorGroup03.sbapp"

Starting Up

Studio

Run Nodes

Now we can startup the nodes in StreamBase Studio. It is recommended to start the LDM node first so that the other nodes have a connection endpoint to send data when started. To start the nodes:

  1. Select Run\Run Configurations from the StreamBase Studio menu

    Run configs
  2. Select the LiveView Fragement\LDM launch config and click the 'Run' button. This will start the node launch in the background.

    Run configs 2
  3. Select the EventFlow Fragement\DrillingOperationAndMonitoring launch config and click the 'Run' button. Make sure the conf files are selected as showing:

    Run configs 3
  4. Now start the simulated data: in the Manual Input tab, enter 'Start' for stream 'DataSource.FeedsimDataSource.CommandIn'

    Start Simulation Data

Command Line

  1. Open a StreamBase command console and navigate to the root of your workspace where the OilAndGasDrillingAccelerator.zip was extracted and build the accelerator

    Maven is required to be install on your machine to perform command line builds
    mvn clean install
  2. Now we must install the nodes

    1. Install the LDM Node

      epadmin install node nodename=node1.ldm.drilling  substitutions="LDM_NODE_NAME=node1,NODE_CLIENT_PORT=10020,LDM_CLIENT_PORT=10080" application=Drilling_App/target/Drilling_App-1.4.1-ep-application.zip
    2. Install the DataCollector Node

      epadmin install node nodename=node1.datacenter.drilling substitutions="DATACOLLECTOR_NODE_NAME=node1,NODE_CLIENT_PORT=10000,LDM_URL=lv://localhost:10080" application=Drilling_App/target/Drilling_App-1.4.1-ep-application.zip
  3. Now start the nodes

    epadmin servicename=.drilling start node
  4. Now do the same process for the Rig nodes (Rig nodes are usually run on the Rig and connect to WITS0 Data via serial ports). This is optional and only if you have a Rig setup, if not skip the rig setup and use simulated data at step 6

    epadmin install node nodename=node1.rig.drilling substitutions="RIG_NODE_NAME=node1,NODE_CLIENT_PORT=10010" application=Drilling_App/target/Drilling_App-1.4.1-ep-application.zip
  5. Now start the rig node

    epadmin servicename=node1.rig.drilling start node
  6. Now start the simulated data

    the nodes may take time to startup, you can use the commands at step 9 to determine if the nodes are running)
    epadmin servicename=node1.datacenter.drilling enqueue stream format=json path=drilling.DataSource.FeedsimDataSource.CommandIn
    The values corrispond to the WITSML subscription schema with fields:
    
    command string,
    value double
  7. Wait for the 'Reading from keyboard' prompt and enter:

    {"command":"start", "value":"1"}
  8. If you want to the feedsim to go 2x then enter:

    {"command":"scale", "value":"2"}
  9. If you want to stop the feedsim then enter

    {"command":"stop", "value":"1"}
  10. Ctrl-Z to exit input mode

  11. To start a WITSML subscription based on defaults from your config

    epadmin servicename=node1.datacenter.drilling enqueue stream format=json path=drilling.DataSource.WITSML_DataSource.subscribe
    The values corrispond to the WITSML subscription schema with fields:
    
    type string,
    uid string,
    uidWell string,
    uidWellbore string,
    name string,
    wellName string,
    wellboreName string,
    statusWell string,
    field string,
    country string,
    state string,
    county string,
    region string,
    district string,
    log (
      regex string,
      afterDateTimeIndex timestamp,
      afterIndex (value double, uom string),
      poll (dateTimeSeconds long, indexSize (value double, uom string))
    ),
    trajectory (
      regex string,
      afterDateTimeIndex timestamp,
      afterIndex (value double, uom string),
      poll (dateTimeSeconds long, indexSize (value double, uom string))
    )
  12. Wait for the 'Reading from keyboard' prompt and enter:

    {}

    Or to subscribe to wells with a status of active use the following:

    {"statusWell":"active"}
  13. Ctrl-Z to exit input mode the perform the following command to see sample data

    epadmin servicename=node1.datacenter.drilling dequeue stream path=drilling.DataOut
  14. To unsubscribe from all data

    epadmin servicename=node1.datacenter.drilling enqueue stream format=json path=drilling.DataSource.WITSML_DataSource.unsubscribe
    The values corrispond to the WITSML unsubscribe schema with fields:
    
    type string,
    uid string,
    uidWell string,
    uidWellbore string,
    name string,
    wellName string,
    wellboreName string,
    statusWell string,
    field string,
    country string,
    state string,
    county string,
    region string,
    district string
  15. Wait for the 'Reading from keyboard' prompt and enter:

    {}
  16. The following commands can help diagnose issues

    1. Command to show all event flow applications running

      epadmin display services servicetype=eventflow
    2. Command to show all nodes running

      epadmin display services servicetype=node
    3. Command to tail log files of all nodes running

      epadmin servicename=.drilling tail logging
    4. Command to display the engine info for all nodes running

      epadmin servicename=.drilling display engine

Access the User Interface

In a web browser open the URL localhost:10080/app/drilling The demo is configured to read data from a CSV. Once this is streaming into the application you will see the following screen:

Drilling Interface

Select Region East Well from the well selector in the top right of the screen

East Well

This view shows the Rig State as well as numerous metrics displayed in vertical line charts.

Click on the tab 3D on the left hand side of the screen. This view illustrates a 3D chart for drill trajectory. This will need to be configured to run with WITSML data once there is a live data source available.

3D View
For live datamart charts in Spotfire, Spotfire server needs to be updated as per the following instructions on the TIBCO community: https://community.tibco.com/questions/integrate-liveview-spotfire

Next open the dxp DrillingSF10.dxp This will open TIBCO Spotfire. The screen shot below illustrates the Live Updating charts in Spotfire. Historical data can be view side by side with these live charts.

Time Logs

Spotfire

Depth Logs

SpotfireDepth

Tripping

SpotfireTripping

Bit Wear

SpotfireBitWear

Now open a link to localhost:10080/lvweb

LVWeb View

Stopping

Stop All Nodes

epadmin servicename=.drilling stop node
epadmin servicename=.drilling remove node

Stop Individual Nodes

epadmin servicename=node1.ldm.drilling stop node
epadmin servicename=node1.ldm.drilling remove node

epadmin servicename=node1.datacenter.drilling stop node
epadmin servicename=node1.datacenter.drilling remove node

epadmin servicename=node1.rig.drilling stop node
epadmin servicename=node1.rig.drilling remove node

Cluster Monitoring

The cluster can be monitored by a built in cluster tool

  1. Start the cluster monitor by opening a StreamBase command prompt and performing the following operations

    epadmin install node nodename=drilling.monitor substitutions="NODE_NAME=drilling.monitor" application=%TIBCO_EP_HOME%/distrib/tibco/sb/applications/cluster-monitor.zip
    
    epadmin servicename=drilling.monitor start node
  2. Now open a web browser to http://localhost:11080/lvweb