27

Hemodynamic Beat-to-Beat Analysis

HHEART: An Automated Beat-to-Beat

Cardiovascular Analysis Package Using Matlab®

Mark J. Schroeder1,2, Ph.D., Bill Perreault2, M.S., Daniel L. Ewert1,2, Ph.D., and Steven C. Koenig1, Ph.D.

1Jewish Hospital Heart and Lung Institute, Department of Surgery, University of Louisville, Louisville, KY 40202

2Cardiovascular Research Laboratory, Department of Electrical and Computer Engineering, North Dakota State University, Fargo, ND 58105

*Funding provided by a grant from the Jewish Hospital Heart and Lung Institute

Running Title: Hemodynamic Beat-to-Beat Analysis

Word Count: 4,117

Address Correspondence To:

Primary Secondary

Mark J. Schroeder, Ph.D. Steven C. Koenig, Ph.D.

Electrical and Computer Engineering Dept. Cardiovascular Research Center

North Dakota State University 500 South Floyd Street, Room 118

Fargo, ND 58105 Department of Surgery

PH: (701)-231-8049 University of Louisville

FAX: (701)-231-8677 Louisville, KY 40202

Email: PH: (502)-852-4416

FAX: (502)-852-1795

E-mail:


27

Hemodynamic Beat-to-Beat Analysis

e-mail:

Abstract

A computer program is described for beat-to-beat analysis of cardiovascular parameters from high-fidelity pressure and flow waveforms. The Hemodynamic Estimation and Analysis Research Tool (HEART) is a post-processing analysis software package developed in Matlab® that enables scientists and clinicians to document, load, view, calibrate, and analyze experimental data that has been digitally saved in ascii or binary format. Analysis routines include traditional hemodynamic parameter estimates as well as more sophisticated analyses to estimate lumped arterial model parameters and vascular impedance frequency spectra. Cardiovascular parameter values of all analyzed beats can be viewed and statistically analyzed. The development of HEART on the Matlab® platform provides users with the flexibility to adapt or create additional analysis files according to their specific needsThe most attractive feature of the HEART program is the ability to analyze data with visual quality assurance throughout the process establishing a framework toward which Good Laboratory Practice (GLP) compliance can be obtained. Additionally, the development of HEART on the Matlab® platform provides users with the flexibility to adapt or create study-specific analysis files according to their specific needs..

Author Key Words

Data acquisition, data analysis, beat-to-beat, hemodynamics, cardiovascular, computer software package, Matlab®
1. Background

Scientists, engineers, and clinicians record hemodynamic data in order to investigate cardiovascular dynamics and improve treatment of cardiovascular disease. The selected method of data analysis is a critical factor in extracting the most information from these data. Unfortunately, often times little attention is paid to the details of the data analysis methods and the investigator settles for a less than ideal method for obtaining cardiovascular metrics.

Since the 1960’s, most data acquisition methods involved recording and analyzing data using strip chart recorders [Maloy 1986 and Wilkison 19841-2]. The methods used to obtain hemodynamic parameters from strip chart paper are subject to numerous measurement inaccuracies that can ultimately lead to incorrect results and conclusions. Additionally, the tedious and time intensive nature of performing analog data analysis often leads to only small samples of data or a representative beat of data being analyzed, thus creating errors due to the unsteady nature of hemodynamic data, e.g. changes caused by respiration.

Over the past several decades, there has been a migration from standard strip chart recorders toward digital data acquisition and analysis systems in which data can be streamed directly to a digital storage device. Despite the obvious time and space saving features of using such a method, many investigators hesitate to make the transition to digital acquisition systems due to the user’s inability to perform post-processing of the digital data. This, along with the realization that ever-increasing amounts of data can be stored digitally as data storage capabilities grow, signifies a need for fast and accurate digital data analysis methods.

Some of tThe first data acquisition and analysis (DAQA) programs were developed on Apple [(Farrell 1987, Mass 1987, Weissenburger 19853-5)] and Macintosh computers (Swanson 1990)[6], microcomputers (Shiffman 1985, Kresh 1984, Wilkison 1984, Kedem 1982)[2,7-9], and VAX systems (Axenborg 1993, Axenborg 1989)[10-11]. As computer technology rapidly improved in the 1990’s, DAQA programs were developed for personal computers (Benedict 1992, Pitsillides 1992, Greenhut 1991, Herbst 1991, James 1990, Mustard 1990)[12-17]. These earlier programs were capable of performing real-time data acquisition, but were often limited by data acquisition and storage capabilities and post-processingrequired off-line data analysis. More recently, real-time data acquisition and with real-time data analysis have has been introduced, and hasve been used successfully in support of cardiovascular research (Wong 2000, Hu 1999, Van Den Saffele 1998, Skyschally 1993)[18-21].

A number of turn-key computer software packages are also now commercially available, including BioBenchä (National Instruments, Austin, TX), PowerLabä (ADInstruments, Grand Junction, CO), ARIA-1ä (Millar Instruments, Houston, TX), DADiSPä (DSP Development Corp, Newton, MA), and PO-NE-MAHä (Gould Instrument Systems, Valley View, OH ). Although they have offered some solace to investigators, these commercial ‘what-you-see-is-what-you-get’ (WYSIWYG) packages provide a limited number of analysis options and often lack the flexibility necessary to meet the evolving study-specific demands of some investigators.

In sharp contrast of to the recent trend toward real-time data acquisition (DAQ) and real-time data analysis (DA), our group has made a conscious the philosophical decision to keep these procedures separate these procedures. The rationale for separating these proceduresthis is two-fold: (1) it enables a high-degree of control for quality assurance throughout data acquisition and data analysis processes, thereby establishing a framework that will meet Good Laboratory Practice (GLP) guidelines regulated by the Food and Drug Administration (FDA) in testing medical devices, and (2) ) leverage the platforms ofit exploits industry-standard software packages that were designed explicit ly for carrying out developing data acquisition (LabVIEWä, National Instruments, Austin, TX) and data analysis (Matlab®,ä MathWorks, Natick, MA) programsprocedures.

Over the past ten years, our group has been developing PC-based data acquisition and analysis programs. Our experience in handling large amounts of digital data has led us to develop an efficient and accurate data analysis program that allows a user to easily document, load, view, calibrate, analyze, and execute data reduction of cardiovascular data on a beat-to-beat basis. The program was created within the Matlab® environment (Mathworks, Natick MA), a commonly used mathematical tool used by many mathematicians, engineers, and scientists. Equipped with myriad mathematical capabilities, this analysis package and platform provide tremendous flexibility and power to meet the user’s present and future needs. We describe in detail the design and functionality of this program called HEART (Hemodynamic Estimation and Analysis Research Tool).

2. Design

The HEART program was developed with the goals of ease of use, time efficiency, accuracy, flexibility, and quality assurance for every step of the data analysis process. These were accomplished by implementing the program using windows-based graphical user interfaces (GUIs) and a menu-driven format to make navigation of the system easy and user-friendly. All programming was performed using Matlab® version 6.0 (Mathworks, Natick MA), a matrix based mathematical package capable of handling large amounts of data and carrying out time-efficient handling of mathematical manipulations. HEART can either be run as a stand-alone data analysis tool or within the Matlab® programming environment (requires purchase of Matlab® software), thereby allowing users to modify or add m-fileadditional routines to the main HEART program tailored to investigator study-specific needs.

The complete HEART program uses a total of seventeen Matlab® m-files. As illustrated in the flowchart below (Figure 1), HEART was designed to carry out numerous functions to help investigators analyze their data. The flowchart outlines the order in which these generalized functions are performed. The first step in this process is accessing or loading the data by using the “Profile” window. This was designed to load either binary or ascii data from files containing numerous variables stored in columnar format. Data of various word sizes can be loaded, as can data containing header information at the beginning of the data file. The “Data Viewer” can be used to plot and/or save any or all data variables over any desired length of time. The data can be viewed versus data point index or time, and, if desired, with pre-selected beat boundariesindices if desired. Separate windows exist for calibrating data, automatically selecting beat boundariesindices, and viewing mathematically manipulated data. A “Data Analysis” window allows the user to select from a variety of automated data analysis routines. Finally, windows exist for viewing the analysis results and exporting the results and other information to an ascii file for data reduction and/or statistical analysis. The following sections will describe each function in greater detail.

[ Figure 1 ]

2.1. Profile

The first step in using HEART is to start the Matlab® program and then type “heart” at the command line prompt. Assuming the HEART program is located in a Matlab-accessible directory, the program will open to the “Profile” window (Figure 2). This window allows the user to either create or load a previously saved “profile”. A profile is an information set linked to a unique set of data. It is initially created to provide necessary information for loading and analyzing the data, but will ultimately include beat indices, calibration factors, and analysis results. This window allows the input and saving of information concerning a specific data file so that the data set can be loaded into the Matlab® work environment. Additionally, the profile is used to save beat indices, calibration factors, and analysis results. The user must create a profile or load a previously saved profile prior to accessing physiologic data. To create a profile, one must first select the “Create Profile” button. Then a data file can be selected using the browse feature. If a calibration file exists (to be discussed later), the file and path can again be located using the browse feature. The names of the signals and corresponding signal units that are in the data file must be selected and entered in the order they appear in the data file by using the signal name and signal unit popup menus. Additional signals names can be created by using the “Edit: Preferences” option (to be discussed later).

[ Figure 2 ]

A section exists in the profile window for specifying information concerning the data file. The four specifications include the data file format, sampling rate, data word size, and number of lines of the file header. The file format can be either ascii or binary. The sampling frequency must be entered in units of samples per second. The data word size, or type and length of each data value, can be selected from a popup menu that provides six different word lengths and types including integer and floating point numbers. If the data file has a header, the number of lines can be specified so that header lines are skipped when loading the data. A “View Header” button opens a separate window showing the specified number of lines of the data set. The entered number of header lines can then be adjusted if it was inaccurate. To load the data, the user selects the “Load Data File” button. A small window will appear indicating that data is being loaded, which is followed by a window that indicates whether or not the data was successfully loaded.

A description of the data and pertinent notes can be entered and displayed in the ‘Profile Description and Notes’ panel by clicking on the “Edit Description” button. Additionally, the profile name and path and the original and last saved dates information is provided in the “Other Profile Information” panel.

Once the profile has been created, it can be saved as a “*.pro” file. The default profile name is the name of the specified data file but with an extension of “.pro”. Of course, this name can be altered when saving the profile. Multiple profile handling was incorporated so that several profiles can be opened and data loaded before proceeding to other program options. Each HEART subroutine provides the user with an option of which profile/data set to use. This eliminates the need to repeatedly access the profile window or close profiles so that another one can be opened.

2.2 Data Preparation: View Data

Once a profile has been called or created and the experimental data have been loaded, one will generally want to inspect the waveforms to insure the data loaded properly. This can be accomplished using the data viewer (Figure 3) that contains a window that allowings selected signals to be plotted over any selected range. If desired, a user can save the raw data in the plot window, data file name, data range, sampling rate and beat indices to a Matlab® or ascii file by clicking on the “Save Plotted Data to a File” button. Additionally, a “Mark Point” button can be used to display the values of all plotted signals at a selected x-value. Several features that are in the “View Data” window are also utilized in other windows that involve plotting. These include: zoom, a “VCR Player Control” that scrolls through the data at adjustable speeds, and options to view the data in its entirety, calibrated or uncalibrated, normalized, or versus time or index. If the beats have already been picked, one can also view picked beat boundaries for the desired signal of choice.

[ Figure 3 ]

2.3. Data Preparation: Data Calibration

The HEART program offers a simple means for calibrating data and storing calibration factors. Calibration of data against a known standard is necessary to convert binary bit counts to physiologic units and insure the accuracy of the acquired data prior to analysis and convert binary bit counts to physiologic units. The calibration process begins by loading data collected at various known measurement values; for instance, data collected from a pressure catheter subjected to known pressures of 0 mmHg and 100 mmHg. The data can then be viewed in the “Data Calibration Factors Extrapolation” window found under the “Data Preparation” pull-down menu, as shown in Figure 4. Data calibration zones are selected at each desired measurement level by clicking on the “Pick Calibration Zone” button and then choosing the left and right sides of a data section using a cursor. The user is prompted to enter the actual known value of each zone, for instance, 0 mmHg (minimum) and 100 mmHg (maximum). This information is stored in memory and must be performed for at least two zones of differing values. Once selected, Iincorrectly picked selected zones can be deleted by selecting the “Delete a Zone” button.

[ Figure 4 ]

The user then selects whether a first or second order fit is desired. A first order fit will determine the coefficients for the equation of a straight line (y = Ax + B) that best fits the average values of the calibration zones to the entered known values in a least-squares sense. A second order fit requires at least three calibration zones and determines the coefficients for a non-linear fit of the form: y = Ax2 + Bx + C. The first order fit is generally the method of choice, as transducers should be linear. The calibration procedure must be repeated for all other signals that require calibration.