12 resultados para concept analysis

em DigitalCommons@The Texas Medical Center


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Introduction: Nursing clinical credibility, a complex, abstract concept is rarely mentioned in the clinical setting, but is implicitly understood by nurses and physicians. The concept has neither been defined nor explored, despite its repeated use in literature. A review of the extant literature formed the basis for a concept analysis of nursing clinical credibility, which is currently under review for publication. ^ Methods: Using taxonomic analysis, findings of a descriptive qualitative research study in which registered nurses and physicians identified attributes of nursing clinical credibility as it applied to nurses in direct care roles in a hospital setting, formed the basis for development of taxonomies of nursing clinical credibility. A secondary review of literature was undertaken to verify congruence of the taxonomic domains with the work of previous researchers who studied credibility and source credibility. ^ Results: Three taxonomies of nursing clinical credibility emerged from the taxonomic analysis. Using an inductive approach, two separate taxonomies of nursing clinical credibility emerged; one was developed from the descriptions of nursing clinical credibility by registered nurses, and the other from physicians' descriptions of nursing clinical credibility. A third and final taxonomy reflects commonalities within both taxonomies. Three domains were consistent for both nurses and physicians: trustworthiness, expertise, and caring. The two disciplines differed in categories and emphases within the domains; however, both disciplines focused on the attributes of trustworthiness and caring, although physicians and nurses differed on components of expertise. ^ Discussion: Findings from this study of nursing clinical credibility concur with the work of previous researchers who identified trustworthiness and expertise as attributes of credibility and source credibility. Findings suggest however, that trustworthiness and expertise alone are not sufficient attributes of nursing clinical credibility. Caring emerged as an essential domain of nursing clinical credibility according to both nurses and physicians. ^ Products: Products of this research include a concept analysis, two discipline-specific taxonomies of nursing clinical credibility, a third final taxonomy, and a monograph that describes the development of the final taxonomy of nursing clinical credibility. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Developing a Model Interruption is a known human factor that contributes to errors and catastrophic events in healthcare as well as other high-risk industries. The landmark Institute of Medicine (IOM) report, To Err is Human, brought attention to the significance of preventable errors in medicine and suggested that interruptions could be a contributing factor. Previous studies of interruptions in healthcare did not offer a conceptual model by which to study interruptions. As a result of the serious consequences of interruptions investigated in other high-risk industries, there is a need to develop a model to describe, understand, explain, and predict interruptions and their consequences in healthcare. Therefore, the purpose of this study was to develop a model grounded in the literature and to use the model to describe and explain interruptions in healthcare. Specifically, this model would be used to describe and explain interruptions occurring in a Level One Trauma Center. A trauma center was chosen because this environment is characterized as intense, unpredictable, and interrupt-driven. The first step in developing the model began with a review of the literature which revealed that the concept interruption did not have a consistent definition in either the healthcare or non-healthcare literature. Walker and Avant’s method of concept analysis was used to clarify and define the concept. The analysis led to the identification of five defining attributes which include (1) a human experience, (2) an intrusion of a secondary, unplanned, and unexpected task, (3) discontinuity, (4) externally or internally initiated, and (5) situated within a context. However, before an interruption could commence, five conditions known as antecedents must occur. For an interruption to take place (1) an intent to interrupt is formed by the initiator, (2) a physical signal must pass a threshold test of detection by the recipient, (3) the sensory system of the recipient is stimulated to respond to the initiator, (4) an interruption task is presented to recipient, and (5) the interruption task is either accepted or rejected by v the recipient. An interruption was determined to be quantifiable by (1) the frequency of occurrence of an interruption, (2) the number of times the primary task has been suspended to perform an interrupting task, (3) the length of time the primary task has been suspended, and (4) the frequency of returning to the primary task or not returning to the primary task. As a result of the concept analysis, a definition of an interruption was derived from the literature. An interruption is defined as a break in the performance of a human activity initiated internal or external to the recipient and occurring within the context of a setting or location. This break results in the suspension of the initial task by initiating the performance of an unplanned task with the assumption that the initial task will be resumed. The definition is inclusive of all the defining attributes of an interruption. This is a standard definition that can be used by the healthcare industry. From the definition, a visual model of an interruption was developed. The model was used to describe and explain the interruptions recorded for an instrumental case study of physicians and registered nurses (RNs) working in a Level One Trauma Center. Five physicians were observed for a total of 29 hours, 31 minutes. Eight registered nurses were observed for a total of 40 hours 9 minutes. Observations were made on either the 0700–1500 or the 1500-2300 shift using the shadowing technique. Observations were recorded in the field note format. The field notes were analyzed by a hybrid method of categorizing activities and interruptions. The method was developed by using both a deductive a priori classification framework and by the inductive process utilizing line-byline coding and constant comparison as stated in Grounded Theory. The following categories were identified as relative to this study: Intended Recipient - the person to be interrupted Unintended Recipient - not the intended recipient of an interruption; i.e., receiving a phone call that was incorrectly dialed Indirect Recipient – the incidental recipient of an interruption; i.e., talking with another, thereby suspending the original activity Recipient Blocked – the intended recipient does not accept the interruption Recipient Delayed – the intended recipient postpones an interruption Self-interruption – a person, independent of another person, suspends one activity to perform another; i.e., while walking, stops abruptly and talks to another person Distraction – briefly disengaging from a task Organizational Design – the physical layout of the workspace that causes a disruption in workflow Artifacts Not Available – supplies and equipment that are not available in the workspace causing a disruption in workflow Initiator – a person who initiates an interruption Interruption by Organizational Design and Artifacts Not Available were identified as two new categories of interruption. These categories had not previously been cited in the literature. Analysis of the observations indicated that physicians were found to perform slightly fewer activities per hour when compared to RNs. This variance may be attributed to differing roles and responsibilities. Physicians were found to have more activities interrupted when compared to RNs. However, RNs experienced more interruptions per hour. Other people were determined to be the most commonly used medium through which to deliver an interruption. Additional mediums used to deliver an interruption vii included the telephone, pager, and one’s self. Both physicians and RNs were observed to resume an original interrupted activity more often than not. In most interruptions, both physicians and RNs performed only one or two interrupting activities before returning to the original interrupted activity. In conclusion the model was found to explain all interruptions observed during the study. However, the model will require an even more comprehensive study in order to establish its predictive value.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Clinical Research Data Quality Literature Review and Pooled Analysis We present a literature review and secondary analysis of data accuracy in clinical research and related secondary data uses. A total of 93 papers meeting our inclusion criteria were categorized according to the data processing methods. Quantitative data accuracy information was abstracted from the articles and pooled. Our analysis demonstrates that the accuracy associated with data processing methods varies widely, with error rates ranging from 2 errors per 10,000 files to 5019 errors per 10,000 fields. Medical record abstraction was associated with the highest error rates (70–5019 errors per 10,000 fields). Data entered and processed at healthcare facilities had comparable error rates to data processed at central data processing centers. Error rates for data processed with single entry in the presence of on-screen checks were comparable to double entered data. While data processing and cleaning methods may explain a significant amount of the variability in data accuracy, additional factors not resolvable here likely exist. Defining Data Quality for Clinical Research: A Concept Analysis Despite notable previous attempts by experts to define data quality, the concept remains ambiguous and subject to the vagaries of natural language. This current lack of clarity continues to hamper research related to data quality issues. We present a formal concept analysis of data quality, which builds on and synthesizes previously published work. We further posit that discipline-level specificity may be required to achieve the desired definitional clarity. To this end, we combine work from the clinical research domain with findings from the general data quality literature to produce a discipline-specific definition and operationalization for data quality in clinical research. While the results are helpful to clinical research, the methodology of concept analysis may be useful in other fields to clarify data quality attributes and to achieve operational definitions. Medical Record Abstractor’s Perceptions of Factors Impacting the Accuracy of Abstracted Data Medical record abstraction (MRA) is known to be a significant source of data errors in secondary data uses. Factors impacting the accuracy of abstracted data are not reported consistently in the literature. Two Delphi processes were conducted with experienced medical record abstractors to assess abstractor’s perceptions about the factors. The Delphi process identified 9 factors that were not found in the literature, and differed with the literature by 5 factors in the top 25%. The Delphi results refuted seven factors reported in the literature as impacting the quality of abstracted data. The results provide insight into and indicate content validity of a significant number of the factors reported in the literature. Further, the results indicate general consistency between the perceptions of clinical research medical record abstractors and registry and quality improvement abstractors. Distributed Cognition Artifacts on Clinical Research Data Collection Forms Medical record abstraction, a primary mode of data collection in secondary data use, is associated with high error rates. Distributed cognition in medical record abstraction has not been studied as a possible explanation for abstraction errors. We employed the theory of distributed representation and representational analysis to systematically evaluate cognitive demands in medical record abstraction and the extent of external cognitive support employed in a sample of clinical research data collection forms. We show that the cognitive load required for abstraction in 61% of the sampled data elements was high, exceedingly so in 9%. Further, the data collection forms did not support external cognition for the most complex data elements. High working memory demands are a possible explanation for the association of data errors with data elements requiring abstractor interpretation, comparison, mapping or calculation. The representational analysis used here can be used to identify data elements with high cognitive demands.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: The failure rate of health information systems is high, partially due to fragmented, incomplete, or incorrect identification and description of specific and critical domain requirements. In order to systematically transform the requirements of work into real information system, an explicit conceptual framework is essential to summarize the work requirements and guide system design. Recently, Butler, Zhang, and colleagues proposed a conceptual framework called Work Domain Ontology (WDO) to formally represent users’ work. This WDO approach has been successfully demonstrated in a real world design project on aircraft scheduling. However, as a top level conceptual framework, this WDO has not defined an explicit and well specified schema (WDOS) , and it does not have a generalizable and operationalized procedure that can be easily applied to develop WDO. Moreover, WDO has not been developed for any concrete healthcare domain. These limitations hinder the utility of WDO in real world information system in general and in health information system in particular. Objective: The objective of this research is to formalize the WDOS, operationalize a procedure to develop WDO, and evaluate WDO approach using Self-Nutrition Management (SNM) work domain. Method: Concept analysis was implemented to formalize WDOS. Focus group interview was conducted to capture concepts in SNM work domain. Ontology engineering methods were adopted to model SNM WDO. Part of the concepts under the primary goal “staying healthy” for SNM were selected and transformed into a semi-structured survey to evaluate the acceptance, explicitness, completeness, consistency, experience dependency of SNM WDO. Result: Four concepts, “goal, operation, object and constraint”, were identified and formally modeled in WDOS with definitions and attributes. 72 SNM WDO concepts under primary goal were selected and transformed into semi-structured survey questions. The evaluation indicated that the major concepts of SNM WDO were accepted by 41 overweight subjects. SNM WDO is generally independent of user domain experience but partially dependent on SNM application experience. 23 of 41 paired concepts had significant correlations. Two concepts were identified as ambiguous concepts. 8 extra concepts were recommended towards the completeness of SNM WDO. Conclusion: The preliminary WDOS is ready with an operationalized procedure. SNM WDO has been developed to guide future SNM application design. This research is an essential step towards Work-Centered Design (WCD).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Manuscript 1: “Conceptual Analysis: Externalizing Nursing Knowledge” We use concept analysis to establish that the report tool nurses prepare, carry, reference, amend, and use as a temporary data repository are examples of cognitive artifacts. This tool, integrally woven throughout the work and practice of nurses, is important to cognition and clinical decision-making. Establishing the tool as a cognitive artifact will support new dimensions of study. Such studies can characterize how this report tool supports cognition, internal representation of knowledge and skills, and external representation of knowledge of the nurse. Manuscript 2: “Research Methods: Exploring Cognitive Work” The purpose of this paper is to describe a complex, cross-sectional, multi-method approach to study of personal cognitive artifacts in the clinical environment. The complex data arrays present in these cognitive artifacts warrant the use of multiple methods of data collection. Use of a less robust research design may result in an incomplete understanding of the meaning, value, content, and relationships between personal cognitive artifacts in the clinical environment and the cognitive work of the user. Manuscript 3: “Making the Cognitive Work of Registered Nurses Visible” Purpose: Knowledge representations and structures are created and used by registered nurses to guide patient care. Understanding is limited regarding how these knowledge representations, or cognitive artifacts, contribute to working memory, prioritization, organization, cognition, and decision-making. The purpose of this study was to identify and characterize the role a specific cognitive artifact knowledge representation and structure as it contributed to the cognitive work of the registered nurse. Methods: Data collection was completed, using qualitative research methods, by shadowing and interviewing 25 registered nurses. Data analysis employed triangulation and iterative analytic processes. Results: Nurse cognitive artifacts support recall, data evaluation, decision-making, organization, and prioritization. These cognitive artifacts demonstrated spatial, longitudinal, chronologic, visual, and personal cues to support the cognitive work of nurses. Conclusions: Nurse cognitive artifacts are an important adjunct to the cognitive work of nurses, and directly support patient care. Nurses need to be able to configure their cognitive artifact in ways that are meaningful and support their internal knowledge representations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Do siblings of centenarians tend to have longer life spans? To answer this question, life spans of 184 siblings for 42 centenarians have been evaluated. Two important questions have been addressed in analyzing the sibling data. First, a standard needs to be established, to which the life spans of 184 siblings are compared. In this report, an external reference population is constructed from the U.S. life tables. Its estimated mortality rates are treated as baseline hazards from which the relative mortality of the siblings are estimated. Second, the standard survival models which assume independent observations are invalid when correlation within family exists, underestimating the true variance. Methods that allow correlations are illustrated by three different methods. First, the cumulative relative excess mortality between siblings and their comparison group is calculated and used as an effective graphic tool, along with the Product Limit estimator of the survival function. The variance estimator of the cumulative relative excess mortality is adjusted for the potential within family correlation using Taylor linearization approach. Second, approaches that adjust for the inflated variance are examined. They are adjusted one-sample log-rank test using design effect originally proposed by Rao and Scott in the correlated binomial or Poisson distribution setting and the robust variance estimator derived from the log-likelihood function of a multiplicative model. Nether of these two approaches provide correlation estimate within families, but the comparison with the comparison with the standard remains valid under dependence. Last, using the frailty model concept, the multiplicative model, where the baseline hazards are known, is extended by adding a random frailty term that is based on the positive stable or the gamma distribution. Comparisons between the two frailty distributions are performed by simulation. Based on the results from various approaches, it is concluded that the siblings of centenarians had significant lower mortality rates as compared to their cohorts. The frailty models also indicate significant correlations between the life spans of the siblings. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dielectrophoresis (DEP) has been used to manipulate cells in low-conductivity suspending media using AC electrical fields generated on micro-fabricated electrode arrays. This has created the possibility of performing automatically on a micro-scale more sophisticated cell processing than that currently requiring substantial laboratory equipment, reagent volumes, time, and human intervention. In this research the manipulation of aqueous droplets in an immiscible, low-permittivity suspending medium is described to complement previous work on dielectrophoretic cell manipulation. Such droplets can be used as carriers not only for air- and water-borne samples, contaminants, chemical reagents, viral and gene products, and cells, but also the reagents to process and characterize these samples. A long-term goal of this area of research is to perform chemical and biological assays on automated, micro-scaled devices at or near the point-of-care, which will increase the availability of modern medicine to people who do not have ready access to large medical institutions and decrease the cost and delays associated with that lack of access. In this research I present proofs-of-concept for droplet manipulation and droplet-based biochemical analysis using dielectrophoresis as the motive force. Proofs-of-concept developed for the first time in this research include: (1) showing droplet movement on a two-dimensional array of electrodes, (2) achieving controlled dielectric droplet injection, (3) fusing and reacting droplets, and (4) demonstrating a protein fluorescence assay using micro-droplets. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Next-generation DNA sequencing platforms can effectively detect the entire spectrum of genomic variation and is emerging to be a major tool for systematic exploration of the universe of variants and interactions in the entire genome. However, the data produced by next-generation sequencing technologies will suffer from three basic problems: sequence errors, assembly errors, and missing data. Current statistical methods for genetic analysis are well suited for detecting the association of common variants, but are less suitable to rare variants. This raises great challenge for sequence-based genetic studies of complex diseases.^ This research dissertation utilized genome continuum model as a general principle, and stochastic calculus and functional data analysis as tools for developing novel and powerful statistical methods for next generation of association studies of both qualitative and quantitative traits in the context of sequencing data, which finally lead to shifting the paradigm of association analysis from the current locus-by-locus analysis to collectively analyzing genome regions.^ In this project, the functional principal component (FPC) methods coupled with high-dimensional data reduction techniques will be used to develop novel and powerful methods for testing the associations of the entire spectrum of genetic variation within a segment of genome or a gene regardless of whether the variants are common or rare.^ The classical quantitative genetics suffer from high type I error rates and low power for rare variants. To overcome these limitations for resequencing data, this project used functional linear models with scalar response to develop statistics for identifying quantitative trait loci (QTLs) for both common and rare variants. To illustrate their applications, the functional linear models were applied to five quantitative traits in Framingham heart studies. ^ This project proposed a novel concept of gene-gene co-association in which a gene or a genomic region is taken as a unit of association analysis and used stochastic calculus to develop a unified framework for testing the association of multiple genes or genomic regions for both common and rare alleles. The proposed methods were applied to gene-gene co-association analysis of psoriasis in two independent GWAS datasets which led to discovery of networks significantly associated with psoriasis.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Advisory Committee on Immunization Practices (ACIP) develops written recommendations for the routine administration of vaccines to children and adults in the U.S. civilian population. The ACIP is the only entity in the federal government that makes such recommendations. ACIP elaborates on selection of its members and rules out concerns regarding its integrity, but fails to provide information about the importance of economic analysis in vaccine selection. ACIP recommendations can have large health and economic consequences. Emphasis on economic evaluation in health is a likely response to severe pressures of the federal and state health budget. This study describes the economic aspects considered by the ACIP while sanctioning a vaccine, and reviews the economic evaluations (our economic data) provided for vaccine deliberations. A five year study period from 2004 to 2009 is adopted. Publicly available data from ACIP web database is used. Drummond et al. (2005) checklist serves as a guide to assess the quality of economic evaluations presented. Drummond et al.'s checklist is a comprehensive hence it is unrealistic to expect every ACIP deliberation to meet all of their criteria. For practical purposes we have selected seven criteria that we judge to be significant criteria provided by Drummond et al. Twenty-four data points were obtained in a five year period. Our results show that out of the total twenty-four data point‘s (economic evaluations) only five data points received a score of six; that is six items on the list of seven were met. None of the data points received a perfect score of seven. Seven of the twenty-four data points received a score of five. A minimum of a two score was received by only one of the economic analyses. The type of economic evaluation along with the model criteria and ICER/QALY criteria met at 0.875 (87.5%). These three criteria were met at the highest rate among the seven criteria studied. Our study findings demonstrate that the perspective criteria met at 0.583 (58.3%) followed by source and sensitivity analysis criteria both tied at 0.541 (54.1%). The discount factor was met at 0.250 (25.0%).^ Economic analysis is not a novel concept to the ACIP. It has been practiced and presented at these meetings on a regular basis for more than five years. ACIP‘s stated goal is to utilize good quality epidemiologic, clinical and economic analyses to help policy makers choose among alternatives presented and thus achieve a better informed decision. As seen in our study the economic analyses over the years are inconsistent. The large variability coupled with lack of a standardized format may compromise the utility of the economic information for decision-making. While making recommendations, the ACIP takes into account all available information about a vaccine. Thus it is vital that standardized high quality economic information is provided at the ACIP meetings. Our study may provide a call for the ACIP to further investigate deficiencies within the system and thereby to improve economic evaluation data presented. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation develops and tests a comparative effectiveness methodology utilizing a novel approach to the application of Data Envelopment Analysis (DEA) in health studies. The concept of performance tiers (PerT) is introduced as terminology to express a relative risk class for individuals within a peer group and the PerT calculation is implemented with operations research (DEA) and spatial algorithms. The analysis results in the discrimination of the individual data observations into a relative risk classification by the DEA-PerT methodology. The performance of two distance measures, kNN (k-nearest neighbor) and Mahalanobis, was subsequently tested to classify new entrants into the appropriate tier. The methods were applied to subject data for the 14 year old cohort in the Project HeartBeat! study.^ The concepts presented herein represent a paradigm shift in the potential for public health applications to identify and respond to individual health status. The resultant classification scheme provides descriptive, and potentially prescriptive, guidance to assess and implement treatments and strategies to improve the delivery and performance of health systems. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Proton therapy is growing increasingly popular due to its superior dose characteristics compared to conventional photon therapy. Protons travel a finite range in the patient body and stop, thereby delivering no dose beyond their range. However, because the range of a proton beam is heavily dependent on the tissue density along its beam path, uncertainties in patient setup position and inherent range calculation can degrade thedose distribution significantly. Despite these challenges that are unique to proton therapy, current management of the uncertainties during treatment planning of proton therapy has been similar to that of conventional photon therapy. The goal of this dissertation research was to develop a treatment planning method and a planevaluation method that address proton-specific issues regarding setup and range uncertainties. Treatment plan designing method adapted to proton therapy: Currently, for proton therapy using a scanning beam delivery system, setup uncertainties are largely accounted for by geometrically expanding a clinical target volume (CTV) to a planning target volume (PTV). However, a PTV alone cannot adequately account for range uncertainties coupled to misaligned patient anatomy in the beam path since it does not account for the change in tissue density. In order to remedy this problem, we proposed a beam-specific PTV (bsPTV) that accounts for the change in tissue density along the beam path due to the uncertainties. Our proposed method was successfully implemented, and its superiority over the conventional PTV was shown through a controlled experiment.. Furthermore, we have shown that the bsPTV concept can be incorporated into beam angle optimization for better target coverage and normal tissue sparing for a selected lung cancer patient. Treatment plan evaluation method adapted to proton therapy: The dose-volume histogram of the clinical target volume (CTV) or any other volumes of interest at the time of planning does not represent the most probable dosimetric outcome of a given plan as it does not include the uncertainties mentioned earlier. Currently, the PTV is used as a surrogate of the CTV’s worst case scenario for target dose estimation. However, because proton dose distributions are subject to change under these uncertainties, the validity of the PTV analysis method is questionable. In order to remedy this problem, we proposed the use of statistical parameters to quantify uncertainties on both the dose-volume histogram and dose distribution directly. The robust plan analysis tool was successfully implemented to compute both the expectation value and its standard deviation of dosimetric parameters of a treatment plan under the uncertainties. For 15 lung cancer patients, the proposed method was used to quantify the dosimetric difference between the nominal situation and its expected value under the uncertainties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first manuscript, entitled "Time-Series Analysis as Input for Clinical Predictive Modeling: Modeling Cardiac Arrest in a Pediatric ICU" lays out the theoretical background for the project. There are several core concepts presented in this paper. First, traditional multivariate models (where each variable is represented by only one value) provide single point-in-time snapshots of patient status: they are incapable of characterizing deterioration. Since deterioration is consistently identified as a precursor to cardiac arrests, we maintain that the traditional multivariate paradigm is insufficient for predicting arrests. We identify time series analysis as a method capable of characterizing deterioration in an objective, mathematical fashion, and describe how to build a general foundation for predictive modeling using time series analysis results as latent variables. Building a solid foundation for any given modeling task involves addressing a number of issues during the design phase. These include selecting the proper candidate features on which to base the model, and selecting the most appropriate tool to measure them. We also identified several unique design issues that are introduced when time series data elements are added to the set of candidate features. One such issue is in defining the duration and resolution of time series elements required to sufficiently characterize the time series phenomena being considered as candidate features for the predictive model. Once the duration and resolution are established, there must also be explicit mathematical or statistical operations that produce the time series analysis result to be used as a latent candidate feature. In synthesizing the comprehensive framework for building a predictive model based on time series data elements, we identified at least four classes of data that can be used in the model design. The first two classes are shared with traditional multivariate models: multivariate data and clinical latent features. Multivariate data is represented by the standard one value per variable paradigm and is widely employed in a host of clinical models and tools. These are often represented by a number present in a given cell of a table. Clinical latent features derived, rather than directly measured, data elements that more accurately represent a particular clinical phenomenon than any of the directly measured data elements in isolation. The second two classes are unique to the time series data elements. The first of these is the raw data elements. These are represented by multiple values per variable, and constitute the measured observations that are typically available to end users when they review time series data. These are often represented as dots on a graph. The final class of data results from performing time series analysis. This class of data represents the fundamental concept on which our hypothesis is based. The specific statistical or mathematical operations are up to the modeler to determine, but we generally recommend that a variety of analyses be performed in order to maximize the likelihood that a representation of the time series data elements is produced that is able to distinguish between two or more classes of outcomes. The second manuscript, entitled "Building Clinical Prediction Models Using Time Series Data: Modeling Cardiac Arrest in a Pediatric ICU" provides a detailed description, start to finish, of the methods required to prepare the data, build, and validate a predictive model that uses the time series data elements determined in the first paper. One of the fundamental tenets of the second paper is that manual implementations of time series based models are unfeasible due to the relatively large number of data elements and the complexity of preprocessing that must occur before data can be presented to the model. Each of the seventeen steps is analyzed from the perspective of how it may be automated, when necessary. We identify the general objectives and available strategies of each of the steps, and we present our rationale for choosing a specific strategy for each step in the case of predicting cardiac arrest in a pediatric intensive care unit. Another issue brought to light by the second paper is that the individual steps required to use time series data for predictive modeling are more numerous and more complex than those used for modeling with traditional multivariate data. Even after complexities attributable to the design phase (addressed in our first paper) have been accounted for, the management and manipulation of the time series elements (the preprocessing steps in particular) are issues that are not present in a traditional multivariate modeling paradigm. In our methods, we present the issues that arise from the time series data elements: defining a reference time; imputing and reducing time series data in order to conform to a predefined structure that was specified during the design phase; and normalizing variable families rather than individual variable instances. The final manuscript, entitled: "Using Time-Series Analysis to Predict Cardiac Arrest in a Pediatric Intensive Care Unit" presents the results that were obtained by applying the theoretical construct and its associated methods (detailed in the first two papers) to the case of cardiac arrest prediction in a pediatric intensive care unit. Our results showed that utilizing the trend analysis from the time series data elements reduced the number of classification errors by 73%. The area under the Receiver Operating Characteristic curve increased from a baseline of 87% to 98% by including the trend analysis. In addition to the performance measures, we were also able to demonstrate that adding raw time series data elements without their associated trend analyses improved classification accuracy as compared to the baseline multivariate model, but diminished classification accuracy as compared to when just the trend analysis features were added (ie, without adding the raw time series data elements). We believe this phenomenon was largely attributable to overfitting, which is known to increase as the ratio of candidate features to class examples rises. Furthermore, although we employed several feature reduction strategies to counteract the overfitting problem, they failed to improve the performance beyond that which was achieved by exclusion of the raw time series elements. Finally, our data demonstrated that pulse oximetry and systolic blood pressure readings tend to start diminishing about 10-20 minutes before an arrest, whereas heart rates tend to diminish rapidly less than 5 minutes before an arrest.