932 resultados para Qualitative data analysis software
Resumo:
The evaluation’s overarching question was “Did the activities undertaken through the state’s LSTA plan achieve results related to priorities identified in the Act?” The evaluation was conducted and is organized according to the six LSTA priorities. The research design employed two major methodologies: 1. Data sources from Iowa Library Services / State Library of Iowa2 as well as U.S and state sources were indentified for quantitative analysis. These sources, which primarily reflect outputs for various projects, included: Statistics from the Public Library Annual Survey Statistics collected internally by Iowa Library Services such as number of libraries subscribing to sponsored databases, number of database searches, attendance at continuing education events, number of interlibrary loan transactions Evaluation surveys from library training sessions, professional development workshops and other programs supported by LSTA funds Internal databases maintained by Iowa Library Services Impact results from post training evaluations conducted by Iowa Library Services 2010 Iowa census data from the U.S. Census Bureau LSTA State Program Reports for the grant period 2. Following the quantitative analysis, the evaluator gathered qualitative data through interviews with key employees, a telephone focus group with district library consultants and two surveys: LSTA Evaluation Survey (Public Libraries) and LSTA Evaluation Survey (Academic Libraries). Both surveys provided sound samples with 43 representatives of Iowa’s 77 academic libraries and 371 representatives of Iowa’s 544 public libraries participating. Respondents represented libraries of all sizes and geographical areas. Both surveys included multiple choice and rating scale items as well as open-ended questions from which results were coded to identify trends, issues and recommendations.
Resumo:
The use of synthetic combinatorial peptide libraries in positional scanning format (PS-SCL) has emerged recently as an alternative approach for the identification of peptides recognized by T lymphocytes. The choice of both the PS-SCL used for screening experiments and the method used for data analysis are crucial for implementing this approach. With this aim, we tested the recognition of different PS-SCL by a tyrosinase 368-376-specific CTL clone and analyzed the data obtained with a recently developed biometric data analysis based on a model of independent and additive contribution of individual amino acids to peptide antigen recognition. Mixtures defined with amino acids present at the corresponding positions in the native sequence were among the most active for all of the libraries. Somewhat surprisingly, a higher number of native amino acids were identifiable by using amidated COOH-terminal rather than free COOH-terminal PS-SCL. Also, our data clearly indicate that when using PS-SCL longer than optimal, frame shifts occur frequently and should be taken into account. Biometric analysis of the data obtained with the amidated COOH-terminal nonapeptide library allowed the identification of the native ligand as the sequence with the highest score in a public human protein database. However, the adequacy of the PS-SCL data for the identification for the peptide ligand varied depending on the PS-SCL used. Altogether these results provide insight into the potential of PS-SCL for the identification of CTL-defined tumor-derived antigenic sequences and may significantly implement our ability to interpret the results of these analyses.
Resumo:
Quantitative information from magnetic resonance imaging (MRI) may substantiate clinical findings and provide additional insight into the mechanism of clinical interventions in therapeutic stroke trials. The PERFORM study is exploring the efficacy of terutroban versus aspirin for secondary prevention in patients with a history of ischemic stroke. We report on the design of an exploratory longitudinal MRI follow-up study that was performed in a subgroup of the PERFORM trial. An international multi-centre longitudinal follow-up MRI study was designed for different MR systems employing safety and efficacy readouts: new T2 lesions, new DWI lesions, whole brain volume change, hippocampal volume change, changes in tissue microstructure as depicted by mean diffusivity and fractional anisotropy, vessel patency on MR angiography, and the presence of and development of new microbleeds. A total of 1,056 patients (men and women ≥ 55 years) were included. The data analysis included 3D reformation, image registration of different contrasts, tissue segmentation, and automated lesion detection. This large international multi-centre study demonstrates how new MRI readouts can be used to provide key information on the evolution of cerebral tissue lesions and within the macrovasculature after atherothrombotic stroke in a large sample of patients.
Resumo:
A large percentage of bridges in the state of Iowa are classified as structurally or fiinctionally deficient. These bridges annually compete for a share of Iowa's limited transportation budget. To avoid an increase in the number of deficient bridges, the state of Iowa decided to implement a comprehensive Bridge Management System (BMS) and selected the Pontis BMS software as a bridge management tool. This program will be used to provide a selection of maintenance, repair, and replacement strategies for the bridge networks to achieve an efficient and possibly optimal allocation of resources. The Pontis BMS software uses a new rating system to evaluate extensive and detailed inspection data gathered for all bridge elements. To manually collect these data would be a highly time-consuming job. The objective of this work was to develop an automated-computerized methodology for an integrated data base that includes the rating conditions as defined in the Pontis program. Several of the available techniques that can be used to capture inspection data were reviewed, and the most suitable method was selected. To accomplish the objectives of this work, two userfriendly programs were developed. One program is used in the field to collect inspection data following a step-by-step procedure without the need to refer to the Pontis user's manuals. The other program is used in the office to read the inspection data and prepare input files for the Pontis BMS software. These two programs require users to have very limited knowledge of computers. On-line help screens as well as options for preparing, viewing, and printing inspection reports are also available. The developed data collection software will improve and expedite the process of conducting bridge inspections and preparing the required input files for the Pontis program. In addition, it will eliminate the need for large storage areas and will simplify retrieval of inspection data. Furthermore, the approach developed herein will facilitate transferring these captured data electronically between offices within the Iowa DOT and across the state.
Resumo:
Introduction ICM+ software encapsulates our 20 years' experience in brain monitoring. It collects data from a variety of bedside monitors and produces time trends of parameters defi ned using confi gurable mathematical formulae. To date it is being used in nearly 40 clinical research centres worldwide. We present its application for continuous monitoring of cerebral autoregulation using near-infrared spectroscopy (NIRS). Methods Data from multiple bedside monitors are processed by ICM+ in real time using a large selection of signal processing methods. These include various time and frequency domain analysis functions as well as fully customisable digital fi lters. The fi nal results are displayed in a variety of ways including simple time trends, as well as time window based histograms, cross histograms, correlations, and so forth. All this allows complex information from bedside monitors to be summarized in a concise fashion and presented to medical and nursing staff in a simple way that alerts them to the development of various pathological processes. Results One hundred and fi fty patients monitored continuously with NIRS, arterial blood pressure (ABP) and intracranial pressure (ICP), where available, were included in this study. There were 40 severely headinjured adult patients, 27 SAH patients (NCCU, Cambridge); 60 patients undergoing cardiopulmonary bypass (John Hopkins Hospital, Baltimore) and 23 patients with sepsis (University Hospital, Basel). In addition, MCA fl ow velocity (FV) was monitored intermittently using transcranial Doppler. FV-derived and ICP-derived pressure reactivity indices (PRx, Mx), as well as NIRS-derived reactivity indices (Cox, Tox, Thx) were calculated and showed signifi cant correlation with each other in all cohorts. Errorbar charts showing reactivity index PRx versus CPP (optimal CPP chart) as well as similar curves for NIRS indices versus CPP and ABP were also demonstrated. Conclusions ICM+ software is proving to be a very useful tool for enhancing the battery of available means for monitoring cerebral vasoreactivity and potentially facilitating autoregulation guided therapy. Complexity of data analysis is also hidden inside loadable profi les, thus allowing investigators to take full advantage of validated protocols including advanced processing formulas.
Resumo:
This study aimed to investigate the impact of a communication skills training (CST) in oncology on clinicians' linguistic strategies. A verbal communication analysis software (Logiciel d'Analyse de la Communication Verbale) was used to compare simulated patients interviews with oncology clinicians who participated in CST (N = 57) (pre/post with a 6-month interval) with a control group of oncology clinicians who did not (N = 56) (T1/T2 with a 6-month interval). A significant improvement of linguistic strategies related to biomedical, psychological and social issues was observed. Analysis of linguistic aspects of videotaped interviews might become in the future a part of individualised feedback in CST and utilised as a marker for an evaluation of training.
Resumo:
After a rockfall event, a usual post event survey includes qualitative volume estimation, trajectory mapping and determination of departing zones. However, quantitative measurements are not usually made. Additional relevant quantitative information could be useful in determining the spatial occurrence of rockfall events and help us in quantifying their size. Seismic measurements could be suitable for detection purposes since they are non invasive methods and are relatively inexpensive. Moreover, seismic techniques could provide important information on rockfall size and location of impacts. On 14 February 2007 the Avalanche Group of the University of Barcelona obtained the seismic data generated by an artificially triggered rockfall event at the Montserrat massif (near Barcelona, Spain) carried out in order to purge a slope. Two 3 component seismic stations were deployed in the area about 200 m from the explosion point that triggered the rockfall. Seismic signals and video images were simultaneously obtained. The initial volume of the rockfall was estimated to be 75 m3 by laser scanner data analysis. After the explosion, dozens of boulders ranging from 10¿4 to 5 m3 in volume impacted on the ground at different locations. The blocks fell down onto a terrace, 120 m below the release zone. The impact generated a small continuous mass movement composed of a mixture of rocks, sand and dust that ran down the slope and impacted on the road 60 m below. Time, time-frequency evolution and particle motion analysis of the seismic records and seismic energy estimation were performed. The results are as follows: 1 ¿ A rockfall event generates seismic signals with specific characteristics in the time domain; 2 ¿ the seismic signals generated by the mass movement show a time-frequency evolution different from that of other seismogenic sources (e.g. earthquakes, explosions or a single rock impact). This feature could be used for detection purposes; 3 ¿ particle motion plot analysis shows that the procedure to locate the rock impact using two stations is feasible; 4 ¿ The feasibility and validity of seismic methods for the detection of rockfall events, their localization and size determination are comfirmed.
Resumo:
The present study proposes a modification in one of the most frequently applied effect size procedures in single-case data analysis the percent of nonoverlapping data. In contrast to other techniques, the calculus and interpretation of this procedure is straightforward and it can be easily complemented by visual inspection of the graphed data. Although the percent of nonoverlapping data has been found to perform reasonably well in N = 1 data, the magnitude of effect estimates it yields can be distorted by trend and autocorrelation. Therefore, the data correction procedure focuses on removing the baseline trend from data prior to estimating the change produced in the behavior due to intervention. A simulation study is carried out in order to compare the original and the modified procedures in several experimental conditions. The results suggest that the new proposal is unaffected by trend and autocorrelation and can be used in case of unstable baselines and sequentially related measurements.
Resumo:
The present study focuses on single-case data analysis and specifically on two procedures for quantifying differences between baseline and treatment measurements The first technique tested is based on generalized least squares regression analysis and is compared to a proposed non-regression technique, which allows obtaining similar information. The comparison is carried out in the context of generated data representing a variety of patterns (i.e., independent measurements, different serial dependence underlying processes, constant or phase-specific autocorrelation and data variability, different types of trend, and slope and level change). The results suggest that the two techniques perform adequately for a wide range of conditions and researchers can use both of them with certain guarantees. The regression-based procedure offers more efficient estimates, whereas the proposed non-regression procedure is more sensitive to intervention effects. Considering current and previous findings, some tentative recommendations are offered to applied researchers in order to help choosing among the plurality of single-case data analysis techniques.
Resumo:
The focus of my PhD research was the concept of modularity. In the last 15 years, modularity has become a classic term in different fields of biology. On the conceptual level, a module is a set of interacting elements that remain mostly independent from the elements outside of the module. I used modular analysis techniques to study gene expression evolution in vertebrates. In particular, I identified ``natural'' modules of gene expression in mouse and human, and I showed that expression of organ-specific and system-specific genes tends to be conserved between such distance vertebrates as mammals and fishes. Also with a modular approach, I studied patterns of developmental constraints on transcriptome evolution. I showed that none of the two commonly accepted models of the evolution of embryonic development (``evo-devo'') are exclusively valid. In particular, I found that the conservation of the sequences of regulatory regions is highest during mid-development of zebrafish, and thus it supports the ``hourglass model''. In contrast, events of gene duplication and new gene introduction are most rare in early development, which supports the ``early conservation model''. In addition to the biological insights on transcriptome evolution, I have also discussed in detail the advantages of modular approaches in large-scale data analysis. Moreover, I re-analyzed several studies (published in high-ranking journals), and showed that their conclusions do not hold out under a detailed analysis. This demonstrates that complex analysis of high-throughput data requires a co-operation between biologists, bioinformaticians, and statisticians.
Resumo:
This report is divided into two volumes. This volume (Volume I) summarizes a structural health monitoring (SHM) system that was developed for the Iowa DOT to remotely and continuously monitor fatigue critical bridges (FCB) to aid in the detection of crack formation. The developed FCB SHM system enables bridge owners to remotely monitor FCB for gradual or sudden damage formation. The SHM system utilizes fiber bragg grating (FBG) fiber optic sensors (FOSs) to measure strains at critical locations. The strain-based SHM system is trained with measured performance data to identify typical bridge response when subjected to ambient traffic loads, and that knowledge is used to evaluate newly collected data. At specified intervals, the SHM system autonomously generates evaluation reports that summarize the current behavior of the bridge. The evaluation reports are collected and distributed to the bridge owner for interpretation and decision making. Volume II summarizes the development and demonstration of an autonomous, continuous SHM system that can be used to monitor typical girder bridges. The developed SHM system can be grouped into two main categories: an office component and a field component. The office component is a structural analysis software program that can be used to generate thresholds which are used for identifying isolated events. The field component includes hardware and field monitoring software which performs data processing and evaluation. The hardware system consists of sensors, data acquisition equipment, and a communication system backbone. The field monitoring software has been developed such that, once started, it will operate autonomously with minimal user interaction. In general, the SHM system features two key uses. First, the system can be integrated into an active bridge management system that tracks usage and structural changes. Second, the system helps owners to identify damage and deterioration.
Resumo:
This research was conducted in the context of the project IRIS 8A Health and Society (2002-2008) and financially supported by the University of Lausanne. It was aomed at developping a model based on the elder people's experience and allowed us to develop a "Portrait evaluation" of fear of falling using their examples and words. It is a very simple evaluation, which can be used by professionals, but by the elder people themselves. The "Portrait evaluation" and the user's guide are on free access, but we would very much approciate to know whether other people or scientists have used it and collect their comments. (contact: Chantal.Piot-Ziegler@unil.ch)The purpose of this study is to create a model grounded in the elderly people's experience allowing the development of an original instrument to evaluate FOF.In a previous study, 58 semi-structured interviews were conducted with community-dwelling elderly people. The qualitative thematic analysis showed that fear of falling was defined through the functional, social and psychological long-term consequences of falls (Piot-Ziegler et al., 2007).In order to reveal patterns in the expression of fear of falling, an original qualitative thematic pattern analysis (QUAlitative Pattern Analysis - QUAPA) is developed and applied on these interviews.The results of this analysis show an internal coherence across the three dimensions (functional, social and psychological). Four different patterns are found, corresponding to four degrees of fear of falling. They are formalized in a fear of falling intensity model.This model leads to a portrait-evaluation for fallers and non-fallers. The evaluation must be confronted to large samples of elderly people, living in different environments. It presents an original alternative to the concept of self-efficacy to evaluate fear of falling in older people.The model of FOF presented in this article is grounded on elderly people's experience. It gives an experiential description of the three dimensions constitutive of FOF and of their evolution as fear increases, and defines an evaluation tool using situations and wordings based on the elderly people's discourse.
Resumo:
Geophysical techniques can help to bridge the inherent gap with regard to spatial resolution and the range of coverage that plagues classical hydrological methods. This has lead to the emergence of the new and rapidly growing field of hydrogeophysics. Given the differing sensitivities of various geophysical techniques to hydrologically relevant parameters and their inherent trade-off between resolution and range the fundamental usefulness of multi-method hydrogeophysical surveys for reducing uncertainties in data analysis and interpretation is widely accepted. A major challenge arising from such endeavors is the quantitative integration of the resulting vast and diverse database in order to obtain a unified model of the probed subsurface region that is internally consistent with all available data. To address this problem, we have developed a strategy towards hydrogeophysical data integration based on Monte-Carlo-type conditional stochastic simulation that we consider to be particularly suitable for local-scale studies characterized by high-resolution and high-quality datasets. Monte-Carlo-based optimization techniques are flexible and versatile, allow for accounting for a wide variety of data and constraints of differing resolution and hardness and thus have the potential of providing, in a geostatistical sense, highly detailed and realistic models of the pertinent target parameter distributions. Compared to more conventional approaches of this kind, our approach provides significant advancements in the way that the larger-scale deterministic information resolved by the hydrogeophysical data can be accounted for, which represents an inherently problematic, and as of yet unresolved, aspect of Monte-Carlo-type conditional simulation techniques. We present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on pertinent synthetic data and then applied to corresponding field data collected at the Boise Hydrogeophysical Research Site near Boise, Idaho, USA.
Resumo:
The early-age thermal development of structural mass concrete elements has a significant impact on the future durability and longevity of the elements. If the heat of hydration is not controlled, the elements may be susceptible to thermal cracking and damage from delayed ettringite formation. In the Phase I study, the research team reviewed published literature and current specifications on mass concrete. In addition, the team observed construction and reviewed thermal data from the westbound (WB) I-80 Missouri River Bridge. Finally, the researchers conducted an initial investigation of the thermal analysis software programs ConcreteWorks and 4C-Temp&Stress. The Phase II study is aimed at developing guidelines for the design and construction of mass concrete placements associated with large bridge foundations. This phase included an additional review of published literature and a more in-depth investigation of current mass concrete specifications. In addition, the mass concrete construction of two bridges, the WB I-80 Missouri River Bridge and the US 34 Missouri River Bridge, was documented. An investigation was conducted of the theory and application of 4C-Temp&Stress. ConcreteWorks and 4C-Temp&Stress were calibrated with thermal data recorded for the WB I-80 Missouri River Bridge and the US 34 Missouri River Bridge. ConcreteWorks and 4C-Temp&Stress were further verified by means of a sensitivity study. Finally, conclusions and recommendations were developed, as included in this report.
Resumo:
The goal of this work was to move structural health monitoring (SHM) one step closer to being ready for mainstream use by the Iowa Department of Transportation (DOT) Office of Bridges and Structures. To meet this goal, the objective of this project was to implement a pilot multi-sensor continuous monitoring system on the Iowa Falls Arch Bridge such that autonomous data analysis, storage, and retrieval can be demonstrated. The challenge with this work was to develop the open channels for communication, coordination, and cooperation of various Iowa DOT offices that could make use of the data. In a way, the end product was to be something akin to a control system that would allow for real-time evaluation of the operational condition of a monitored bridge. Development and finalization of general hardware and software components for a bridge SHM system were investigated and completed. This development and finalization was framed around the demonstration installation on the Iowa Falls Arch Bridge. The hardware system focused on using off-the-shelf sensors that could be read in either “fast” or “slow” modes depending on the desired monitoring metric. As hoped, the installed system operated with very few problems. In terms of communications—in part due to the anticipated installation on the I-74 bridge over the Mississippi River—a hardline digital subscriber line (DSL) internet connection and grid power were used. During operation, this system would transmit data to a central server location where the data would be processed and then archived for future retrieval and use. The pilot monitoring system was developed for general performance evaluation purposes (construction, structural, environmental, etc.) such that it could be easily adapted to the Iowa DOT’s bridges and other monitoring needs. The system was developed allowing easy access to near real-time data in a format usable to Iowa DOT engineers.