946 resultados para Research Data Management


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Problems due to the lack of data standardization and data management have lead to work inefficiencies for the staff working with the vision data for the Lifetime Surveillance of Astronaut Health. Data has been collected over 50 years in a variety of manners and then entered into a software. The lack of communication between the electronic health record (EHR) form designer, epidemiologists, and optometrists has led to some level to confusion on the capability of the EHR system and how its forms can be designed to fit all the needs of the relevant parties. EHR form customizations or form redesigns were found to be critical for using NASA's EHR system in the most beneficial way for its patients, optometrists, and epidemiologists. In order to implement a protocol, data being collected was examined to find the differences in data collection methods. Changes were implemented through the establishment of a process improvement team (PIT). Based on the findings of the PIT, suggestions have been made to improve the current EHR system. If the suggestions are implemented correctly, this will not only improve efficiency of the staff at NASA and its contractors, but set guidelines for changes in other forms such as the vision exam forms. Because NASA is at the forefront of such research and health surveillance the impact of this management change could have a drastic improvement on the collection of and adaptability of the EHR. Accurate data collection from this 50+ year study is ongoing and is going to help current and future generations understand the implications of space flight on human health. It is imperative that the vast amount of information is documented correctly.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Clinical Research Data Quality Literature Review and Pooled Analysis We present a literature review and secondary analysis of data accuracy in clinical research and related secondary data uses. A total of 93 papers meeting our inclusion criteria were categorized according to the data processing methods. Quantitative data accuracy information was abstracted from the articles and pooled. Our analysis demonstrates that the accuracy associated with data processing methods varies widely, with error rates ranging from 2 errors per 10,000 files to 5019 errors per 10,000 fields. Medical record abstraction was associated with the highest error rates (70–5019 errors per 10,000 fields). Data entered and processed at healthcare facilities had comparable error rates to data processed at central data processing centers. Error rates for data processed with single entry in the presence of on-screen checks were comparable to double entered data. While data processing and cleaning methods may explain a significant amount of the variability in data accuracy, additional factors not resolvable here likely exist. Defining Data Quality for Clinical Research: A Concept Analysis Despite notable previous attempts by experts to define data quality, the concept remains ambiguous and subject to the vagaries of natural language. This current lack of clarity continues to hamper research related to data quality issues. We present a formal concept analysis of data quality, which builds on and synthesizes previously published work. We further posit that discipline-level specificity may be required to achieve the desired definitional clarity. To this end, we combine work from the clinical research domain with findings from the general data quality literature to produce a discipline-specific definition and operationalization for data quality in clinical research. While the results are helpful to clinical research, the methodology of concept analysis may be useful in other fields to clarify data quality attributes and to achieve operational definitions. Medical Record Abstractor’s Perceptions of Factors Impacting the Accuracy of Abstracted Data Medical record abstraction (MRA) is known to be a significant source of data errors in secondary data uses. Factors impacting the accuracy of abstracted data are not reported consistently in the literature. Two Delphi processes were conducted with experienced medical record abstractors to assess abstractor’s perceptions about the factors. The Delphi process identified 9 factors that were not found in the literature, and differed with the literature by 5 factors in the top 25%. The Delphi results refuted seven factors reported in the literature as impacting the quality of abstracted data. The results provide insight into and indicate content validity of a significant number of the factors reported in the literature. Further, the results indicate general consistency between the perceptions of clinical research medical record abstractors and registry and quality improvement abstractors. Distributed Cognition Artifacts on Clinical Research Data Collection Forms Medical record abstraction, a primary mode of data collection in secondary data use, is associated with high error rates. Distributed cognition in medical record abstraction has not been studied as a possible explanation for abstraction errors. We employed the theory of distributed representation and representational analysis to systematically evaluate cognitive demands in medical record abstraction and the extent of external cognitive support employed in a sample of clinical research data collection forms. We show that the cognitive load required for abstraction in 61% of the sampled data elements was high, exceedingly so in 9%. Further, the data collection forms did not support external cognition for the most complex data elements. High working memory demands are a possible explanation for the association of data errors with data elements requiring abstractor interpretation, comparison, mapping or calculation. The representational analysis used here can be used to identify data elements with high cognitive demands.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the authors introduce a novel mechanism for data management in a middleware for smart home control, where a relational database and semantic ontology storage are used at the same time in a Data Warehouse. An annotation system has been designed for instructing the storage format and location, registering new ontology concepts and most importantly, guaranteeing the Data Consistency between the two storage methods. For easing the data persistence process, the Data Access Object (DAO) pattern is applied and optimized to enhance the Data Consistency assurance. Finally, this novel mechanism provides an easy manner for the development of applications and their integration with BATMP. Finally, an application named "Parameter Monitoring Service" is given as an example for assessing the feasibility of the system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Empirical Software Engineering (ESE) replication researchers need to store and manipulate experimental data for several purposes, in particular analysis and reporting. Current research needs call for sharing and preservation of experimental data as well. In a previous work, we analyzed Replication Data Management (RDM) needs. A novel concept, called Experimental Ecosystem, was proposed to solve current deficiencies in RDMapproaches. The empirical ecosystem provides replication researchers with a common framework that integrates transparently local heterogeneous data sources. A typical situation where the Empirical Ecosystem is applicable, is when several members of a research group, or several research groups collaborating together, need to share and access each other experimental results. However, to be able to apply the Empirical Ecosystem concept and deliver all promised benefits, it is necessary to analyze the software architectures and tools that can properly support it.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Uses research in a major UK company on the introduction of an electronic document management system to explore perceptions of, and attitudes to, risk. Phenomenological methods were used; with subsequent dialogue transcripts evaluated with Winmax dialogue software, using an adapted theoretical framework based upon an analysis of the literature. The paper identifies a number of factors, and builds a framework, that should support a greater understanding of risk assessment and project management by the academic community and practitioners.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coarse-resolution thematic maps derived from remotely sensed data and implemented in GIS play an important role in coastal and marine conservation, research and management. Here, we describe an approach for fine-resolution mapping of land-cover types using aerial photography and ancillary GIs and ground data in a large (100 x 35 km) subtropical estuarine system (Moreton Bay, Queensland, Australia). We have developed and implemented a classification scheme representing 24 coastal (subtidal, intertidal. mangrove, supratidal and terrestrial) cover types relevant to the ecology of estuarine animals, nekton and shorebirds. The accuracy of classifications of the intertidal and subtidal cover types, as indicated by the agreement between the mapped (predicted) and reference (ground) data, was 77-88%, depending on the zone and level of generalization required. The variability and spatial distribution of habitat mosaics (landscape types) across the mapped environment were assessed using K-means clustering and validated with Classification and Regression Tree models. Seven broad landscape types could be distinguished and ways of incorporating the information on landscape composition into site-specific conservation and field research are discussed. This research illustrates the importance and potential applications of fine-resolution mapping for conservation and management of estuarine habitats and their terrestrial and aquatic wildlife. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: There is some evidence from a Cochrane review that rehabilitation following spinal surgery may be beneficial. Methods: We conducted a survey of current post-operative practice amongst spinal surgeons in the United Kingdom in 2002 to determine whether such interventions are being included routinely in the post-operative management of spinal patients. The survey included all surgeons who were members of either the British Association of Spinal Surgeons ( BASS) or the Society for Back Pain Research. Data on the characteristics of each surgeon and his or her current pattern of practice and post-operative care were collected via a reply-paid postal questionnaire. Results: Usable responses were provided by 57% of the 89 surgeons included in the survey. Most surgeons (79%) had a routine post-operative management regime, but only 35% had a written set of instructions that they gave to their patients concerning this. Over half (55%) of surgeons do not send their patients for any physiotherapy after discharge, with an average of less than two sessions of treatment organised by those that refer for physiotherapy at all. Restrictions on lifting, sitting and driving showed considerable inconsistency both between surgeons and also within the recommendations given by individual surgeons. Conclusion: Demonstrable inconsistencies within and between spinal surgeons in their approaches to post-operative management can be interpreted as evidence of continuing and significant uncertainty across the sub-speciality as to what does constitute best care in these areas of practice. Conducting further large, rigorous, randomised controlled trials would be the best method for obtaining definitive answers to these questions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The continuous plankton recorder (CPR) survey is the largest multi-decadal plankton monitoring programme in the world. It was initiated in 1931 and by the end of 2004 had counted 207,619 samples and identified 437 phyto- and zoo-plankton taxa throughout the North Atlantic. CPR data are used extensively by the research community and in recent years have been used increasingly to underpin marine management. Here, we take a critical look at how best to use CPR data. We first describe the CPR itself, CPR sampling, and plankton counting procedures. We discuss the spatial and temporal biases in the Survey, summarise environmental data that have not previously been available, and describe the new data access policy. We supply information essential to using CPR data, including descriptions of each CPR taxonomic entity., the idiosyncrasies associated with counting many of the taxa, the logic behind taxonomic changes in the Survey, the semi-quantitative nature of CPR sampling, and recommendations on choosing the spatial and temporal scale of study. This forms the basis for a broader discussion on how to use CPR data for deriving ecologically meaningful indices based on size, functional groups and biomass that can be used to support research and management. This contribution should be useful for plankton ecologists, modellers and policy makers that actively use CPR data. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data Envelopment Analysis (DEA) is a nonparametric method for measuring the efficiency of a set of decision making units such as firms or public sector agencies, first introduced into the operational research and management science literature by Charnes, Cooper, and Rhodes (CCR) [Charnes, A., Cooper, W.W., Rhodes, E., 1978. Measuring the efficiency of decision making units. European Journal of Operational Research 2, 429–444]. The original DEA models were applicable only to technologies characterized by positive inputs/outputs. In subsequent literature there have been various approaches to enable DEA to deal with negative data. In this paper, we propose a semi-oriented radial measure, which permits the presence of variables which can take both negative and positive values. The model is applied to data on a notional effluent processing system to compare the results with those yielded by two alternative methods for dealing with negative data in DEA: The modified slacks-based model suggested by Sharp et al. [Sharp, J.A., Liu, W.B., Meng, W., 2006. A modified slacks-based measure model for data envelopment analysis with ‘natural’ negative outputs and inputs. Journal of Operational Research Society 57 (11) 1–6] and the range directional model developed by Portela et al. [Portela, M.C.A.S., Thanassoulis, E., Simpson, G., 2004. A directional distance approach to deal with negative data in DEA: An application to bank branches. Journal of Operational Research Society 55 (10) 1111–1121]. A further example explores the advantages of using the new model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter provides the theoretical foundation and background on data envelopment analysis (DEA) method. We first introduce the basic DEA models. The balance of this chapter focuses on evidences showing DEA has been extensively applied for measuring efficiency and productivity of services including financial services (banking, insurance, securities, and fund management), professional services, health services, education services, environmental and public services, energy services, logistics, tourism, information technology, telecommunications, transport, distribution, audio-visual, media, entertainment, cultural and other business services. Finally, we provide information on the use of Performance Improvement Management Software (PIM-DEA). A free limited version of this software and downloading procedure is also included in this chapter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A prominent theme emerging in Occupational Health and Safety (OSH) is the development of management systems. A range of interventions, according to a prescribed route detailed by one of the management systems, can be introduced into an organisation with some expectation of improved OSH performance. This thesis attempts to identify the key influencing factors that may impact upon the process of introducing interventions, (according to B88800: 1996, Guide to Implementing Occupational Health and Safety Management Systems) into an organisation. To help identify these influencing factors a review of possible models from the sphere of Total Quality Management (TQM) was undertaken and the most suitable TQM model selected for development and use in aSH. By anchoring the aSH model's development in the reviewed literature a range ofeare, medium and low level influencing factors were identified. This model was developed in conjunction with the research data generated within the case study organisation (rubber manufacturer) and applied to the organisation. The key finding was that the implementation of an OSH intervention was dependant upon three broad vectors of influence. These are the Incentive to introduce change within an organisation which refers to the drivers or motivators for OSH. Secondly the Ability within the management team to actually implement the changes refers to aspects, amongst others, such as leadership, commitment and perceptions of OSH. Ability is in turn itself influenced by the environment within which change is being introduced. TItis aspect of Receptivity refers to the history of the plant and characteristics of the workforce. Aspects within Receptivity include workforce profile and organisational policies amongst others. It was found that the TQM model selected and developed for an OSH management system intervention did explain the core influencing factors and their impact upon OSH performance. It was found that within the organisation the results that may have been expected from implementation of BS8800:1996 were not realised. The OSH model highlighted that given the organisation's starting point, a poor appreciation of the human factors of OSH, gave little reward for implementation of an OSH management system. In addition it was found that general organisational culture can effectively suffocate any attempts to generate a proactive safety culture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Zambia and many other countries in Sub-Saharan Africa face a key challenge of sustaining high levels of coverage of AIDS treatment under prospects of dwindling global resources for HIV/AIDS treatment. Policy debate in HIV/AIDS is increasingly paying more focus to efficiency in the use of available resources. In this chapter, we apply Data Envelopment Analysis (DEA) to estimate short term technical efficiency of 34 HIV/AIDS treatment facilities in Zambia. The data consists of input variables such as human resources, medical equipment, building space, drugs, medical supplies, and other materials used in providing HIV/AIDS treatment. Two main outputs namely, numbers of ART-years (Anti-Retroviral Therapy-years) and pre-ART-years are included in the model. Results show the mean technical efficiency score to be 83%, with great variability in efficiency scores across the facilities. Scale inefficiency is also shown to be significant. About half of the facilities were on the efficiency frontier. We also construct bootstrap confidence intervals around the efficiency scores.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Non-parametric methods for efficiency evaluation were designed to analyse industries comprising multi-input multi-output producers and lacking data on market prices. Education is a typical example. In this chapter, we review applications of DEA in secondary and tertiary education, focusing on the opportunities that this offers for benchmarking at institutional level. At secondary level, we investigate also the disaggregation of efficiency measures into pupil-level and school-level effects. For higher education, while many analyses concern overall institutional efficiency, we examine also studies that take a more disaggregated approach, centred either around the performance of specific functional areas or that of individual employees.