33 resultados para Optimistic data replication system
em BORIS: Bern Open Repository and Information System - Berna - Sui
Resumo:
Historical, i.e. pre-1957, upper-air data are a valuable source of information on the state of the atmosphere, in some parts of the world dating back to the early 20th century. However, to date, reanalyses have only partially made use of these data, and only of observations made after 1948. Even for the period between 1948 (the starting year of the NCEP/NCAR (National Centers for Environmental Prediction/National Center for Atmospheric Research) reanalysis) and the International Geophysical Year in 1957 (the starting year of the ERA-40 reanalysis), when the global upper-air coverage reached more or less its current status, many observations have not yet been digitised. The Comprehensive Historical Upper-Air Network (CHUAN) already compiled a large collection of pre-1957 upper-air data. In the framework of the European project ERA-CLIM (European Reanalysis of Global Climate Observations), significant amounts of additional upper-air data have been catalogued (> 1.3 million station days), imaged (> 200 000 images) and digitised (> 700 000 station days) in order to prepare a new input data set for upcoming reanalyses. The records cover large parts of the globe, focussing on, so far, less well covered regions such as the tropics, the polar regions and the oceans, and on very early upper-air data from Europe and the US. The total number of digitised/inventoried records is 61/101 for moving upper-air data, i.e. data from ships, etc., and 735/1783 for fixed upper-air stations. Here, we give a detailed description of the resulting data set including the metadata and the quality checking procedures applied. The data will be included in the next version of CHUAN. The data are available at doi:10.1594/PANGAEA.821222
Resumo:
The paper showcases the field- and lab-documentation system developed for Kinneret Regional Project, an international archaeological expedition to the Northwestern shore of the Sea of Galilee (Israel) under the auspices of the University of Bern, the University of Helsinki, Leiden University and Wofford College. The core of the data management system is a fully relational, server-based database framework, which also includes time-based and static GIS services, stratigraphic analysis tools and fully indexed document/digital image archives. Data collection in the field is based on mobile, hand-held devices equipped with a custom-tailored stand-alone application. Comprehensive three-dimensional documentation of all finds and findings is achieved by means of total stations and/or high-precision GPS devices. All archaeological information retrieved in the field – including tachymetric data – is synched with the core system on the fly and thus immediately available for further processing in the field lab (within the local network) or for post-excavation analysis at remote institutions (via the WWW). Besides a short demonstration of the main functionalities, the paper also presents some of the key technologies used and illustrates usability aspects of the system’s individual components.
Resumo:
This paper introduces an area- and power-efficient approach for compressive recording of cortical signals used in an implantable system prior to transmission. Recent research on compressive sensing has shown promising results for sub-Nyquist sampling of sparse biological signals. Still, any large-scale implementation of this technique faces critical issues caused by the increased hardware intensity. The cost of implementing compressive sensing in a multichannel system in terms of area usage can be significantly higher than a conventional data acquisition system without compression. To tackle this issue, a new multichannel compressive sensing scheme which exploits the spatial sparsity of the signals recorded from the electrodes of the sensor array is proposed. The analysis shows that using this method, the power efficiency is preserved to a great extent while the area overhead is significantly reduced resulting in an improved power-area product. The proposed circuit architecture is implemented in a UMC 0.18 [Formula: see text]m CMOS technology. Extensive performance analysis and design optimization has been done resulting in a low-noise, compact and power-efficient implementation. The results of simulations and subsequent reconstructions show the possibility of recovering fourfold compressed intracranial EEG signals with an SNR as high as 21.8 dB, while consuming 10.5 [Formula: see text]W of power within an effective area of 250 [Formula: see text]m × 250 [Formula: see text]m per channel.
Resumo:
Project justification is regarded as one of the major methodological deficits in Data Warehousing practice. As reasons for applying inappropriate methods, performing incomplete evaluations, or even entirely omitting justifications, the special nature of Data Warehousing benefits and the large portion of infrastructure-related activities are stated. In this paper, the economic justification of Data Warehousing projects is analyzed, and first results from a large academiaindustry collaboration project in the field of non-technical issues of Data Warehousing are presented. As conceptual foundations, the role of the Data Warehouse system in corporate application architectures is analyzed, and the specific properties of Data Warehousing projects are discussed. Based on an applicability analysis of traditional approaches to economic IT project justification, basic steps and responsibilities for the justification of Data Warehousing projects are derived.
Resumo:
SMARTDIAB is a platform designed to support the monitoring, management, and treatment of patients with type 1 diabetes mellitus (T1DM), by combining state-of-the-art approaches in the fields of database (DB) technologies, communications, simulation algorithms, and data mining. SMARTDIAB consists mainly of two units: 1) the patient unit (PU); and 2) the patient management unit (PMU), which communicate with each other for data exchange. The PMU can be accessed by the PU through the internet using devices, such as PCs/laptops with direct internet access or mobile phones via a Wi-Fi/General Packet Radio Service access network. The PU consists of an insulin pump for subcutaneous insulin infusion to the patient and a continuous glucose measurement system. The aforementioned devices running a user-friendly application gather patient's related information and transmit it to the PMU. The PMU consists of a diabetes data management system (DDMS), a decision support system (DSS) that provides risk assessment for long-term diabetes complications, and an insulin infusion advisory system (IIAS), which reside on a Web server. The DDMS can be accessed from both medical personnel and patients, with appropriate security access rights and front-end interfaces. The DDMS, apart from being used for data storage/retrieval, provides also advanced tools for the intelligent processing of the patient's data, supporting the physician in decision making, regarding the patient's treatment. The IIAS is used to close the loop between the insulin pump and the continuous glucose monitoring system, by providing the pump with the appropriate insulin infusion rate in order to keep the patient's glucose levels within predefined limits. The pilot version of the SMARTDIAB has already been implemented, while the platform's evaluation in clinical environment is being in progress.
Resumo:
Background Abstractor training is a key element in creating valid and reliable data collection procedures. The choice between in-person vs. remote or simultaneous vs. sequential abstractor training has considerable consequences for time and resource utilization. We conducted a web-based (webinar) abstractor training session to standardize training across six individual Cancer Research Network (CRN) sites for a study of breast cancer treatment effects in older women (BOWII). The goals of this manuscript are to describe the training session, its participants and participants' evaluation of webinar technology for abstraction training. Findings A webinar was held for all six sites with the primary purpose of simultaneously training staff and ensuring consistent abstraction across sites. The training session involved sequential review of over 600 data elements outlined in the coding manual in conjunction with the display of data entry fields in the study's electronic data collection system. Post-training evaluation was conducted via Survey Monkey©. Inter-rater reliability measures for abstractors within each site were conducted three months after the commencement of data collection. Ten of the 16 people who participated in the training completed the online survey. Almost all (90%) of the 10 trainees had previous medical record abstraction experience and nearly two-thirds reported over 10 years of experience. Half of the respondents had previously participated in a webinar, among which three had participated in a webinar for training purposes. All rated the knowledge and information delivered through the webinar as useful and reported it adequately prepared them for data collection. Moreover, all participants would recommend this platform for multi-site abstraction training. Consistent with participant-reported training effectiveness, results of data collection inter-rater agreement within sites ranged from 89 to 98%, with a weighted average of 95% agreement across sites. Conclusions Conducting training via web-based technology was an acceptable and effective approach to standardizing medical record review across multiple sites for this group of experienced abstractors. Given the substantial time and cost savings achieved with the webinar, coupled with participants' positive evaluation of the training session, researchers should consider this instructional method as part of training efforts to ensure high quality data collection in multi-site studies.
Resumo:
This article reports about the internet based, second multicenter study (MCS II) of the spine study group (AG WS) of the German trauma association (DGU). It represents a continuation of the first study conducted between the years 1994 and 1996 (MCS I). For the purpose of one common, centralised data capture methodology, a newly developed internet-based data collection system ( http://www.memdoc.org ) of the Institute for Evaluative Research in Orthopaedic Surgery of the University of Bern was used. The aim of this first publication on the MCS II was to describe in detail the new method of data collection and the structure of the developed data base system, via internet. The goal of the study was the assessment of the current state of treatment for fresh traumatic injuries of the thoracolumbar spine in the German speaking part of Europe. For that reason, we intended to collect large number of cases and representative, valid information about the radiographic, clinical and subjective treatment outcomes. Thanks to the new study design of MCS II, not only the common surgical treatment concepts, but also the new and constantly broadening spectrum of spine surgery, i.e. vertebro-/kyphoplasty, computer assisted surgery and navigation, minimal-invasive, and endoscopic techniques, documented and evaluated. We present a first statistical overview and preliminary analysis of 18 centers from Germany and Austria that participated in MCS II. A real time data capture at source was made possible by the constant availability of the data collection system via internet access. Following the principle of an application service provider, software, questionnaires and validation routines are located on a central server, which is accessed from the periphery (hospitals) by means of standard Internet browsers. By that, costly and time consuming software installation and maintenance of local data repositories are avoided and, more importantly, cumbersome migration of data into one integrated database becomes obsolete. Finally, this set-up also replaces traditional systems wherein paper questionnaires were mailed to the central study office and entered by hand whereby incomplete or incorrect forms always represent a resource consuming problem and source of error. With the new study concept and the expanded inclusion criteria of MCS II 1, 251 case histories with admission and surgical data were collected. This remarkable number of interventions documented during 24 months represents an increase of 183% compared to the previously conducted MCS I. The concept and technical feasibility of the MEMdoc data collection system was proven, as the participants of the MCS II succeeded in collecting data ever published on the largest series of patients with spinal injuries treated within a 2 year period.
Resumo:
Nonallergic hypersensitivity and allergic reactions are part of the many different types of adverse drug reactions (ADRs). Databases exist for the collection of ADRs. Spontaneous reporting makes up the core data-generating system of pharmacovigilance, but there is a large under-estimation of allergy/hypersensitivity drug reactions. A specific database is therefore required for drug allergy and hypersensitivity using standard operating procedures (SOPs), as the diagnosis of drug allergy/hypersensitivity is difficult and current pharmacovigilance algorithms are insufficient. Although difficult, the diagnosis of drug allergy/hypersensitivity has been standardized by the European Network for Drug Allergy (ENDA) under the aegis of the European Academy of Allergology and Clinical Immunology and SOPs have been published. Based on ENDA and Global Allergy and Asthma European Network (GA(2)LEN, EU Framework Programme 6) SOPs, a Drug Allergy and Hypersensitivity Database (DAHD((R))) has been established under FileMaker((R)) Pro 9. It is already available online in many different languages and can be accessed using a personal login. GA(2)LEN is a European network of 27 partners (16 countries) and 59 collaborating centres (26 countries), which can coordinate and implement the DAHD across Europe. The GA(2)LEN-ENDA-DAHD platform interacting with a pharmacovigilance network appears to be of great interest for the reporting of allergy/hypersensitivity ADRs in conjunction with other pharmacovigilance instruments.
Resumo:
This article describes the structure and utilization of a computerized databank system for WHO mortality data. This system makes available "at finger-tips" data which previously were published by WHO in its blue volumes. The data can be handled much more flexible. At the moment the system provides information on age-standardized rates (direct standardization), total number of cases, as well as cover per age-group and year for about a hundred countries. The time period covered is 1950-1985, with exceptions for data which are not available to WHO.
Resumo:
INTRODUCTION: Despite the key role of hemodynamic goals, there are few data addressing the question as to which hemodynamic variables are associated with outcome or should be targeted in cardiogenic shock patients. The aim of this study was to investigate the association between hemodynamic variables and cardiogenic shock mortality. METHODS: Medical records and the patient data management system of a multidisciplinary intensive care unit (ICU) were reviewed for patients admitted because of cardiogenic shock. In all patients, the hourly variable time integral of hemodynamic variables during the first 24 hours after ICU admission was calculated. If hemodynamic variables were associated with 28-day mortality, the hourly variable time integral of drops below clinically relevant threshold levels was computed. Regression models and receiver operator characteristic analyses were calculated. All statistical models were adjusted for age, admission year, mean catecholamine doses and the Simplified Acute Physiology Score II (excluding hemodynamic counts) in order to account for the influence of age, changes in therapies during the observation period, the severity of cardiovascular failure and the severity of the underlying disease on 28-day mortality. RESULTS: One-hundred and nineteen patients were included. Cardiac index (CI) (P = 0.01) and cardiac power index (CPI) (P = 0.03) were the only hemodynamic variables separately associated with mortality. The hourly time integral of CI drops <3, 2.75 (both P = 0.02) and 2.5 (P = 0.03) L/min/m2 was associated with death but not that of CI drops <2 L/min/m2 or lower thresholds (all P > 0.05). The hourly time integral of CPI drops <0.5-0.8 W/m2 (all P = 0.04) was associated with 28-day mortality but not that of CPI drops <0.4 W/m2 or lower thresholds (all P > 0.05). CONCLUSIONS: During the first 24 hours after intensive care unit admission, CI and CPI are the most important hemodynamic variables separately associated with 28-day mortality in patients with cardiogenic shock. A CI of 3 L/min/m2 and a CPI of 0.8 W/m2 were most predictive of 28-day mortality. Since our results must be considered hypothesis-generating, randomized controlled trials are required to evaluate whether targeting these levels as early resuscitation endpoints can improve mortality in cardiogenic shock.
Resumo:
A state-of-the-art inverse model, CarbonTracker Data Assimilation Shell (CTDAS), was used to optimize estimates of methane (CH4) surface fluxes using atmospheric observations of CH4 as a constraint. The model consists of the latest version of the TM5 atmospheric chemistry-transport model and an ensemble Kalman filter based data assimilation system. The model was constrained by atmospheric methane surface concentrations, obtained from the World Data Centre for Greenhouse Gases (WDCGG). Prior methane emissions were specified for five sources: biosphere, anthropogenic, fire, termites and ocean, of which bio-sphere and anthropogenic emissions were optimized. Atmospheric CH 4 mole fractions for 2007 from northern Finland calculated from prior and optimized emissions were compared with observations. It was found that the root mean squared errors of the posterior esti - mates were more than halved. Furthermore, inclusion of NOAA observations of CH 4 from weekly discrete air samples collected at Pallas improved agreement between posterior CH 4 mole fraction estimates and continuous observations, and resulted in reducing optimized biosphere emissions and their uncertainties in northern Finland.
Resumo:
STUDY DESIGN Single centre retrospective study of prospectively collected data, nested within the Eurospine Spine Tango data acquisition system. OBJECTIVE The aim of this study was to assess the patient-rated outcome and complication rates associated with lumbar fusion procedures in three different age groups. SUMMARY OF BACKGROUND DATA There is a general reluctance to consider spinal fusion procedures in elderly patients due to the increased likelihood of complications. METHODS Before and at 3, 12, and 24 months after surgery, patients completed the multidimensional Core Outcome Measures Index (COMI). At the 3-, 12-, and 24-month follow-ups they also rated the Global Treatment Outcome (GTO) and their satisfaction with care. Patients were divided into three age groups: younger (≥50y < 65y; n = 317), older (≥65y < 80y; n = 350), and geriatric (≥ 80y; n = 40). RESULTS 707 consecutive patients were included. The preoperative comorbidity status differed significantly (p < 0.0001) between the age groups, with the highest scores in the geriatric group. Medical complications during surgery were lower in the younger age group (7%) than in the older (13.4%; p = 0.006) and geriatric groups (17.5%; p = 0.007); surgical complications tended to be higher in the elderly group (younger, 6.3%; older, 6.0%; geriatric, 15.0%; p = 0.09). There were no significant group differences (p > 0.05) for the scores on any of the COMI domains, GTO, or patient-rated satisfaction at either 3-, 12-, and 24-months follow-up. CONCLUSIONS Despite greater comorbidity and complication rates in geriatric patients, the patient-rated outcome was as good in the elderly as it was in younger age groups up to two years after surgery. These data indicate that geriatric age needs careful consideration of associated risks but is not per se a contraindication for fusion for lumbar degenerative disease. LEVEL OF EVIDENCE 4.