31 resultados para downloading of data


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Project justification is regarded as one of the major methodological deficits in Data Warehousing practice. As reasons for applying inappropriate methods, performing incomplete evaluations, or even entirely omitting justifications, the special nature of Data Warehousing benefits and the large portion of infrastructure-related activities are stated. In this paper, the economic justification of Data Warehousing projects is analyzed, and first results from a large academiaindustry collaboration project in the field of non-technical issues of Data Warehousing are presented. As conceptual foundations, the role of the Data Warehouse system in corporate application architectures is analyzed, and the specific properties of Data Warehousing projects are discussed. Based on an applicability analysis of traditional approaches to economic IT project justification, basic steps and responsibilities for the justification of Data Warehousing projects are derived.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study validated the accuracy of data from a self-reported questionnaire on smoking behaviour with the use of exhaled carbon monoxide (CO) level measurements in two groups of patients. Group 1 included patients referred to an oral medicine unit, whereas group 2 was recruited from the daily outpatient service. All patients filled in a standardized questionnaire regarding their current and former smoking habits. Additionally, exhaled CO levels were measured using a monitor. A total of 121 patients were included in group 1, and 116 patients were included in group 2. The mean value of exhaled CO was 7.6 ppm in the first group and 9.2 ppm in the second group. The mean CO values did not statistically significantly differ between the two groups. The two exhaled CO level measurements taken for each patient exhibited very good correlation (Spearman's coefficient of 0.9857). Smokers had a mean difference of exhaled CO values of 13.95 ppm (p < 0.001) compared to non-smokers adjusted for the first or second group. The consumption of one additional pack year resulted in an increase in CO values of 0.16 ppm (p = 0.003). The consumption of one additional cigarette per day elevated the CO measurements by 0.88 ppm (p < 0.001). Based on these results, the correlations between the self-reported smoking habits and exhaled CO values are robust and highly reproducible. CO monitors may offer a non-invasive method to objectively assess current smoking behaviour and to monitor tobacco use cessation attempts in the dental setting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background The release of quality data from acute care hospitals to the general public is based on the aim to inform the public, to provide transparency and to foster quality-based competition among providers. Due to the expected mechanisms of action and possibly the adverse consequences of public quality comparison, it is a controversial topic. The perspective of physicians and nurses is of particular importance in this context. They are mainly responsible for the collection of quality-control data, and are directly confronted with the results of public comparison. The research focus of this qualitative study was to discover what the views and opinions of the Swiss physicians and nurses were regarding these issues. It was investigated as to how the two professional groups appraised the opportunities as well as the risks of the release of quality data in Switzerland. Methods A qualitative approach was chosen to answer the research question. For data collection, four focus groups were conducted with physicians and nurses who were employed in Swiss acute care hospitals. Qualitative content analysis was applied to the data. Results The results revealed that both occupational groups had a very critical and negative attitude regarding the recent developments. The perceived risks were dominating their view. In summary, their main concerns were: the reduction of complexity, the one-sided focus on measurable quality variables, risk selection, the threat of data manipulation and the abuse of published information by the media. An additional concern was that the impression is given that the complex construct of quality can be reduced to a few key figures, and it that it is constructed from a false message which then influences society and politics. This critical attitude is associated with the different value system and the professional self-concept that both physicians and nurses have, in comparison to the underlying principles of a market-based economy and the economic orientation of health care business. Conclusions The critical and negative attitude of Swiss physicians and nurses must, under all conditions, be heeded to and investigated regarding its impact on work motivation and identification with the profession. At the same time, the two professional groups are obligated to reflect upon their critical attitude and take a proactive role in the development of appropriate quality indicators for the publication of quality data in Switzerland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Germany, hospitals can deliver data from patients with pelvic fractures selectively or twofold to two different trauma registries, i.e. the German Pelvic Injury Register (PIR) and the TraumaRegister DGU(®) (TR). Both registers are anonymous and differ in composition and content. We describe the methodological approach of linking these registries and reidentifying twofold documented patients. The aim of the approach is to create an intersection set that benefit from complementary data of each registry, respectively. Furthermore, the concordance of data entry of some clinical variables entered in both registries was evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A patient-specific surface model of the proximal femur plays an important role in planning and supporting various computer-assisted surgical procedures including total hip replacement, hip resurfacing, and osteotomy of the proximal femur. The common approach to derive 3D models of the proximal femur is to use imaging techniques such as computed tomography (CT) or magnetic resonance imaging (MRI). However, the high logistic effort, the extra radiation (CT-imaging), and the large quantity of data to be acquired and processed make them less functional. In this paper, we present an integrated approach using a multi-level point distribution model (ML-PDM) to reconstruct a patient-specific model of the proximal femur from intra-operatively available sparse data. Results of experiments performed on dry cadaveric bones using dozens of 3D points are presented, as well as experiments using a limited number of 2D X-ray images, which demonstrate promising accuracy of the present approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To describe the implementation and use of an electronic patient-referral system as an aid to the efficient referral of patients to a remote and specialized treatment center. METHODS AND MATERIALS: A system for the exchange of radiotherapy data between different commercial planning systems and a specially developed planning system for proton therapy has been developed through the use of the PAPYRUS diagnostic image standard as an intermediate format. To ensure the cooperation of the different TPS manufacturers, the number of data sets defined for transfer has been restricted to the three core data sets of CT, VOIs, and three-dimensional dose distributions. As a complement to the exchange of data, network-wide application-sharing (video-conferencing) technologies have been adopted to provide methods for the interactive discussion and assessment of treatments plans with one or more partner clinics. RESULTS: Through the use of evaluation plans based on the exchanged data, referring clinics can accurately assess the advantages offered by proton therapy on a patient-by-patient basis, while the practicality or otherwise of the proposed treatments can simultaneously be assessed by the proton therapy center. Such a system, along with the interactive capabilities provided by video-conferencing methods, has been found to be an efficient solution to the problem of patient assessment and selection at a specialized treatment center, and is a necessary first step toward the full electronic integration of such centers with their remotely situated referral centers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Absolute quantitation of clinical (1)H-MR spectra is virtually always incomplete for single subjects because the separate determination of spectrum, baseline, and transverse and longitudinal relaxation times in single subjects is prohibitively long. Integrated Processing and Acquisition of Data (IPAD) based on a combined 2-dimensional experimental and fitting strategy is suggested to substantially improve the information content from a given measurement time. A series of localized saturation-recovery spectra was recorded and combined with 2-dimensional prior-knowledge fitting to simultaneously determine metabolite T(1) (from analysis of the saturation-recovery time course), metabolite T(2) (from lineshape analysis based on metabolite and water peak shapes), macromolecular baseline (based on T(1) differences and analysis of the saturation-recovery time course), and metabolite concentrations (using prior knowledge fitting and conventional procedures of absolute standardization). The procedure was tested on metabolite solutions and applied in 25 subjects (15-78 years old). Metabolite content was comparable to previously found values. Interindividual variation was larger than intraindividual variation in repeated spectra for metabolite content as well as for some relaxation times. Relaxation times were different for various metabolite groups. Parts of the interindividual variation could be explained by significant age dependence of relaxation times.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

he physics program of the NA61/SHINE (SHINE = SPS Heavy Ion and Neutrino Experiment) experiment at the CERN SPS consists of three subjects. In the first stage of data taking (2007-2009) measurements of hadron production in hadron-nucleus interactions needed for neutrino (T2K) and cosmic-ray (Pierre Auger and KASCADE) experiments will be performed. In the second stage (2009-2010) hadron production in proton-proton and proton-nucleus interactions needed as reference data for a better understanding of nucleus-nucleus reactions will be studied. In the third stage (2009-2013) energy dependence of hadron production properties will be measured in p+p, p+Pb interactions and nucleus-nucleus collisions, with the aim to identify the properties of the onset of deconfinement and find evidence for the critical point of strongly interacting matter. The NA61 experiment was approved at CERN in June 2007. The first pilot run was performed during October 2007. Calibrations of all detector components have been performed successfully and preliminary uncorrected spectra have been obtained. High quality of track reconstruction and particle identification similar to NA49 has been achieved. The data and new detailed simulations confirm that the NA61 detector acceptance and particle identification capabilities cover the phase space required by the T2K experiment. This document reports on the progress made in the calibration and analysis of the 2007 data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The variability of toxicity data contained within databases was investigated using the widely used US EPA ECOTOX database as an example. Fish acute lethality (LC50) values for 44 compounds (for which at least 10 data entries existed) were extracted from the ECOTOX database yielding a total of 4654 test records. Significant variability of LC50 test results was observed, exceeding several orders of magnitude. In an attempt to systematically explore potential causes of the data variability, the influence of biological factors (such as test species or life stages) and physical factors (such as water temperature, pH or water hardness) were examined. Even after eliminating the influence of these inherent factors, considerable data variability remained, suggesting an important role of factors relating to technical and measurement procedures. The analysis, however, was limited by pronounced gaps in the test documentation. Of the 4654 extracted test reports, 66.5% provided no information on the fish life stage used for testing. Likewise, water temperature, hardness or pH were not recorded in 19.6%, 48.2% and 41.2% of the data entries, respectively. From these findings, we recommend the rigorous control of data entries ensuring complete recording of testing conditions. A more consistent database will help to better discriminate between technical and natural variability of the test data, which is of importance in ecological risk assessment for extrapolation from laboratory tests to the field, and also might help to develop correction factors that account for systematic differences in test results caused by species, life stage or test conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-quality data are essential for veterinary surveillance systems, and their quality can be affected by the source and the method of collection. Data recorded on farms could provide detailed information about the health of a population of animals, but the accuracy of the data recorded by farmers is uncertain. The aims of this study were to evaluate the quality of the data on animal health recorded on 97 Swiss dairy farms, to compare the quality of the data obtained by different recording systems, and to obtain baseline data on the health of the animals on the 97 farms. Data on animal health were collected from the farms for a year. Their quality was evaluated by assessing the completeness and accuracy of the recorded information, and by comparing farmers' and veterinarians' records. The quality of the data provided by the farmers was satisfactory, although electronic recording systems made it easier to trace the animals treated. The farmers tended to record more health-related events than the veterinarians, although this varied with the event considered, and some events were recorded only by the veterinarians. The farmers' attitude towards data collection was positive. Factors such as motivation, feedback, training, and simplicity and standardisation of data collection were important because they influenced the quality of the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE To investigate whether it is valid to combine follow-up and change data when conducting meta-analyses of continuous outcomes. STUDY DESIGN AND SETTING Meta-epidemiological study of randomized controlled trials in patients with osteoarthritis of the knee/hip, which assessed patient-reported pain. We calculated standardized mean differences (SMDs) based on follow-up and change data, and pooled within-trial differences in SMDs. We also derived pooled SMDs indicating the largest treatment effect within a trial (optimistic selection of SMDs) and derived pooled SMDs from the estimate indicating the smallest treatment effect within a trial (pessimistic selection of SMDs). RESULTS A total of 21 meta-analyses with 189 trials with 292 randomized comparisons in 41,256 patients were included. On average, SMDs were 0.04 standard deviation units more beneficial when follow-up values were used (difference in SMDs: -0.04; 95% confidence interval: -0.13, 0.06; P=0.44). In 13 meta-analyses (62%), there was a relevant difference in clinical and/or significance level between optimistic and pessimistic pooled SMDs. CONCLUSION On average, there is no relevant difference between follow-up and change data SMDs, and combining these estimates in meta-analysis is generally valid. Decision on which type of data to use when both follow-up and change data are available should be prespecified in the meta-analysis protocol.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The hadronic light-by-light contribution to the anomalous magnetic moment of the muon was recently analyzed in the framework of dispersion theory, providing a systematic formalism where all input quantities are expressed in terms of on-shell form factors and scattering amplitudes that are in principle accessible in experiment. We briefly review the main ideas behind this framework and discuss the various experimental ingredients needed for the evaluation of one- and two-pion intermediate states. In particular, we identify processes that in the absence of data for doubly-virtual pion–photon interactions can help constrain parameters in the dispersive reconstruction of the relevant input quantities, the pion transition form factor and the helicity partial waves for γ⁎γ⁎→ππ.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Complex pelvic traumas, i.e., pelvic fractures accompanied by pelvic soft tissue injuries, still have an unacceptably high mortality rate of about 18 %. PATIENTS AND METHODS We retrospectively evaluated an intersection set of data from the TraumaRegister DGU® and the German Pelvic Injury Register from 2004-2009. Patients with complex and noncomplex pelvic traumas were compared regarding their vital parameters, emergency management, stay in the ICU, and outcome. RESULTS From a total of 344 patients with pelvic injuries, 21 % of patients had a complex and 79 % a noncomplex trauma. Complex traumas were significantly less likely to survive (16.7 % vs. 5.9 %). Whereas vital parameters and emergency treatment in the preclinical setting did not differ substantially, patients with complex traumas were more often in shock and showed acute traumatic coagulopathy on hospital arrival, which resulted in more fluid volumes and transfusions when compared to patients with noncomplex traumas. Furthermore, patients with complex traumas had more complications and longer ICU stays. CONCLUSION Prevention of exsanguination and complications like multiple organ dysfunction syndrome still pose a major challenge in the management of complex pelvic traumas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, we propose a novel network coding enabled NDN architecture for the delivery of scalable video. Our scheme utilizes network coding in order to address the problem that arises in the original NDN protocol, where optimal use of the bandwidth and caching resources necessitates the coordination of the forwarding decisions. To optimize the performance of the proposed network coding based NDN protocol and render it appropriate for transmission of scalable video, we devise a novel rate allocation algorithm that decides on the optimal rates of Interest messages sent by clients and intermediate nodes. This algorithm guarantees that the achieved flow of Data objects will maximize the average quality of the video delivered to the client population. To support the handling of Interest messages and Data objects when intermediate nodes perform network coding, we modify the standard NDN protocol and introduce the use of Bloom filters, which store efficiently additional information about the Interest messages and Data objects. The proposed architecture is evaluated for transmission of scalable video over PlanetLab topologies. The evaluation shows that the proposed scheme performs very close to the optimal performance

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Systems for the identification and registration of cattle have gradually been receiving attention for use in syndromic surveillance, a relatively recent approach for the early detection of infectious disease outbreaks. Real or near real-time monitoring of deaths or stillbirths reported to these systems offer an opportunity to detect temporal or spatial clusters of increased mortality that could be caused by an infectious disease epidemic. In Switzerland, such data are recorded in the "Tierverkehrsdatenbank" (TVD). To investigate the potential of the Swiss TVD for syndromic surveillance, 3 years of data (2009-2011) were assessed in terms of data quality, including timeliness of reporting and completeness of geographic data. Two time-series consisting of reported on-farm deaths and stillbirths were retrospectively analysed to define and quantify the temporal patterns that result from non-health related factors. Geographic data were almost always present in the TVD data; often at different spatial scales. On-farm deaths were reported to the database by farmers in a timely fashion; stillbirths were less timely. Timeliness and geographic coverage are two important features of disease surveillance systems, highlighting the suitability of the TVD for use in a syndromic surveillance system. Both time series exhibited different temporal patterns that were associated with non-health related factors. To avoid false positive signals, these patterns need to be removed from the data or accounted for in some way before applying aberration detection algorithms in real-time. Evaluating mortality data reported to systems for the identification and registration of cattle is of value for comparing national data systems and as a first step towards a European-wide early detection system for emerging and re-emerging cattle diseases.