970 resultados para Semantics - Data processing


Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Short-acting agents for neuromuscular block (NMB) require frequent dosing adjustments for individual patient's needs. In this study, we verified a new closed-loop controller for mivacurium dosing in clinical trials. METHODS: Fifteen patients were studied. T1% measured with electromyography was used as input signal for the model-based controller. After induction of propofol/opiate anaesthesia, stabilization of baseline electromyography signal was awaited and a bolus of 0.3 mg kg-1 mivacurium was then administered to facilitate endotracheal intubation. Closed-loop infusion was started thereafter, targeting a neuromuscular block of 90%. Setpoint deviation, the number of manual interventions and surgeon's complaints were recorded. Drug use and its variability between and within patients were evaluated. RESULTS: Median time of closed-loop control for the 11 patients included in the data processing was 135 [89-336] min (median [range]). Four patients had to be excluded because of sensor problems. Mean absolute deviation from setpoint was 1.8 +/- 0.9 T1%. Neither manual interventions nor complaints from the surgeons were recorded. Mean necessary mivacurium infusion rate was 7.0 +/- 2.2 microg kg-1 min-1. Intrapatient variability of mean infusion rates over 30-min interval showed high differences up to a factor of 1.8 between highest and lowest requirement in the same patient. CONCLUSIONS: Neuromuscular block can precisely be controlled with mivacurium using our model-based controller. The amount of mivacurium needed to maintain T1% at defined constant levels differed largely between and within patients. Closed-loop control seems therefore advantageous to automatically maintain neuromuscular block at constant levels.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The problem of re-sampling spatially distributed data organized into regular or irregular grids to finer or coarser resolution is a common task in data processing. This procedure is known as 'gridding' or 're-binning'. Depending on the quantity the data represents, the gridding-algorithm has to meet different requirements. For example, histogrammed physical quantities such as mass or energy have to be re-binned in order to conserve the overall integral. Moreover, if the quantity is positive definite, negative sampling values should be avoided. The gridding process requires a re-distribution of the original data set to a user-requested grid according to a distribution function. The distribution function can be determined on the basis of the given data by interpolation methods. In general, accurate interpolation with respect to multiple boundary conditions of heavily fluctuating data requires polynomial interpolation functions of second or even higher order. However, this may result in unrealistic deviations (overshoots or undershoots) of the interpolation function from the data. Accordingly, the re-sampled data may overestimate or underestimate the given data by a significant amount. The gridding-algorithm presented in this work was developed in order to overcome these problems. Instead of a straightforward interpolation of the given data using high-order polynomials, a parametrized Hermitian interpolation curve was used to approximate the integrated data set. A single parameter is determined by which the user can control the behavior of the interpolation function, i.e. the amount of overshoot and undershoot. Furthermore, it is shown how the algorithm can be extended to multidimensional grids. The algorithm was compared to commonly used gridding-algorithms using linear and cubic interpolation functions. It is shown that such interpolation functions may overestimate or underestimate the source data by about 10-20%, while the new algorithm can be tuned to significantly reduce these interpolation errors. The accuracy of the new algorithm was tested on a series of x-ray CT-images (head and neck, lung, pelvis). The new algorithm significantly improves the accuracy of the sampled images in terms of the mean square error and a quality index introduced by Wang and Bovik (2002 IEEE Signal Process. Lett. 9 81-4).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

INTRODUCTION: A multi-centre study has been conducted, during 2005, by means of a questionnaire posted on the Italian Society of Emergency Medicine (SIMEU) web page. Our intention was to carry out an organisational and functional analysis of Italian Emergency Departments (ED) in order to pick out some macro-indicators of the activities performed. Participation was good, in that 69 ED (3,285,440 admissions to emergency services) responded to the questionnaire. METHODS: The study was based on 18 questions: 3 regarding the personnel of the ED, 2 regarding organisational and functional aspects, 5 on the activity of the ED, 7 on triage and 1 on the assessment of the quality perceived by the users of the ED. RESULTS AND CONCLUSION: The replies revealed that 91.30% of the ED were equipped with data-processing software, which, in 96.83% of cases, tracked the entire itinerary of the patient. About 48,000 patients/year used the ED: 76.72% were discharged and 18.31% were hospitalised. Observation Units were active in 81.16% of the ED examined. Triage programmes were in place in 92.75% of ED: in 75.81% of these, triage was performed throughout the entire itinerary of the patient; in 16.13% it was performed only symptom-based, and in 8.06% only on-call. Of the patients arriving at the ED, 24.19% were assigned a non-urgent triage code, 60.01% a urgent code, 14.30% a emergent code and 1.49% a life-threatening code. Waiting times were: 52.39 min for non-urgent patients, 40.26 min for urgent, 12.08 for emergent, and 1.19 for life-threatening patients.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Applying location-focused data protection law within the context of a location-agnostic cloud computing framework is fraught with difficulties. While the Proposed EU Data Protection Regulation has introduced a lot of changes to the current data protection framework, the complexities of data processing in the cloud involve various layers and intermediaries of actors that have not been properly addressed. This leaves some gaps in the regulation when analyzed in cloud scenarios. This paper gives a brief overview of the relevant provisions of the regulation that will have an impact on cloud transactions and addresses the missing links. It is hoped that these loopholes will be reconsidered before the final version of the law is passed in order to avoid unintended consequences.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A joint reprocessing of GPS, GLONASS and SLR observations has been carried out at TU Dresden, TU Munich, AIUB and ETH Zurich. Common a priori models have been applied for the processing of all types of observation to ensure both consistent parameter estimates and the rigorous combination of microwave and optical measurements. Based on that reprocessing results, we evaluate the impact of adding GLONASS observations to the standard GPS data processing. In particular, changes in station position time series and day boundary overlaps of consecutive satellite arcs are analyzed. In addition, the GNSS orbits derived from microwave measurements are validated using independent SLR range measurements. Our SLR residuals indicate a significant improvement compared to previous results. Furthermore, we evaluate the performance of our high-rate (30s) combined GNSS satellite clocks and discuss associated zero-difference phase residuals.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many observed time series of the global radiosonde or PILOT networks exist as fragments distributed over different archives. Identifying and merging these fragments can enhance their value for studies on the three-dimensional spatial structure of climate change. The Comprehensive Historical Upper-Air Network (CHUAN version 1.7), which was substantially extended in 2013, and the Integrated Global Radiosonde Archive (IGRA) are the most important collections of upper-air measurements taken before 1958. CHUAN (tracked) balloon data start in 1900, with higher numbers from the late 1920s onward, whereas IGRA data start in 1937. However, a substantial fraction of those measurements have not been taken at synoptic times (preferably 00:00 or 12:00 GMT) and on altitude levels instead of standard pressure levels. To make them comparable with more recent data, the records have been brought to synoptic times and standard pressure levels using state-of-the-art interpolation techniques, employing geopotential information from the National Oceanic and Atmospheric Administration (NOAA) 20th Century Reanalysis (NOAA 20CR). From 1958 onward the European Re-Analysis archives (ERA-40 and ERA-Interim) available at the European Centre for Medium-Range Weather Forecasts (ECMWF) are the main data sources. These are easier to use, but pilot data still have to be interpolated to standard pressure levels. Fractions of the same records distributed over different archives have been merged, if necessary, taking care that the data remain traceable back to their original sources. If possible, station IDs assigned by the World Meteorological Organization (WMO) have been allocated to the station records. For some records which have never been identified by a WMO ID, a local ID above 100 000 has been assigned. The merged data set contains 37 wind records longer than 70 years and 139 temperature records longer than 60 years. It can be seen as a useful basis for further data processing steps, most notably homogenization and gridding, after which it should be a valuable resource for climatological studies. Homogeneity adjustments for wind using the NOAA-20CR as a reference are described in Ramella Pralungo and Haimberger (2014). Reliable homogeneity adjustments for temperature beyond 1958 using a surface-data-only reanalysis such as NOAA-20CR as a reference have yet to be created. All the archives and metadata files are available in ASCII and netCDF format in the PANGAEA archive

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: To investigate if non-rigid image-registration reduces motion artifacts in triggered and non-triggered diffusion tensor imaging (DTI) of native kidneys. A secondary aim was to determine, if improvements through registration allow for omitting respiratory-triggering. METHODS: Twenty volunteers underwent coronal DTI of the kidneys with nine b-values (10-700 s/mm2 ) at 3 Tesla. Image-registration was performed using a multimodal nonrigid registration algorithm. Data processing yielded the apparent diffusion coefficient (ADC), the contribution of perfusion (FP ), and the fractional anisotropy (FA). For comparison of the data stability, the root mean square error (RMSE) of the fitting and the standard deviations within the regions of interest (SDROI ) were evaluated. RESULTS: RMSEs decreased significantly after registration for triggered and also for non-triggered scans (P < 0.05). SDROI for ADC, FA, and FP were significantly lower after registration in both medulla and cortex of triggered scans (P < 0.01). Similarly the SDROI of FA and FP decreased significantly in non-triggered scans after registration (P < 0.05). RMSEs were significantly lower in triggered than in non-triggered scans, both with and without registration (P < 0.05). CONCLUSION: Respiratory motion correction by registration of individual echo-planar images leads to clearly reduced signal variations in renal DTI for both triggered and particularly non-triggered scans. Secondarily, the results suggest that respiratory-triggering still seems advantageous.J. Magn. Reson. Imaging 2014. (c) 2014 Wiley Periodicals, Inc.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Several techniques have been proposed to exploit GNSS-derived kinematic orbit information for the determination of long-wavelength gravity field features. These methods include the (i) celestial mechanics approach, (ii) short-arc approach, (iii) point-wise acceleration approach, (iv) averaged acceleration approach, and (v) energy balance approach. Although there is a general consensus that—except for energy balance—these methods theoretically provide equivalent results, real data gravity field solutions from kinematic orbit analysis have never been evaluated against each other within a consistent data processing environment. This contribution strives to close this gap. Target consistency criteria for our study are the input data sets, period of investigation, spherical harmonic resolution, a priori gravity field information, etc. We compare GOCE gravity field estimates based on the aforementioned approaches as computed at the Graz University of Technology, the University of Bern, the University of Stuttgart/Austrian Academy of Sciences, and by RHEA Systems for the European Space Agency. The involved research groups complied with most of the consistency criterions. Deviations only occur where technical unfeasibility exists. Performance measures include formal errors, differences with respect to a state-of-the-art GRACE gravity field, (cumulative) geoid height differences, and SLR residuals from precise orbit determination of geodetic satellites. We found that for the approaches (i) to (iv), the cumulative geoid height differences at spherical harmonic degree 100 differ by only ≈10 % ; in the absence of the polar data gap, SLR residuals agree by ≈96 % . From our investigations, we conclude that real data analysis results are in agreement with the theoretical considerations concerning the (relative) performance of the different approaches.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Gravity field and steady-state Ocean Circulation Explorer (GOCE) was the first Earth explorer core mission of the European Space Agency. It was launched on March 17, 2009 into a Sun-synchronous dusk-dawn orbit and re-entered into the Earth’s atmosphere on November 11, 2013. The satellite altitude was between 255 and 225 km for the measurement phases. The European GOCE Gravity consortium is responsible for the Level 1b to Level 2 data processing in the frame of the GOCE High-level processing facility (HPF). The Precise Science Orbit (PSO) is one Level 2 product, which was produced under the responsibility of the Astronomical Institute of the University of Bern within the HPF. This PSO product has been continuously delivered during the entire mission. Regular checks guaranteed a high consistency and quality of the orbits. A correlation between solar activity, GPS data availability and quality of the orbits was found. The accuracy of the kinematic orbit primarily suffers from this. Improvements in modeling the range corrections at the retro-reflector array for the SLR measurements were made and implemented in the independent SLR validation for the GOCE PSO products. The satellite laser ranging (SLR) validation finally states an orbit accuracy of 2.42 cm for the kinematic and 1.84 cm for the reduced-dynamic orbits over the entire mission. The common-mode accelerations from the GOCE gradiometer were not used for the official PSO product, but in addition to the operational HPF work a study was performed to investigate to which extent common-mode accelerations improve the reduced-dynamic orbit determination results. The accelerometer data may be used to derive realistic constraints for the empirical accelerations estimated for the reduced-dynamic orbit determination, which already improves the orbit quality. On top of that the accelerometer data may further improve the orbit quality if realistic constraints and state-of-the-art background models such as gravity field and ocean tide models are used for the reduced-dynamic orbit determination.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As the number of space debris is increasing in the geostationary ring, it becomes mandatory for any satellite operator to avoid any collisions. Space debris in geosynchronous orbits may be observed with optical telescopes. Other than radar, that requires very large dishes and transmission powers for sensing high-altitude objects, optical observations do not depend on active illumination from ground and may be performed with notably smaller apertures. The detection size of an object depends on the aperture of the telescope, sky background and exposure time. With a telescope of 50 cm aperture, objects down to approximately 50 cm may be observed. This size is regarded as a threshold for the identification of hazardous objects and the prevention of potentially catastrophic collisions in geostationary orbits. In collaboration with the Astronomical Institute of the University of Bern (AIUB), the German Space Operations Center (GSOC) is building a small aperture telescope to demonstrate the feasibility of optical surveillance of the geostationary ring. The telescope will be located in the southern hemisphere and complement an existing telescope in the northern hemisphere already operated by AIUB. These two telescopes provide an optimum coverage of European GEO satellites and enable a continuous monitoring independent of seasonal limitations. The telescope will be operated completely automatically. The automated operations should be demonstrated covering the full range of activities including scheduling of observations, telescope and camera control as well as data processing.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Space debris in geostationary orbits may be detected with optical telescopes when the objects are illuminated by the Sun. The advantage compared to Radar can be found in the illumination: radar illuminates the objects and thus the detection sensitivity depletest proportional to the fourth power of the d istance. The German Space Operation Center, GSOC, together with the Astronomical Institute of the University of Bern, AIUB, are setting up a telescope system called SMARTnet to demonstrate the capability of performing geostationary surveillance. Such a telescope system will consist of two telescopes on one mount: a smaller telescope with an aperture of 20cm will serve for fast survey while the larger one, a telescope with an aperture of 50cm, will be used for follow-up observations. The telescopes will be operated by GSOC from Oberpfaffenhofen by the internal monitoring and control system called SMARTnetMAC. The observation plan will be generated by MARTnetPlanning seven days in advance by applying an optimized planning scheduler, taking into account fault time like cloudy nights, priority of objects etc. From each picture taken, stars will be identified and everything not being a star is treated as a possible object. If the same object can be identified on multiple pictures within a short time span, the trace is called a tracklet. In the next step, several tracklets will be correlated to identify individual objects, ephemeris data for these objects are generated and catalogued . This will allow for services like collision avoidance to ensure safe operations for GSOC’s satellites. The complete data processing chain is handled by BACARDI, the backbone catalogue of relational debris information and is presented as a poster.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A wide variety of spatial data collection efforts are ongoing throughout local, state and federal agencies, private firms and non-profit organizations. Each effort is established for a different purpose but organizations and individuals often collect and maintain the same or similar information. The United States federal government has undertaken many initiatives such as the National Spatial Data Infrastructure, the National Map and Geospatial One-Stop to reduce duplicative spatial data collection and promote the coordinated use, sharing, and dissemination of spatial data nationwide. A key premise in most of these initiatives is that no national government will be able to gather and maintain more than a small percentage of the geographic data that users want and desire. Thus, national initiatives depend typically on the cooperation of those already gathering spatial data and those using GIs to meet specific needs to help construct and maintain these spatial data infrastructures and geo-libraries for their nations (Onsrud 2001). Some of the impediments to widespread spatial data sharing are well known from directly asking GIs data producers why they are not currently involved in creating datasets that are of common or compatible formats, documenting their datasets in a standardized metadata format or making their datasets more readily available to others through Data Clearinghouses or geo-libraries. The research described in this thesis addresses the impediments to wide-scale spatial data sharing faced by GIs data producers and explores a new conceptual data-sharing approach, the Public Commons for Geospatial Data, that supports user-friendly metadata creation, open access licenses, archival services and documentation of parent lineage of the contributors and value- adders of digital spatial data sets.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Clinical Research Data Quality Literature Review and Pooled Analysis We present a literature review and secondary analysis of data accuracy in clinical research and related secondary data uses. A total of 93 papers meeting our inclusion criteria were categorized according to the data processing methods. Quantitative data accuracy information was abstracted from the articles and pooled. Our analysis demonstrates that the accuracy associated with data processing methods varies widely, with error rates ranging from 2 errors per 10,000 files to 5019 errors per 10,000 fields. Medical record abstraction was associated with the highest error rates (70–5019 errors per 10,000 fields). Data entered and processed at healthcare facilities had comparable error rates to data processed at central data processing centers. Error rates for data processed with single entry in the presence of on-screen checks were comparable to double entered data. While data processing and cleaning methods may explain a significant amount of the variability in data accuracy, additional factors not resolvable here likely exist. Defining Data Quality for Clinical Research: A Concept Analysis Despite notable previous attempts by experts to define data quality, the concept remains ambiguous and subject to the vagaries of natural language. This current lack of clarity continues to hamper research related to data quality issues. We present a formal concept analysis of data quality, which builds on and synthesizes previously published work. We further posit that discipline-level specificity may be required to achieve the desired definitional clarity. To this end, we combine work from the clinical research domain with findings from the general data quality literature to produce a discipline-specific definition and operationalization for data quality in clinical research. While the results are helpful to clinical research, the methodology of concept analysis may be useful in other fields to clarify data quality attributes and to achieve operational definitions. Medical Record Abstractor’s Perceptions of Factors Impacting the Accuracy of Abstracted Data Medical record abstraction (MRA) is known to be a significant source of data errors in secondary data uses. Factors impacting the accuracy of abstracted data are not reported consistently in the literature. Two Delphi processes were conducted with experienced medical record abstractors to assess abstractor’s perceptions about the factors. The Delphi process identified 9 factors that were not found in the literature, and differed with the literature by 5 factors in the top 25%. The Delphi results refuted seven factors reported in the literature as impacting the quality of abstracted data. The results provide insight into and indicate content validity of a significant number of the factors reported in the literature. Further, the results indicate general consistency between the perceptions of clinical research medical record abstractors and registry and quality improvement abstractors. Distributed Cognition Artifacts on Clinical Research Data Collection Forms Medical record abstraction, a primary mode of data collection in secondary data use, is associated with high error rates. Distributed cognition in medical record abstraction has not been studied as a possible explanation for abstraction errors. We employed the theory of distributed representation and representational analysis to systematically evaluate cognitive demands in medical record abstraction and the extent of external cognitive support employed in a sample of clinical research data collection forms. We show that the cognitive load required for abstraction in 61% of the sampled data elements was high, exceedingly so in 9%. Further, the data collection forms did not support external cognition for the most complex data elements. High working memory demands are a possible explanation for the association of data errors with data elements requiring abstractor interpretation, comparison, mapping or calculation. The representational analysis used here can be used to identify data elements with high cognitive demands.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

La metodología del número de la curva (NC) es la más empleada para transformar la precipitación total en precipitación efectiva. De esta manera se constituye en una herramienta de gran valor para realizar estudios hidrológicos en cuencas hidrográficas, fundamentalmente cuando hay una deficiencia de registros extensos y confiables. Esta metodología requiere del conocimiento del tipo y uso de suelo de la cuenca en estudio y registros pluviográficos. En el presente trabajo se aplicó el procesamiento de imágenes LANDSAT para la zonificación de la vegetación y uso del suelo en la cuenca del Arroyo Pillahuinco Grande (38° LS y 61° 15' LW), ubicada sobre el sistema serrano de La Ventana, en el sudoeste de la provincia de Buenos Aires, Argentina. El análisis de su interrelación generó los valores de NC y coeficiente de escorrentía (CE). El procesamiento digital de la base de datos raster georreferenciada se realizó con aplicación de herramientas de sistema de información geográfica (Idrisi Kilimanjaro). El análisis de regresión múltiple efectuado a las variables generó un R2 que explica el 89,77 % de la variabilidad de CE (a < 0,01). Los resultados se exponen a nivel diagnóstico y zonificación del NC, donde la mayor influencia de la escorrentía se relaciona con las variables cobertura vegetal y uso del suelo.