965 resultados para Diagnostic imaging - Data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Owing to the demand for genuine mozzarella, some 330 water buffaloes are being slaughtered every year in Switzerland albeit a stunning procedure meeting animal welfare and occupational safety requirements remains to be established. To provide a basis for improvements, we sized anatomical specifics in water buffaloes and cattle and we assessed brain lesions after stunning with captive bolts or handguns by diagnostic imaging. In water buffaloes and cattle, the median distance from the frontal skin surface to the inner bone table was 74.0 mm (56.0–100.0 mm) vs 36.6 mm (29.3–44.3 mm) and from skin to the thalamus 144.8 mm (117.1–172.0 mm) vs 102.0 (101.0–121.0 mm), respectively. Consequently, customary captive bolt stunners may be inadequate. Free bullets are potentially suitable for stunning buffaloes but involve occupational safety hazards. The results of the present study shall be used to develop a device allowing effective and safe stunning of water buffaloes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A wide variety of spatial data collection efforts are ongoing throughout local, state and federal agencies, private firms and non-profit organizations. Each effort is established for a different purpose but organizations and individuals often collect and maintain the same or similar information. The United States federal government has undertaken many initiatives such as the National Spatial Data Infrastructure, the National Map and Geospatial One-Stop to reduce duplicative spatial data collection and promote the coordinated use, sharing, and dissemination of spatial data nationwide. A key premise in most of these initiatives is that no national government will be able to gather and maintain more than a small percentage of the geographic data that users want and desire. Thus, national initiatives depend typically on the cooperation of those already gathering spatial data and those using GIs to meet specific needs to help construct and maintain these spatial data infrastructures and geo-libraries for their nations (Onsrud 2001). Some of the impediments to widespread spatial data sharing are well known from directly asking GIs data producers why they are not currently involved in creating datasets that are of common or compatible formats, documenting their datasets in a standardized metadata format or making their datasets more readily available to others through Data Clearinghouses or geo-libraries. The research described in this thesis addresses the impediments to wide-scale spatial data sharing faced by GIs data producers and explores a new conceptual data-sharing approach, the Public Commons for Geospatial Data, that supports user-friendly metadata creation, open access licenses, archival services and documentation of parent lineage of the contributors and value- adders of digital spatial data sets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Every x-ray attenuation curve inherently contains all the information necessary to extract the complete energy spectrum of a beam. To date, attempts to obtain accurate spectral information from attenuation data have been inadequate.^ This investigation presents a mathematical pair model, grounded in physical reality by the Laplace Transformation, to describe the attenuation of a photon beam and the corresponding bremsstrahlung spectral distribution. In addition the Laplace model has been mathematically extended to include characteristic radiation in a physically meaningful way. A method to determine the fraction of characteristic radiation in any diagnostic x-ray beam was introduced for use with the extended model.^ This work has examined the reconstructive capability of the Laplace pair model for a photon beam range of from 50 kVp to 25 MV, using both theoretical and experimental methods.^ In the diagnostic region, excellent agreement between a wide variety of experimental spectra and those reconstructed with the Laplace model was obtained when the atomic composition of the attenuators was accurately known. The model successfully reproduced a 2 MV spectrum but demonstrated difficulty in accurately reconstructing orthovoltage and 6 MV spectra. The 25 MV spectrum was successfully reconstructed although poor agreement with the spectrum obtained by Levy was found.^ The analysis of errors, performed with diagnostic energy data, demonstrated the relative insensitivity of the model to typical experimental errors and confirmed that the model can be successfully used to theoretically derive accurate spectral information from experimental attenuation data. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nuclear imaging is used for non-invasive detection, staging and therapeutic monitoring of tumors through the use of radiolabeled probes. Generally, these probes are used for applications in which they provide passive, non-specific information about the target. Therefore, there is a significant need for actively-targeted radioactive probes to provide functional information about the site of interest. This study examined endostatin, an endogenous inhibitor of tumor angiogenesis, which has affinity for tumor vasculature. The major objective of this study was to develop radiolabeled analogues of endostatin through novel chemical and radiochemical syntheses, and to determine their usefulness for tumor imaging using in vitro and in vivo models of vascular, mammary and prostate tumor cells. I hypothesize that this binding will allow for a non-invasive approach to detection of tumor angiogenesis, and such detection can be used for therapeutic monitoring to determine the efficacy of anti-angiogenic therapy. ^ The data showed that endostatin could be successfully conjugated to the bifunctional chelator ethylenedicysteine (EC), and radiolabeled with technetium-99m and gallium-68, providing a unique opportunity to use a single precursor for both nuclear imaging modalities: 99mTc for single photon emission computed tomography and 68Ga for positron emission tomography, respectively. Both radiolabeled analogues showed increased binding as a function of time in human umbilical vein endothelial cells and mammary and prostate tumor cells. Binding could be blocked in a dose-dependent manner by unlabeled endostatin implying the presence of endostatin receptors on both vascular and tumor cells. Animal biodistribution studies demonstrated that both analogues were stable in vivo, showed typical reticuloendothelial and renal excretion and produced favorable absorbed organ doses for application in humans. The imaging data provide evidence that the compounds quantitate tumor volumes with clinically-useful tumor-to-nontumor ratios, and can be used for treatment follow-up to depict changes occurring at the vascular and cellular levels. ^ Two novel endostatin analogues were developed and demonstrated interaction with vascular and tumor cells. Both can be incorporated into existing nuclear imaging platforms allowing for potential wide-spread clinical benefit as well as serving as a diagnostic tool for elucidation of the mechanism of action of endostatin. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Clinical Research Data Quality Literature Review and Pooled Analysis We present a literature review and secondary analysis of data accuracy in clinical research and related secondary data uses. A total of 93 papers meeting our inclusion criteria were categorized according to the data processing methods. Quantitative data accuracy information was abstracted from the articles and pooled. Our analysis demonstrates that the accuracy associated with data processing methods varies widely, with error rates ranging from 2 errors per 10,000 files to 5019 errors per 10,000 fields. Medical record abstraction was associated with the highest error rates (70–5019 errors per 10,000 fields). Data entered and processed at healthcare facilities had comparable error rates to data processed at central data processing centers. Error rates for data processed with single entry in the presence of on-screen checks were comparable to double entered data. While data processing and cleaning methods may explain a significant amount of the variability in data accuracy, additional factors not resolvable here likely exist. Defining Data Quality for Clinical Research: A Concept Analysis Despite notable previous attempts by experts to define data quality, the concept remains ambiguous and subject to the vagaries of natural language. This current lack of clarity continues to hamper research related to data quality issues. We present a formal concept analysis of data quality, which builds on and synthesizes previously published work. We further posit that discipline-level specificity may be required to achieve the desired definitional clarity. To this end, we combine work from the clinical research domain with findings from the general data quality literature to produce a discipline-specific definition and operationalization for data quality in clinical research. While the results are helpful to clinical research, the methodology of concept analysis may be useful in other fields to clarify data quality attributes and to achieve operational definitions. Medical Record Abstractor’s Perceptions of Factors Impacting the Accuracy of Abstracted Data Medical record abstraction (MRA) is known to be a significant source of data errors in secondary data uses. Factors impacting the accuracy of abstracted data are not reported consistently in the literature. Two Delphi processes were conducted with experienced medical record abstractors to assess abstractor’s perceptions about the factors. The Delphi process identified 9 factors that were not found in the literature, and differed with the literature by 5 factors in the top 25%. The Delphi results refuted seven factors reported in the literature as impacting the quality of abstracted data. The results provide insight into and indicate content validity of a significant number of the factors reported in the literature. Further, the results indicate general consistency between the perceptions of clinical research medical record abstractors and registry and quality improvement abstractors. Distributed Cognition Artifacts on Clinical Research Data Collection Forms Medical record abstraction, a primary mode of data collection in secondary data use, is associated with high error rates. Distributed cognition in medical record abstraction has not been studied as a possible explanation for abstraction errors. We employed the theory of distributed representation and representational analysis to systematically evaluate cognitive demands in medical record abstraction and the extent of external cognitive support employed in a sample of clinical research data collection forms. We show that the cognitive load required for abstraction in 61% of the sampled data elements was high, exceedingly so in 9%. Further, the data collection forms did not support external cognition for the most complex data elements. High working memory demands are a possible explanation for the association of data errors with data elements requiring abstractor interpretation, comparison, mapping or calculation. The representational analysis used here can be used to identify data elements with high cognitive demands.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La metodología del número de la curva (NC) es la más empleada para transformar la precipitación total en precipitación efectiva. De esta manera se constituye en una herramienta de gran valor para realizar estudios hidrológicos en cuencas hidrográficas, fundamentalmente cuando hay una deficiencia de registros extensos y confiables. Esta metodología requiere del conocimiento del tipo y uso de suelo de la cuenca en estudio y registros pluviográficos. En el presente trabajo se aplicó el procesamiento de imágenes LANDSAT para la zonificación de la vegetación y uso del suelo en la cuenca del Arroyo Pillahuinco Grande (38° LS y 61° 15' LW), ubicada sobre el sistema serrano de La Ventana, en el sudoeste de la provincia de Buenos Aires, Argentina. El análisis de su interrelación generó los valores de NC y coeficiente de escorrentía (CE). El procesamiento digital de la base de datos raster georreferenciada se realizó con aplicación de herramientas de sistema de información geográfica (Idrisi Kilimanjaro). El análisis de regresión múltiple efectuado a las variables generó un R2 que explica el 89,77 % de la variabilidad de CE (a < 0,01). Los resultados se exponen a nivel diagnóstico y zonificación del NC, donde la mayor influencia de la escorrentía se relaciona con las variables cobertura vegetal y uso del suelo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Distribution, accumulation and diagenesis of surficial sediments in coastal and continental shelf systems follow complex chains of localized processes and form deposits of great spatial variability. Given the environmental and economic relevance of ocean margins, there is growing need for innovative geophysical exploration methods to characterize seafloor sediments by more than acoustic properties. A newly conceptualized benthic profiling and data processing approach based on controlled source electromagnetic (CSEM) imaging permits to coevally quantify the magnetic susceptibility and the electric conductivity of shallow marine deposits. The two physical properties differ fundamentally insofar as magnetic susceptibility mostly assesses solid particle characteristics such as terrigenous or iron mineral content, redox state and contamination level, while electric conductivity primarily relates to the fluid-filled pore space and detects salinity, porosity and grain-size variations. We develop and validate a layered half-space inversion algorithm for submarine multifrequency CSEM with concentric sensor configuration. Guided by results of modeling, we modified a commercial land CSEM sensor for submarine application, which was mounted into a nonconductive and nonmagnetic bottom-towed sled. This benthic EM profiler Neridis II achieves 25 soundings/second at 3-4 knots over continuous profiles of up to hundred kilometers. Magnetic susceptibility is determined from the 75 Hz in-phase response (90% signal originates from the top 50 cm), while electric conductivity is derived from the 5 kHz out-of-phase (quadrature) component (90% signal from the top 92 cm). Exemplary survey data from the north-west Iberian margin underline the excellent sensitivity, functionality and robustness of the system in littoral (~0-50 m) and neritic (~50-300 m) environments. Susceptibility vs. porosity cross-plots successfully identify known lithofacies units and their transitions. All presently available data indicate an eminent potential of CSEM profiling for assessing the complex distribution of shallow marine surficial sediments and for revealing climatic, hydrodynamic, diagenetic and anthropogenic factors governing their formation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, profiling floats, which form the basis of the successful international Argo observatory, are also being considered as platforms for marine biogeochemical research. This study showcases the utility of floats as a novel tool for combined gas measurements of CO2 partial pressure (pCO2) and O2. These float prototypes were equipped with a small-sized and submersible pCO2 sensor and an optode O2 sensor for highresolution measurements in the surface ocean layer. Four consecutive deployments were carried out during November 2010 and June 2011 near the Cape Verde Ocean Observatory (CVOO) in the eastern tropical North Atlantic. The profiling float performed upcasts every 31 h while measuring pCO2, O2, salinity, temperature, and hydrostatic pressure in the upper 200 m of the water column. To maintain accuracy, regular pCO2 sensor zeroings at depth and surface, as well as optode measurements in air, were performed for each profile. Through the application of data processing procedures (e.g., time-lag correction), accuracies of floatborne pCO2 measurements were greatly improved (10-15 µatm for the water column and 5 µatm for surface measurements). O2 measurements yielded an accuracy of 2 µmol/kg. First results of this pilot study show the possibility of using profiling floats as a platform for detailed and unattended observations of the marine carbon and oxygen cycle dynamics.