896 resultados para High-dimensional data visualization


Relevância:

40.00% 40.00%

Publicador:

Resumo:

We propose a 3-D gravity model for the volcanic structure of the island of Maio (Cape Verde archipelago) with the objective of solving some open questions concerning the geometry and depth of the intrusive Central Igneous Complex. A gravity survey was made covering almost the entire surface of the island. The gravity data was inverted through a non-linear 3-D approach which provided a model constructed in a random growth process. The residual Bouguer gravity field shows a single positive anomaly presenting an elliptic shape with a NWSE trending long axis. This Bouguer gravity anomaly is slightly off-centred with the island but its outline is concordant with the surface exposure of the Central Igneous Complex. The gravimetric modelling shows a high-density volume whose centre of mass is about 4500 m deep. With increasing depth, and despite the restricted gravimetric resolution, the horizontal sections of the model suggest the presence of two distinct bodies, whose relative position accounts for the elongated shape of the high positive Bouguer gravity anomaly. These bodies are interpreted as magma chambers whose coeval volcanic counterparts are no longer preserved. The orientation defined by the two bodies is similar to that of other structures known in the southern group of the Cape Verde islands, thus suggesting a possible structural control constraining the location of the plutonic intrusions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: High-grade gliomas are aggressive, incurable tumors characterized by extensive diffuse invasion of the normal brain parenchyma. Novel therapies at best prolong survival; their costs are formidable and benefit is marginal. Economic restrictions thus require knowledge of the cost-effectiveness of treatments. Here, we show the cost-effectiveness of enhanced resections in malignant glioma surgery using a well-characterized tool for intraoperative tumor visualization, 5-aminolevulinic acid (5-ALA). OBJECTIVE: To evaluate the cost-effectiveness of 5-ALA fluorescence-guided neurosurgery compared with white-light surgery in adult patients with newly diagnosed high-grade glioma, adopting the perspective of the Portuguese National Health Service. METHODS: We used a Markov model (cohort simulation). Transition probabilities were estimated with the use of data from 1 randomized clinical trial and 1 noninterventional prospective study. Utility values and resource use were obtained from published literature and expert opinion. Unit costs were taken from official Portuguese reimbursement lists (2012 values). The health outcomes considered were quality-adjusted life-years, lifeyears, and progression-free life-years. Extensive 1-way and probabilistic sensitivity analyses were performed. RESULTS: The incremental cost-effectiveness ratios are below €10 000 in all evaluated outcomes, being around €9100 per quality-adjusted life-year gained, €6700 per life-year gained, and €8800 per progression-free life-year gained. The probability of 5-ALA fluorescence-guided surgery cost-effectiveness at a threshold of €20000 is 96.0% for quality-adjusted life-year, 99.6% for life-year, and 98.8% for progression-free life-year. CONCLUSION: 5-ALA fluorescence-guided surgery appears to be cost-effective in newly diagnosed high-grade gliomas compared with white-light surgery. This example demonstrates cost-effectiveness analyses for malignant glioma surgery to be feasible on the basis of existing data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Contém resumo

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Biofilm research is growing more diverse and dependent on high-throughput technologies and the large-scale production of results aggravates data substantiation. In particular, it is often the case that experimental protocols are adapted to meet the needs of a particular laboratory and no statistical validation of the modified method is provided. This paper discusses the impact of intra-laboratory adaptation and non-rigorous documentation of experimental protocols on biofilm data interchange and validation. The case study is a non-standard, but widely used, workflow for Pseudomonas aeruginosa biofilm development, considering three analysis assays: the crystal violet (CV) assay for biomass quantification, the XTT assay for respiratory activity assessment, and the colony forming units (CFU) assay for determination of cell viability. The ruggedness of the protocol was assessed by introducing small changes in the biofilm growth conditions, which simulate minor protocol adaptations and non-rigorous protocol documentation. Results show that even minor variations in the biofilm growth conditions may affect the results considerably, and that the biofilm analysis assays lack repeatability. Intra-laboratory validation of non-standard protocols is found critical to ensure data quality and enable the comparison of results within and among laboratories.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Currently, the quality of the Indonesian national road network is inadequate due to several constraints, including overcapacity and overloaded trucks. The high deterioration rate of the road infrastructure in developing countries along with major budgetary restrictions and high growth in traffic have led to an emerging need for improving the performance of the highway maintenance system. However, the high number of intervening factors and their complex effects require advanced tools to successfully solve this problem. The high learning capabilities of Data Mining (DM) are a powerful solution to this problem. In the past, these tools have been successfully applied to solve complex and multi-dimensional problems in various scientific fields. Therefore, it is expected that DM can be used to analyze the large amount of data regarding the pavement and traffic, identify the relationship between variables, and provide information regarding the prediction of the data. In this paper, we present a new approach to predict the International Roughness Index (IRI) of pavement based on DM techniques. DM was used to analyze the initial IRI data, including age, Equivalent Single Axle Load (ESAL), crack, potholes, rutting, and long cracks. This model was developed and verified using data from an Integrated Indonesia Road Management System (IIRMS) that was measured with the National Association of Australian State Road Authorities (NAASRA) roughness meter. The results of the proposed approach are compared with the IIRMS analytical model adapted to the IRI, and the advantages of the new approach are highlighted. We show that the novel data-driven model is able to learn (with high accuracy) the complex relationships between the IRI and the contributing factors of overloaded trucks

Relevância:

40.00% 40.00%

Publicador:

Resumo:

ABSTRACT The spatial distribution of forest biomass in the Amazon is heterogeneous with a temporal and spatial variation, especially in relation to the different vegetation types of this biome. Biomass estimated in this region varies significantly depending on the applied approach and the data set used for modeling it. In this context, this study aimed to evaluate three different geostatistical techniques to estimate the spatial distribution of aboveground biomass (AGB). The selected techniques were: 1) ordinary least-squares regression (OLS), 2) geographically weighted regression (GWR) and, 3) geographically weighted regression - kriging (GWR-K). These techniques were applied to the same field dataset, using the same environmental variables derived from cartographic information and high-resolution remote sensing data (RapidEye). This study was developed in the Amazon rainforest from Sucumbíos - Ecuador. The results of this study showed that the GWR-K, a hybrid technique, provided statistically satisfactory estimates with the lowest prediction error compared to the other two techniques. Furthermore, we observed that 75% of the AGB was explained by the combination of remote sensing data and environmental variables, where the forest types are the most important variable for estimating AGB. It should be noted that while the use of high-resolution images significantly improves the estimation of the spatial distribution of AGB, the processing of this information requires high computational demand.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Genome-scale metabolic models are valuable tools in the metabolic engineering process, based on the ability of these models to integrate diverse sources of data to produce global predictions of organism behavior. At the most basic level, these models require only a genome sequence to construct, and once built, they may be used to predict essential genes, culture conditions, pathway utilization, and the modifications required to enhance a desired organism behavior. In this chapter, we address two key challenges associated with the reconstruction of metabolic models: (a) leveraging existing knowledge of microbiology, biochemistry, and available omics data to produce the best possible model; and (b) applying available tools and data to automate the reconstruction process. We consider these challenges as we progress through the model reconstruction process, beginning with genome assembly, and culminating in the integration of constraints to capture the impact of transcriptional regulation. We divide the reconstruction process into ten distinct steps: (1) genome assembly from sequenced reads; (2) automated structural and functional annotation; (3) phylogenetic tree-based curation of genome annotations; (4) assembly and standardization of biochemistry database; (5) genome-scale metabolic reconstruction; (6) generation of core metabolic model; (7) generation of biomass composition reaction; (8) completion of draft metabolic model; (9) curation of metabolic model; and (10) integration of regulatory constraints. Each of these ten steps is documented in detail.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Maschinenbau, Diss., 2009

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Nowadays a huge attention of the academia and research teams is attracted to the potential of the usage of the 60 GHz frequency band in the wireless communications. The use of the 60GHz frequency band offers great possibilities for wide variety of applications that are yet to be implemented. These applications also imply huge implementation challenges. Such example is building a high data rate transceiver which at the same time would have very low power consumption. In this paper we present a prototype of Single Carrier -SC transceiver system, illustrating a brief overview of the baseband design, emphasizing the most important decisions that need to be done. A brief overview of the possible approaches when implementing the equalizer, as the most complex module in the SC transceiver, is also presented. The main focus of this paper is to suggest a parallel architecture for the receiver in a Single Carrier communication system. This would provide higher data rates that the communication system canachieve, for a price of higher power consumption. The suggested architecture of such receiver is illustrated in this paper,giving the results of its implementation in comparison with its corresponding serial implementation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Informatik, Diss., 2014

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Informatik, Diss., 2014

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Informatik, Diss., 2015