13 resultados para High-dimensional data visualization
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
Dimensionality reduction is employed for visual data analysis as a way to obtaining reduced spaces for high dimensional data or to mapping data directly into 2D or 3D spaces. Although techniques have evolved to improve data segregation on reduced or visual spaces, they have limited capabilities for adjusting the results according to user's knowledge. In this paper, we propose a novel approach to handling both dimensionality reduction and visualization of high dimensional data, taking into account user's input. It employs Partial Least Squares (PLS), a statistical tool to perform retrieval of latent spaces focusing on the discriminability of the data. The method employs a training set for building a highly precise model that can then be applied to a much larger data set very effectively. The reduced data set can be exhibited using various existing visualization techniques. The training data is important to code user's knowledge into the loop. However, this work also devises a strategy for calculating PLS reduced spaces when no training data is available. The approach produces increasingly precise visual mappings as the user feeds back his or her knowledge and is capable of working with small and unbalanced training sets.
Resumo:
Traditional supervised data classification considers only physical features (e. g., distance or similarity) of the input data. Here, this type of learning is called low level classification. On the other hand, the human (animal) brain performs both low and high orders of learning and it has facility in identifying patterns according to the semantic meaning of the input data. Data classification that considers not only physical attributes but also the pattern formation is, here, referred to as high level classification. In this paper, we propose a hybrid classification technique that combines both types of learning. The low level term can be implemented by any classification technique, while the high level term is realized by the extraction of features of the underlying network constructed from the input data. Thus, the former classifies the test instances by their physical features or class topologies, while the latter measures the compliance of the test instances to the pattern formation of the data. Our study shows that the proposed technique not only can realize classification according to the pattern formation, but also is able to improve the performance of traditional classification techniques. Furthermore, as the class configuration's complexity increases, such as the mixture among different classes, a larger portion of the high level term is required to get correct classification. This feature confirms that the high level classification has a special importance in complex situations of classification. Finally, we show how the proposed technique can be employed in a real-world application, where it is capable of identifying variations and distortions of handwritten digit images. As a result, it supplies an improvement in the overall pattern recognition rate.
Resumo:
Data visualization techniques are powerful in the handling and analysis of multivariate systems. One such technique known as parallel coordinates was used to support the diagnosis of an event, detected by a neural network-based monitoring system, in a boiler at a Brazilian Kraft pulp mill. Its attractiveness is the possibility of the visualization of several variables simultaneously. The diagnostic procedure was carried out step-by-step going through exploratory, explanatory, confirmatory, and communicative goals. This tool allowed the visualization of the boiler dynamics in an easier way, compared to commonly used univariate trend plots. In addition it facilitated analysis of other aspects, namely relationships among process variables, distinct modes of operation and discrepant data. The whole analysis revealed firstly that the period involving the detected event was associated with a transition between two distinct normal modes of operation, and secondly the presence of unusual changes in process variables at this time.
Resumo:
Accruing evidence indicates that connexin (Cx) channels in the gap junctions (GJ) are involved in neurodegeneration after injury. However, studies using KO animal models endowed apparently contradictory results in relation to the role of coupling in neuroprotection. We analyzed the role of Cx-mediated communication in a focal lesion induced by mechanical trauma of the retina, a model that allows spatial and temporal definition of the lesion with high reproducibility, permitting visualization of the focus, penumbra and adjacent areas. Cx36 and Cx43 exhibited distinct gene expression and protein levels throughout the neurodegeneration progress. Cx36 was observed close to TUNEL-positive nuclei, revealing the presence of this protein surrounding apoptotic cells. The functional role of cell coupling was assessed employing GJ blockers and openers combined with lactate dehydrogenase (LDH) assay, a direct method for evaluating cell death/viability. Carbenoxolone (CBX), a broad-spectrum GJ blocker, reduced LDH release after 4 hours, whereas quinine, a Cx36-channel specific blocker, decreased LDH release as early as 1 hour after lesion. Furthermore, analysis of dying cell distribution confirmed that the use of GJ blockers reduced apoptosis spread. Accordingly, blockade of GJ communication during neurodegeneration with quinine, but not CBX, caused downregulation of initial and effector caspases. To summarize, we observed specific changes in Cx gene expression and protein distribution during the progress of retinal degeneration, indicating the participation of these elements in acute neurodegeneration processes. More importantly, our results revealed that direct control of GJ channels permeability may take part in reliable neuroprotection strategies aimed to rapid, fast treatment of mechanical trauma in the retina.
Resumo:
Thermal treatment (thermal rectification) is a process in which technological properties of wood are modified using thermal energy, the result of Which is often value-added wood. Thermally treated wood takes on similar color shades to tropical woods and offers considerable resistance to destructive microorganisms and climate action, in addition to having high dimensional stability and low hygroscopicity. Wood samples of Eucalyptus grandis were subjected to various thermal treatments, as performed in presence (140 degrees C; 160 degrees C; 180 degrees C) or in absence of oxygen (160 degrees C; 180 degrees C; 200 degrees C) inside a thermal treatment chamber, and then studied as to their chemical characteristics. Increasing the maximum treatment temperatures led to a reduction in the holocellulose content of samples as a result of the degradation and volatilization of hemicelluloses, also leading to an increase in the relative lignin content. Except for glucose, all monosaccharide levels were found to decrease in samples after the thermal treatment at a maximum temperature of 200 degrees C. The thermal treatment above 160 degrees C led to increased levels of total extractives in the wood samples, probably ascribed to the emergence of low molecular weight substances as a result of thermal degradation. Overall, it was not possible to clearly determine the effect of presence or absence of oxygen in the air during thermal treatment on the chemical characteristics of the relevant wood samples.
Resumo:
This article investigates the effect of product market liberalisation on employment allowing for interactions between policies and institutions in product and labour markets. Using panel data for OECD countries over the period 19802002, we present evidence that product market deregulation is more effective at the margin when labour market regulation is high. The data also suggest that product market liberalisation may promote employment-enhancing labour market reforms.
Resumo:
Abstract Background Oral squamous cell carcinoma (OSCC) is a frequent neoplasm, which is usually aggressive and has unpredictable biological behavior and unfavorable prognosis. The comprehension of the molecular basis of this variability should lead to the development of targeted therapies as well as to improvements in specificity and sensitivity of diagnosis. Results Samples of primary OSCCs and their corresponding surgical margins were obtained from male patients during surgery and their gene expression profiles were screened using whole-genome microarray technology. Hierarchical clustering and Principal Components Analysis were used for data visualization and One-way Analysis of Variance was used to identify differentially expressed genes. Samples clustered mostly according to disease subsite, suggesting molecular heterogeneity within tumor stages. In order to corroborate our results, two publicly available datasets of microarray experiments were assessed. We found significant molecular differences between OSCC anatomic subsites concerning groups of genes presently or potentially important for drug development, including mRNA processing, cytoskeleton organization and biogenesis, metabolic process, cell cycle and apoptosis. Conclusion Our results corroborate literature data on molecular heterogeneity of OSCCs. Differences between disease subsites and among samples belonging to the same TNM class highlight the importance of gene expression-based classification and challenge the development of targeted therapies.
Resumo:
Thermal treatment (thermal rectification) is a process in which technological properties of wood are modified using thermal energy, the result of which is often value-added wood. Thermally treated wood takes on similar color shades to tropical woods and offers considerable resistance to destructive microorganisms and climate action, in addition to having high dimensional stability and low hygroscopicity. Wood samples of Eucalyptus grandis were subjected to various thermal treatments, as performed in presence (140ºC; 160ºC; 180ºC) or in absence of oxygen (160ºC; 180ºC; 200ºC) inside a thermal treatment chamber, and then studied as to their chemical characteristics. Increasing the maximum treatment temperatures led to a reduction in the holocellulose content of samples as a result of the degradation and volatilization of hemicelluloses, also leading to an increase in the relative lignin content. Except for glucose, all monosaccharide levels were found to decrease in samples after the thermal treatment at a maximum temperature of 200ºC. The thermal treatment above 160ºC led to increased levels of total extractives in the wood samples, probably ascribed to the emergence of low molecular weight substances as a result of thermal degradation. Overall, it was not possible to clearly determine the effect of presence or absence of oxygen in the air during thermal treatment on the chemical characteristics of the relevant wood samples.
Resumo:
In this paper we discuss the detection of glucose and triglycerides using information visualization methods to process impedance spectroscopy data. The sensing units contained either lipase or glucose oxidase immobilized in layer-by-layer (LbL) films deposited onto interdigitated electrodes. The optimization consisted in identifying which part of the electrical response and combination of sensing units yielded the best distinguishing ability. It is shown that complete separation can be obtained for a range of concentrations of glucose and triglyceride when the interactive document map (IDMAP) technique is used to project the data into a two-dimensional plot. Most importantly, the optimization procedure can be extended to other types of biosensors, thus increasing the versatility of analysis provided by tailored molecular architectures exploited with various detection principles. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
In [1], the authors proposed a framework for automated clustering and visualization of biological data sets named AUTO-HDS. This letter is intended to complement that framework by showing that it is possible to get rid of a user-defined parameter in a way that the clustering stage can be implemented more accurately while having reduced computational complexity
Resumo:
Polymorphonuclear leukocyte (PMNL) apoptosis is central to the successful resolution of inflammation. Since Somatic Cell Count (SCC) is an indicator of the mammary gland's immune status, this study sought to clarify the influence that these factors have on each other and on the evolution of the inflammatory process. Milk samples were stained with annexin-V, propidium iodide (PI), primary antibody anti-CH138A. Negative correlation between SCC and PMNL apoptosis was found, and a statistical difference between high SCC group and low SCC group was observed concerning the rate of viable PMNL, apoptotic PMNL, necrotic PMNL and necrotic and/or apoptotic PMNL. Overall, the high cellularity group presented lower proportions of CH138+ cells undergoing apoptosis and higher proportions of viable and necrotic CH138+ cells. Thus, it can be concluded that PMNL apoptosis and SCC are related factors, and that in high SCC, milk apoptosis is delayed. Although there is a greater amount of active phagocytes in this situation, apoptosis' anti-inflammatory effects are decreased, while necrosis' pro-inflammatory effects are increased, which can contribute to chronic inflammation.
Resumo:
Objective: To assess the fetal lumbosacral spine by three-dimensional (3D) ultrasonography using volume contrast imaging (VCI) omni view method and compare reproducibility and agreement between three different measurement techniques: standard mouse, high definition mouse and pen-tablet. Methods: A comparative and prospective study with 40 pregnant women between 20 and 34+6 weeks was realized. 3D volume datasets of the fetal spine were acquired using a convex transabdominal transducer. Starting scan plane was the coronal section of fetal lumbosacral spine by VCI-C function. Omni view manual trace was selected and a parallel plane of fetal spine was drawn including interest region. Intraclass correlation coefficient (ICC) was used for reproducibility analysis. The relative difference between three used techniques was compared by chi-square test and Fischer test. Results: Pen-tablet showed better reliability (ICC = 0.987). In the relative proportion of differences, this was significantly higher for the pen-tablet (82.14%; p < 0.01). In paired comparison, the relative difference was significantly greater for the pen-tablet (p < 0.01). Conclusion: The pen-tablet showed to be the most reproductive and concordant method in the measurement of body vertebral area of fetal lumbosacral spine by 3D ultrasonography using the VCI.
Resumo:
The wide variety of molecular architectures used in sensors and biosensors and the large amount of data generated with some principles of detection have motivated the use of computational methods, such as information visualization techniques, not only to handle the data but also to optimize sensing performance. In this study, we combine projection techniques with micro-Raman scattering and atomic force microscopy (AFM) to address critical issues related to practical applications of electronic tongues (e-tongues) based on impedance spectroscopy. Experimentally, we used sensing units made with thin films of a perylene derivative (AzoPTCD acronym), coating Pt interdigitated electrodes, to detect CuCl(2) (Cu(2+)), methylene blue (MB), and saccharose in aqueous solutions, which were selected due to their distinct molecular sizes and ionic character in solution. The AzoPTCD films were deposited from monolayers to 120 nm via Langmuir-Blodgett (LB) and physical vapor deposition (PVD) techniques. Because the main aspects investigated were how the interdigitated electrodes are coated by thin films (architecture on e-tongue) and the film thickness, we decided to employ the same material for all sensing units. The capacitance data were projected into a 2D plot using the force scheme method, from which we could infer that at low analyte concentrations the electrical response of the units was determined by the film thickness. Concentrations at 10 mu M or higher could be distinguished with thinner films tens of nanometers at most-which could withstand the impedance measurements, and without causing significant changes in the Raman signal for the AzoPTCD film-forming molecules. The sensitivity to the analytes appears to be related to adsorption on the film surface, as inferred from Raman spectroscopy data using MB as analyte and from the multidimensional projections. The analysis of the results presented may serve as a new route to select materials and molecular architectures for novel sensors and biosensors, in addition to suggesting ways to unravel the mechanisms behind the high sensitivity obtained in various sensors.