967 resultados para multiview visualization
Resumo:
The polar winter stratospheric vortex is a coherent structure that undergoes different types of deformation that can be revealed by the geometric invariant moments. Three moments are used—the aspect ratio, the centroid latitude, and the area of the vortex based on stratospheric data from the 40-yr ECMWF Re-Analysis (ERA-40) project—to study sudden stratospheric warmings. Hierarchical clustering combined with data image visualization techniques is used as well. Using the gap statistic, three optimal clusters are obtained based on the three geometric moments considered here. The 850-K potential vorticity field, as well as the vertical profiles of polar temperature and zonal wind, provides evidence that the clusters represent, respectively, the undisturbed (U), displaced (D), and split (S) states of the polar vortex. This systematic method for identifying and characterizing the state of the polar vortex using objective methods is useful as a tool for analyzing observations and as a test for climate models to simulate the observations. The method correctly identifies all previously identified major warmings and also identifies significant minor warmings where the atmosphere is substantially disturbed but does not quite meet the criteria to qualify as a major stratospheric warming.
Resumo:
1. Apolipoprotein B-48, the transport protein for chylomicrons, is identical with apolipoprotein B-100 for the first 48% of its sequence. No antiserum has yet been reported that can recognize apolipoprotein B-48, but not apolipoprotein B-100. 2. In the present study an antiserum was raised to the C-terminal sequence of apolipoprotein B-48, using specific chemical reactions to ensure that the charged carboxyl group of the C-terminal isoleucine residue was free. In a Western blot the antiserum was shown to bind to a protein band having the characteristics of apolipoprotein B-48, but not to apolipoprotein B-100. 3. In the early evening 11 subjects were given a test meal which contained 40 g of mixed oil and retinyl palmitate. Blood samples were collected over 9 h. Chylomicron-enriched fractions were prepared and analysed for triacylglycerol, retinyl palmitate and apolipoprotein B-48, the latter after separation using SDS/PAGE and visualization by chemiluminescence on a Western blot. Both triacylglycerol and apolipoprotein B-48 showed an early peak at 1 h, which was not seen with retinyl palmitate. All three substances gave a broader peak between 5 and 6 h postprandially. Retinyl palmitate concentrations declined rapidly during the late (6-9 h) postprandial period, but apolipoprotein B-48 concentrations remained elevated. 4. This study has shown that an antiserum has been produced which is specific for apolipoprotein B-48. This has enabled measurement of postprandial concentrations of the protein that revealed features of chylomicron metabolism which have not been reported previously.
Resumo:
In the earth sciences, data are commonly cast on complex grids in order to model irregular domains such as coastlines, or to evenly distribute grid points over the globe. It is common for a scientist to wish to re-cast such data onto a grid that is more amenable to manipulation, visualization, or comparison with other data sources. The complexity of the grids presents a significant technical difficulty to the regridding process. In particular, the regridding of complex grids may suffer from severe performance issues, in the worst case scaling with the product of the sizes of the source and destination grids. We present a mechanism for the fast regridding of such datasets, based upon the construction of a spatial index that allows fast searching of the source grid. We discover that the most efficient spatial index under test (in terms of memory usage and query time) is a simple look-up table. A kd-tree implementation was found to be faster to build and to give similar query performance at the expense of a larger memory footprint. Using our approach, we demonstrate that regridding of complex data may proceed at speeds sufficient to permit regridding on-the-fly in an interactive visualization application, or in a Web Map Service implementation. For large datasets with complex grids the new mechanism is shown to significantly outperform algorithms used in many scientific visualization packages.
Resumo:
The Self-Organizing Map (SOM) is a popular unsupervised neural network able to provide effective clustering and data visualization for multidimensional input datasets. In this paper, we present an application of the simulated annealing procedure to the SOM learning algorithm with the aim to obtain a fast learning and better performances in terms of quantization error. The proposed learning algorithm is called Fast Learning Self-Organized Map, and it does not affect the easiness of the basic learning algorithm of the standard SOM. The proposed learning algorithm also improves the quality of resulting maps by providing better clustering quality and topology preservation of input multi-dimensional data. Several experiments are used to compare the proposed approach with the original algorithm and some of its modification and speed-up techniques.
Resumo:
This paper introduces the Hilbert Analysis (HA), which is a novel digital signal processing technique, for the investigation of tremor. The HA is formed by two complementary tools, i.e. the Empirical Mode Decomposition (EMD) and the Hilbert Spectrum (HS). In this work we show that the EMD can automatically detect and isolate tremulous and voluntary movements from experimental signals collected from 31 patients with different conditions. Our results also suggest that the tremor may be described by a new class of mathematical functions defined in the HA framework. In a further study, the HS was employed for visualization of the energy activities of signals. This tool introduces the concept of instantaneous frequency in the field of tremor. In addition, it could provide, in a time-frequency-energy plot, a clear visualization of local activities of tremor energy over the time. The HA demonstrated to be very useful to perform objective measurements of any kind of tremor and can therefore be used to perform functional assessment.
Resumo:
This paper introduces the Hilbert Analysis (HA), which is a novel digital signal processing technique, for the investigation of tremor. The HA is formed by two complementary tools, i.e. the Empirical Mode Decomposition (EMD) and the Hilbert Spectrum (HS). In this work we show that the EMD can automatically detect and isolate tremulous and voluntary movements from experimental signals collected from 31 patients with different conditions. Our results also suggest that the tremor may be described by a new class of mathematical functions defined in the HA framework. In a further study, the HS was employed for visualization of the energy activities of signals. This tool introduces the concept of instantaneous frequency in the field of tremor. In addition, it could provide, in a time-frequency energy plot, a clear visualization of local activities of tremor energy over the time. The HA demonstrated to be very useful to perform objective measurements of any kind of tremor and can therefore be used to perform functional assessment.
Resumo:
Climate-G is a large scale distributed testbed devoted to climate change research. It is an unfunded effort started in 2008 and involving a wide community both in Europe and US. The testbed is an interdisciplinary effort involving partners from several institutions and joining expertise in the field of climate change and computational science. Its main goal is to allow scientists carrying out geographical and cross-institutional data discovery, access, analysis, visualization and sharing of climate data. It represents an attempt to address, in a real environment, challenging data and metadata management issues. This paper presents a complete overview about the Climate-G testbed highlighting the most important results that have been achieved since the beginning of this project.
Resumo:
If acid-sensitive drugs or cells are administered orally, there is often a reduction in efficacy associated with gastric passage. Formulation into a polymer matrix is a potential method to improve their stability. The visualization of pH within these materials may help better understand the action of these polymer systems and allow comparison of different formulations. We herein describe the development of a novel confocal laser-scanning microscopy (CLSM) method for visualizing pH changes within polymer matrices and demonstrate its applicability to an enteric formulation based on chitosan-coated alginate gels. The system in question is first shown to protect an acid-sensitive bacterial strain to low pH, before being studied by our technique. Prior to this study, it has been claimed that protection by these materials is a result of buffering, but this has not been demonstrated. The visualization of pH within these matrices during exposure to a pH 2.0 simulated gastric solution showed an encroachment of acid from the periphery of the capsule, and a persistence of pHs above 2.0 within the matrix. This implies that the protective effect of the alginate-chitosan matrices is most likely due to a combination of buffering of acid as it enters the polymer matrix and the slowing of acid penetration.
Resumo:
How can organizations use digital infrastructure to realise physical outcomes? The design and construction of London Heathrow Terminal 5 is analysed to build new theoretical understanding of visualization and materialization practices in the transition from digital design to physical realisation. In the project studied, an integrated software solution is introduced as an infrastructure for delivery. The analyses articulate the work done to maintain this digital infrastructure and also to move designs beyond the closed world of the computer to a physical reality. In changing medium, engineers use heterogeneous trials to interrogate and address the limitations of an integrated digital model. The paper explains why such trials, which involve the reconciliation of digital and physical data through parallel and iterative forms of work, provide a robust practice for realizing goals that have physical outcomes. It argues that this practice is temporally different from, and at times in conflict with, building a comprehensive dataset within the digital medium. The paper concludes by discussing the implications for organizations that use digital infrastructures in seeking to accomplish goals in digital and physical media.
Resumo:
Pulsed terahertz imaging is being developed as a technique to image obscured mural paintings. Due to significant advances in terahertz technology, portable systems are now capable of operating in unregulated environments and this has prompted their use on archaeological excavations. August 2011 saw the first use of pulsed terahertz imaging at the archaeological site of Çatalhöyük, Turkey, where mural paintings dating from the Neolithic period are continuously being uncovered by archaeologists. In these particular paintings the paint is applied onto an uneven surface, and then covered by an equally uneven surface. Traditional terahertz data analysis has proven unsuccessful at sub-surface imaging of these paintings due to the effect of these uneven surfaces. For the first time, an image processing technique is presented, based around Gaussian beam-mode coupling, which enables the visualization of the obscured painting.
Resumo:
A novel diagnostic tool is presented, based on polar-cap temperature anomalies, for visualizing daily variability of the Arctic stratospheric polar vortex over multiple decades. This visualization illustrates the ubiquity of extended-time-scale recoveries from stratospheric sudden warmings, termed here polar-night jet oscillation (PJO) events. These are characterized by an anomalously warm polar lower stratosphere that persists for several months. Following the initial warming, a cold anomaly forms in the middle stratosphere, as does an anomalously high stratopause, both of which descend while the lower-stratospheric anomaly persists. These events are characterized in four datasets: Microwave Limb Sounder (MLS) temperature observations; the 40-yr ECMWF Re-Analysis (ERA-40) and Modern Era Retrospective Analysis for Research and Applications (MERRA) reanalyses; and an ensemble of three 150-yr simulations from the Canadian Middle Atmosphere Model. The statistics of PJO events in the model are found to agree very closely with those of the observations and reanalyses. The time scale for the recovery of the polar vortex following sudden warmings correlates strongly with the depth to which the warming initially descends. PJO events occur following roughly half of all major sudden warmings and are associated with an extended period of suppressed wave-activity fluxes entering the polar vortex. They follow vortex splits more frequently than they do vortex displacements. They are also related to weak vortex events as identified by the northern annular mode; in particular, those weak vortex events followed by a PJO event show a stronger tropospheric response. The long time scales, predominantly radiative dynamics, and tropospheric influence of PJO events suggest that they represent an important source of conditional skill in seasonal forecasting.
Resumo:
Terahertz pulse imaging (TPI) is a novel noncontact, nondestructive technique for the examination of cultural heritage artifacts. It has the advantage of broadband spectral range, time-of-flight depth resolution, and penetration through optically opaque materials. Fiber-coupled, portable, time-domain terahertz systems have enabled this technique to move out of the laboratory and into the field. Much like the rings of a tree, stratified architectural materials give the chronology of their environmental and aesthetic history. This work concentrates on laboratory models of stratified mosaics and fresco paintings, specimens extracted from a neolithic excavation site in Catalhoyuk, Turkey, and specimens measured at the medieval Eglise de Saint Jean-Baptiste in Vif, France. Preparatory spectroscopic studies of various composite materials, including lime, gypsum and clay plasters are presented to enhance the interpretation of results and with the intent to aid future computer simulations of the TPI of stratified architectural material. The breadth of the sample range is a demonstration of the cultural demand and public interest in the life history of buildings. The results are an illustration of the potential role of TPI in providing both a chronological history of buildings and in the visualization of obscured wall paintings and mosaics.
Resumo:
Abstract Background: The analysis of the Auditory Brainstem Response (ABR) is of fundamental importance to the investigation of the auditory system behaviour, though its interpretation has a subjective nature because of the manual process employed in its study and the clinical experience required for its analysis. When analysing the ABR, clinicians are often interested in the identification of ABR signal components referred to as Jewett waves. In particular, the detection and study of the time when these waves occur (i.e., the wave latency) is a practical tool for the diagnosis of disorders affecting the auditory system. Significant differences in inter-examiner results may lead to completely distinct clinical interpretations of the state of the auditory system. In this context, the aim of this research was to evaluate the inter-examiner agreement and variability in the manual classification of ABR. Methods: A total of 160 ABR data samples were collected, for four different stimulus intensity (80dBHL, 60dBHL, 40dBHL and 20dBHL), from 10 normal-hearing subjects (5 men and 5 women, from 20 to 52 years). Four examiners with expertise in the manual classification of ABR components participated in the study. The Bland-Altman statistical method was employed for the assessment of inter-examiner agreement and variability. The mean, standard deviation and error for the bias, which is the difference between examiners’ annotations, were estimated for each pair of examiners. Scatter plots and histograms were employed for data visualization and analysis. Results: In most comparisons the differences between examiner’s annotations were below 0.1 ms, which is clinically acceptable. In four cases, it was found a large error and standard deviation (>0.1 ms) that indicate the presence of outliers and thus, discrepancies between examiners. Conclusions: Our results quantify the inter-examiner agreement and variability of the manual analysis of ABR data, and they also allows for the determination of different patterns of manual ABR analysis.
Resumo:
n the past decade, the analysis of data has faced the challenge of dealing with very large and complex datasets and the real-time generation of data. Technologies to store and access these complex and large datasets are in place. However, robust and scalable analysis technologies are needed to extract meaningful information from these datasets. The research field of Information Visualization and Visual Data Analytics addresses this need. Information visualization and data mining are often used complementary to each other. Their common goal is the extraction of meaningful information from complex and possibly large data. However, though data mining focuses on the usage of silicon hardware, visualization techniques also aim to access the powerful image-processing capabilities of the human brain. This article highlights the research on data visualization and visual analytics techniques. Furthermore, we highlight existing visual analytics techniques, systems, and applications including a perspective on the field from the chemical process industry.
Resumo:
The aim of this article is to improve the communication of the probabilistic flood forecasts generated by hydrological ensemble prediction systems (HEPS) by understanding perceptions of different methods of visualizing probabilistic forecast information. This study focuses on interexpert communication and accounts for differences in visualization requirements based on the information content necessary for individual users. The perceptions of the expert group addressed in this study are important because they are the designers and primary users of existing HEPS. Nevertheless, they have sometimes resisted the release of uncertainty information to the general public because of doubts about whether it can be successfully communicated in ways that would be readily understood to nonexperts. In this article, we explore the strengths and weaknesses of existing HEPS visualization methods and thereby formulate some wider recommendations about the best practice for HEPS visualization and communication. We suggest that specific training on probabilistic forecasting would foster use of probabilistic forecasts with a wider range of applications. The result of a case study exercise showed that there is no overarching agreement between experts on how to display probabilistic forecasts and what they consider the essential information that should accompany plots and diagrams. In this article, we propose a list of minimum properties that, if consistently displayed with probabilistic forecasts, would make the products more easily understandable. Copyright © 2012 John Wiley & Sons, Ltd.