95 resultados para 3D scalar data
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
LHC has found hints for a Higgs particle of 125 GeV. We investigate the possibility that such a particle is a mixture of scalar and pseudoscalar states. For definiteness, we concentrate on a two-Higgs doublet model with explicit CP violation and soft Z(2) violation. Including all Higgs production mechanisms, we determine the current constraints obtained by comparing h -> yy with h -> VV*, and comment on the information which can be gained by measurements of h -> b (b) over bar. We find bounds vertical bar s(2)vertical bar less than or similar to 0.83 at one sigma, where vertical bar s(2)vertical bar = 0 (vertical bar s(2)vertical bar = 1) corresponds to a pure scalar (pure pseudoscalar) state.
Resumo:
A great number of low-temperature geothermal fields occur in Northern-Portugal related to fractured rocks. The most important superficial manifestations of these hydrothermal systems appear in pull-apart tectonic basins and are strongly conditioned by the orientation of the main fault systems in the region. This work presents the interpretation of gravity gradient maps and 3D inversion model produced from a regional gravity survey. The horizontal gradients reveal a complex fault system. The obtained 3D model of density contrast puts into evidence the main fault zone in the region and the depth distribution of the granitic bodies. Their relationship with the hydrothermal systems supports the conceptual models elaborated from hydrochemical and isotopic water analyses. This work emphasizes the importance of the role of the gravity method and analysis to better understand the connection between hydrothermal systems and the fractured rock pattern and surrounding geology. (c) 2013 Elsevier B.V. All rights reserved.
Resumo:
In this work, we present a neural network (NN) based method designed for 3D rigid-body registration of FMRI time series, which relies on a limited number of Fourier coefficients of the images to be aligned. These coefficients, which are comprised in a small cubic neighborhood located at the first octant of a 3D Fourier space (including the DC component), are then fed into six NN during the learning stage. Each NN yields the estimates of a registration parameter. The proposed method was assessed for 3D rigid-body transformations, using DC neighborhoods of different sizes. The mean absolute registration errors are of approximately 0.030 mm in translations and 0.030 deg in rotations, for the typical motion amplitudes encountered in FMRI studies. The construction of the training set and the learning stage are fast requiring, respectively, 90 s and 1 to 12 s, depending on the number of input and hidden units of the NN. We believe that NN-based approaches to the problem of FMRI registration can be of great interest in the future. For instance, NN relying on limited K-space data (possibly in navigation echoes) can be a valid solution to the problem of prospective (in frame) FMRI registration.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Civil
Resumo:
This paper addresses the estimation of surfaces from a set of 3D points using the unified framework described in [1]. This framework proposes the use of competitive learning for curve estimation, i.e., a set of points is defined on a deformable curve and they all compete to represent the available data. This paper extends the use of the unified framework to surface estimation. It o shown that competitive learning performes better than snakes, improving the model performance in the presence of concavities and allowing to desciminate close surfaces. The proposed model is evaluated in this paper using syntheticdata and medical images (MRI and ultrasound images).
Resumo:
This paper addresses the estimation of object boundaries from a set of 3D points. An extension of the constrained clustering algorithm developed by Abrantes and Marques in the context of edge linking is presented. The object surface is approximated using rectangular meshes and simplex nets. Centroid-based forces are used for attracting the model nodes towards the data, using competitive learning methods. It is shown that competitive learning improves the model performance in the presence of concavities and allows to discriminate close surfaces. The proposed model is evaluated using synthetic data and medical images (MRI and ultrasound images).
Resumo:
We investigate a mechanism that generates exact solutions of scalar field cosmologies in a unified way. The procedure investigated here permits to recover almost all known solutions, and allows one to derive new solutions as well. In particular, we derive and discuss one novel solution defined in terms of the Lambert function. The solutions are organised in a classification which depends on the choice of a generating function which we have denoted by x(phi) that reflects the underlying thermodynamics of the model. We also analyse and discuss the existence of form-invariance dualities between solutions. A general way of defining the latter in an appropriate fashion for scalar fields is put forward.
Resumo:
Lossless compression algorithms of the Lempel-Ziv (LZ) family are widely used nowadays. Regarding time and memory requirements, LZ encoding is much more demanding than decoding. In order to speed up the encoding process, efficient data structures, like suffix trees, have been used. In this paper, we explore the use of suffix arrays to hold the dictionary of the LZ encoder, and propose an algorithm to search over it. We show that the resulting encoder attains roughly the same compression ratios as those based on suffix trees. However, the amount of memory required by the suffix array is fixed, and much lower than the variable amount of memory used by encoders based on suffix trees (which depends on the text to encode). We conclude that suffix arrays, when compared to suffix trees in terms of the trade-off among time, memory, and compression ratio, may be preferable in scenarios (e.g., embedded systems) where memory is at a premium and high speed is not critical.
Resumo:
We analyze generalized CP symmetries of two-Higgs doublet models, extending them from the scalar to the fermion sector of the theory. We show that, other than the usual CP transformation, there is only one of those symmetries which does not imply massless charged fermions. That single model which accommodates a fermionic mass spectrum compatible with experimental data possesses a remarkable feature. Through a soft breaking of the symmetry it displays a new type of spontaneous CP violation, which does not occur in the scalar sector responsible for the symmetry breaking mechanism but, rather, in the fermion sector.
Resumo:
Opposite enantiomers exhibit different NMR properties in the presence of an external common chiral element, and a chiral molecule exhibits different NMR properties in the presence of external enantiomeric chiral elements. Automatic prediction of such differences, and comparison with experimental values, leads to the assignment of the absolute configuration. Here two cases are reported, one using a dataset of 80 chiral secondary alcohols esterified with (R)-MTPA and the corresponding 1H NMR chemical shifts and the other with 94 13C NMR chemical shifts of chiral secondary alcohols in two enantiomeric chiral solvents. For the first application, counterpropagation neural networks were trained to predict the sign of the difference between chemical shifts of opposite stereoisomers. The neural networks were trained to process the chirality code of the alcohol as the input, and to give the NMR property as the output. In the second application, similar neural networks were employed, but the property to predict was the difference of chemical shifts in the two enantiomeric solvents. For independent test sets of 20 objects, 100% correct predictions were obtained in both applications concerning the sign of the chemical shifts differences. Additionally, with the second dataset, the difference of chemical shifts in the two enantiomeric solvents was quantitatively predicted, yielding r2 0.936 for the test set between the predicted and experimental values.
Resumo:
This paper presents an investigation into cloud-to-ground lightning activity over the continental territory of Portugal with data collected by the national Lightning Location System. The Lightning Location System in Portugal is first presented. Analyses about geographical, seasonal, and polarity distribution of cloud-to-ground lightning activity and cumulative probability of peak current are carried out. An overall ground flash density map is constructed from the database, which contains the information of more than five years and almost four million records. This map is compared with the thunderstorm days map, produced by the Portuguese Institute of Meteorology, and with the orographic map of Portugal. Finally, conclusions are duly drawn.
Resumo:
We present a study of the magnetic properties of a group of basalt samples from the Saldanha Massif (Mid-Atlantic Ridge - MAR - 36degrees 33' 54" N, 33degrees 26' W), and we set out to interpret these properties in the tectono-magmatic framework of this sector of the MAR. Most samples have low magnetic anisotropy and magnetic minerals of single domain grain size, typical of rapid cooling. The thermomagnetic study mostly shows two different susceptibility peaks. The high temperature peak is related to mineralogical alteration due to heating. The low temperature peak shows a distinction between three different stages of low temperature oxidation: the presence of titanomagnetite, titanomagnetite and titanomaghemite, and exclusively of titanomaghemite. Based on established empirical relationships between Curie temperature and degree of oxidation, the latter is tentatively deduced for all samples. Finally, swath bathymetry and sidescan sonar data combined with dive observations show that the Saldanha Massif is located over an exposed section of upper mantle rocks interpreted to be the result of detachment tectonics. Basalt samples inside the detachment zone often have higher than expected oxidation rates; this effect can be explained by the higher permeability caused by the detachment fault activity.
Resumo:
The 27 December 1722 Algarve earthquake destroyed a large area in southern Portugal generating a local tsunami that inundated the shallow areas of Tavira. It is unclear whether its source was located onshore or offshore and, in any case, what was the tectonic source responsible for the event. We analyze available historical information concerning macroseismicity and the tsunami to discuss the most probable location of the source. We also review available seismotectonic knowledge of the offshore region close to the probable epicenter, selecting a set of four candidate sources. We simulate tsunamis produced by these candidate sources assuming that the sea bottom displacement is caused by a compressive dislocation over a rectangular fault, as given by the half-space homogeneous elastic approach, and we use numerical modeling to study wave propagation and run-up. We conclude that the 27 December 1722 Tavira earthquake and tsunami was probably generated offshore, close to 37 degrees 01'N, 7 degrees 49'W.
Resumo:
Although stock prices fluctuate, the variations are relatively small and are frequently assumed to be normal distributed on a large time scale. But sometimes these fluctuations can become determinant, especially when unforeseen large drops in asset prices are observed that could result in huge losses or even in market crashes. The evidence shows that these events happen far more often than would be expected under the generalized assumption of normal distributed financial returns. Thus it is crucial to properly model the distribution tails so as to be able to predict the frequency and magnitude of extreme stock price returns. In this paper we follow the approach suggested by McNeil and Frey (2000) and combine the GARCH-type models with the Extreme Value Theory (EVT) to estimate the tails of three financial index returns DJI,FTSE 100 and NIKKEI 225 representing three important financial areas in the world. Our results indicate that EVT-based conditional quantile estimates are much more accurate than those from conventional AR-GARCH models assuming normal or Student’s t-distribution innovations when doing out-of-sample estimation (within the insample estimation, this is so for the right tail of the distribution of returns).
Resumo:
Background: With the decrease of DNA sequencing costs, sequence-based typing methods are rapidly becoming the gold standard for epidemiological surveillance. These methods provide reproducible and comparable results needed for a global scale bacterial population analysis, while retaining their usefulness for local epidemiological surveys. Online databases that collect the generated allelic profiles and associated epidemiological data are available but this wealth of data remains underused and are frequently poorly annotated since no user-friendly tool exists to analyze and explore it. Results: PHYLOViZ is platform independent Java software that allows the integrated analysis of sequence-based typing methods, including SNP data generated from whole genome sequence approaches, and associated epidemiological data. goeBURST and its Minimum Spanning Tree expansion are used for visualizing the possible evolutionary relationships between isolates. The results can be displayed as an annotated graph overlaying the query results of any other epidemiological data available. Conclusions: PHYLOViZ is a user-friendly software that allows the combined analysis of multiple data sources for microbial epidemiological and population studies. It is freely available at http://www.phyloviz.net.