933 resultados para non-trivial data structures
Resumo:
We propose a modification of standard linear electrodynamics in four dimensions, where effective non-trivial interactions of the electromagnetic field with itself and with matter fields induce Lorentz violating Chern-Simons terms. This yields two consequences: it provides a more realistic and general scenario for the breakdown of Lorentz symmetry in electromagnetism and it may explain the effective behavior of the electromagnetic field in certain planar phenomena (for instance, Hall effect). A number of proposals for non-linear electrodynamics is discussed along the paper. Important physical implications of the breaking of Lorentz symmetry, such as optical birefringence and the possibility of having conductance in the vacuum are commented on.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
It is quite difficult to obtain non-trivial chiral symmetry breaking solutions for the quark gap equation in the presence of dynamically generated gluon masses. An effective confining propagator has recently been proposed by Cornwall in order to solve this problem. We study phenomenological consequences of this approach, showing its compatibility with the experimental data. We argue that this confining propagator should be restricted to a small region of momenta, leading to effective four-fermion interactions at low energy. © 2013 American Institute of Physics.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The design of a network is a solution to several engineering and science problems. Several network design problems are known to be NP-hard, and population-based metaheuristics like evolutionary algorithms (EAs) have been largely investigated for such problems. Such optimization methods simultaneously generate a large number of potential solutions to investigate the search space in breadth and, consequently, to avoid local optima. Obtaining a potential solution usually involves the construction and maintenance of several spanning trees, or more generally, spanning forests. To efficiently explore the search space, special data structures have been developed to provide operations that manipulate a set of spanning trees (population). For a tree with n nodes, the most efficient data structures available in the literature require time O(n) to generate a new spanning tree that modifies an existing one and to store the new solution. We propose a new data structure, called node-depth-degree representation (NDDR), and we demonstrate that using this encoding, generating a new spanning forest requires average time O(root n). Experiments with an EA based on NDDR applied to large-scale instances of the degree-constrained minimum spanning tree problem have shown that the implementation adds small constants and lower order terms to the theoretical bound.
Resumo:
The objective of this study was to investigate, in a population of crossbred cattle, the obtainment of the non-additive genetic effects for the characteristics weight at 205 and 390 days and scrotal circumference, and to evaluate the consideration of these effects in the prediction of breeding values of sires using different estimation methodologies. In method 1, the data were pre-adjusted for the non-additive effects obtained by least squares means method in a model that considered the direct additive, maternal and non-additive fixed genetic effects, the direct and total maternal heterozygosities, and epistasis. In method 2, the non-additive effects were considered covariates in genetic model. Genetic values for adjusted and non-adjusted data were predicted considering additive direct and maternal effects, and for weight at 205 days, also the permanent environmental effect, as random effects in the model. The breeding values of the categories of sires considered for the weight characteristic at 205 days were organized in files, in order to verify alterations in the magnitude of the predictions and ranking of animals in the two methods of correction data for the non-additives effects. The non-additive effects were not similar in magnitude and direction in the two estimation methods used, nor for the characteristics evaluated. Pearson and Spearman correlations between breeding values were higher than 0.94, and the use of different methods does not imply changes in the selection of animals.
Resumo:
The main scope of my PhD is the reconstruction of the large-scale bivalve phylogeny on the basis of four mitochondrial genes, with samples taken from all major groups of the class. To my knowledge, it is the first attempt of such a breadth in Bivalvia. I decided to focus on both ribosomal and protein coding DNA sequences (two ribosomal encoding genes -12s and 16s -, and two protein coding ones - cytochrome c oxidase I and cytochrome b), since either bibliography and my preliminary results confirmed the importance of combined gene signals in improving evolutionary pathways of the group. Moreover, I wanted to propose a methodological pipeline that proved to be useful to obtain robust results in bivalves phylogeny. Actually, best-performing taxon sampling and alignment strategies were tested, and several data partitioning and molecular evolution models were analyzed, thus demonstrating the importance of molding and implementing non-trivial evolutionary models. In the line of a more rigorous approach to data analysis, I also proposed a new method to assess taxon sampling, by developing Clarke and Warwick statistics: taxon sampling is a major concern in phylogenetic studies, and incomplete, biased, or improper taxon assemblies can lead to misleading results in reconstructing evolutionary trees. Theoretical methods are already available to optimize taxon choice in phylogenetic analyses, but most involve some knowledge about genetic relationships of the group of interest, or even a well-established phylogeny itself; these data are not always available in general phylogenetic applications. The method I proposed measures the "phylogenetic representativeness" of a given sample or set of samples and it is based entirely on the pre-existing available taxonomy of the ingroup, which is commonly known to investigators. Moreover, it also accounts for instability and discordance in taxonomies. A Python-based script suite, called PhyRe, has been developed to implement all analyses.
Resumo:
The heavy fermion compound UNi2Al3 exhibits the coexistence of superconductivity and magnetic order at low temperatures, stimulating speculations about possible exotic Cooper-pairing interaction in this superconductor. However, the preparation of good quality bulk single crystals of UNi2Al3 has proven to be a non-trivial task due to metallurgical problems, which result in the formation of an UAl2 impurity phase and hence a strongly reduced sample purity. The present work concentrates on the preparation, characterization and electronic properties investigation of UNi2Al3 single crystalline thin film samples. The preparation of thin films was accomplished in a molecular beam epitaxy (MBE) system. (100)-oriented epitaxial thin films of UNi2Al3 were grown on single crystalline YAlO3 substrates cut in (010)- or (112)-direction. The high crystallographic quality of the samples was proved by several characterisation methods, such as X-ray analysis, RHEED and TEM. To study the magnetic structure of epitaxial thin films resonant magnetic x-ray scattering was employed. The magnetic order of thin the film samples, the formation of magnetic domains with different moment directions, and the magnetic correlation length were discussed. The electronic properties of the UNi2Al3 thin films in the normal and superconducting states were investigated by means of transport measurements. A pronounced anisotropy of the temperature dependent resistivity ρ(T) was observed. Moreover, it was found that the temperature of the resistive superconducting transition depends on the current direction, providing evidence for multiband superconductivity in UNi2Al3. The initial slope of the upper critical field H′c2(T) of the thin film samples suggests an unconventional spin-singlet superconducting state, as opposed to bulk single crystal data. To probe the superconducting gap of UNi2Al3 directly by means of tunnelling spectroscopy many planar junctions of different design employing different techniques were prepared. Despite the tunneling regime of the junctions, no features of the superconducting density of state of UNi2Al3 were ever observed. It is assumed that the absence of UNi2Al3 gap features in the tunneling spectra was caused by imperfections of the tunnelling contacts. The superconductivity of UNi2Al3 was probably suppressed just in a degraded surface layer, resulting in tunneling into non superconducting UNi2Al3. However, alternative explanations such as intrinsic pair breaking effects at the interface to the barrier are also possible.
Resumo:
This work presents exact algorithms for the Resource Allocation and Cyclic Scheduling Problems (RA&CSPs). Cyclic Scheduling Problems arise in a number of application areas, such as in hoist scheduling, mass production, compiler design (implementing scheduling loops on parallel architectures), software pipelining, and in embedded system design. The RA&CS problem concerns time and resource assignment to a set of activities, to be indefinitely repeated, subject to precedence and resource capacity constraints. In this work we present two constraint programming frameworks facing two different types of cyclic problems. In first instance, we consider the disjunctive RA&CSP, where the allocation problem considers unary resources. Instances are described through the Synchronous Data-flow (SDF) Model of Computation. The key problem of finding a maximum-throughput allocation and scheduling of Synchronous Data-Flow graphs onto a multi-core architecture is NP-hard and has been traditionally solved by means of heuristic (incomplete) algorithms. We propose an exact (complete) algorithm for the computation of a maximum-throughput mapping of applications specified as SDFG onto multi-core architectures. Results show that the approach can handle realistic instances in terms of size and complexity. Next, we tackle the Cyclic Resource-Constrained Scheduling Problem (i.e. CRCSP). We propose a Constraint Programming approach based on modular arithmetic: in particular, we introduce a modular precedence constraint and a global cumulative constraint along with their filtering algorithms. Many traditional approaches to cyclic scheduling operate by fixing the period value and then solving a linear problem in a generate-and-test fashion. Conversely, our technique is based on a non-linear model and tackles the problem as a whole: the period value is inferred from the scheduling decisions. The proposed approaches have been tested on a number of non-trivial synthetic instances and on a set of realistic industrial instances achieving good results on practical size problem.
Resumo:
In this thesis the evolution of the techno-social systems analysis methods will be reported, through the explanation of the various research experience directly faced. The first case presented is a research based on data mining of a dataset of words association named Human Brain Cloud: validation will be faced and, also through a non-trivial modeling, a better understanding of language properties will be presented. Then, a real complex system experiment will be introduced: the WideNoise experiment in the context of the EveryAware european project. The project and the experiment course will be illustrated and data analysis will be displayed. Then the Experimental Tribe platform for social computation will be introduced . It has been conceived to help researchers in the implementation of web experiments, and aims also to catalyze the cumulative growth of experimental methodologies and the standardization of tools cited above. In the last part, three other research experience which already took place on the Experimental Tribe platform will be discussed in detail, from the design of the experiment to the analysis of the results and, eventually, to the modeling of the systems involved. The experiments are: CityRace, about the measurement of human traffic-facing strategies; laPENSOcosì, aiming to unveil the political opinion structure; AirProbe, implemented again in the EveryAware project framework, which consisted in monitoring air quality opinion shift of a community informed about local air pollution. At the end, the evolution of the technosocial systems investigation methods shall emerge together with the opportunities and the threats offered by this new scientific path.
Resumo:
In the present thesis, we study quantization of classical systems with non-trivial phase spaces using the group-theoretical quantization technique proposed by Isham. Our main goal is a better understanding of global and topological aspects of quantum theory. In practice, the group-theoretical approach enables direct quantization of systems subject to constraints and boundary conditions in a natural and physically transparent manner -- cases for which the canonical quantization method of Dirac fails. First, we provide a clarification of the quantization formalism. In contrast to prior treatments, we introduce a sharp distinction between the two group structures that are involved and explain their physical meaning. The benefit is a consistent and conceptually much clearer construction of the Canonical Group. In particular, we shed light upon the 'pathological' case for which the Canonical Group must be defined via a central Lie algebra extension and emphasise the role of the central extension in general. In addition, we study direct quantization of a particle restricted to a half-line with 'hard wall' boundary condition. Despite the apparent simplicity of this example, we show that a naive quantization attempt based on the cotangent bundle over the half-line as classical phase space leads to an incomplete quantum theory; the reflection which is a characteristic aspect of the 'hard wall' is not reproduced. Instead, we propose a different phase space that realises the necessary boundary condition as a topological feature and demonstrate that quantization yields a suitable quantum theory for the half-line model. The insights gained in the present special case improve our understanding of the relation between classical and quantum theory and illustrate how contact interactions may be incorporated.
Resumo:
Volumetric data at micrometer level resolution can be acquired within a few minutes using synchrotron-radiation-based tomographic microscopy. The field of view along the rotation axis of the sample can easily be increased by stacking several tomograms, allowing the investigation of long and thin objects at high resolution. On the contrary, an extension of the field of view in the perpendicular direction is non-trivial. This paper presents an acquisition protocol which increases the field of view of the tomographic dataset perpendicular to its rotation axis. The acquisition protocol can be tuned as a function of the reconstruction quality and scanning time. Since the scanning time is proportional to the radiation dose imparted to the sample, this method can be used to increase the field of view of tomographic microscopy instruments while optimizing the radiation dose for radiation-sensitive samples and keeping the quality of the tomographic dataset on the required level. This approach, dubbed wide-field synchrotron radiation tomographic microscopy, can increase the lateral field of view up to five times. The method has been successfully applied for the three-dimensional imaging of entire rat lung acini with a diameter of 4.1 mm at a voxel size of 1.48 microm.
Resumo:
To support development tools like debuggers, runtime systems need to provide a meta-programming interface to alter their semantics and access internal data. Reflective capabilities are typically fixed by the Virtual Machine (VM). Unanticipated reflective features must either be simulated by complex program transformations, or they require the development of a specially tailored VM. We propose a novel approach to behavioral reflection that eliminates the barrier between applications and the VM by manipulating an explicit tower of first-class interpreters. Pinocchio is a proof-of-concept implementation of our approach which enables radical changes to the interpretation of programs by explicitly instantiating subclasses of the base interpreter. We illustrate the design of Pinocchio through non-trivial examples that extend runtime semantics to support debugging, parallel debugging, and back-in-time object-flow debugging. Although performance is not yet addressed, we also discuss numerous opportunities for optimization, which we believe will lead to a practical approach to behavioral reflection.
Resumo:
Limitations associated with the visual information provided to surgeons during laparoscopic surgery increases the difficulty of procedures and thus, reduces clinical indications and increases training time. This work presents a novel augmented reality visualization approach that aims to improve visual data supplied for the targeting of non visible anatomical structures in laparoscopic visceral surgery. The approach aims to facilitate the localisation of hidden structures with minimal damage to surrounding structures and with minimal training requirements. The proposed augmented reality visualization approach incorporates endoscopic images overlaid with virtual 3D models of underlying critical structures in addition to targeting and depth information pertaining to targeted structures. Image overlay was achieved through the implementation of camera calibration techniques and integration of the optically tracked endoscope into an existing image guidance system for liver surgery. The approach was validated in accuracy, clinical integration and targeting experiments. Accuracy of the overlay was found to have a mean value of 3.5 mm ± 1.9 mm and 92.7% of targets within a liver phantom were successfully located laparoscopically by non trained subjects using the approach.