62 resultados para Comparison of nucleotide sequences
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
Ghosh's model is discussed in this paper under two alternative scenarios. In an open version we compare it with Leontief's model and prove that they reduce to each other under some specific productive conditions. We then move onto reconsidering Ghosh's model alleged implausibility and we do so reformulating the model to incorporate a closure rule. The closure solves, to some extent, the implausibility problem very clearly put out by Oosterhaven for then value-added is correctly computed and responsive to allocation changes resulting from supply shocks.
Resumo:
The purpose of this paper is to provide a comparative analysis of pork value chains in Catalonia, Spain and Manitoba, Canada. Intensive hog production models were implemented in Catalonia in the 1960s as a result of agriculture crises and fostered by feedstuffs factories. The expansion of the hog sector in Manitoba is more recent (in the 1990s) and brought about in large part by the opening of the Maple Leaf Meats processing plant in Brandon, Manitoba. This plant is capable of processing 90,000 hogs per week. Both hog production models ‐ the ‘older’ one in Catalonia (Spain) and the ‘newer’ in Manitoba‐ have been, until recently, examples of success. Inventories and production have been increasing substantially and both regions have proven to have great export potential. Recently, however, tensions have been developing with the hog production models of both regions, particularly as they relate to environmental concerns. The purpose of the paper is to compare the value chains with respect to their origins (e.g. supply a growing demand for pork, ensure farm profitability) and present states (e.g. environmental concerns, profitability). Keywords: pork value chain, hog farms, agri‐food studies. JEL: Q10, Q13, O57
Resumo:
This article examines the governance structures for managing the location and operation of Intensive Livestock Farming Operations (ILFOs). The article focuses on the hog sector and compares two very different jurisdictions: the Province of Manitoba, Canada and the Autonomous Community of Catalonia, Spain. Both are regions that have witnessed recent increases in hog production, including increasing spatial concentration of ILFOs and increasing size of those ILFOs. Policy has both fostered and sought to manage the increased production. Following a brief background description of restructuring, the changing legislative framework for Manitoba and Catalonia are described. Keywords: environmental regulations, hog farms, manure management, animal feeding operations. JEL: Q15, Q58, R52, O57
Resumo:
Un estudi observacional de pacients amb LES, atesos al University College de London Hospital entre 1976 i 2005, es va dur a terme per revisar les diferències entre homes i dones amb lupus pel que fa a les característiques clíniques, serologia i resultats. 439 dones i 45 homes van ser identificats. L'edat mitjana al diagnòstic va ser de 29,3 anys (12,6), sense diferències significatives entre homes i dones. El sexe femení es va associar significativament amb la presència d'úlceres orals i Ig M ACA. No hi va haver diferències significatives en la comparació de les altres variables. Durant aquest període de seguiment de trenta anys, relativament poques diferències han sorgit al comparar les freqüències de les característiques clíniques i serològiques en homes y dones amb lupus.
Resumo:
This letter presents a comparison between threeFourier-based motion compensation (MoCo) algorithms forairborne synthetic aperture radar (SAR) systems. These algorithmscircumvent the limitations of conventional MoCo, namelythe assumption of a reference height and the beam-center approximation.All these approaches rely on the inherent time–frequencyrelation in SAR systems but exploit it differently, with the consequentdifferences in accuracy and computational burden. Aftera brief overview of the three approaches, the performance ofeach algorithm is analyzed with respect to azimuthal topographyaccommodation, angle accommodation, and maximum frequencyof track deviations with which the algorithm can cope. Also, ananalysis on the computational complexity is presented. Quantitativeresults are shown using real data acquired by the ExperimentalSAR system of the German Aerospace Center (DLR).
Resumo:
In a seminal paper, Aitchison and Lauder (1985) introduced classical kernel densityestimation techniques in the context of compositional data analysis. Indeed, they gavetwo options for the choice of the kernel to be used in the kernel estimator. One ofthese kernels is based on the use the alr transformation on the simplex SD jointly withthe normal distribution on RD-1. However, these authors themselves recognized thatthis method has some deficiencies. A method for overcoming these dificulties based onrecent developments for compositional data analysis and multivariate kernel estimationtheory, combining the ilr transformation with the use of the normal density with a fullbandwidth matrix, was recently proposed in Martín-Fernández, Chacón and Mateu-Figueras (2006). Here we present an extensive simulation study that compares bothmethods in practice, thus exploring the finite-sample behaviour of both estimators
Resumo:
This contribution compares existing and newly developed techniques for geometrically representing mean-variances-kewness portfolio frontiers based on the rather widely adapted methodology of polynomial goal programming (PGP) on the one hand and the more recent approach based on the shortage function on the other hand. Moreover, we explain the working of these different methodologies in detail and provide graphical illustrations. Inspired by these illustrations, we prove a generalization of the well-known two fund separation theorem from traditionalmean-variance portfolio theory.
Resumo:
Business processes designers take into account the resources that the processes would need, but, due to the variable cost of certain parameters (like energy) or other circumstances, this scheduling must be done when business process enactment. In this report we formalize the energy aware resource cost, including time and usage dependent rates. We also present a constraint programming approach and an auction-based approach to solve the mentioned problem including a comparison of them and a comparison of the proposed algorithms for solving them
Resumo:
Background: The cooperative interaction between transcription factors has a decisive role in the control of the fate of the eukaryotic cell. Computational approaches for characterizing cooperative transcription factors in yeast, however, are based on different rationales and provide a low overlap between their results. Because the wealth of information contained in protein interaction networks and regulatory networks has proven highly effective in elucidating functional relationships between proteins, we compared different sets of cooperative transcription factor pairs (predicted by four different computational methods) within the frame of those networks. Results: Our results show that the overlap between the sets of cooperative transcription factors predicted by the different methods is low yet significant. Cooperative transcription factors predicted by all methods are closer and more clustered in the protein interaction network than expected by chance. On the other hand, members of a cooperative transcription factor pair neither seemed to regulate each other nor shared similar regulatory inputs, although they do regulate similar groups of target genes. Conclusion: Despite the different definitions of transcriptional cooperativity and the different computational approaches used to characterize cooperativity between transcription factors, the analysis of their roles in the framework of the protein interaction network and the regulatory network indicates a common denominator for the predictions under study. The knowledge of the shared topological properties of cooperative transcription factor pairs in both networks can be useful not only for designing better prediction methods but also for better understanding the complexities of transcriptional control in eukaryotes.
Resumo:
In the last few years, there has been a growing focus on faster computational methods to support clinicians in planning stenting procedures. This study investigates the possibility of introducing computational approximations in modelling stent deployment in aneurysmatic cerebral vessels to achieve simulations compatible with the constraints of real clinical workflows. The release of a self-expandable stent in a simplified aneurysmatic vessel was modelled in four different initial positions. Six progressively simplified modelling approaches (based on Finite Element method and Fast Virtual Stenting – FVS) have been used. Comparing accuracy of the results, the final configuration of the stent is more affected by neglecting mechanical properties of materials (FVS) than by adopting 1D instead of 3D stent models. Nevertheless, the differencesshowed are acceptable compared to those achieved by considering different stent initial positions. Regarding computationalcosts, simulations involving 1D stent features are the only ones feasible in clinical context.
Resumo:
Multiple-input multiple-output (MIMO) techniques have become an essential part of broadband wireless communications systems. For example, the recently developed IEEE 802.16e specifications for broadband wireless access include three MIMOprofiles employing 2×2 space-time codes (STCs), and two of these MIMO schemes are mandatory on the downlink of Mobile WiMAX systems. One of these has full rate, and the other has full diversity, but neither of them has both of the desired features. The third profile, namely, Matrix C, which is not mandatory, is both a full rate and a full diversity code, but it has a high decoder complexity. Recently, the attention was turned to the decodercomplexity issue and including this in the design criteria, several full-rate STCs were proposed as alternatives to Matrix C. In this paper, we review these different alternatives and compare them to Matrix C in terms of performances and the correspondingreceiver complexities.
Resumo:
Sequential randomized prediction of an arbitrary binary sequence isinvestigated. No assumption is made on the mechanism of generating the bit sequence. The goal of the predictor is to minimize its relative loss, i.e., to make (almost) as few mistakes as the best ``expert'' in a fixed, possibly infinite, set of experts. We point out a surprising connection between this prediction problem and empirical process theory. First, in the special case of static (memoryless) experts, we completely characterize the minimax relative loss in terms of the maximum of an associated Rademacher process. Then we show general upper and lower bounds on the minimaxrelative loss in terms of the geometry of the class of experts. As main examples, we determine the exact order of magnitude of the minimax relative loss for the class of autoregressive linear predictors and for the class of Markov experts.
Resumo:
This paper studies the relationship between the amount of publicinformation that stock market prices incorporate and the equilibriumbehavior of market participants. The analysis is framed in a static, NREEsetup where traders exchange vectors of assets accessing multidimensionalinformation under two alternative market structures. In the first(the unrestricted system), both informed and uninformed speculators cancondition their demands for each traded asset on all equilibrium prices;in the second (the restricted system), they are restricted to conditiontheir demand on the price of the asset they want to trade. I show thatinformed traders incentives to exploit multidimensional privateinformation depend on the number of prices they can condition upon whensubmitting their demand schedules, and on the specific price formationprocess one considers. Building on this insight, I then give conditionsunder which the restricted system is more efficient than the unrestrictedsystem.