51 resultados para EXACT RESULTS
Resumo:
In this paper we present a new, accurate form of the heat balance integral method, termed the Combined Integral Method (or CIM). The application of this method to Stefan problems is discussed. For simple test cases the results are compared with exact and asymptotic limits. In particular, it is shown that the CIM is more accurate than the second order, large Stefan number, perturbation solution for a wide range of Stefan numbers. In the initial examples it is shown that the CIM reduces the standard problem, consisting of a PDE defined over a domain specified by an ODE, to the solution of one or two algebraic equations. The latter examples, where the boundary temperature varies with time, reduce to a set of three first order ODEs.
Resumo:
Hypergraph width measures are a class of hypergraph invariants important in studying the complexity of constraint satisfaction problems (CSPs). We present a general exact exponential algorithm for a large variety of these measures. A connection between these and tree decompositions is established. This enables us to almost seamlessly adapt the combinatorial and algorithmic results known for tree decompositions of graphs to the case of hypergraphs and obtain fast exact algorithms. As a consequence, we provide algorithms which, given a hypergraph H on n vertices and m hyperedges, compute the generalized hypertree-width of H in time O*(2n) and compute the fractional hypertree-width of H in time O(1.734601n.m).1
Resumo:
Systematic asymptotic methods are used to formulate a model for the extensional flow of a thin sheet of nematic liquid crystal. With no external body forces applied, the model is found to be equivalent to the so-called Trouton model for Newtonian sheets (and fi bers), albeit with a modi fied "Trouton ratio". However, with a symmetry-breaking electric field gradient applied, behavior deviates from the Newtonian case, and the sheet can undergo fi nite-time breakup if a suitable destabilizing field is applied. Some simple exact solutions are presented to illustrate the results in certain idealized limits, as well as sample numerical results to the full model equations.
Resumo:
Given positive integers n and m, we consider dynamical systems in which n copies of a topological space is homeomorphic to m copies of that same space. The universal such system is shown to arise naturally from the study of a C*-algebra we denote by Om;n, which in turn is obtained as a quotient of the well known Leavitt C*-algebra Lm;n, a process meant to transform the generating set of partial isometries of Lm;n into a tame set. Describing Om;n as the crossed-product of the universal (m; n) -dynamical system by a partial action of the free group Fm+n, we show that Om;n is not exact when n and m are both greater than or equal to 2, but the corresponding reduced crossed-product, denoted Or m;n, is shown to be exact and non-nuclear. Still under the assumption that m; n &= 2, we prove that the partial action of Fm+n is topologically free and that Or m;n satisfies property (SP) (small projections). We also show that Or m;n admits no finite dimensional representations. The techniques developed to treat this system include several new results pertaining to the theory of Fell bundles over discrete groups.
Resumo:
Background: The incidence of cardiovascular events in HIV patients has fallen. Methods: We identified 81 patients with a history of coronary events from 2 hospitals in Spain to evaluate management of CVRF before and after the event. Results: The prevalence of coronary events was 2.15%. At the time of the coronary event, CVRF were highly prevalent. Decrease in total cholesterol (P=0.025) and LDLc(P=0.004) was observed. LDLc and HDLc were determined and the percentage of patients with LDLc &100 mg/dL remained stable at the last visit. Conclusions: The prevalence of coronary disease in our cohort was low. Although CVRF were highly.
Resumo:
Scarcities of environmental services are no longer merely a remote hypothesis. Consequently, analysis of their inequalities between nations becomes of paramount importance for the achievement of sustainability in terms either of international policy, or of Universalist ethical principles of equity. This paper aims, on the one hand, at revising methodological aspects of the inequality measurement of certain environmental data and, on the other, at extending the scarce empirical evidence relating to the international distribution of Ecological Footprint (EF), by using a longer EF time series. Most of the techniques currently important in the literature are revised and then tested on EF data with interesting results. We look in depth at Lorenz dominance analyses and consider the underlying properties of different inequality indices. Those indices which fit best with environmental inequality measurements are CV2 and GE(2) because of their neutrality property, however a trade-off may occur when subgroup decompositions are performed. A weighting factor decomposition method is proposed in order to isolate weighting factor changes in inequality growth rates. Finally, the only non-ambiguous way of decomposing inequality by source is the natural decomposition of CV2, which additionally allows the interpretation of marginal term contributions. Empirically, this paper contributes to the environmental inequality measurement of EF: this inequality has been quite stable and its change over time is due to per capita vector changes rather than population changes. Almost the entirety of the EF inequality is explainable by differences in the means between the countries of the World Bank group. This finding suggests that international environmental agreements should be attempted on a regional basis in an attempt to achieve greater consensus between the parties involved. Additionally, source decomposition warns of the dangers of confining CO2 emissions reduction to crop-based energies because of the implications for basic needs satisfaction.
Resumo:
Scarcities of environmental services are no longer merely a remote hypothesis. Consequently, analysis of their inequalities between nations becomes of paramount importance for the achievement of sustainability in terms either of international policy, or of Universalist ethical principles of equity. This paper aims, on the one hand, at revising methodological aspects of the inequality measurement of certain environmental data and, on the other, at extending the scarce empirical evidence relating to the international distribution of Ecological Footprint (EF), by using a longer EF time series. Most of the techniques currently important in the literature are revised and then tested on EF data with interesting results. We look in depth at Lorenz dominance analyses and consider the underlying properties of different inequality indices. Those indices which fit best with environmental inequality measurements are CV2 and GE(2) because of their neutrality property, however a trade-off may occur when subgroup decompositions are performed. A weighting factor decomposition method is proposed in order to isolate weighting factor changes in inequality growth rates. Finally, the only non-ambiguous way of decomposing inequality by source is the natural decomposition of CV2, which additionally allows the interpretation of marginal term contributions. Empirically, this paper contributes to the environmental inequality measurement of EF: this inequality has been quite stable and its change over time is due to per capita vector changes rather than population changes. Almost the entirety of the EF inequality is explainable by differences in the means between the countries of the World Bank group. This finding suggests that international environmental agreements should be attempted on a regional basis in an attempt to achieve greater consensus between the parties involved. Additionally, source decomposition warns of the dangers of confining CO2 emissions reduction to crop-based energies because of the implications for basic needs satisfaction. Keywords: ecological footprint; ecological inequality measurement, inequality decomposition.
Resumo:
The front speed problem for nonuniform reaction rate and diffusion coefficient is studied by using singular perturbation analysis, the geometric approach of Hamilton-Jacobi dynamics, and the local speed approach. Exact and perturbed expressions for the front speed are obtained in the limit of large times. For linear and fractal heterogeneities, the analytic results have been compared with numerical results exhibiting a good agreement. Finally we reach a general expression for the speed of the front in the case of smooth and weak heterogeneities
Resumo:
Background: Cancer is a major medical problem in modern societies. However, the incidence of this disease in non-human primates is very low. To study whether genetic differences between human and chimpanzee could contribute to their distinct cancer susceptibility, we have examined in the chimpanzee genome the orthologous genes of a set of 333 human cancer genes. Results: This analysis has revealed that all examined human cancer genes are present in chimpanzee, contain intact open reading frames and show a high degree of conservation between both species. However, detailed analysis of this set of genes has shown some differences in genes of special relevance for human cancer. Thus, the chimpanzee gene encoding p53 contains a Pro residue at codon 72, while this codon is polymorphic in humans and can code for Arg or Pro, generating isoforms with different ability to induce apoptosis or interact with p73. Moreover, sequencing of the BRCA1 gene has shown an 8 Kb deletion in the chimpanzee sequence that prematurely truncates the co-regulated NBR2 gene. Conclusion: These data suggest that small differences in cancer genes, as those found in tumor suppressor genes, might influence the differences in cancer susceptibility between human and chimpanzee. Nevertheless, further analysis will be required to determine the exact contribution of the genetic changes identified in this study to the different cancer incidence in non-human primates.
Resumo:
Purpose: The objective of this study is to investigate the feasibility of detecting and quantifying 3D cerebrovascular wall motion from a single 3D rotational x-ray angiography (3DRA) acquisition within a clinically acceptable time and computing from the estimated motion field for the further biomechanical modeling of the cerebrovascular wall. Methods: The whole motion cycle of the cerebral vasculature is modeled using a 4D B-spline transformation, which is estimated from a 4D to 2D + t image registration framework. The registration is performed by optimizing a single similarity metric between the entire 2D + t measured projection sequence and the corresponding forward projections of the deformed volume at their exact time instants. The joint use of two acceleration strategies, together with their implementation on graphics processing units, is also proposed so as to reach computation times close to clinical requirements. For further characterizing vessel wall properties, an approximation of the wall thickness changes is obtained through a strain calculation. Results: Evaluation on in silico and in vitro pulsating phantom aneurysms demonstrated an accurate estimation of wall motion curves. In general, the error was below 10% of the maximum pulsation, even in the situation when substantial inhomogeneous intensity pattern was present. Experiments on in vivo data provided realistic aneurysm and vessel wall motion estimates, whereas in regions where motion was neither visible nor anatomically possible, no motion was detected. The use of the acceleration strategies enabled completing the estimation process for one entire cycle in 5-10 min without degrading the overall performance. The strain map extracted from our motion estimation provided a realistic deformation measure of the vessel wall. Conclusions: The authors' technique has demonstrated that it can provide accurate and robust 4D estimates of cerebrovascular wall motion within a clinically acceptable time, although it has to be applied to a larger patient population prior to possible wide application to routine endovascular procedures. In particular, for the first time, this feasibility study has shown that in vivo cerebrovascular motion can be obtained intraprocedurally from a 3DRA acquisition. Results have also shown the potential of performing strain analysis using this imaging modality, thus making possible for the future modeling of biomechanical properties of the vascular wall.
Resumo:
Exact closed-form expressions are obtained for the outage probability of maximal ratio combining in η-μ fadingchannels with antenna correlation and co-channel interference. The scenario considered in this work assumes the joint presence of background white Gaussian noise and independent Rayleigh-faded interferers with arbitrary powers. Outage probability results are obtained through an appropriate generalization of the moment-generating function of theη-μ fading distribution, for which new closed-form expressions are provided.
Resumo:
We present a new method for constructing exact distribution-free tests (and confidence intervals) for variables that can generate more than two possible outcomes.This method separates the search for an exact test from the goal to create a non-randomized test. Randomization is used to extend any exact test relating to meansof variables with finitely many outcomes to variables with outcomes belonging to agiven bounded set. Tests in terms of variance and covariance are reduced to testsrelating to means. Randomness is then eliminated in a separate step.This method is used to create confidence intervals for the difference between twomeans (or variances) and tests of stochastic inequality and correlation.
Resumo:
How much would output increase if underdeveloped economies were toincrease their levels of schooling? We contribute to the development accounting literature by describing a non-parametric upper bound on theincrease in output that can be generated by more schooling. The advantage of our approach is that the upper bound is valid for any number ofschooling levels with arbitrary patterns of substitution/complementarity.Another advantage is that the upper bound is robust to certain forms ofendogenous technology response to changes in schooling. We also quantify the upper bound for all economies with the necessary data, compareour results with the standard development accounting approach, andprovide an update on the results using the standard approach for a largesample of countries.
Resumo:
Our work attempts to investigate the influence of credit tightness orexpansion on activity and relative prices in a multimarket set-up. We report on somedouble- auction, two-market experiments where subjects had to satisfy an inequalityinvolving the use of credit. The experiments display two regimes, characterizedby high and low credit availability. The critical value of credit at the commonboundary of the two regimes has a compelling interpretation as the maximal credituse at the Arrow-Debreu equilibrium of the abstract economy naturally associatedto our experimental environment. Our main results are that changes in theavailability of credit: (a): have minor and unsystematic effects on quantitiesand relative prices in the high-credit regime, (b): have substantial effects, bothon quantities and relative prices, in the low-credit regime.
Resumo:
I consider a general specification of criminals' objective functionand argue that, when the general non-expected utility theory issubstituted for the traditional expected utility theory, thehigh-fine-low-probability result (Becker, 1968) only holds underspecific and strong restrictions.