947 resultados para Asymptotic Formulas
Resumo:
In this paper, we study the average inter-crossing number between two random walks and two random polygons in the three-dimensional space. The random walks and polygons in this paper are the so-called equilateral random walks and polygons in which each segment of the walk or polygon is of unit length. We show that the mean average inter-crossing number ICN between two equilateral random walks of the same length n is approximately linear in terms of n and we were able to determine the prefactor of the linear term, which is a = (3 In 2)/(8) approximate to 0.2599. In the case of two random polygons of length n, the mean average inter-crossing number ICN is also linear, but the prefactor of the linear term is different from that of the random walks. These approximations apply when the starting points of the random walks and polygons are of a distance p apart and p is small compared to n. We propose a fitting model that would capture the theoretical asymptotic behaviour of the mean average ICN for large values of p. Our simulation result shows that the model in fact works very well for the entire range of p. We also study the mean ICN between two equilateral random walks and polygons of different lengths. An interesting result is that even if one random walk (polygon) has a fixed length, the mean average ICN between the two random walks (polygons) would still approach infinity if the length of the other random walk (polygon) approached infinity. The data provided by our simulations match our theoretical predictions very well.
Resumo:
In this paper, we address this problem through the design of a semiactive controller based on the mixed H2/H∞ control theory. The vibrations caused by the seismic motions are mitigated by a semiactive damper installed in the bottom of the structure. It is meant by semiactive damper, a device that absorbs but cannot inject energy into the system. Sufficient conditions for the design of a desired control are given in terms of linear matrix inequalities (LMIs). A controller that guarantees asymptotic stability and a mixed H2/H∞ performance is then developed. An algorithm is proposed to handle the semiactive nature of the actuator. The performance of the controller is experimentally evaluated in a real-time hybrid testing facility that consists of a physical specimen (a small-scale magnetorheological damper) and a numerical model (a large-scale three-story building)
Resumo:
Natural populations are of finite size and organisms carry multilocus genotypes. There are, nevertheless, few results on multilocus models when both random genetic drift and natural selection affect the evolutionary dynamics. In this paper we describe a formalism to calculate systematic perturbation expansions of moments of allelic states around neutrality in populations of constant size. This allows us to evaluate multilocus fixation probabilities (long-term limits of the moments) under arbitrary strength of selection and gene action. We show that such fixation probabilities can be expressed in terms of selection coefficients weighted by mean first passages times of ancestral gene lineages within a single ancestor. These passage times extend the coalescence times that weight selection coefficients in one-locus perturbation formulas for fixation probabilities. We then apply these results to investigate the Hill-Robertson effect and the coevolution of helping and punishment. Finally, we discuss limitations and strengths of the perturbation approach. In particular, it provides accurate approximations for fixation probabilities for weak selection regimes only (Ns < or = 1), but it provides generally good prediction for the direction of selection under frequency-dependent selection.
Resumo:
In this paper a one-phase supercooled Stefan problem, with a nonlinear relation between the phase change temperature and front velocity, is analysed. The model with the standard linear approximation, valid for small supercooling, is first examined asymptotically. The nonlinear case is more difficult to analyse and only two simple asymptotic results are found. Then, we apply an accurate heat balance integral method to make further progress. Finally, we compare the results found against numerical solutions. The results show that for large supercooling the linear model may be highly inaccurate and even qualitatively incorrect. Similarly as the Stefan number β → 1&sup&+&/sup& the classic Neumann solution which exists down to β =1 is far from the linear and nonlinear supercooled solutions and can significantly overpredict the solidification rate.
Resumo:
High-resolution tomographic imaging of the shallow subsurface is becoming increasingly important for a wide range of environmental, hydrological and engineering applications. Because of their superior resolution power, their sensitivity to pertinent petrophysical parameters, and their far reaching complementarities, both seismic and georadar crosshole imaging are of particular importance. To date, corresponding approaches have largely relied on asymptotic, ray-based approaches, which only account for a very small part of the observed wavefields, inherently suffer from a limited resolution, and in complex environments may prove to be inadequate. These problems can potentially be alleviated through waveform inversion. We have developed an acoustic waveform inversion approach for crosshole seismic data whose kernel is based on a finite-difference time-domain (FDTD) solution of the 2-D acoustic wave equations. This algorithm is tested on and applied to synthetic data from seismic velocity models of increasing complexity and realism and the results are compared to those obtained using state-of-the-art ray-based traveltime tomography. Regardless of the heterogeneity of the underlying models, the waveform inversion approach has the potential of reliably resolving both the geometry and the acoustic properties of features of the size of less than half a dominant wavelength. Our results do, however, also indicate that, within their inherent resolution limits, ray-based approaches provide an effective and efficient means to obtain satisfactory tomographic reconstructions of the seismic velocity structure in the presence of mild to moderate heterogeneity and in absence of strong scattering. Conversely, the excess effort of waveform inversion provides the greatest benefits for the most heterogeneous, and arguably most realistic, environments where multiple scattering effects tend to be prevalent and ray-based methods lose most of their effectiveness.
Resumo:
SummaryDiscrete data arise in various research fields, typically when the observations are count data.I propose a robust and efficient parametric procedure for estimation of discrete distributions. The estimation is done in two phases. First, a very robust, but possibly inefficient, estimate of the model parameters is computed and used to indentify outliers. Then the outliers are either removed from the sample or given low weights, and a weighted maximum likelihood estimate (WML) is computed.The weights are determined via an adaptive process such that if the data follow the model, then asymptotically no observation is downweighted.I prove that the final estimator inherits the breakdown point of the initial one, and that its influence function at the model is the same as the influence function of the maximum likelihood estimator, which strongly suggests that it is asymptotically fully efficient.The initial estimator is a minimum disparity estimator (MDE). MDEs can be shown to have full asymptotic efficiency, and some MDEs have very high breakdown points and very low bias under contamination. Several initial estimators are considered, and the performances of the WMLs based on each of them are studied.It results that in a great variety of situations the WML substantially improves the initial estimator, both in terms of finite sample mean square error and in terms of bias under contamination. Besides, the performances of the WML are rather stable under a change of the MDE even if the MDEs have very different behaviors.Two examples of application of the WML to real data are considered. In both of them, the necessity for a robust estimator is clear: the maximum likelihood estimator is badly corrupted by the presence of a few outliers.This procedure is particularly natural in the discrete distribution setting, but could be extended to the continuous case, for which a possible procedure is sketched.RésuméLes données discrètes sont présentes dans différents domaines de recherche, en particulier lorsque les observations sont des comptages.Je propose une méthode paramétrique robuste et efficace pour l'estimation de distributions discrètes. L'estimation est faite en deux phases. Tout d'abord, un estimateur très robuste des paramètres du modèle est calculé, et utilisé pour la détection des données aberrantes (outliers). Cet estimateur n'est pas nécessairement efficace. Ensuite, soit les outliers sont retirés de l'échantillon, soit des faibles poids leur sont attribués, et un estimateur du maximum de vraisemblance pondéré (WML) est calculé.Les poids sont déterminés via un processus adaptif, tel qu'asymptotiquement, si les données suivent le modèle, aucune observation n'est dépondérée.Je prouve que le point de rupture de l'estimateur final est au moins aussi élevé que celui de l'estimateur initial, et que sa fonction d'influence au modèle est la même que celle du maximum de vraisemblance, ce qui suggère que cet estimateur est pleinement efficace asymptotiquement.L'estimateur initial est un estimateur de disparité minimale (MDE). Les MDE sont asymptotiquement pleinement efficaces, et certains d'entre eux ont un point de rupture très élevé et un très faible biais sous contamination. J'étudie les performances du WML basé sur différents MDEs.Le résultat est que dans une grande variété de situations le WML améliore largement les performances de l'estimateur initial, autant en terme du carré moyen de l'erreur que du biais sous contamination. De plus, les performances du WML restent assez stables lorsqu'on change l'estimateur initial, même si les différents MDEs ont des comportements très différents.Je considère deux exemples d'application du WML à des données réelles, où la nécessité d'un estimateur robuste est manifeste : l'estimateur du maximum de vraisemblance est fortement corrompu par la présence de quelques outliers.La méthode proposée est particulièrement naturelle dans le cadre des distributions discrètes, mais pourrait être étendue au cas continu.
Resumo:
The introduction of an infective-infectious period on the geographic spread of epidemics is considered in two different models. The classical evolution equations arising in the literature are generalized and the existence of epidemic wave fronts is revised. The asymptotic speed is obtained and improves previous results for the Black Death plague
Resumo:
The asymptotic speed problem of front solutions to hyperbolic reaction-diffusion (HRD) equations is studied in detail. We perform linear and variational analyses to obtain bounds for the speed. In contrast to what has been done in previous work, here we derive upper bounds in addition to lower ones in such a way that we can obtain improved bounds. For some functions it is possible to determine the speed without any uncertainty. This is also achieved for some systems of HRD (i.e., time-delayed Lotka-Volterra) equations that take into account the interaction among different species. An analytical analysis is performed for several systems of biological interest, and we find good agreement with the results of numerical simulations as well as with available observations for a system discussed recently
Resumo:
This paper investigates a simple procedure to estimate robustly the mean of an asymmetric distribution. The procedure removes the observations which are larger or smaller than certain limits and takes the arithmetic mean of the remaining observations, the limits being determined with the help of a parametric model, e.g., the Gamma, the Weibull or the Lognormal distribution. The breakdown point, the influence function, the (asymptotic) variance, and the contamination bias of this estimator are explored and compared numerically with those of competing estimates.
Resumo:
In the context of fading channels it is well established that, with a constrained transmit power, the bit rates achievable by signals that are not peaky vanish as the bandwidth grows without bound. Stepping back from the limit, we characterize the highest bit rate achievable by such non-peaky signals and the approximate bandwidth where that apex occurs. As it turns out, the gap between the highest rate achievable without peakedness and the infinite-bandwidth capacity (with unconstrained peakedness) is small for virtually all settings of interest to wireless communications. Thus, although strictly achieving capacity in wideband fading channels does require signal peakedness, bit rates not far from capacity can be achieved with conventional signaling formats that do not exhibit the serious practical drawbacks associated with peakedness. In addition, we show that the asymptotic decay of bit rate in the absence of peakedness usually takes hold at bandwidths so large that wideband fading models are called into question. Rather, ultrawideband models ought to be used.
Resumo:
In this chapter, after pointing out the different logics that lie behind the familiar ideas of democracy and federalism, I have dealt with the case of plurinational federal democracies. Having put forward a double criterion of an empirical nature with which to differentiate between the existence of minority nations within plurinational democracies (section 2), I suggest three theoretical criteria for the political accommodation of these democracies. In the following section, I show the agonistic nature of the normative discussion of the political accommodation of this kind of democracies, which bring monist and pluralist versions of the demos of the polity into conflict (section 3.1), as well as a number of conclusions which are the result of a comparative study of 19 federal and regional democracies using four analytical axes: the uninational/plurinational axis; the unitarianism-federalism axis; the centralisation-decentralisation axis; and the symmetry-asymmetry axis (section 3.2). This analysis reveals shortcomings in the constitutional recognition of national pluralism in federal and regional cases with a large number of federated units/regions with political autonomy; a lower degree of constitutional federalism and a greater asymmetry in the federated entities or regions of plurinational democracies. It also reveals difficulties to establish clear formulas in these democracies in order to encourage a “federalism of trust” based on the participation and protection of national minorities in the shared government of plurinational federations/regional states. Actually, there is a federal deficit in this kind polities according to normative liberal-democratic patterns and to what comparative analysis show. Finally, this chapter advocates the need for a greater normative and institutional refinement in plurinational federal democracies. In order to achieve this, it is necessary to introduce a deeper form of “ethical” pluralism -which displays normative agonistic trends, as well as a more “confederal/asymmetrical” perspective, congruent with the national pluralism of these kind of polities.
Resumo:
Using data from the Spanish household budget survey, we investigate life- cycle effects on several product expenditures. A latent-variable model approach is adopted to evaluate the impact of income on expenditures, controlling for the number of members in the family. Two latent factors underlying repeated measures of monetary and non-monetary income are used as explanatory variables in the expenditure regression equations, thus avoiding possible bias associated to the measurement error in income. The proposed methodology also takes care of the case in which product expenditures exhibit a pattern of infrequent purchases. Multiple-group analysis is used to assess the variation of key parameters of the model across various household life-cycle typologies. The analysis discloses significant life-cycle effects on the mean levels of expenditures; it also detects significant life-cycle effects on the way expenditures are affected by income and family size. Asymptotic robust methods are used to account for possible non-normality of the data.
Resumo:
Computed tomography (CT) is used increasingly to measure liver volume in patients undergoing evaluation for transplantation or resection. This study is designed to determine a formula predicting total liver volume (TLV) based on body surface area (BSA) or body weight in Western adults. TLV was measured in 292 patients from four Western centers. Liver volumes were calculated from helical computed tomographic scans obtained for conditions unrelated to the hepatobiliary system. BSA was calculated based on height and weight. Each center used a different established method of three-dimensional volume reconstruction. Using regression analysis, measurements were compared, and formulas correlating BSA or body weight to TLV were established. A linear regression formula to estimate TLV based on BSA was obtained: TLV = -794.41 + 1,267.28 x BSA (square meters; r(2) = 0.46; P <.0001). A formula based on patient weight also was derived: TLV = 191.80 + 18.51 x weight (kilograms; r(2) = 0.49; P <.0001). The newly derived TLV formula based on BSA was compared with previously reported formulas. The application of a formula obtained from healthy Japanese individuals underestimated TLV. Two formulas derived from autopsy data for Western populations were similar to the newly derived BSA formula, with a slight overestimation of TLV. In conclusion, hepatic three-dimensional volume reconstruction based on helical CT predicts TLV based on BSA or body weight. The new formulas derived from this correlation should contribute to the estimation of TLV before liver transplantation or major hepatic resection.
Resumo:
Introduction: Imatinib, a first-line drug for chronic myeloid leukaemia (CML), has been increasingly proposed for therapeutic drug monitoring (TDM), as trough concentrations >=1000 ng/ml (Cmin) have been associated with improved molecular and complete cytogenetic response (CCyR). The pharmacological monitoring project of EUTOS (European Treatment and Outcome Study) was launched to validate retrospectively the correlation between Cmin and response in a large population of patients followed by central TDM in Bordeaux.¦Methods: 1898 CML patients with first TDM 0-9 years after imatinib initiation, providing cytogenetic data along with demographic and comedication (37%) information, were included. Individual Cmin, estimated by non-linear regression (NONMEM), was adjusted to initial standard dose (400 mg/day) and stratified at 1000 ng/ml. Kaplan-Meier estimates of overall cumulative CCyR rates (stratified by sex, age, comedication and Cmin) were compared using asymptotic logrank k-sample test for interval-censored data. Differences in Cmin were assessed by Wilcoxon test.¦Results: There were no significant differences in overall cumulative CCyR rates between Cmin strata, sex and comedication with P-glycoprotein inhibitors/inducers or CYP3A4 inhibitors (p >0.05). Lower rates were observed in 113 young patients <30 years (p = 0.037; 1-year rates: 43% vs 60% in older patients), as well as in 29 patients with CYP3A4 inducers (p = 0.001, 1-year rates: 40% vs 66% without). Higher rates were observed in 108 patients on organic-cation-transporter-1 (hOCT-1) inhibitors (p = 0.034, 1-year rates: 83% vs 56% without). Considering 1-year CCyR rates, a trend towards better response for Cmin above 1000 ng/ml was observed: 64% (95%CI: 60-69%) vs 59% (95%CI: 56-61%). Median Cmin (400 mg/day) was significantly reduced in male patients (732 vs 899ng/ml, p <0.001), young patients <30 years (734 vs 802 ng/ml, p = 0.037) and under CYP3A4 inducers (758 vs 859 ng/ml, p = 0.022). Under hOCT-1 inhibitors, Cmin was increased (939 vs 827 ng/ml, p = 0.038).¦Conclusion: Based on observational TDM data, the impact of imatinib Cmin >1000 ng/ml on CCyR was not salient. Young CML patients (<30 years) and patients taking CYP3A4 inducers probably need close monitoring and possibly higher imatinib doses, due to lower Cmin along with lower CCyR rates. Patients taking hOCT-1 inhibitors seem in contrast to have improved CCyR response rates. The precise role for imatinib TDM remains to be established prospectively.
Resumo:
Using data from the Spanish household budget survey, we investigate life-cycle effects on several product expenditures. A latent-variable model approach is adopted to evaluate the impact of income on expenditures, controlling for the number of members in the family. Two latent factors underlying repeated measures of monetary and non-monetary income are used as explanatory variables in the expenditure regression equations, thus avoiding possible bias associated to the measurement error in income. The proposed methodology also takes care of the case in which product expenditures exhibit a pattern of infrequent purchases. Multiple-group analysis is used to assess the variation of key parameters of the model across various household life-cycle typologies. The analysis discloses significant life-cycle effects on the mean levels of expenditures; it also detects significant life-cycle effects on the way expenditures are affected by income and family size. Asymptotic robust methods are used to account for possible non-normality of the data.