958 resultados para Classical orthogonal polynomials of a discrete variable


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bound and resonance states of HO2 have been calculated quantum mechanically by the Lanczos homogeneous filter diagonalization method [Zhang and Smith, Phys. Chem. Chem. Phys. 3, 2282 (2001); J. Chem. Phys. 115, 5751 (2001)] for nonzero total angular momentum J = 1,2,3. For lower bound states, agreement between the results in this paper and previous work is quite satisfactory; while for high lying bound states and resonances these are the first reported results. A helicity quantum number V assignment (within the helicity conserving approximation) is performed and the results indicate that for lower bound states it is possible to assign the V quantum numbers unambiguously, but for resonances it is impossible to assign the V helicity quantum numbers due to strong mixing. In fact, for the high-lying bound states, the mixing has already appeared. These results indicate that the helicity conserving approximation is not good for the resonance state calculations and exact quantum calculations are needed to accurately describe the reaction dynamics for HO2 system. Analysis of the resonance widths shows that most of the resonances are overlapping and the interferences between them lead to large fluctuations from one resonance to another. In accord with the conclusions from earlier J = 0 calculations, this indicates that the dissociation of HO2 is essentially irregular. (C) 2003 American Institute of Physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, operational matrices were adapted for solving several kinds of fractional differential equations (FDEs). The use of numerical techniques in conjunction with operational matrices of some orthogonal polynomials, for the solution of FDEs on finite and infinite intervals, produced highly accurate solutions for such equations. This article discusses spectral techniques based on operational matrices of fractional derivatives and integrals for solving several kinds of linear and nonlinear FDEs. More precisely, we present the operational matrices of fractional derivatives and integrals, for several polynomials on bounded domains, such as the Legendre, Chebyshev, Jacobi and Bernstein polynomials, and we use them with different spectral techniques for solving the aforementioned equations on bounded domains. The operational matrices of fractional derivatives and integrals are also presented for orthogonal Laguerre and modified generalized Laguerre polynomials, and their use with numerical techniques for solving FDEs on a semi-infinite interval is discussed. Several examples are presented to illustrate the numerical and theoretical properties of various spectral techniques for solving FDEs on finite and semi-infinite intervals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a novel way of testing exogeneity of an explanatory variable without any parametric assumptions in the presence of a "conditional" instrumental variable. A testable implication is derived that if an explanatory variable is endogenous, the conditional distribution of the outcome given the endogenous variable is not independent of its instrumental variable(s). The test rejects the null hypothesis with probability one if the explanatory variable is endogenous and it detects alternatives converging to the null at a rate n..1=2:We propose a consistent nonparametric bootstrap test to implement this testable implication. We show that the proposed bootstrap test can be asymptotically justi.ed in the sense that it produces asymptotically correct size under the null of exogeneity, and it has unit power asymptotically. Our nonparametric test can be applied to the cases in which the outcome is generated by an additively non-separable structural relation or in which the outcome is discrete, which has not been studied in the literature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When continuous data are coded to categorical variables, two types of coding are possible: crisp coding in the form of indicator, or dummy, variables with values either 0 or 1; or fuzzy coding where each observation is transformed to a set of "degrees of membership" between 0 and 1, using co-called membership functions. It is well known that the correspondence analysis of crisp coded data, namely multiple correspondence analysis, yields principal inertias (eigenvalues) that considerably underestimate the quality of the solution in a low-dimensional space. Since the crisp data only code the categories to which each individual case belongs, an alternative measure of fit is simply to count how well these categories are predicted by the solution. Another approach is to consider multiple correspondence analysis equivalently as the analysis of the Burt matrix (i.e., the matrix of all two-way cross-tabulations of the categorical variables), and then perform a joint correspondence analysis to fit just the off-diagonal tables of the Burt matrix - the measure of fit is then computed as the quality of explaining these tables only. The correspondence analysis of fuzzy coded data, called "fuzzy multiple correspondence analysis", suffers from the same problem, albeit attenuated. Again, one can count how many correct predictions are made of the categories which have highest degree of membership. But here one can also defuzzify the results of the analysis to obtain estimated values of the original data, and then calculate a measure of fit in the familiar percentage form, thanks to the resultant orthogonal decomposition of variance. Furthermore, if one thinks of fuzzy multiple correspondence analysis as explaining the two-way associations between variables, a fuzzy Burt matrix can be computed and the same strategy as in the crisp case can be applied to analyse the off-diagonal part of this matrix. In this paper these alternative measures of fit are defined and applied to a data set of continuous meteorological variables, which are coded crisply and fuzzily into three categories. Measuring the fit is further discussed when the data set consists of a mixture of discrete and continuous variables.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the standard kernel density estimate, it is known that one can tune the bandwidth such that the expected L1 error is within a constant factor of the optimal L1 error (obtained when one is allowed to choose the bandwidth with knowledge of the density). In this paper, we pose the same problem for variable bandwidth kernel estimates where the bandwidths are allowed to depend upon the location. We show in particular that for positive kernels on the real line, for any data-based bandwidth, there exists a densityfor which the ratio of expected L1 error over optimal L1 error tends to infinity. Thus, the problem of tuning the variable bandwidth in an optimal manner is ``too hard''. Moreover, from the class of counterexamples exhibited in the paper, it appears thatplacing conditions on the densities (monotonicity, convexity, smoothness) does not help.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the nature of health care provider choice inthe case of patient-initiated contacts, with special reference toa National Health Service setting, where monetary prices are zeroand general practitioners act as gatekeepers to publicly financedspecialized care. We focus our attention on the factors that mayexplain the continuously increasing use of hospital emergencyvisits as opposed to other provider alternatives. An extendedversion of a discrete choice model of demand for patient-initiatedcontacts is presented, allowing for individual and town residencesize differences in perceived quality (preferences) betweenalternative providers and including travel and waiting time asnon-monetary costs. Results of a nested multinomial logit model ofprovider choice are presented. Individual choice betweenalternatives considers, in a repeated nested structure, self-care,primary care, hospital and clinic emergency services. Welfareimplications and income effects are analyzed by computingcompensating variations, and by simulating the effects of userfees by levels of income. Results indicate that compensatingvariation per visit is higher than the direct marginal cost ofemergency visits, and consequently, emergency visits do not appearas an inefficient alternative even for non-urgent conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A biplot, which is the multivariate generalization of the two-variable scatterplot, can be used to visualize the results of many multivariate techniques, especially those that are based on the singular value decomposition. We consider data sets consisting of continuous-scale measurements, their fuzzy coding and the biplots that visualize them, using a fuzzy version of multiple correspondence analysis. Of special interest is the way quality of fit of the biplot is measured, since it is well-known that regular (i.e., crisp) multiple correspondence analysis seriously under-estimates this measure. We show how the results of fuzzy multiple correspondence analysis can be defuzzified to obtain estimated values of the original data, and prove that this implies an orthogonal decomposition of variance. This permits a measure of fit to be calculated in the familiar form of a percentage of explained variance, which is directly comparable to the corresponding fit measure used in principal component analysis of the original data. The approach is motivated initially by its application to a simulated data set, showing how the fuzzy approach can lead to diagnosing nonlinear relationships, and finally it is applied to a real set of meteorological data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many dynamic revenue management models divide the sale period into a finite number of periods T and assume, invoking a fine-enough grid of time, that each period sees at most one booking request. These Poisson-type assumptions restrict the variability of the demand in the model, but researchers and practitioners were willing to overlook this for the benefit of tractability of the models. In this paper, we criticize this model from another angle. Estimating the discrete finite-period model poses problems of indeterminacy and non-robustness: Arbitrarily fixing T leads to arbitrary control values and on the other hand estimating T from data adds an additional layer of indeterminacy. To counter this, we first propose an alternate finite-population model that avoids this problem of fixing T and allows a wider range of demand distributions, while retaining the useful marginal-value properties of the finite-period model. The finite-population model still requires jointly estimating market size and the parameters of the customer purchase model without observing no-purchases. Estimation of market-size when no-purchases are unobservable has rarely been attempted in the marketing or revenue management literature. Indeed, we point out that it is akin to the classical statistical problem of estimating the parameters of a binomial distribution with unknown population size and success probability, and hence likely to be challenging. However, when the purchase probabilities are given by a functional form such as a multinomial-logit model, we propose an estimation heuristic that exploits the specification of the functional form, the variety of the offer sets in a typical RM setting, and qualitative knowledge of arrival rates. Finally we perform simulations to show that the estimator is very promising in obtaining unbiased estimates of population size and the model parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Protein electrophoresis was used to assess the phylogenetic relationships of populations of the phenotypically variable Asian house shrew Suncus murinus. These populations represent a sample of both commensal and wild forms. They were compared to another taxon, S. montanus, which was formerly considered conspecific with S. murinus. Suncus dayi was used as an outgroup in all phylogenetic reconstructions. Within the S. murinus lineage, the allozyme data show very low levels of genetic differentiation among both wild and commensal Southeast Asian and Japanese samples when compared to the Indian populations. This pattern is consistent with the classical hypothesis of a recent introduction by man in Eastern Asia. The higher genetic diversity found within S. murinus from India, as well as previous mitochondrial and karyological results suggest that this area is the probable centre of origin for the species. Although the lack of gene flow between S. murinus and S. montanus is clearly established in an area of sympatry in Southern India, one Asian house shrew sampled in Nepal was more closely related to S. montanus. This could either reflect the retention of an ancestral polymorphism, or result from a hybridization episode between S. murinus and S. montanus. Similar conclusions were also suggested in mitochondrial DNA studies dealing with animals sampled in the Northern parts of the Indian subcontinent. Clearly, further data on Suncus from this area are needed in order to assess these hypotheses. (C) 1995 The Linnean Society of London

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this thesis is twofold. The first and major part is devoted to sensitivity analysis of various discrete optimization problems while the second part addresses methods applied for calculating measures of solution stability and solving multicriteria discrete optimization problems. Despite numerous approaches to stability analysis of discrete optimization problems two major directions can be single out: quantitative and qualitative. Qualitative sensitivity analysis is conducted for multicriteria discrete optimization problems with minisum, minimax and minimin partial criteria. The main results obtained here are necessary and sufficient conditions for different stability types of optimal solutions (or a set of optimal solutions) of the considered problems. Within the framework of quantitative direction various measures of solution stability are investigated. A formula for a quantitative characteristic called stability radius is obtained for the generalized equilibrium situation invariant to changes of game parameters in the case of the H¨older metric. Quality of the problem solution can also be described in terms of robustness analysis. In this work the concepts of accuracy and robustness tolerances are presented for a strategic game with a finite number of players where initial coefficients (costs) of linear payoff functions are subject to perturbations. Investigation of stability radius also aims to devise methods for its calculation. A new metaheuristic approach is derived for calculation of stability radius of an optimal solution to the shortest path problem. The main advantage of the developed method is that it can be potentially applicable for calculating stability radii of NP-hard problems. The last chapter of the thesis focuses on deriving innovative methods based on interactive optimization approach for solving multicriteria combinatorial optimization problems. The key idea of the proposed approach is to utilize a parameterized achievement scalarizing function for solution calculation and to direct interactive procedure by changing weighting coefficients of this function. In order to illustrate the introduced ideas a decision making process is simulated for three objective median location problem. The concepts, models, and ideas collected and analyzed in this thesis create a good and relevant grounds for developing more complicated and integrated models of postoptimal analysis and solving the most computationally challenging problems related to it.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A single electroconvulsive shock (ECS) or a sham ECS was administered to male 3-4-month-old Wistar rats 1, 2, and 4 h before training in an inhibitory avoidance test and in cued classical fear conditioning (measured by means of freezing time in a new environment). ECS impaired inhibitory avoidance at all times and, at 1 or 2 h before training, reduced freezing time before and after re-presentation of the ECS. These results are interpreted as a transient conditioned stimulus (CS)-induced anxiolytic or analgesic effect lasting about 2 h after a single treatment, in addition to the known amnesic effect of the stimulus. This suggests that the effect of anterograde learning impairment is demonstrated unequivocally only when the analgesic/anxiolytic effect is over (about 4 h after ECS administration) and that this impairment of learning is selective, affecting inhibitory avoidance but not classical fear conditioning to a discrete stimulus.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Early stimulation has been shown to produce long-lasting effects in many species. Prenatal exposure to some strong stressors may affect development of the nervous system leading to behavioral impairment in adult life. The purpose of the present work was to study the postnatal harmful effects of exposure to variable mild stresses in rats during pregnancy. Female Holtzman rats were submitted daily to one session of a chronic variable stress (CVS) during pregnancy (prenatal stress; PS group). Control pregnant rats (C group) were undisturbed. The pups of PS and C dams were weighed and separated into two groups 48 h after delivery. One group was maintained with their own dams (PS group, N = 70; C group, N = 36) while the other PS pups were cross-fostered with C dams (PSF group, N = 47) and the other C pups were cross-fostered with PS dams (CF group, N = 58). Pups were undisturbed until weaning (postnatal day 28). The male offspring underwent motor activity tests (day 28), enriched environment tests (day 37) and social interaction tests (day 42) in an animal activity monitor. Body weight was recorded on days 2, 28 and 60. The PS pups showed lower birth weight than C pups (Duncan's test, P<0.05). The PS pups suckling with their stressed mothers displayed greater preweaning mortality (C: 23%, PS: 60%; c2 test, P<0.05) and lower body weight than controls at days 28 and 60 (Duncan's test, P<0.05 and P<0.01, respectively). The PS, PSF and CF groups showed lower motor activity scores than controls when tested at day 28 (Duncan's test, P<0.01 for PS group and P<0.05 for CF and PSF groups). In the enriched environment test performed on day 37, between-group differences in total motor activity were not detected; however, the PS, CF and PSF groups displayed less exploration time than controls (Duncan's test, P<0.05). Only the PS group showed impaired motor activity and impaired social behavior at day 42 (Duncan's test, P<0.05). In fact, CVS treatment during gestation plus suckling with a previously stressed mother caused long-lasting physical and behavioral changes in rats. Cross-fostering PS-exposed pups to a dam which was not submitted to stress counteracted most of the harmful effects of the treatment. It is probable that prenatal stress plus suckling from a previously stressed mother can induce long-lasting changes in the neurotransmitter systems involved in emotional regulation. Further experiments using neurochemical and pharmacological approaches would be interesting in this model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we introduce a new approach for volatility modeling in discrete and continuous time. We follow the stochastic volatility literature by assuming that the variance is a function of a state variable. However, instead of assuming that the loading function is ad hoc (e.g., exponential or affine), we assume that it is a linear combination of the eigenfunctions of the conditional expectation (resp. infinitesimal generator) operator associated to the state variable in discrete (resp. continuous) time. Special examples are the popular log-normal and square-root models where the eigenfunctions are the Hermite and Laguerre polynomials respectively. The eigenfunction approach has at least six advantages: i) it is general since any square integrable function may be written as a linear combination of the eigenfunctions; ii) the orthogonality of the eigenfunctions leads to the traditional interpretations of the linear principal components analysis; iii) the implied dynamics of the variance and squared return processes are ARMA and, hence, simple for forecasting and inference purposes; (iv) more importantly, this generates fat tails for the variance and returns processes; v) in contrast to popular models, the variance of the variance is a flexible function of the variance; vi) these models are closed under temporal aggregation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La variable aleatoria es una función matemática que permite asignar valores numéricos a cada uno de los posibles resultados obtenidos en un evento de naturaleza aleatoria. Si el número de estos resultados se puede contar, se tiene un conjunto discreto; por el contrario, cuando el número de resultados es infinito y no se puede contar, se tiene un conjunto continuo. El objetivo de la variable aleatoria es permitir adelantar estudios probabilísticos y estadísticos a partir del establecimiento de una asignación numérica a través de la cual se identifiquen cada uno de los resultados que pueden ser obtenidos en el desarrollo de un evento determinado. El valor esperado y la varianza son los parámetros por medio de los cuales es posible caracterizar el comportamiento de los datos reunidos en el desarrollo de una situación experimental; el valor esperado permite establecer el valor sobre el cual se centra la distribución de la probabilidad, mientras que la varianza proporciona información acerca de la manera como se distribuyen los datos obtenidos. Adicionalmente, las distribuciones de probabilidad son funciones numéricas asociadas a la variable aleatoria que describen la asignación de probabilidad para cada uno de los elementos del espacio muestral y se caracterizan por ser un conjunto de parámetros que establecen su comportamiento funcional, es decir, cada uno de los parámetros propios de la distribución suministra información del experimento aleatorio al que se asocia. El documento se cierra con una aproximación de la variable aleatoria a procesos de toma de decisión que implican condiciones de riesgo e incertidumbre.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Estimation of a population size by means of capture-recapture techniques is an important problem occurring in many areas of life and social sciences. We consider the frequencies of frequencies situation, where a count variable is used to summarize how often a unit has been identified in the target population of interest. The distribution of this count variable is zero-truncated since zero identifications do not occur in the sample. As an application we consider the surveillance of scrapie in Great Britain. In this case study holdings with scrapie that are not identified (zero counts) do not enter the surveillance database. The count variable of interest is the number of scrapie cases per holding. For count distributions a common model is the Poisson distribution and, to adjust for potential heterogeneity, a discrete mixture of Poisson distributions is used. Mixtures of Poissons usually provide an excellent fit as will be demonstrated in the application of interest. However, as it has been recently demonstrated, mixtures also suffer under the so-called boundary problem, resulting in overestimation of population size. It is suggested here to select the mixture model on the basis of the Bayesian Information Criterion. This strategy is further refined by employing a bagging procedure leading to a series of estimates of population size. Using the median of this series, highly influential size estimates are avoided. In limited simulation studies it is shown that the procedure leads to estimates with remarkable small bias.