989 resultados para Progressive Asymptotic Approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several Authors Have Discussed Recently the Limited Dependent Variable Regression Model with Serial Correlation Between Residuals. the Pseudo-Maximum Likelihood Estimators Obtained by Ignoring Serial Correlation Altogether, Have Been Shown to Be Consistent. We Present Alternative Pseudo-Maximum Likelihood Estimators Which Are Obtained by Ignoring Serial Correlation Only Selectively. Monte Carlo Experiments on a Model with First Order Serial Correlation Suggest That Our Alternative Estimators Have Substantially Lower Mean-Squared Errors in Medium Size and Small Samples, Especially When the Serial Correlation Coefficient Is High. the Same Experiments Also Suggest That the True Level of the Confidence Intervals Established with Our Estimators by Assuming Asymptotic Normality, Is Somewhat Lower Than the Intended Level. Although the Paper Focuses on Models with Only First Order Serial Correlation, the Generalization of the Proposed Approach to Serial Correlation of Higher Order Is Also Discussed Briefly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The technique of Monte Carlo (MC) tests [Dwass (1957), Barnard (1963)] provides an attractive method of building exact tests from statistics whose finite sample distribution is intractable but can be simulated (provided it does not involve nuisance parameters). We extend this method in two ways: first, by allowing for MC tests based on exchangeable possibly discrete test statistics; second, by generalizing the method to statistics whose null distributions involve nuisance parameters (maximized MC tests, MMC). Simplified asymptotically justified versions of the MMC method are also proposed and it is shown that they provide a simple way of improving standard asymptotics and dealing with nonstandard asymptotics (e.g., unit root asymptotics). Parametric bootstrap tests may be interpreted as a simplified version of the MMC method (without the general validity properties of the latter).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La théorie de l'information quantique étudie les limites fondamentales qu'imposent les lois de la physique sur les tâches de traitement de données comme la compression et la transmission de données sur un canal bruité. Cette thèse présente des techniques générales permettant de résoudre plusieurs problèmes fondamentaux de la théorie de l'information quantique dans un seul et même cadre. Le théorème central de cette thèse énonce l'existence d'un protocole permettant de transmettre des données quantiques que le receveur connaît déjà partiellement à l'aide d'une seule utilisation d'un canal quantique bruité. Ce théorème a de plus comme corollaires immédiats plusieurs théorèmes centraux de la théorie de l'information quantique. Les chapitres suivants utilisent ce théorème pour prouver l'existence de nouveaux protocoles pour deux autres types de canaux quantiques, soit les canaux de diffusion quantiques et les canaux quantiques avec information supplémentaire fournie au transmetteur. Ces protocoles traitent aussi de la transmission de données quantiques partiellement connues du receveur à l'aide d'une seule utilisation du canal, et ont comme corollaires des versions asymptotiques avec et sans intrication auxiliaire. Les versions asymptotiques avec intrication auxiliaire peuvent, dans les deux cas, être considérées comme des versions quantiques des meilleurs théorèmes de codage connus pour les versions classiques de ces problèmes. Le dernier chapitre traite d'un phénomène purement quantique appelé verrouillage: il est possible d'encoder un message classique dans un état quantique de sorte qu'en lui enlevant un sous-système de taille logarithmique par rapport à sa taille totale, on puisse s'assurer qu'aucune mesure ne puisse avoir de corrélation significative avec le message. Le message se trouve donc «verrouillé» par une clé de taille logarithmique. Cette thèse présente le premier protocole de verrouillage dont le critère de succès est que la distance trace entre la distribution jointe du message et du résultat de la mesure et le produit de leur marginales soit suffisamment petite.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We derive a universal model for atom pairs interacting with non-resonant light via the polarizability anisotropy, based on the long range properties of the scattering. The corresponding dynamics can be obtained using a nodal line technique to solve the asymptotic Schrödinger equation. It consists of imposing physical boundary conditions at long range and vanishing the wavefunction at a position separating the inner zone and the asymptotic region. We show that nodal lines which depend on the intensity of the non-resonant light can satisfactorily account for the effect of the polarizability at short range. The approach allows to determine the resonance structure, energy, width, channel mixing and hybridization even for narrow resonances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-resonant light interacting with diatomics via the polarizability anisotropy couples different rotational states and may lead to strong hybridization of the motion. The modification of shape resonances and low-energy scattering states due to this interaction can be fully captured by an asymptotic model, based on the long-range properties of the scattering (Crubellier et al 2015 New J. Phys. 17 045020). Remarkably, the properties of the field-dressed shape resonances in this asymptotic multi-channel description are found to be approximately linear in the field intensity up to fairly large intensity. This suggests a perturbative single-channel approach to be sufficient to study the control of such resonances by the non-resonant field. The multi-channel results furthermore indicate the dependence on field intensity to present, at least approximately, universal characteristics. Here we combine the nodal line technique to solve the asymptotic Schrödinger equation with perturbation theory. Comparing our single channel results to those obtained with the full interaction potential, we find nodal lines depending only on the field-free scattering length of the diatom to yield an approximate but universal description of the field-dressed molecule, confirming universal behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La crisis que se desató en el mercado hipotecario en Estados Unidos en 2008 y que logró propagarse a lo largo de todo sistema financiero, dejó en evidencia el nivel de interconexión que actualmente existe entre las entidades del sector y sus relaciones con el sector productivo, dejando en evidencia la necesidad de identificar y caracterizar el riesgo sistémico inherente al sistema, para que de esta forma las entidades reguladoras busquen una estabilidad tanto individual, como del sistema en general. El presente documento muestra, a través de un modelo que combina el poder informativo de las redes y su adecuación a un modelo espacial auto regresivo (tipo panel), la importancia de incorporar al enfoque micro-prudencial (propuesto en Basilea II), una variable que capture el efecto de estar conectado con otras entidades, realizando así un análisis macro-prudencial (propuesto en Basilea III).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L'ús d'esperma criopreservada en la inseminació artificial (IA) d'espècies d'interès productiu permet un major control sanitari i la creació de bancs de germoplasma d'alt valor genètic, entre d'altres avantatges. En el mercat porcí la major part de les inseminacions són encara realitzades amb semen refrigerat degut a l'èxit de l'aplicació de diluents de llarga durada i també a causa de la sensibilitat de l'esperma porcina a la criopreservació. Malgrat que aquesta sensibilitat ve donada per característiques particulars de la fisiologia espermàtica en l'espècie, algunes ejaculacions mantenen els paràmetres de qualitat espermàtica després de la criopreservació (ejaculacions amb bona "congelabilitat", GFEs) enfront d'altres que no sobreviuen al procés (ejaculacions amb mala "congelabilitat", PFEs). El primer objectiu de l'estudi va ser comparar ambdós grups en termes de fertilitat in vivo. El segon objectiu va ser testar l'eficiència de la inseminació postcervical (post-CAI) amb l'esperma criopreservada. El tercer objectiu va ser buscar predictors de la congelabilitat de les ejaculacions, tant en les GFEs com en les PFEs i en tres passos del procés de criopreservació (a 17ºC, a 5ºC i a 240 min postdescongelació). Aquest objectiu es va dur a terme mitjançant l'avaluació de paràmetres convencionals de qualitat espermàtica i a través de l'estudi de la localització i la reactivitat sota el microscopi de tres proteïnes (GLUT3, HSP90AA1 i Cu/ZnSOD) relacionades amb la fisiologia espermàtica i amb possibles rols en la congelabilitat. El quart objectiu va ser quantificar l'expressió de les tres proteïnes per transferència western, tant en espermatozoides d'ejaculacions GFEs com en els d'ejaculacions PFEs i en els tres passos abans esmentats, per tal de determinar el seu potencial com a predictores de la congelabilitat. Pel primer i el segon objectiu, 86 truges van ser inseminades per post-CAI amb 26 ejaculacions de mascles Piétrain dividides en una porció refrigerada a 17ºC (tractament control) i una porció criopreservada, ambdues porcions classificades alhora com a GFEs o PFEs. Els resultats més rellevants van demostrar que les probabilitats d'embaràs eren dues vegades menors en inseminacions amb esperma criopreservada d'ejaculacions PFEs (P < 0.05) que en inseminacions amb esperma criopreservada d'ejaculacions GFEs, fet que indica que les ejaculacions amb percentatges elevats d'espermatozoides mòbils progressius i d'integritat de membrana (per sobre del 40% en les GFEs) són més favorables a provocar embarassos que no pas aquelles ejaculacions amb una pobra funció espermàtica in vitro (PFEs). Ni el nombre de truges que van donar a llum, ni la quantitat de garrins, ni el risc de reflux espermàtic van ser significativament diferents entre les inseminacions amb esperma criopreservada d'ejaculacions GFEs i les inseminacions control amb semen refrigerat, la qual cosa demostra la bona aplicabilitat de la inseminació post-CAI amb l'esperma criopreservada. Finalment, pel tercer i quart objectius van ser criopreservades 29 i 11 ejaculacions de mascles Piétrain, respectivament. Dos paràmetres cinètics espermàtics, la linealitat (LIN) i la rectitud (STR), van mostrar una hiperactivació de la mobilitat superior en les ejaculacions PFEs que en les GFEs després de 30 min a 5ºC durant la criopreservació. A més, la combinació d'ambdós paràmetres va donar una fiabilitat propera al 72% en la predicció de la congelabilitat de les ejaculacions porcines. Tot i que no va ser possible predir la congelabilitat mitjançant l'avaluació de les tres proteïnes al microscopi, els resultats de transferència western van revelar diferències en l'expressió de la HSP90AA1 en l'esperma a 17ºC, molt possiblement relacionades amb la millor supervivència a la criopreservació dels espermatozoides d'ejaculacions GFEs. Aquests resultats suggereixen que la promoció de la criopreservació d'esperma porcina per la seva aplicació en IA passa pel desenvolupament de tests per la predicció de la congelabilitat en semen refrigerat.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of estimating the individual probabilities of a discrete distribution is considered. The true distribution of the independent observations is a mixture of a family of power series distributions. First, we ensure identifiability of the mixing distribution assuming mild conditions. Next, the mixing distribution is estimated by non-parametric maximum likelihood and an estimator for individual probabilities is obtained from the corresponding marginal mixture density. We establish asymptotic normality for the estimator of individual probabilities by showing that, under certain conditions, the difference between this estimator and the empirical proportions is asymptotically negligible. Our framework includes Poisson, negative binomial and logarithmic series as well as binomial mixture models. Simulations highlight the benefit in achieving normality when using the proposed marginal mixture density approach instead of the empirical one, especially for small sample sizes and/or when interest is in the tail areas. A real data example is given to illustrate the use of the methodology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The climate belongs to the class of non-equilibrium forced and dissipative systems, for which most results of quasi-equilibrium statistical mechanics, including the fluctuation-dissipation theorem, do not apply. In this paper we show for the first time how the Ruelle linear response theory, developed for studying rigorously the impact of perturbations on general observables of non-equilibrium statistical mechanical systems, can be applied with great success to analyze the climatic response to general forcings. The crucial value of the Ruelle theory lies in the fact that it allows to compute the response of the system in terms of expectation values of explicit and computable functions of the phase space averaged over the invariant measure of the unperturbed state. We choose as test bed a classical version of the Lorenz 96 model, which, in spite of its simplicity, has a well-recognized prototypical value as it is a spatially extended one-dimensional model and presents the basic ingredients, such as dissipation, advection and the presence of an external forcing, of the actual atmosphere. We recapitulate the main aspects of the general response theory and propose some new general results. We then analyze the frequency dependence of the response of both local and global observables to perturbations having localized as well as global spatial patterns. We derive analytically several properties of the corresponding susceptibilities, such as asymptotic behavior, validity of Kramers-Kronig relations, and sum rules, whose main ingredient is the causality principle. We show that all the coefficients of the leading asymptotic expansions as well as the integral constraints can be written as linear function of parameters that describe the unperturbed properties of the system, such as its average energy. Some newly obtained empirical closure equations for such parameters allow to define such properties as an explicit function of the unperturbed forcing parameter alone for a general class of chaotic Lorenz 96 models. We then verify the theoretical predictions from the outputs of the simulations up to a high degree of precision. The theory is used to explain differences in the response of local and global observables, to define the intensive properties of the system, which do not depend on the spatial resolution of the Lorenz 96 model, and to generalize the concept of climate sensitivity to all time scales. We also show how to reconstruct the linear Green function, which maps perturbations of general time patterns into changes in the expectation value of the considered observable for finite as well as infinite time. Finally, we propose a simple yet general methodology to study general Climate Change problems on virtually any time scale by resorting to only well selected simulations, and by taking full advantage of ensemble methods. The specific case of globally averaged surface temperature response to a general pattern of change of the CO2 concentration is discussed. We believe that the proposed approach may constitute a mathematically rigorous and practically very effective way to approach the problem of climate sensitivity, climate prediction, and climate change from a radically new perspective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During Oceanic Anoxic Event 1a (OAE 1a, 120 Ma; Li et al., 2008), organic carbon-rich layers were deposited in marine environments under anoxic conditions on a global scale. In this study, palaeoenvironmental conditions leading to this event are characterised by studying the Upper Barremian to the Lower Aptian succession of the Gorgo a Cerbara section (central Italy). For this, an integrated multi-proxy approach (δ13Ccarb; δ13Corg; δ18O; phosphorus; Total Organic Carbon, TOC; bulk-rock mineralogy, as well as redox-sensitive trace elements — RSTEs) has been applied. During the LateBarremian, thin organic-rich layers occur episodically, and associated Corg:Ptot ratios indicate the presence of intermittent dysoxic to anoxic conditions. Coarse correlations are observed between TOC, P and biogenic silica contents, indicating links between P availability, productivity, and TOC preservation. However, the corresponding δ13Ccarb and δ18O records remain quite stable, indicating that these brief periods of enhanced TOC preservation did not have sufficient impact on the marine carbon reservoir to deviate δ13C records. Around the Barremian–Aptian boundary, TOC-enriched layers become more frequent. These layers correlate with negative excursions in the δ13Ccarb and δ13Corg records, possibly due to a warming period as indicated by the δ18O record. During the earliest Aptian, this warming trend is reverted into a cooling trend, which is then followed by an important warming step near the onset of Oceanic Anoxic Event 1a (OAE 1a). During this time period, organic-rich intervals occur, which are characterised by the progressive increase in RSTE. The warming step prior the onset of OAE 1a is associated with the well-known negative spike in δ13Ccarb and δ13Corg records, an important peak in P accumulation, RSTE enrichments and Corg:Ptot ratios indicating the prevalence of anoxic conditions. The Selli Level itself may document a cooling phase. RSTE enrichments and Corg:Ptot ratios confirm the importance of anoxic conditions during OAE 1a at this site. The Gorgo a Cerbara section is interpreted to reflect the progressive impact of palaeoenvironmental change related to the formation of the Ontong-Java plate-basalt plateau, which started already around the Barremian–Aptian boundary and culminated into OAE 1a.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Alzheimer's disease (AD) is a progressive neurodegenerative disease characterized in the brain by the formation of amyloid-beta (Aβ)-containing plaques and neurofibrillary tangles containing the microtubule-associated protein tau. Neuroinflammation is another feature of AD and astrocytes are receiving increasing attention as key contributors. Although some progress has been made, the molecular mechanisms underlying the pathophysiology of AD remain unclear. Interestingly, some of the main proteins involved in AD, including amyloid precursor protein (APP) and tau, have recently been shown to be SUMOylated. The post-translational modification by SUMO (small ubiquitin-like modifier) has been shown to regulate APP and tau and may modulate other proteins implicated in AD. Here we present an overview of recent studies suggesting that protein SUMOylation might be involved in the underlying pathogenic mechanisms of AD and discuss how this could be exploited for therapeutic intervention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a novel approach for multispectral image contextual classification by combining iterative combinatorial optimization algorithms. The pixel-wise decision rule is defined using a Bayesian approach to combine two MRF models: a Gaussian Markov Random Field (GMRF) for the observations (likelihood) and a Potts model for the a priori knowledge, to regularize the solution in the presence of noisy data. Hence, the classification problem is stated according to a Maximum a Posteriori (MAP) framework. In order to approximate the MAP solution we apply several combinatorial optimization methods using multiple simultaneous initializations, making the solution less sensitive to the initial conditions and reducing both computational cost and time in comparison to Simulated Annealing, often unfeasible in many real image processing applications. Markov Random Field model parameters are estimated by Maximum Pseudo-Likelihood (MPL) approach, avoiding manual adjustments in the choice of the regularization parameters. Asymptotic evaluations assess the accuracy of the proposed parameter estimation procedure. To test and evaluate the proposed classification method, we adopt metrics for quantitative performance assessment (Cohen`s Kappa coefficient), allowing a robust and accurate statistical analysis. The obtained results clearly show that combining sub-optimal contextual algorithms significantly improves the classification performance, indicating the effectiveness of the proposed methodology. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the connection between information and copula theories by showing that a copula can be employed to decompose the information content of a multivariate distribution into marginal and dependence components, with the latter quantified by the mutual information. We define the information excess as a measure of deviation from a maximum-entropy distribution. The idea of marginal invariant dependence measures is also discussed and used to show that empirical linear correlation underestimates the amplitude of the actual correlation in the case of non-Gaussian marginals. The mutual information is shown to provide an upper bound for the asymptotic empirical log-likelihood of a copula. An analytical expression for the information excess of T-copulas is provided, allowing for simple model identification within this family. We illustrate the framework in a financial data set. Copyright (C) EPLA, 2009

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parkinson’s disease is a clinical syndrome manifesting with slowness and instability. As it is a progressive disease with varying symptoms, repeated assessments are necessary to determine the outcome of treatment changes in the patient. In the recent past, a computer-based method was developed to rate impairment in spiral drawings. The downside of this method is that it cannot separate the bradykinetic and dyskinetic spiral drawings. This work intends to construct the computer method which can overcome this weakness by using the Hilbert-Huang Transform (HHT) of tangential velocity. The work is done under supervised learning, so a target class is used which is acquired from a neurologist using a web interface. After reducing the dimension of HHT features by using PCA, classification is performed. C4.5 classifier is used to perform the classification. Results of the classification are close to random guessing which shows that the computer method is unsuccessful in assessing the cause of drawing impairment in spirals when evaluated against human ratings. One promising reason is that there is no difference between the two classes of spiral drawings. Displaying patients self ratings along with the spirals in the web application is another possible reason for this, as the neurologist may have relied too much on this in his own ratings.