955 resultados para Conditional entropy


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Epithelial sodium channels (ENaC) are members of the degenerin/ENaC superfamily of non-voltage-gated, highly amiloride-sensitive cation channels that are composed of three subunits (alpha-, beta-, and gamma-ENaC). Since complete gene inactivation of the beta- and gamma-ENaC subunit genes (Scnn1b and Scnn1g) leads to early postnatal death, we generated conditional alleles and obtained mice harboring floxed and null alleles for both gene loci. Using quantitative RT-PCR analysis, we showed that the introduction of the loxP sites did not interfere with the mRNA transcript expression level of the Scnn1b and Scnn1g gene locus, respectively. Upon a regular and salt-deficient diet, both beta- and gamma-ENaC floxed mice showed no difference in their mRNA transcript expression levels, plasma electrolytes, and aldosterone concentrations as well as weight changes compared with control animals. These mice can now be utilized to dissect the role of ENaC function in classical and nonclassic target organs/tissues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose new methods for evaluating predictive densities. The methods includeKolmogorov-Smirnov and Cram?r-von Mises-type tests for the correct specification ofpredictive densities robust to dynamic mis-specification. The novelty is that the testscan detect mis-specification in the predictive densities even if it appears only overa fraction of the sample, due to the presence of instabilities. Our results indicatethat our tests are well sized and have good power in detecting mis-specification inpredictive densities, even when it is time-varying. An application to density forecastsof the Survey of Professional Forecasters demonstrates the usefulness of the proposedmethodologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many governments in developing countries implement programs that aim to address nutrionalfailures in early childhood, yet evidence on the effectiveness of these interventions is scant. Thispaper evaluates the impact of a conditional food supplementation program on child mortality inEcuador. The Programa de Alimentaci?n y Nutrici?n Nacional (PANN) 2000 was implementedby regular staff at local public health posts and consisted of offering a free micronutrient-fortifiedfood, Mi Papilla, for children aged 6 to 24 months in exchange for routine health check-ups forthe children. Our regression discontinuity design exploits the fact that at its inception, the PANN2000 was running for about 8 months only in the poorest communities (parroquias) of certainprovinces. Our main result is that the presence of the program reduced child mortality in cohortswith 8 months of differential exposure from a level of about 2.5 percent by 1 to 1.5 percentagepoints.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Geophysical techniques can help to bridge the inherent gap with regard to spatial resolution and the range of coverage that plagues classical hydrological methods. This has lead to the emergence of the new and rapidly growing field of hydrogeophysics. Given the differing sensitivities of various geophysical techniques to hydrologically relevant parameters and their inherent trade-off between resolution and range the fundamental usefulness of multi-method hydrogeophysical surveys for reducing uncertainties in data analysis and interpretation is widely accepted. A major challenge arising from such endeavors is the quantitative integration of the resulting vast and diverse database in order to obtain a unified model of the probed subsurface region that is internally consistent with all available data. To address this problem, we have developed a strategy towards hydrogeophysical data integration based on Monte-Carlo-type conditional stochastic simulation that we consider to be particularly suitable for local-scale studies characterized by high-resolution and high-quality datasets. Monte-Carlo-based optimization techniques are flexible and versatile, allow for accounting for a wide variety of data and constraints of differing resolution and hardness and thus have the potential of providing, in a geostatistical sense, highly detailed and realistic models of the pertinent target parameter distributions. Compared to more conventional approaches of this kind, our approach provides significant advancements in the way that the larger-scale deterministic information resolved by the hydrogeophysical data can be accounted for, which represents an inherently problematic, and as of yet unresolved, aspect of Monte-Carlo-type conditional simulation techniques. We present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on pertinent synthetic data and then applied to corresponding field data collected at the Boise Hydrogeophysical Research Site near Boise, Idaho, USA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract: Asthma prevalence in children and adolescents in Spain is 10-17%. It is the most common chronic illness during childhood. Prevalence has been increasing over the last 40 years and there is considerable evidence that, among other factors, continued exposure to cigarette smoke results in asthma in children. No statistical or simulation model exist to forecast the evolution of childhood asthma in Europe. Such a model needs to incorporate the main risk factors that can be managed by medical authorities, such as tobacco (OR = 1.44), to establish how they affect the present generation of children. A simulation model using conditional probability and discrete event simulation for childhood asthma was developed and validated by simulating realistic scenario. The parameters used for the model (input data) were those found in the bibliography, especially those related to the incidence of smoking in Spain. We also used data from a panel of experts from the Hospital del Mar (Barcelona) related to actual evolution and asthma phenotypes. The results obtained from the simulation established a threshold of a 15-20% smoking population for a reduction in the prevalence of asthma. This is still far from the current level in Spain, where 24% of people smoke. We conclude that more effort must be made to combat smoking and other childhood asthma risk factors, in order to significantly reduce the number of cases. Once completed, this simulation methodology can realistically be used to forecast the evolution of childhood asthma as a function of variation in different risk factors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simulated-annealing-based conditional simulations provide a flexible means of quantitatively integrating diverse types of subsurface data. Although such techniques are being increasingly used in hydrocarbon reservoir characterization studies, their potential in environmental, engineering and hydrological investigations is still largely unexploited. Here, we introduce a novel simulated annealing (SA) algorithm geared towards the integration of high-resolution geophysical and hydrological data which, compared to more conventional approaches, provides significant advancements in the way that large-scale structural information in the geophysical data is accounted for. Model perturbations in the annealing procedure are made by drawing from a probability distribution for the target parameter conditioned to the geophysical data. This is the only place where geophysical information is utilized in our algorithm, which is in marked contrast to other approaches where model perturbations are made through the swapping of values in the simulation grid and agreement with soft data is enforced through a correlation coefficient constraint. Another major feature of our algorithm is the way in which available geostatistical information is utilized. Instead of constraining realizations to match a parametric target covariance model over a wide range of spatial lags, we constrain the realizations only at smaller lags where the available geophysical data cannot provide enough information. Thus we allow the larger-scale subsurface features resolved by the geophysical data to have much more due control on the output realizations. Further, since the only component of the SA objective function required in our approach is a covariance constraint at small lags, our method has improved convergence and computational efficiency over more traditional methods. Here, we present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on a synthetic data set, and then applied to data collected at the Boise Hydrogeophysical Research Site.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We answer the following question: given any n∈ℕ, which is the minimum number of endpoints en of a tree admitting a zero-entropy map f with a periodic orbit of period n? We prove that en=s1s2…sk−∑i=2ksisi+1…sk, where n=s1s2…sk is the decomposition of n into a product of primes such that si≤si+1 for 1≤ientropy: if f has a periodic orbit of period m with em>e, then the topological entropy of f is positive

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Control of a chaotic system by homogeneous nonlinear driving, when a conditional Lyapunov exponent is zero, may give rise to special and interesting synchronizationlike behaviors in which the response evolves in perfect correlation with the drive. Among them, there are the amplification of the drive attractor and the shift of it to a different region of phase space. In this paper, these synchronizationlike behaviors are discussed, and demonstrated by computer simulation of the Lorentz model [E. N. Lorenz, J. Atmos. Sci. 20 130 (1963)] and the double scroll [T. Matsumoto, L. O. Chua, and M. Komuro, IEEE Trans. CAS CAS-32, 798 (1985)].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is concerned with the derivation of new estimators and performance bounds for the problem of timing estimation of (linearly) digitally modulated signals. The conditional maximum likelihood (CML) method is adopted, in contrast to the classical low-SNR unconditional ML (UML) formulationthat is systematically applied in the literature for the derivationof non-data-aided (NDA) timing-error-detectors (TEDs). A new CML TED is derived and proved to be self-noise free, in contrast to the conventional low-SNR-UML TED. In addition, the paper provides a derivation of the conditional Cramér–Rao Bound (CRB ), which is higher (less optimistic) than the modified CRB (MCRB)[which is only reached by decision-directed (DD) methods]. It is shown that the CRB is a lower bound on the asymptotic statisticalaccuracy of the set of consistent estimators that are quadratic with respect to the received signal. Although the obtained boundis not general, it applies to most NDA synchronizers proposed in the literature. A closed-form expression of the conditional CRBis obtained, and numerical results confirm that the CML TED attains the new bound for moderate to high Eg/No.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The continuous wavelet transform is obtained as a maximumentropy solution of the corresponding inverse problem. It is well knownthat although a signal can be reconstructed from its wavelet transform,the expansion is not unique due to the redundancy of continuous wavelets.Hence, the inverse problem has no unique solution. If we want to recognizeone solution as "optimal", then an appropriate decision criterion hasto be adopted. We show here that the continuous wavelet transform is an"optimal" solution in a maximum entropy sense.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A detailed mathematical analysis on the q = 1/2 non-extensive maximum entropydistribution of Tsallis' is undertaken. The analysis is based upon the splitting of such adistribution into two orthogonal components. One of the components corresponds to theminimum norm solution of the problem posed by the fulfillment of the a priori conditionson the given expectation values. The remaining component takes care of the normalizationconstraint and is the projection of a constant onto the Null space of the "expectation-values-transformation"

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A regularization method based on the non-extensive maximum entropy principle is devised. Special emphasis is given to the q=1/2 case. We show that, when the residual principle is considered as constraint, the q=1/2 generalized distribution of Tsallis yields a regularized solution for bad-conditioned problems. The so devised regularized distribution is endowed with a component which corresponds to the well known regularized solution of Tikhonov (1977).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A maximum entropy statistical treatment of an inverse problem concerning frame theory is presented. The problem arises from the fact that a frame is an overcomplete set of vectors that defines a mapping with no unique inverse. Although any vector in the concomitant space can be expressed as a linear combination of frame elements, the coefficients of the expansion are not unique. Frame theory guarantees the existence of a set of coefficients which is “optimal” in a minimum norm sense. We show here that these coefficients are also “optimal” from a maximum entropy viewpoint.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose new methods for evaluating predictive densities that focus on the models' actual predictive ability in finite samples. The tests offer a simple way of evaluatingthe correct specification of predictive densities, either parametric or non-parametric.The results indicate that our tests are well sized and have good power in detecting mis-specification in predictive densities. An empirical application to the Survey ofProfessional Forecasters and a baseline Dynamic Stochastic General Equilibrium modelshows the usefulness of our methodology.