968 resultados para Poisson-Boltzmann
Resumo:
We include solvation effects in tight-binding Hamiltonians for hole states in DNA. The corresponding linear-response parameters are derived from accurate estimates of solvation energy calculated for several hole charge distributions in DNA stacks. Two models are considered: (A) the correction to a diagonal Hamiltonian matrix element depends only on the charge localized on the corresponding site and (B) in addition to this term, the reaction field due to adjacent base pairs is accounted for. We show that both schemes give very similar results. The effects of the polar medium on the hole distribution in DNA are studied. We conclude that the effects of polar surroundings essentially suppress charge delocalization in DNA, and hole states in (GC)n sequences are localized on individual guanines
Resumo:
Early detection of breast cancer (BC) with mammography may cause overdiagnosis and overtreatment, detecting tumors which would remain undiagnosed during a lifetime. The aims of this study were: first, to model invasive BC incidence trends in Catalonia (Spain) taking into account reproductive and screening data; and second, to quantify the extent of BC overdiagnosis. We modeled the incidence of invasive BC using a Poisson regression model. Explanatory variables were: age at diagnosis and cohort characteristics (completed fertility rate, percentage of women that use mammography at age 50, and year of birth). This model also was used to estimate the background incidence in the absence of screening. We used a probabilistic model to estimate the expected BC incidence if women in the population used mammography as reported in health surveys. The difference between the observed and expected cumulative incidences provided an estimate of overdiagnosis.Incidence of invasive BC increased, especially in cohorts born from 1940 to 1955. The biggest increase was observed in these cohorts between the ages of 50 to 65 years, where the final BC incidence rates more than doubled the initial ones. Dissemination of mammography was significantly associated with BC incidence and overdiagnosis. Our estimates of overdiagnosis ranged from 0.4% to 46.6%, for women born around 1935 and 1950, respectively.Our results support the existence of overdiagnosis in Catalonia attributed to mammography usage, and the limited malignant potential of some tumors may play an important role. Women should be better informed about this risk. Research should be oriented towards personalized screening and risk assessment tools
Resumo:
In recent years, some epidemiologic studies have attributed adverse effects of air pollutants on health not only to particles and sulfur dioxide but also to photochemical air pollutants (nitrogen dioxide and ozone). The effects are usually small, leading to some inconsistencies in the results of the studies. Furthermore, the different methodologic approaches of the studies used has made it difficult to derive generic conclusions. We provide here a quantitative summary of the short-term effects of photochemical air pollutants on mortality in seven Spanish cities involved in the EMECAM project, using generalized additive models from analyses of single and multiple pollutants. Nitrogen dioxide and ozone data were provided by seven EMECAM cities (Barcelona, Gijón, Huelva, Madrid, Oviedo, Seville, and Valencia). Mortality indicators included daily total mortality from all causes excluding external causes, daily cardiovascular mortality, and daily respiratory mortality. Individual estimates, obtained from city-specific generalized additive Poisson autoregressive models, were combined by means of fixed effects models and, if significant heterogeneity among local estimates was found, also by random effects models. Significant positive associations were found between daily mortality (all causes and cardiovascular) and NO2, once the rest of air pollutants were taken into account. A 10 μg/m3 increase in the 24-hr average 1-day NO2 level was associated with an increase in the daily number of deaths of 0.43% [95% confidence interval(CI), –0.003–0.86%] for all causes excluding external. In the case of significant relationships, relative risks for cause-specific mortality were nearly twice as much as that for total mortality for all the photochemical pollutants. Ozone was independently related only to cardiovascular daily mortality. No independent statistically significant relationship between photochemical air pollutants and respiratory mortality was found. The results in this study suggest that, given the present levels of photochemical pollutants, people living in Spanish cities are exposed to health risks derived from air pollution
Resumo:
The objective of this paper is to introduce a diVerent approach, called the ecological-longitudinal, to carrying out pooled analysis in time series ecological studies. Because it gives a larger number of data points and, hence, increases the statistical power of the analysis, this approach, unlike conventional ones, allows the complementation of aspects such as accommodation of random effect models, of lags, of interaction between pollutants and between pollutants and meteorological variables, that are hardly implemented in conventional approaches. Design—The approach is illustrated by providing quantitative estimates of the short-termeVects of air pollution on mortality in three Spanish cities, Barcelona,Valencia and Vigo, for the period 1992–1994. Because the dependent variable was a count, a Poisson generalised linear model was first specified. Several modelling issues are worth mentioning. Firstly, because the relations between mortality and explanatory variables were nonlinear, cubic splines were used for covariate control, leading to a generalised additive model, GAM. Secondly, the effects of the predictors on the response were allowed to occur with some lag. Thirdly, the residual autocorrelation, because of imperfect control, was controlled for by means of an autoregressive Poisson GAM. Finally, the longitudinal design demanded the consideration of the existence of individual heterogeneity, requiring the consideration of mixed models. Main results—The estimates of the relative risks obtained from the individual analyses varied across cities, particularly those associated with sulphur dioxide. The highest relative risks corresponded to black smoke in Valencia. These estimates were higher than those obtained from the ecological-longitudinal analysis. Relative risks estimated from this latter analysis were practically identical across cities, 1.00638 (95% confidence intervals 1.0002, 1.0011) for a black smoke increase of 10 μg/m3 and 1.00415 (95% CI 1.0001, 1.0007) for a increase of 10 μg/m3 of sulphur dioxide. Because the statistical power is higher than in the individual analysis more interactions were statistically significant,especially those among air pollutants and meteorological variables. Conclusions—Air pollutant levels were related to mortality in the three cities of the study, Barcelona, Valencia and Vigo. These results were consistent with similar studies in other cities, with other multicentric studies and coherent with both, previous individual, for each city, and multicentric studies for all three cities
Resumo:
During the last part of the 1990s the chance of surviving breast cancer increased. Changes in survival functions reflect a mixture of effects. Both, the introduction of adjuvant treatments and early screening with mammography played a role in the decline in mortality. Evaluating the contribution of these interventions using mathematical models requires survival functions before and after their introduction. Furthermore, required survival functions may be different by age groups and are related to disease stage at diagnosis. Sometimes detailed information is not available, as was the case for the region of Catalonia (Spain). Then one may derive the functions using information from other geographical areas. This work presents the methodology used to estimate age- and stage-specific Catalan breast cancer survival functions from scarce Catalan survival data by adapting the age- and stage-specific US functions. Methods: Cubic splines were used to smooth data and obtain continuous hazard rate functions. After, we fitted a Poisson model to derive hazard ratios. The model included time as a covariate. Then the hazard ratios were applied to US survival functions detailed by age and stage to obtain Catalan estimations. Results: We started estimating the hazard ratios for Catalonia versus the USA before and after the introduction of screening. The hazard ratios were then multiplied by the age- and stage-specific breast cancer hazard rates from the USA to obtain the Catalan hazard rates. We also compared breast cancer survival in Catalonia and the USA in two time periods, before cancer control interventions (USA 1975–79, Catalonia 1980–89) and after (USA and Catalonia 1990–2001). Survival in Catalonia in the 1980–89 period was worse than in the USA during 1975–79, but the differences disappeared in 1990–2001. Conclusion: Our results suggest that access to better treatments and quality of care contributed to large improvements in survival in Catalonia. On the other hand, we obtained detailed breast cancer survival functions that will be used for modeling the effect of screening and adjuvant treatments in Catalonia
Resumo:
The clustering in time (seriality) of extratropical cyclones is responsible for large cumulative insured losses in western Europe, though surprisingly little scientific attention has been given to this important property. This study investigates and quantifies the seriality of extratropical cyclones in the Northern Hemisphere using a point-process approach. A possible mechanism for serial clustering is the time-varying effect of the large-scale flow on individual cyclone tracks. Another mechanism is the generation by one parent cyclone of one or more offspring through secondary cyclogenesis. A long cyclone-track database was constructed for extended October March winters from 1950 to 2003 using 6-h analyses of 850-mb relative vorticity derived from the NCEP NCAR reanalysis. A dispersion statistic based on the varianceto- mean ratio of monthly cyclone counts was used as a measure of clustering. It reveals extensive regions of statistically significant clustering in the European exit region of the North Atlantic storm track and over the central North Pacific. Monthly cyclone counts were regressed on time-varying teleconnection indices with a log-linear Poisson model. Five independent teleconnection patterns were found to be significant factors over Europe: the North Atlantic Oscillation (NAO), the east Atlantic pattern, the Scandinavian pattern, the east Atlantic western Russian pattern, and the polar Eurasian pattern. The NAO alone is not sufficient for explaining the variability of cyclone counts in the North Atlantic region and western Europe. Rate dependence on time-varying teleconnection indices accounts for the variability in monthly cyclone counts, and a cluster process did not need to be invoked.
Resumo:
White clover (Trifolium repens) is an important pasture legume but is often difficult to sustain in a mixed sward because, among other things, of the damage to roots caused by the soil-dwelling larval stages of S. lepidus. Locating the root nodules on the white clover roots is crucial for the survival of the newly hatched larvae. This paper presents a numerical model to simulate the movement of newly hatched S. lepidus larvae towards the root nodules, guided by a chemical signal released by the nodules. The model is based on the diffusion-chemotaxis equation. Experimental observations showed that the average speed of the larvae remained approximately constant, so the diffusion-chernotaxis model was modified so that the larvae respond only to the gradient direction of the chemical signal but not its magnitude. An individual-based lattice Boltzmann method was used to simulate the movement of individual larvae, and the parameters required for the model were estimated from the measurement of larval movement towards nodules in soil scanned using X-ray microtomography. The model was used to investigate the effects of nodule density, the rate of release of chemical signal, the sensitivity of the larvae to the signal, and the random foraging of the larvae on the movement and subsequent survival of the larvae. The simulations showed that the most significant factors for larval survival were nodule density and the sensitivity of the larvae to the signal. The dependence of larval survival rate on nodule density was well fitted by the Michealis-Menten kinetics. (c) 2005 Elsevier B.V All rights reserved.
Resumo:
Laboratory measurements of the attenuation and velocity dispersion of compressional and shear waves at appropriate frequencies, pressures, and temperatures can aid interpretation of seismic and well-log surveys as well as indicate absorption mechanisms in rocks. Construction and calibration of resonant-bar equipment was used to measure velocities and attenuations of standing shear and extensional waves in copper-jacketed right cylinders of rocks (30 cm in length, 2.54 cm in diameter) in the sonic frequency range and at differential pressures up to 65 MPa. We also measured ultrasonic velocities and attenuations of compressional and shear waves in 50-mm-diameter samples of the rocks at identical pressures. Extensional-mode velocities determined from the resonant bar are systematically too low, yielding unreliable Poisson's ratios. Poisson's ratios determined from the ultrasonic data are frequency corrected and used to calculate the sonic-frequency compressional-wave velocities and attenuations from the shear- and extensional-mode data. We calculate the bulk-modulus loss. The accuracies of attenuation data (expressed as 1000/Q, where Q is the quality factor) are +/- 1 for compressional and shear waves at ultrasonic frequency, +/- 1 for shear waves, and +/- 3 for compressional waves at sonic frequency. Example sonic-frequency data show that the energy absorption in a limestone is small (Q(P) greater than 200 and stress independent) and is primarily due to poroelasticity, whereas that in the two sandstones is variable in magnitude (Q(P) ranges from less than 50 to greater than 300, at reservoir pressures) and arises from a combination of poroelasticity and viscoelasticity. A graph of compressional-wave attenuation versus compressional-wave velocity at reservoir pressures differentiates high-permeability (> 100 mD, 9.87 X 10(-14) m(2)) brine-saturated sandstones from low-permeability (< 100 mD, 9.87 X 10 (14) m(2)) sandstones and shales.
Resumo:
White clover (Trifolium repens) is an important pasture legume but is often difficult to sustain in a mixed sward because, among other things, of the damage to roots caused by the soil-dwelling larval stages of S. lepidus. Locating the root nodules on the white clover roots is crucial for the survival of the newly hatched larvae. This paper presents a numerical model to simulate the movement of newly hatched S. lepidus larvae towards the root nodules, guided by a chemical signal released by the nodules. The model is based on the diffusion-chemotaxis equation. Experimental observations showed that the average speed of the larvae remained approximately constant, so the diffusion-chernotaxis model was modified so that the larvae respond only to the gradient direction of the chemical signal but not its magnitude. An individual-based lattice Boltzmann method was used to simulate the movement of individual larvae, and the parameters required for the model were estimated from the measurement of larval movement towards nodules in soil scanned using X-ray microtomography. The model was used to investigate the effects of nodule density, the rate of release of chemical signal, the sensitivity of the larvae to the signal, and the random foraging of the larvae on the movement and subsequent survival of the larvae. The simulations showed that the most significant factors for larval survival were nodule density and the sensitivity of the larvae to the signal. The dependence of larval survival rate on nodule density was well fitted by the Michealis-Menten kinetics. (c) 2005 Elsevier B.V All rights reserved.
Resumo:
The International System of Units (SI) is founded on seven base units, the metre, kilogram, second, ampere, kelvin, mole and candela corresponding to the seven base quantities of length, mass, time, electric current, thermodynamic temperature, amount of substance and luminous intensity. At its 94th meeting in October 2005, the International Committee for Weights and Measures (CIPM) adopted a recommendation on preparative steps towards redefining the kilogram, ampere, kelvin and mole so that these units are linked to exactly known values of fundamental constants. We propose here that these four base units should be given new definitions linking them to exactly defined values of the Planck constant h, elementary charge e, Boltzmann constant k and Avogadro constant NA, respectively. This would mean that six of the seven base units of the SI would be defined in terms of true invariants of nature. In addition, not only would these four fundamental constants have exactly defined values but also the uncertainties of many of the other fundamental constants of physics would be either eliminated or appreciably reduced. In this paper we present the background and discuss the merits of these proposed changes, and we also present possible wordings for the four new definitions. We also suggest a novel way to define the entire SI explicitly using such definitions without making any distinction between base units and derived units. We list a number of key points that should be addressed when the new definitions are adopted by the General Conference on Weights and Measures (CGPM), possibly by the 24th CGPM in 2011, and we discuss the implications of these changes for other aspects of metrology.
Resumo:
Two simple and frequently used capture–recapture estimates of the population size are compared: Chao's lower-bound estimate and Zelterman's estimate allowing for contaminated distributions. In the Poisson case it is shown that if there are only counts of ones and twos, the estimator of Zelterman is always bounded above by Chao's estimator. If counts larger than two exist, the estimator of Zelterman is becoming larger than that of Chao's, if only the ratio of the frequencies of counts of twos and ones is small enough. A similar analysis is provided for the binomial case. For a two-component mixture of Poisson distributions the asymptotic bias of both estimators is derived and it is shown that the Zelterman estimator can experience large overestimation bias. A modified Zelterman estimator is suggested and also the bias-corrected version of Chao's estimator is considered. All four estimators are compared in a simulation study.
Resumo:
In this paper we consider the estimation of population size from onesource capture–recapture data, that is, a list in which individuals can potentially be found repeatedly and where the question is how many individuals are missed by the list. As a typical example, we provide data from a drug user study in Bangkok from 2001 where the list consists of drug users who repeatedly contact treatment institutions. Drug users with 1, 2, 3, . . . contacts occur, but drug users with zero contacts are not present, requiring the size of this group to be estimated. Statistically, these data can be considered as stemming from a zero-truncated count distribution.We revisit an estimator for the population size suggested by Zelterman that is known to be robust under potential unobserved heterogeneity. We demonstrate that the Zelterman estimator can be viewed as a maximum likelihood estimator for a locally truncated Poisson likelihood which is equivalent to a binomial likelihood. This result allows the extension of the Zelterman estimator by means of logistic regression to include observed heterogeneity in the form of covariates. We also review an estimator proposed by Chao and explain why we are not able to obtain similar results for this estimator. The Zelterman estimator is applied in two case studies, the first a drug user study from Bangkok, the second an illegal immigrant study in the Netherlands. Our results suggest the new estimator should be used, in particular, if substantial unobserved heterogeneity is present.
Resumo:
None of the current surveillance streams monitoring the presence of scrapie in Great Britain provide a comprehensive and unbiased estimate of the prevalence of the disease at the holding level. Previous work to estimate the under-ascertainment adjusted prevalence of scrapie in Great Britain applied multiple-list capture–recapture methods. The enforcement of new control measures on scrapie-affected holdings in 2004 has stopped the overlapping between surveillance sources and, hence, the application of multiple-list capture–recapture models. Alternative methods, still under the capture–recapture methodology, relying on repeated entries in one single list have been suggested in these situations. In this article, we apply one-list capture–recapture approaches to data held on the Scrapie Notifications Database to estimate the undetected population of scrapie-affected holdings with clinical disease in Great Britain for the years 2002, 2003, and 2004. For doing so, we develop a new diagnostic tool for indication of heterogeneity as well as a new understanding of the Zelterman and Chao’s lower bound estimators to account for potential unobserved heterogeneity. We demonstrate that the Zelterman estimator can be viewed as a maximum likelihood estimator for a special, locally truncated Poisson likelihood equivalent to a binomial likelihood. This understanding allows the extension of the Zelterman approach by means of logistic regression to include observed heterogeneity in the form of covariates—in case studied here, the holding size and country of origin. Our results confirm the presence of substantial unobserved heterogeneity supporting the application of our two estimators. The total scrapie-affected holding population in Great Britain is around 300 holdings per year. None of the covariates appear to inform the model significantly.
Resumo:
The paper concerns the design and analysis of serial dilution assays to estimate the infectivity of a sample of tissue when it is assumed that the sample contains a finite number of indivisible infectious units such that a subsample will be infectious if it contains one or more of these units. The aim of the study is to estimate the number of infectious units in the original sample. The standard approach to the analysis of data from such a study is based on the assumption of independence of aliquots both at the same dilution level and at different dilution levels, so that the numbers of infectious units in the aliquots follow independent Poisson distributions. An alternative approach is based on calculation of the expected value of the total number of samples tested that are not infectious. We derive the likelihood for the data on the basis of the discrete number of infectious units, enabling calculation of the maximum likelihood estimate and likelihood-based confidence intervals. We use the exact probabilities that are obtained to compare the maximum likelihood estimate with those given by the other methods in terms of bias and standard error and to compare the coverage of the confidence intervals. We show that the methods have very similar properties and conclude that for practical use the method that is based on the Poisson assumption is to be recommended, since it can be implemented by using standard statistical software. Finally we consider the design of serial dilution assays, concluding that it is important that neither the dilution factor nor the number of samples that remain untested should be too large.
Resumo:
The problem of estimating the individual probabilities of a discrete distribution is considered. The true distribution of the independent observations is a mixture of a family of power series distributions. First, we ensure identifiability of the mixing distribution assuming mild conditions. Next, the mixing distribution is estimated by non-parametric maximum likelihood and an estimator for individual probabilities is obtained from the corresponding marginal mixture density. We establish asymptotic normality for the estimator of individual probabilities by showing that, under certain conditions, the difference between this estimator and the empirical proportions is asymptotically negligible. Our framework includes Poisson, negative binomial and logarithmic series as well as binomial mixture models. Simulations highlight the benefit in achieving normality when using the proposed marginal mixture density approach instead of the empirical one, especially for small sample sizes and/or when interest is in the tail areas. A real data example is given to illustrate the use of the methodology.