794 resultados para Estimations
Resumo:
RESUMO - Portugal continental, como outros países europeus, foi afectado por uma onda de calor de grande intensidade no Verão de 2003, com efeitos na mortalidade da população. O excesso de óbitos associados à onda de calor foi estimado pela comparação do número de óbitos observados entre 30 de Julho e 15 de Agosto de 2003 e o número de óbitos esperados se a população tivesse estado exposta às taxas de mortalidade médias do biénio 2000-2001 no respectivo período homólogo. Os óbitos esperados foram calculados com ajustamento para a idade. O número de óbitos observados (O) foi superior ao número esperado (E) em todos os dias do período estudado e o seu excesso global foi estimado em 1953 óbitos (excesso relativo de 43%), dos quais 1317 (61%) ocorreram no sexo feminino e 1742 no grupo de 75 e + anos (89%). A nível distrital, Portalegre teve o maior aumento relativo do número de óbitos (+89%) e Aveiro o menor (+18%). Numa área geográfica contínua do interior do território (Guarda, Castelo Branco, Portalegre e Évora) houve aumentos relativos superiores a 80%. Em termos absolutos, o maior excesso de óbitos ocorreu no distrito de Lisboa (mais cerca de 396) e no do Porto (mais cerca de 183). As causas de morte «golpe de calor» e «desidratação e outros distúrbios metabólicos» tiveram os aumentos relativos mais elevados (razões O/E de, respectivamente, 70 e 8,65). Os maiores aumentos absolutos do número de óbitos ocorreram no grupo das «doenças do aparelho circulatório» (mais 758), nas «doenças do aparelho respiratório» (mais 255) e no conjunto de «todas as neoplasias malignas» (mais 131). No período da onda de calor e no período de comparação, a percentagem dos óbitos que ocorreu nos hospitais (52% e 56%), no domicílio (32 e 33%) e em «outros locais» foi semelhante. A discussão sobre os factores que condicionaram a obtenção dos valores apresentados, relativos ao excesso de óbitos por sexo, grupo etário, distrito, causa e local da morte, permite concluir que os mesmos se afiguram adequados para medir a ordem de grandeza e caracterizar o efeito da onda de calor na mortalidade. O erro aleatório, medido pelos intervalos de confiança, e alguns possíveis erros sistemáticos associados ao período de comparação escolhido não deverão afectar de modo relevante as estimativas.
Resumo:
Early visual processing stages have been demonstrated to be impaired in schizophrenia patients and their first-degree relatives. The amplitude and topography of the P1 component of the visual evoked potential (VEP) are both affected; the latter of which indicates alterations in active brain networks between populations. At least two issues remain unresolved. First, the specificity of this deficit (and suitability as an endophenotype) has yet to be established, with evidence for impaired P1 responses in other clinical populations. Second, it remains unknown whether schizophrenia patients exhibit intact functional modulation of the P1 VEP component; an aspect that may assist in distinguishing effects specific to schizophrenia. We applied electrical neuroimaging analyses to VEPs from chronic schizophrenia patients and healthy controls in response to variation in the parafoveal spatial extent of stimuli. Healthy controls demonstrated robust modulation of the VEP strength and topography as a function of the spatial extent of stimuli during the P1 component. By contrast, no such modulations were evident at early latencies in the responses from patients with schizophrenia. Source estimations localized these deficits to the left precuneus and medial inferior parietal cortex. These findings provide insights on potential underlying low-level impairments in schizophrenia.
Resumo:
Accurate perception of the order of occurrence of sensory information is critical for the building up of coherent representations of the external world from ongoing flows of sensory inputs. While some psychophysical evidence reports that performance on temporal perception can improve, the underlying neural mechanisms remain unresolved. Using electrical neuroimaging analyses of auditory evoked potentials (AEPs), we identified the brain dynamics and mechanism supporting improvements in auditory temporal order judgment (TOJ) during the course of the first vs. latter half of the experiment. Training-induced changes in brain activity were first evident 43-76 ms post stimulus onset and followed from topographic, rather than pure strength, AEP modulations. Improvements in auditory TOJ accuracy thus followed from changes in the configuration of the underlying brain networks during the initial stages of sensory processing. Source estimations revealed an increase in the lateralization of initially bilateral posterior sylvian region (PSR) responses at the beginning of the experiment to left-hemisphere dominance at its end. Further supporting the critical role of left and right PSR in auditory TOJ proficiency, as the experiment progressed, responses in the left and right PSR went from being correlated to un-correlated. These collective findings provide insights on the neurophysiologic mechanism and plasticity of temporal processing of sounds and are consistent with models based on spike timing dependent plasticity.
Resumo:
Se empleó un modelo poblacional estructurado por edades para estimar la abundancia, biomasa, biomasa desovante y el reclutamiento medio del stock norte – centro de la anchoveta peruana entre los años biológicos (octubre a setiembre) 1962-63 y 2007-08. El modelo, basado en un enfoque hacia adelante, fue optimizado minimizando las diferencias de los estimados del modelo y observaciones independientes de biomasa, desembarque y estructuras por edades de los desembarques. Los resultados muestran que han existido tres regímenes de productividad de dicho stock: el primero, entre 1962-63 y 1970-71, con la abundancia, biomasa, biomasa desovante y reclutamiento medio más altos; el segundo, entre 1971- 72 y 1990-91 con los niveles poblacionales más bajos; y el tercero, entre 1991-92 y 2007-08, con niveles intermedios. Parece claro que luego del colapso de las décadas de 1970 y 1980 el stock se ha recuperado de manera significativa aunque sin alcanzar los niveles de la década de 1960. Desde el año 2001-02 la biomasa desovante se ha mantenido por encima de cinco millones de toneladas, y la mortalidad por pesca ha mostrado una tendencia decreciente. Se demostró que el presente modelo estuvo en capacidad de captar la dinámica poblacional del stock norte – centro de la anchoveta validando su utilidad en las evaluaciones y monitoreo de la población de anchoveta.
Resumo:
Next-generation sequencing (NGS) technologies have become the standard for data generation in studies of population genomics, as the 1000 Genomes Project (1000G). However, these techniques are known to be problematic when applied to highly polymorphic genomic regions, such as the human leukocyte antigen (HLA) genes. Because accurate genotype calls and allele frequency estimations are crucial to population genomics analyses, it is important to assess the reliability of NGS data. Here, we evaluate the reliability of genotype calls and allele frequency estimates of the single-nucleotide polymorphisms (SNPs) reported by 1000G (phase I) at five HLA genes (HLA-A, -B, -C, -DRB1, and -DQB1). We take advantage of the availability of HLA Sanger sequencing of 930 of the 1092 1000G samples and use this as a gold standard to benchmark the 1000G data. We document that 18.6% of SNP genotype calls in HLA genes are incorrect and that allele frequencies are estimated with an error greater than ±0.1 at approximately 25% of the SNPs in HLA genes. We found a bias toward overestimation of reference allele frequency for the 1000G data, indicating mapping bias is an important cause of error in frequency estimation in this dataset. We provide a list of sites that have poor allele frequency estimates and discuss the outcomes of including those sites in different kinds of analyses. Because the HLA region is the most polymorphic in the human genome, our results provide insights into the challenges of using of NGS data at other genomic regions of high diversity.
Resumo:
QUESTIONS UNDER STUDY: Since tumour burden consumes substantial healthcare resources, precise cancer incidence estimations are pivotal to define future needs of national healthcare. This study aimed to estimate incidence and mortality rates of oesophageal, gastric, pancreatic, hepatic and colorectal cancers up to 2030 in Switzerland. METHODS: Swiss Statistics provides national incidences and mortality rates of various cancers, and models of future developments of the Swiss population. Cancer incidences and mortality rates from 1985 to 2009 were analysed to estimate trends and to predict incidence and mortality rates up to 2029. Linear regressions and Joinpoint analyses were performed to estimate the future trends of incidences and mortality rates. RESULTS: Crude incidences of oesophageal, pancreas, liver and colorectal cancers have steadily increased since 1985, and will continue to increase. Gastric cancer incidence and mortality rates reveal an ongoing decrease. Pancreatic and liver cancer crude mortality rates will keep increasing, whereas colorectal cancer mortality on the contrary will fall. Mortality from oesophageal cancer will plateau or minimally increase. If we consider European population-standardised incidence rates, oesophageal, pancreatic and colorectal cancer incidences are steady. Gastric cancers are diminishing and liver cancers will follow an increasing trend. Standardised mortality rates show a diminution for all but liver cancer. CONCLUSIONS: The oncological burden of gastrointestinal cancer will significantly increase in Switzerland during the next two decades. The crude mortality rates globally show an ongoing increase except for gastric and colorectal cancers. Enlarged healthcare resources to take care of these complex patient groups properly will be needed.
Resumo:
We examined three different algorithms used in diffusion Monte Carlo (DMC) to study their precisions and accuracies in predicting properties of isolated atoms, which are H atom ground state, Be atom ground state and H atom first excited state. All three algorithms — basic DMC, minimal stochastic reconfiguration DMC, and pure DMC, each with future-walking, are successfully impletmented in ground state energy and simple moments calculations with satisfactory results. Pure diffusion Monte Carlo with future-walking algorithm is proven to be the simplest approach with the least variance. Polarizabilities for Be atom ground state and H atom first excited state are not satisfactorily estimated in the infinitesimal differentiation approach. Likewise, an approach using the finite field approximation with an unperturbed wavefunction for the latter system also fails. However, accurate estimations for the a-polarizabilities are obtained by using wavefunctions that come from the time-independent perturbation theory. This suggests the flaw in our approach to polarizability estimation for these difficult cases rests with our having assumed the trial function is unaffected by infinitesimal perturbations in the Hamiltonian.
Resumo:
The present study was carried out to test the hypothesis that photosynthetic bacteria contribute a large portion of the food of filter feeding zooplankton populations in Crawford Lake, Ontario. The temporal and spatial variations of both groups of organisms are strongly dependent on one another. 14 By using C-Iabelled photosynthetic bacteria. the ingestion and clearance rates of Daphnia pulex, ~. rosea, and Keratella spp were estimated during summer and fall of 1982. These quantitative estimations of zooplankton ingestion and clearence rates on photosynthetic bacteria comprised an original addition to the literature. Photosynthetic bacteria comprised a substantial portion of the diet of all four dominant zooplankton species. The evidence for this is based on the ingestion and clearance rates of the dominant zooplankton species. Ingestion rates of D. pulex and D. rosea ranged 5 5 -1 -1 - -- 5 - -- 5 from 8.3X10 -1 to 14.6XlO -1 cells.ind. hr and 8.1X10 to 13.9X10 cells.ind. hr • Their clearance rates ranged from 0.400 to 1.000 -1 -1 -1 -1 ml.ind. hr. and 0.380 to 0.930 ml.ind. hr • The ingestion and clearance -1 -1 -1 -1 rates of Keratella spp were 600 cell.ind. hr and 0.40 ul.ind. hr respectively. Clearance rates were inversely proportional to the concentration of food cells and directly proportional to the body size of the animals. It is believed that despite the very short reg~neration times of photosynthetic bacteria (3-8 hours) their population densities were controlled in part by the feeding rates of the dominant zooplankton in Crawford Lake. By considering the regeneration times of photosynthetic bacteria and the population clearance rates of zooplankton, it was estimated that between 16 to 52% and 11 to 35% of the PHotosynthetic bacteria were' consumed· by Daphnia· pulex. and Q.. rosea per day. The temporal and spatial distribution of Daphnia pulex, !.. rosea, Keratella quadrata, K. coChlearis and photosynthetic bacteria in Crawford Lake were also investigated during the period of October, 1981 to December, 1982. The photosynthetic bacteria in the lake, constituted a major food source for only those zooplankton Which tolerate anaerobic conditions. Changes in temperature and food appeared to correlate with the seasonal changes in zooplankton density. All four dominant species of zooplankton were abundant at the lake's surface (O-4m) during winter and spring and moved downwards with the thermocline as summer stratification proceeded. Photosynthetic bacteria formed a 2 m thick layer at the chemocline. The position of this photosynthetic bacterial J-ayer changed seasonally. In the summer, the bacterial plate moved upwards and following fall mixing it moved downwards. A vertical shift of O.8m (14.5 to 15.3m) was recorded during the period of June to December. The upper limit of the photosynthetic bacteria in the water column was controlled by dissolved oxygen, and sulfide concentrations While their lower limit was controlled by light intensity. A maximum bacterio- 1 chlorophyll concentration of 81 mg Bchl.l was recorded on August 9, 1981. The seasonal distribution of photosynthetic bacteria was controlledinpart' by ·theg.-"z1ai'_.Q;~.zoopl. ank:tCm;-.Qther -ciactors associated with zooplankton grazing were oxygen and sulfide concentrations.
Resumo:
Chicl( brain growth factor (CBGF) is a mitogen isolated from embryonic chick brains thought to have a potential role as a trophic factor involved in nerve dependent amphibian limb regeneration. In addition, CBGF stimulates 3H-thymidine incorporation in chick embryo brain astrocytes in vitro. In this study, cultured chick embryo brain non-neuronal cells were employed in a bioassay to monitor CBGF activity throughout various stages of its pllrification. Cell culture and assay conditions were optimized. Nonneuronal cells grew best on collagen-coated culture dishes in complete medium, were most responsive to a growth stimulus [10% fetal bovine serum (FBS)] at the second and third subcultures, and were healthiest when rendered "quiescent" in medium supplemented with 1% FBS. The most effective bioassay conditions consisted of a minimum 14.5 hour "quiescence" time (24 hours was used), a 6 hour "prestimulation" time, and a 24 hour 3H-thymidine labeling time. Four-day subconfluent primary non-neuronal cells consisted of 6.63% GFAP positive cells; as a result cultures were thought to be mainly composed of astroblasts. CBGF was purified from 18-day chick embryo brains by ultrafiltration through Amicon PM-30 and YM-2 membranes, size exclusion chromatography through a Biogel P6 column, and analytical reverse-phase high-performance liquid chromatography (rp-HPLC). The greatest activity resided in rp-HPLC fraction #7 (10 ng/ml) which was as effective as 10% FBS at stimulating 3H-thymidine incorporation in chick embryo brain nonneuronal cells. Although other researchers report the isolation of a mitogenic fraction consisting of 5'-GMP from the embryonic chick brain, UV absorbance spectra, rp-HPLC elution profiles, and fast atom bombardment (FAB) mass spectra indicated that CBGF is neither 5'-GMP nor 51-AMP. 2 Moreover, commercially available 5t-GMP was inhibitory to 3H-thymidine incorporation in the chick non-neuronal cells, while Sf-AMP had no effect. Upon treatment with pronase, the biological activity of fraction P6-3 increased; this increase was nearly 30% greater than what would be expected from a simple additive effect of any mitogenic activity of pronase alone together with P6-3 alone. This may suggest the presence of an inhibitor protein. The bioactive component may be a protein protected by a nucleoside/nucleotide or simply a nucleoside/nucleotide acting alone. While the FAB mass spectrum of rp-HPLC fraction #7 did not reveal molecular weight or sequence information, the ion of highest molecular weight was observed at m/z 1610; this is consistent with previous estimations of CBGF's size. 3
Resumo:
Self-controlled KR practice has revealed that providing participants the opportunity to control their KR is superior for motor learning compared to participants replicating the KR schedule of a self-control participant, without the choice (e.g., yoked). The purpose of the present experiment was two-fold. First, to examine the utility of a self-controlled KR schedule for learning a spatial motor task in younger and older adults and second, to determine whether a self-controlled KR schedule facilitates an increased ability to estimate one’s performance in retention and transfer. Twenty younger adults and 20 older adults practiced in either the self-control or yoked condition and were required to push and release a slide along a confined pathway using their non-dominant hand to a target distance. The retention data revealed that as a function of age, a self-controlled KR schedule facilitated superior retention performance and performance estimations in younger adults compared to their yoked counterparts.
Resumo:
Recent studies have shown that providing learners Knowledge of Results (KR) after “good trials” rather than “poor trials” is superior for learning. The present study examined whether requiring participants to estimate their three best or three worst trials in a series of six trial blocks before receiving KR would prove superior to learning compared to not estimating their performance. Participants were required to push and release a slide along a confined pathway using their non-dominant hand to a target distance (133cm). The retention and transfer data suggest those participants who received KR after good trials demonstrated superior learning and performance estimations compared to those receiving KR after poor trials. The results of the present experiment offer an important theoretical extension in our understanding of the role of KR content and performance estimation on motor skill learning.
Resumo:
This paper studies seemingly unrelated linear models with integrated regressors and stationary errors. By adding leads and lags of the first differences of the regressors and estimating this augmented dynamic regression model by feasible generalized least squares using the long-run covariance matrix, we obtain an efficient estimator of the cointegrating vector that has a limiting mixed normal distribution. Simulation results suggest that this new estimator compares favorably with others already proposed in the literature. We apply these new estimators to the testing of purchasing power parity (PPP) among the G-7 countries. The test based on the efficient estimates rejects the PPP hypothesis for most countries.
Resumo:
Dans ce texte, nous revoyons certains développements récents de l’économétrie qui peuvent être intéressants pour des chercheurs dans des domaines autres que l’économie et nous soulignons l’éclairage particulier que l’économétrie peut jeter sur certains thèmes généraux de méthodologie et de philosophie des sciences, tels la falsifiabilité comme critère du caractère scientifique d’une théorie (Popper), la sous-détermination des théories par les données (Quine) et l’instrumentalisme. En particulier, nous soulignons le contraste entre deux styles de modélisation - l’approche parcimonieuse et l’approche statistico-descriptive - et nous discutons les liens entre la théorie des tests statistiques et la philosophie des sciences.
Resumo:
This paper develops and estimates a game-theoretical model of inflation targeting where the central banker's preferences are asymmetric around the targeted rate. In particular, positive deviations from the target can be weighted more, or less, severely than negative ones in the central banker's loss function. It is shown that some of the previous results derived under the assumption of symmetry are not robust to the generalization of preferences. Estimates of the central banker's preference parameters for Canada, Sweden, and the United Kingdom are statistically different from the ones implied by the commonly used quadratic loss function. Econometric results are robust to different forecasting models for the rate of unemployment but not to the use of measures of inflation broader than the one targeted.
Resumo:
A wide range of tests for heteroskedasticity have been proposed in the econometric and statistics literature. Although a few exact homoskedasticity tests are available, the commonly employed procedures are quite generally based on asymptotic approximations which may not provide good size control in finite samples. There has been a number of recent studies that seek to improve the reliability of common heteroskedasticity tests using Edgeworth, Bartlett, jackknife and bootstrap methods. Yet the latter remain approximate. In this paper, we describe a solution to the problem of controlling the size of homoskedasticity tests in linear regression contexts. We study procedures based on the standard test statistics [e.g., the Goldfeld-Quandt, Glejser, Bartlett, Cochran, Hartley, Breusch-Pagan-Godfrey, White and Szroeter criteria] as well as tests for autoregressive conditional heteroskedasticity (ARCH-type models). We also suggest several extensions of the existing procedures (sup-type of combined test statistics) to allow for unknown breakpoints in the error variance. We exploit the technique of Monte Carlo tests to obtain provably exact p-values, for both the standard and the new tests suggested. We show that the MC test procedure conveniently solves the intractable null distribution problem, in particular those raised by the sup-type and combined test statistics as well as (when relevant) unidentified nuisance parameter problems under the null hypothesis. The method proposed works in exactly the same way with both Gaussian and non-Gaussian disturbance distributions [such as heavy-tailed or stable distributions]. The performance of the procedures is examined by simulation. The Monte Carlo experiments conducted focus on : (1) ARCH, GARCH, and ARCH-in-mean alternatives; (2) the case where the variance increases monotonically with : (i) one exogenous variable, and (ii) the mean of the dependent variable; (3) grouped heteroskedasticity; (4) breaks in variance at unknown points. We find that the proposed tests achieve perfect size control and have good power.