855 resultados para quantifying heteroskedasticity
Resumo:
Counts of Pick bodies (PB), Pick cells (PC), senile plaques (SP) and neurofibrillary tangles (NFT) were made in the frontal and temporal cortex from patients with Pick's disease (PD). Lesions were stained histologically with hematoxylin and eosin (HE) and the Bielschowsky silver impregnation method and labeled immunohistochemically with antibodies raised to ubiquitin and tau. The greatest numbers of PB were revealed by immunohistochemistry. Counts of PB revealed by ubiquitin and tau were highly positively correlated which suggested that the two antibodies recognized virtually identical populations of PB. The greatest numbers of PC were revealed by HE followed by the anti-ubiquitin antibody. However, the correlation between counts was poor, suggesting that HE and ubiquitin revealed different populations of PC. The greatest numbers of SP and NFT were revealed by the Bielschowsky method indicating the presence of Alzheimer-type lesions not revealed by the immunohistochemistry. In addition, more NFT were revealed by the anti-ubiquitin compared with the anti-tau antibody. The data suggested that in PD: (i) the anti-ubiquitin and anti-tau antibodies were equally effective at labeling PB; (ii) both HE and anti-ubiquitin should be used to quantitate PC; and (iii) the Bielschowsky method should be used to quantitate SP and NFT.
Resumo:
The pH and counter-ion response of a microphase separated poly(methyl methacrylate)-block-poly(2-(diethylamino)ethyl methacrylate)-block-poly(methyl methacrylate) hydrogel has been investigated using laser light scattering on an imprinted micron scale topography. A quartz diffraction grating was used to create a micron-sized periodic structure on the surface of a thin film of the polymer and the resulting diffraction pattern used to calculate the swelling ratio of the polymer film in situ. A potentiometric titration and a sequence of counter ion species, taken from the Hofmeister series, have been used to compare the results obtained using this novel technique against small angle X-ray scattering (nanoscopic) and gravimetric studies of bulk gel pieces (macroscopic). For the first time, the technique has been proven to be an inexpensive and effective analytical tool for measuring hydrogel response on the microscopic scale.
Resumo:
Background: Parkinson’s disease (PD) is an incurable neurological disease with approximately 0.3% prevalence. The hallmark symptom is gradual movement deterioration. Current scientific consensus about disease progression holds that symptoms will worsen smoothly over time unless treated. Accurate information about symptom dynamics is of critical importance to patients, caregivers, and the scientific community for the design of new treatments, clinical decision making, and individual disease management. Long-term studies characterize the typical time course of the disease as an early linear progression gradually reaching a plateau in later stages. However, symptom dynamics over durations of days to weeks remains unquantified. Currently, there is a scarcity of objective clinical information about symptom dynamics at intervals shorter than 3 months stretching over several years, but Internet-based patient self-report platforms may change this. Objective: To assess the clinical value of online self-reported PD symptom data recorded by users of the health-focused Internet social research platform PatientsLikeMe (PLM), in which patients quantify their symptoms on a regular basis on a subset of the Unified Parkinson’s Disease Ratings Scale (UPDRS). By analyzing this data, we aim for a scientific window on the nature of symptom dynamics for assessment intervals shorter than 3 months over durations of several years. Methods: Online self-reported data was validated against the gold standard Parkinson’s Disease Data and Organizing Center (PD-DOC) database, containing clinical symptom data at intervals greater than 3 months. The data were compared visually using quantile-quantile plots, and numerically using the Kolmogorov-Smirnov test. By using a simple piecewise linear trend estimation algorithm, the PLM data was smoothed to separate random fluctuations from continuous symptom dynamics. Subtracting the trends from the original data revealed random fluctuations in symptom severity. The average magnitude of fluctuations versus time since diagnosis was modeled by using a gamma generalized linear model. Results: Distributions of ages at diagnosis and UPDRS in the PLM and PD-DOC databases were broadly consistent. The PLM patients were systematically younger than the PD-DOC patients and showed increased symptom severity in the PD off state. The average fluctuation in symptoms (UPDRS Parts I and II) was 2.6 points at the time of diagnosis, rising to 5.9 points 16 years after diagnosis. This fluctuation exceeds the estimated minimal and moderate clinically important differences, respectively. Not all patients conformed to the current clinical picture of gradual, smooth changes: many patients had regimes where symptom severity varied in an unpredictable manner, or underwent large rapid changes in an otherwise more stable progression. Conclusions: This information about short-term PD symptom dynamics contributes new scientific understanding about the disease progression, currently very costly to obtain without self-administered Internet-based reporting. This understanding should have implications for the optimization of clinical trials into new treatments and for the choice of treatment decision timescales.
Resumo:
The simulated classical dynamics of a small molecule exhibiting self-organizing behavior via a fast transition between two states is analyzed by calculation of the statistical complexity of the system. It is shown that the complexity of molecular descriptors such as atom coordinates and dihedral angles have different values before and after the transition. This provides a new tool to identify metastable states during molecular self-organization. The highly concerted collective motion of the molecule is revealed. Low-dimensional subspaces dynamics is found sensitive to the processes in the whole, high-dimensional phase space of the system. © 2004 Wiley Periodicals, Inc.
Resumo:
The sheer volume of citizen weather data collected and uploaded to online data hubs is immense. However as with any citizen data it is difficult to assess the accuracy of the measurements. Within this project we quantify just how much data is available, where it comes from, the frequency at which it is collected, and the types of automatic weather stations being used. We also list the numerous possible sources of error and uncertainty within citizen weather observations before showing evidence of such effects in real data. A thorough intercomparison field study was conducted, testing popular models of citizen weather stations. From this study we were able to parameterise key sources of bias. Most significantly the project develops a complete quality control system through which citizen air temperature observations can be passed. The structure of this system was heavily informed by the results of the field study. Using a Bayesian framework the system learns and updates its estimates of the calibration and radiation-induced biases inherent to each station. We then show the benefit of correcting for these learnt biases over using the original uncorrected data. The system also attaches an uncertainty estimate to each observation, which would provide real world applications that choose to incorporate such observations with a measure on which they may base their confidence in the data. The system relies on interpolated temperature and radiation observations from neighbouring professional weather stations for which a Bayesian regression model is used. We recognise some of the assumptions and flaws of the developed system and suggest further work that needs to be done to bring it to an operational setting. Such a system will hopefully allow applications to leverage the additional value citizen weather data brings to longstanding professional observing networks.
Resumo:
This dissertation research project addressed the question of how hydrologic restoration of the Everglades is impacting the nutrient dynamics of marsh ecosystems in the southern Everglades. These effects were analyzed by quantifying nitrogen (N) cycle dynamics in the region. I utilized stable isotope tracer techniques to investigate nitrogen uptake and cycling between the major ecosystem components of the freshwater marsh system. I recorded the natural isotopic signatures (δ15N and δ 13C) for major ecosystem components from the three major watersheds of the Everglades: Shark River Slough, Taylor Slough, and C-111 basin. Analysis of δ15 N and δ13C natural abundance data were used to demonstrate the spatial extent to which nitrogen from anthropogenic or naturally enriched sources is entering the marshes of the Everglades. In addition, I measured the fluxes on N between various ecosystem components at both near-canal and estuarine ecotone locations. Lastly, I investigated the effect of three phosphorus load treatments (0.00 mg P m-2, 6.66 mg P m-2, and 66.6 mg P m-2) on the rate and magnitude of ecosystem N-uptake and N-cycling. The δ15N and δ13C natural abundance data supported the hypothesis that ecosystem components from near-canal sites have heavier, more enriched δ 15N isotopic signatures than downstream sites. The natural abundance data also showed that the marshes of the southern Everglades are acting as a sink for isotopically heavier, canal-borne dissolved inorganic nitrogen (DIN) and a source for "new" marsh derived dissolved organic nitrogen (DON). In addition, the 15N mesocosm data showed the rapid assimilation of the 15N tracer by the periphyton component and the delayed N uptake by soil and macrophyte components in the southern Everglades.
Resumo:
A deep understanding of the proteins folding dynamics can be get quantifying folding landscape by calculating how the number of microscopic configurations (entropy) varies with the energy of the chain, Ω=Ω(E). Because of the incredibly large number of microstates available to a protein, direct enumeration of Ω(E) is not possible on realistic computer simulations. An estimate of Ω(E) can be obtained by use of a combination of statistical mechanics and thermodynamics. By combining different definitions of entropy that are valid for a system whose probability for occupying a state is given by the canonical Boltzmann probability, computers allow the determination of Ω(E). ^ The energy landscapes of two similar, but not identical model proteins were studied. One protein contains no kinetic tracks. Results show a smooth funnel for the folding landscape. That allows the contour determination of the folding funnel. Also it was presented results for the folding landscape for a modified protein with kinetic traps. Final results show that the computational approach is able to distinguish and explore regions of the folding landscape that are due to kinetic traps from the native state folding funnel.^
Resumo:
Groundwater systems of different densities are often mathematically modeled to understand and predict environmental behavior such as seawater intrusion or submarine groundwater discharge. Additional data collection may be justified if it will cost-effectively aid in reducing the uncertainty of a model's prediction. The collection of salinity, as well as, temperature data could aid in reducing predictive uncertainty in a variable-density model. However, before numerical models can be created, rigorous testing of the modeling code needs to be completed. This research documents the benchmark testing of a new modeling code, SEAWAT Version 4. The benchmark problems include various combinations of density-dependent flow resulting from variations in concentration and temperature. The verified code, SEAWAT, was then applied to two different hydrological analyses to explore the capacity of a variable-density model to guide data collection. ^ The first analysis tested a linear method to guide data collection by quantifying the contribution of different data types and locations toward reducing predictive uncertainty in a nonlinear variable-density flow and transport model. The relative contributions of temperature and concentration measurements, at different locations within a simulated carbonate platform, for predicting movement of the saltwater interface were assessed. Results from the method showed that concentration data had greater worth than temperature data in reducing predictive uncertainty in this case. Results also indicated that a linear method could be used to quantify data worth in a nonlinear model. ^ The second hydrological analysis utilized a model to identify the transient response of the salinity, temperature, age, and amount of submarine groundwater discharge to changes in tidal ocean stage, seasonal temperature variations, and different types of geology. The model was compared to multiple kinds of data to (1) calibrate and verify the model, and (2) explore the potential for the model to be used to guide the collection of data using techniques such as electromagnetic resistivity, thermal imagery, and seepage meters. Results indicated that the model can be used to give insight to submarine groundwater discharge and be used to guide data collection. ^
Resumo:
We examined the high-resolution temporal dynamics of recovery of dried periphyton crusts following rapid rehydration in a phosphorus (P)-limited short hydroperiod Everglades wetland. Crusts were incubated in a greenhouse in tubs containing water with no P or exogenous algae to mimic the onset of the wet season in the natural marsh when heavy downpours containing very low P flood the dry wetland. Algal and bacterial productivity were tracked for 20 days and related to compositional changes and P dynamics in the water. A portion of original crusts was also used to determine how much TP could be released if no biotic recovery occurred. Composition was volumetrically dominated by cyanobacteria (90%) containing morphotypes typical of xeric environments. Algal and bacterial production recovered immediately upon rehydration but there was a net TP loss from the crusts to the water in the first 2 days. By day 5, however, cyanobacteria and other bacteria had re-absorbed 90% of the released P. Then, water TP concentration reached a steady-state level of 6.6 μg TP/L despite water TP concentration through evaporation. Phosphomonoesterase (PMEase) activity was very high during the first day after rehydration due to the release of a large pre-existing pool of extracellular PMEase. Thereafter, the activity dropped by 90% and increased gradually from this low level. The fast recovery of desiccated crusts upon rehydration required no exogenous P or allogenous algae/bacteria additions and periphyton largely controlled P concentration in the water.
Resumo:
The purpose of this study is to explore the accuracy issue of the Input-Output model in quantifying the impacts of the 2007 economic crisis on a local tourism industry and economy. Though the model has been used in the tourism impact analysis, its estimation accuracy is rarely verified empirically. The Metro Orlando area in Florida is investigated as an empirical study, and the negative change in visitor expenditure between 2007 and 2008 is taken as the direct shock. The total impacts are assessed in terms of output and employment, and are compared with the actual data. This study finds that there are surprisingly large discrepancies among the estimated and actual results, and the Input-Output model appears to overestimate the negative impacts. By investigating the local economic activities during the study period, this study made some exploratory efforts in explaining such discrepancies. Theoretical and practical implications are then suggested.
Resumo:
The purpose of this study is to explore the accuracy issue of the Input-Output model in quantifying the impacts of the 2007 economic crisis on a local tourism industry and economy. Though the model has been used in the tourism impact analysis, its estimation accuracy is rarely verified empirically. The Metro Orlando area in Florida is investigated as an empirical study, and the negative change in visitor expenditure between 2007 and 2008 is taken as the direct shock. The total impacts are assessed in terms of output and employment, and are compared with the actual data. This study finds that there are surprisingly large discrepancies among the estimated and actual results, and the Input-Output model appears to overestimate the negative impacts. By investigating the local economic activities during the study period, this study made some exploratory efforts in explaining such discrepancies. Theoretical and practical implications are then suggested.
Resumo:
Ships and offshore structures, that encounter ice floes, tend to experience loads with varying pressure distributions within the contact patch. The effect of the surrounding ice adjacent to that which is involved in the contact zone has an influence on the effective strength. This effect has come to be called confinement. A methodology for quantifying ice sample confinement is developed, and the confinement is defined using two non-dimensional terms; a ratio of geometries and an angle. Together these terms are used to modify force predictions that account for increased fracturing and spalling at lower confinement levels. Data developed through laboratory experimentation is studied using dimensional analysis. The characteristics of dimensional analysis allow for easy comparison between many different load cases; provided the impact scenario is consistent. In all, a methodology is developed for analyzing ice impact testing considering confinement effects on force levels, with the potential for extrapolating these tests to full size collision events.
Resumo:
Esta tesis doctoral nace con el propósito de entender, analizar y sobre todo modelizar el comportamiento estadístico de las series financieras. En este sentido, se puede afirmar que los modelos que mejor recogen las especiales características de estas series son los modelos de heterocedasticidad condicionada en tiempo discreto,si los intervalos de tiempo en los que se recogen los datos lo permiten, y en tiempo continuo si tenemos datos diarios o datos intradía. Con esta finalidad, en esta tesis se proponen distintos estimadores bayesianos para la estimación de los parámetros de los modelos GARCH en tiempo discreto (Bollerslev (1986)) y COGARCH en tiempo continuo (Kluppelberg et al. (2004)). En el capítulo 1 se introducen las características de las series financieras y se presentan los modelos ARCH, GARCH y COGARCH, así como sus principales propiedades. Mandelbrot (1963) destacó que las series financieras no presentan estacionariedad y que sus incrementos no presentan autocorrelación, aunque sus cuadrados sí están correlacionados. Señaló también que la volatilidad que presentan no es constante y que aparecen clusters de volatilidad. Observó la falta de normalidad de las series financieras, debida principalmente a su comportamiento leptocúrtico, y también destacó los efectos estacionales que presentan las series, analizando como se ven afectadas por la época del año o el día de la semana. Posteriormente Black (1976) completó la lista de características especiales incluyendo los denominados leverage effects relacionados con como las fluctuaciones positivas y negativas de los precios de los activos afectan a la volatilidad de las series de forma distinta.
Resumo:
Acknowledgement This work is funded by the National Science Center Poland based on the decision number DEC-2015/16/T/ST8/00516. PB is supported by the Foundation for Polish Science (FNP).
Resumo:
10 pages, 5 figures, conference or other essential info Acknowledgments LK and JCS were supported by Blue Brain Project. P.D. and R.L. were supported in part by the Blue Brain Project and by the start-up grant of KH. Partial support for P.D. has been provided by the Advanced Grant of the European Research Council GUDHI (Geometric Understanding in Higher Dimensions). MS was supported by the SNF NCCR ”Synapsy”.