926 resultados para Multivariate Statistical Process Monitoring
Resumo:
In this work we analyze how patchy distributions of CO2 and brine within sand reservoirs may lead to significant attenuation and velocity dispersion effects, which in turn may have a profound impact on surface seismic data. The ultimate goal of this paper is to contribute to the understanding of these processes within the framework of the seismic monitoring of CO2 sequestration, a key strategy to mitigate global warming. We first carry out a Monte Carlo analysis to study the statistical behavior of attenuation and velocity dispersion of compressional waves traveling through rocks with properties similar to those at the Utsira Sand, Sleipner field, containing quasi-fractal patchy distributions of CO2 and brine. These results show that the mean patch size and CO2 saturation play key roles in the observed wave-induced fluid flow effects. The latter can be remarkably important when CO2 concentrations are low and mean patch sizes are relatively large. To analyze these effects on the corresponding surface seismic data, we perform numerical simulations of wave propagation considering reservoir models and CO2 accumulation patterns similar to the CO2 injection site in the Sleipner field. These numerical experiments suggest that wave-induced fluid flow effects may produce changes in the reservoir's seismic response, modifying significantly the main seismic attributes usually employed in the characterization of these environments. Consequently, the determination of the nature of the fluid distributions as well as the proper modeling of the seismic data constitute important aspects that should not be ignored in the seismic monitoring of CO2 sequestration problems.
Resumo:
Una de las herramientas estadísticas más importantes para el seguimiento y análisis de la evolución de la actividad económica a corto plazo es la disponibilidad de estimaciones de la evolución trimestral de los componentes del PIB, en lo que afecta tanto a la oferta como a la demanda. La necesidad de disponer de esta información con un retraso temporal reducido hace imprescindible la utilización de métodos de trimestralización que permitan desagregar la información anual a trimestral. El método más aplicado, puesto que permite resolver este problema de manera muy elegante bajo un enfoque estadístico de estimador óptimo, es el método de Chow-Lin. Pero este método no garantiza que las estimaciones trimestrales del PIB en lo que respecta a la oferta y a la demanda coincidan, haciendo necesaria la aplicación posterior de algún método de conciliación. En este trabajo se desarrolla una ampliación multivariante del método de Chow-Lin que permite resolver el problema de la estimación de los valores trimestrales de manera óptima, sujeta a un conjunto de restricciones. Una de las aplicaciones potenciales de este método, que hemos denominado método de Chow-Lin restringido, es precisamente la estimación conjunta de valores trimestrales para cada uno de los componentes del PIB en lo que afecta tanto a la demanda como a la oferta condicionada a que ambas estimaciones trimestrales del PIB sean iguales, evitando así la necesidad de aplicar posteriormente métodos de conciliación
Resumo:
Una de las herramientas estadísticas más importantes para el seguimiento y análisis de la evolución de la actividad económica a corto plazo es la disponibilidad de estimaciones de la evolución trimestral de los componentes del PIB, en lo que afecta tanto a la oferta como a la demanda. La necesidad de disponer de esta información con un retraso temporal reducido hace imprescindible la utilización de métodos de trimestralización que permitan desagregar la información anual a trimestral. El método más aplicado, puesto que permite resolver este problema de manera muy elegante bajo un enfoque estadístico de estimador óptimo, es el método de Chow-Lin. Pero este método no garantiza que las estimaciones trimestrales del PIB en lo que respecta a la oferta y a la demanda coincidan, haciendo necesaria la aplicación posterior de algún método de conciliación. En este trabajo se desarrolla una ampliación multivariante del método de Chow-Lin que permite resolver el problema de la estimación de los valores trimestrales de manera óptima, sujeta a un conjunto de restricciones. Una de las aplicaciones potenciales de este método, que hemos denominado método de Chow-Lin restringido, es precisamente la estimación conjunta de valores trimestrales para cada uno de los componentes del PIB en lo que afecta tanto a la demanda como a la oferta condicionada a que ambas estimaciones trimestrales del PIB sean iguales, evitando así la necesidad de aplicar posteriormente métodos de conciliación
Resumo:
The Iowa EHDI High-Risk Monitoring Protocol is based on the Joint Committee on Infant Hearing 2007 position statement. Emphasis is placed on follow-up as deemed appropriate by the primary health care provider and audiologist. The Iowa protocol describes the follow-up process for children with risk factors.
Resumo:
The fast simultaneous hadronization and chemical freeze-out of supercooled quark-gluon plasma, created in relativistic heavy ion collisions, can lead to the reheating of the expanding matter and to the change in a collective flow profile. We use the assumption of statistical nature of the hadronization process, and study quantitatively the freeze-out in the framework of hydrodynamical Bjorken model with different simple quark-gluon plasma equations of state.
Roadway Lighting and Safety: Phase II – Monitoring Quality, Durability and Efficiency, November 2011
Resumo:
This Phase II project follows a previous project titled Strategies to Address Nighttime Crashes at Rural, Unsignalized Intersections. Based on the results of the previous study, the Iowa Highway Research Board (IHRB) indicated interest in pursuing further research to address the quality of lighting, rather than just the presence of light, with respect to safety. The research team supplemented the literature review from the previous study, specifically addressing lighting level in terms of measurement, the relationship between light levels and safety, and lamp durability and efficiency. The Center for Transportation Research and Education (CTRE) teamed with a national research leader in roadway lighting, Virginia Tech Transportation Institute (VTTI) to collect the data. An integral instrument to the data collection efforts was the creation of the Roadway Monitoring System (RMS). The RMS allowed the research team to collect lighting data and approach information for each rural intersection identified in the previous phase. After data cleanup, the final data set contained illuminance data for 101 lighted intersections (of 137 lighted intersections in the first study). Data analysis included a robust statistical analysis based on Bayesian techniques. Average illuminance, average glare, and average uniformity ratio values were used to classify quality of lighting at the intersections.
Resumo:
Peroxisome proliferator-activated receptors (PPARs) are a potential target for neuroprotection in focal ischemic stroke. These nuclear receptors have major effects in lipid metabolism, but they are also involved in inflammatory processes. Three PPAR isotypes have been identified: alpha, beta (or delta) and gamma. The development of PPAR transgenic mice offers a promising tool for prospective therapeutic studies. This study used MRI to assess the role of PPARalpha and PPARbeta in the development of stroke. Permanent middle cerebral artery occlusion induced focal ischemia in wild-type, PPARalpha-null mice and PPARbeta-null mice. T(2)-weighted MRI was performed with a 7 T MRI scan on day 0, 1, 3, 7 and 14 to monitor lesion growth in the various genotypes. General Linear Model statistical analysis found a significant difference in lesion volume between wild-type and PPAR-null mice for both alpha and beta isotypes. These data validate high-resolution MRI for monitoring cerebral ischemic lesions, and confirm the neuroprotective role of PPARalpha and PPARbeta in the brain.
Resumo:
The dynamics of N losses in fertilizer by ammonia volatilization is affected by several factors, making investigation of these dynamics more complex. Moreover, some features of the behavior of the variable can lead to deviation from normal distribution, making the main commonly adopted statistical strategies inadequate for data analysis. Thus, the purpose of this study was to evaluate the patterns of cumulative N losses from urea through ammonia volatilization in order to find a more adequate and detailed way of assessing the behavior of the variable. For that reason, changes in patterns of ammonia volatilization losses as a result of applying different combinations of two soil classes [Planossolo and Chernossolo (Typic Albaqualf and Vertic Argiaquolls)] and different rates of urea (50, 100 and 150 kg ha-1 N), in the presence or absence of a urease inhibitor, were evaluated, adopting a 2 × 3 × 2 factorial design with four replications. Univariate and multivariate analysis of variance were performed using the adjusted parameter values of a logistic function as a response variable. The results obtained from multivariate analysis indicated a prominent effect of the soil class factor on the set of parameters, indicating greater relevance of soil adsorption potential on ammonia volatilization losses. Univariate analysis showed that the parameters related to total N losses and rate of volatilization were more affected by soil class and the rate of urea applied. The urease inhibitor affected only the rate and inflection point parameters, decreasing the rate of losses and delaying the beginning of the process, but had no effect on total ammonia losses. Patterns of ammonia volatilization losses provide details on behavior of the variable, details which can be used to develop and adopt more accurate techniques for more efficient use of urea.
Resumo:
We develop a method to obtain first-passage-time statistics for non-Markovian processes driven by dichotomous fluctuations. The fluctuations themselves need not be Markovian. We calculate analytic first-passage-time distributions and mean first-passage times for exponential, rectangular, and long-tail temporal distributions of the fluctuations.
Resumo:
The protein shells, or capsids, of nearly all spherelike viruses adopt icosahedral symmetry. In the present Letter, we propose a statistical thermodynamic model for viral self-assembly. We find that icosahedral symmetry is not expected for viral capsids constructed from structurally identical protein subunits and that this symmetry requires (at least) two internal switching configurations of the protein. Our results indicate that icosahedral symmetry is not a generic consequence of free energy minimization but requires optimization of internal structural parameters of the capsid proteins
Resumo:
This paper deals with the recruitment strategies of employers in the low-skilled segment of the labour market. We focus on low-skilled workers because they are overrepresented among jobless people and constitute the bulk of the clientele included in various activation and labour market programmes. A better understanding of the constraints and opportunities of interventions in this labour market segment may help improve their quality and effectiveness. On the basis of qualitative interviews with 41 employers in six European countries, we find that the traditional signals known to be used as statistical discrimination devices (old age, immigrant status and unemployment) play a somewhat reduced role, since these profiles are overrepresented among applicants for low skill positions. However, we find that other signals, mostly considered to be indicators of motivation, have a bigger impact in the selection process. These tend to concern the channel through which the contact with a prospective candidate is made. Unsolicited applications and recommendations from already employed workers emit a positive signal, whereas the fact of being referred by the public employment office is associated with the likelihood of lower motivation.
Resumo:
Introduction: The aim of this study is to compare the walking activity of a cohort of individuals before and after total ankle arthroplasty (TAA). Methods: Nineteen consecutive patients (ten males and nine females) with mean age of 58.72, selected for TAA between January and June 2006, were prospectively reviewed with the use of a dedicated ambulatory activity-monitoring device to assess their natural ambulatory activity. Patients were tested in the community for two weeks duration, one month prior to and at least eighteen months after surgery. The ambulatory parameters were assessed through measurement of the number of steps at different cadence, and the time spent walking at different walking paces. Data were analyzed by using specific statistical methods. Results: This study revealed a significant improvement in the number of steps walked at normal cadence (b = 331.63, p = .00) and significantly reduced at low cadence (b = -402.52, p = .00) and medium cadence (b = -386.29, p = .00), before and after TAA. However, there are no significant different between two phases of assessment in term of time spent walking. Conclusion: These quantitative data allow a clear comparative assessment of walking ability following TAR and demonstrates that this intervention improves patient's walking pace.
Roadway Lighting and Safety: Phase II – Monitoring Quality, Durability and Efficiency, November 2011
Resumo:
This Phase II project follows a previous project titled Strategies to Address Nighttime Crashes at Rural, Unsignalized Intersections. Based on the results of the previous study, the Iowa Highway Research Board (IHRB) indicated interest in pursuing further research to address the quality of lighting, rather than just the presence of light, with respect to safety. The research team supplemented the literature review from the previous study, specifically addressing lighting level in terms of measurement, the relationship between light levels and safety, and lamp durability and efficiency. The Center for Transportation Research and Education (CTRE) teamed with a national research leader in roadway lighting, Virginia Tech Transportation Institute (VTTI) to collect the data. An integral instrument to the data collection efforts was the creation of the Roadway Monitoring System (RMS). The RMS allowed the research team to collect lighting data and approach information for each rural intersection identified in the previous phase. After data cleanup, the final data set contained illuminance data for 101 lighted intersections (of 137 lighted intersections in the first study). Data analysis included a robust statistical analysis based on Bayesian techniques. Average illuminance, average glare, and average uniformity ratio values were used to classify quality of lighting at the intersections.
Resumo:
Research in autophagy continues to accelerate,(1) and as a result many new scientists are entering the field. Accordingly, it is important to establish a standard set of criteria for monitoring macroautophagy in different organisms. Recent reviews have described the range of assays that have been used for this purpose.(2,3) There are many useful and convenient methods that can be used to monitor macroautophagy in yeast, but relatively few in other model systems, and there is much confusion regarding acceptable methods to measure macroautophagy in higher eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers of autophagosomes versus those that measure flux through the autophagy pathway; thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from fully functional autophagy that includes delivery to, and degradation within, lysosomes (in most higher eukaryotes) or the vacuole (in plants and fungi). Here, we present a set of guidelines for the selection and interpretation of the methods that can be used by investigators who are attempting to examine macroautophagy and related processes, as well as by reviewers who need to provide realistic and reasonable critiques of papers that investigate these processes. This set of guidelines is not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to verify an autophagic response.
Resumo:
In this work, a previously-developed, statistical-based, damage-detection approach was validated for its ability to autonomously detect damage in bridges. The damage-detection approach uses statistical differences in the actual and predicted behavior of the bridge caused under a subset of ambient trucks. The predicted behavior is derived from a statistics-based model trained with field data from the undamaged bridge (not a finite element model). The differences between actual and predicted responses, called residuals, are then used to construct control charts, which compare undamaged and damaged structure data. Validation of the damage-detection approach was achieved by using sacrificial specimens that were mounted to the bridge and exposed to ambient traffic loads and which simulated actual damage-sensitive locations. Different damage types and levels were introduced to the sacrificial specimens to study the sensitivity and applicability. The damage-detection algorithm was able to identify damage, but it also had a high false-positive rate. An evaluation of the sub-components of the damage-detection methodology and methods was completed for the purpose of improving the approach. Several of the underlying assumptions within the algorithm were being violated, which was the source of the false-positives. Furthermore, the lack of an automatic evaluation process was thought to potentially be an impediment to widespread use. Recommendations for the improvement of the methodology were developed and preliminarily evaluated. These recommendations are believed to improve the efficacy of the damage-detection approach.