903 resultados para Generalization of Ehrenfest’s urn Model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The public primary school system in the State of Geneva, Switzerland, is characterized by centrally evaluated pupil performance measured with the use of standardized tests. As a result, consistent data are collected among the system. The 2010-2011 dataset is used to develop a two-stage data envelopment analysis (DEA) of school efficiency. In the first stage, DEA is employed to calculate an individual efficiency score for each school. It shows that, on average, each school could reduce its inputs by 7% whilst maintaining the same quality of pupil performance. The cause of inefficiency lies in perfectible management. In the second stage, efficiency is regressed on school characteristics and environmental variables;external factors outside of the control of headteachers. The model is tested for multicollinearity, heteroskedasticity and endogeneity. Four variables are identified as statistically significant. School efficiency is negatively influenced by (1) the provision of special education, (2) the proportion of disadvantaged pupils enrolled at the school and (3) operations being held on multiple sites, but positively influenced by school size (captured by the number of pupils). The proportion of allophone pupils; schools located in urban areas and the provision of reception classes for immigrant pupils are not significant. Although the significant variables influencing school efficiency are outside of the control of headteachers, it is still possible to either boost the positive impact or curb the negative impact. Dans le canton de Genève (Suisse), les écoles publiques primaires sont caractérisées par un financement assuré par les collectivités publiques (canton et communes) et par une évaluation des élèves à l'aide d'épreuves standardisées à trois moments distincts de leur scolarité. Cela permet de réunir des informations statistiques consistantes. La base de données de l'année 2010-2011 est utilisée dans une analyse en deux étapes de l'efficience des écoles. Dans une première étape, la méthode d'analyse des données par enveloppement (DEA) est utilisée pour calculer un score d'efficience pour chaque école. Cette analyse démontre que l'efficience moyenne des écoles s'élève à 93%. Chaque école pourrait, en moyenne, réduire ses ressources de 7% tout en conservant constants les résultats des élèves aux épreuves standardisées. La source de l'inefficience réside dans un management des écoles perfectible. Dans une seconde étape, les scores d'efficience sont régressés sur les caractéristiques des écoles et sur des variables environnementales. Ces variables ne sont pas sous le contrôle (ou l'influence) des directeurs d'école. Le modèle est testé pour la multicolinéartié, l'hétéroscédasticité et l'endogénéité. Quatre variables sont statistiquement significatives. L'efficience des écoles est influencée négativement par (1) le fait d'offrir un enseignement spécialisé en classe séparée, (2) la proporition d'élèves défavorisés et (3) le fait d'opérer sur plusieurs sites différents. L'efficience des écoles est influencée positivement par la taille de l'école, mesurée par le nombre d'élèves. La proporition d'élèves allophones, le fait d'être situé dans une zone urbaine et d'offrir des classes d'accueil pour les élèves immigrants constituent autant de variables non significatives. Le fait que les variables qui influencent l'efficience des écoles ne soient pas sous le contrôle des directeurs ne signifie pas qu'il faille céder au fatalisme. Différentes pistes sont proposées pour permettre soit de réduire l'impact négatif soit de tirer parti de l'impact positif des variables significatives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a method to analyse the 2009 outbreak in the region of Botucatu in the state of São Paulo (SP), Brazil, when 28 yellow fever (YF) cases were confirmed, including 11 deaths. At the time of the outbreak, the Secretary of Health of the State of São Paulo vaccinated one million people, causing the death of five individuals, an unprecedented number of YF vaccine-induced fatalities. We apply a mathematical model described previously to optimise the proportion of people who should be vaccinated to minimise the total number of deaths. The model was used to calculate the optimum proportion that should be vaccinated in the remaining, vaccine-free regions of SP, considering the risk of vaccine-induced fatalities and the risk of YF outbreaks in these regions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of model observers for mimicking human detection strategies has followed from symmetric signals in simple noise to increasingly complex backgrounds. In this study we implement different model observers for the complex task of detecting a signal in a 3D image stack. The backgrounds come from real breast tomosynthesis acquisitions and the signals were simulated and reconstructed within the volume. Two different tasks relevant to the early detection of breast cancer were considered: detecting an 8 mm mass and detecting a cluster of microcalcifications. The model observers were calculated using a channelized Hotelling observer (CHO) with dense difference-of-Gaussian channels, and a modified (Partial prewhitening [PPW]) observer which was adapted to realistic signals which are not circularly symmetric. The sustained temporal sensitivity function was used to filter the images before applying the spatial templates. For a frame rate of five frames per second, the only CHO that we calculated performed worse than the humans in a 4-AFC experiment. The other observers were variations of PPW and outperformed human observers in every single case. This initial frame rate was a rather low speed and the temporal filtering did not affect the results compared to a data set with no human temporal effects taken into account. We subsequently investigated two higher speeds at 5, 15 and 30 frames per second. We observed that for large masses, the two types of model observers investigated outperformed the human observers and would be suitable with the appropriate addition of internal noise. However, for microcalcifications both only the PPW observer consistently outperformed the humans. The study demonstrated the possibility of using a model observer which takes into account the temporal effects of scrolling through an image stack while being able to effectively detect a range of mass sizes and distributions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Richer and healthier agents tend to hold riskier portfolios and spend proportionally less on health expenditures. Potential explanations include health and wealth effects on preferences, expected longevity or disposable total wealth. Using HRS data, we perform a structural estimation of a dynamic model of consumption, portfolio and health expenditure choices with recursive utility, as well as health-dependent income and mortality risk. Our estimates of the deep parameters highlight the importance of health capital, mortality risk control, convex health and mortality adjustment costs and binding liquidity constraints to rationalize the stylized facts. They also provide new perspectives on expected longevity and on the values of life and health.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a series of three experiments, participants made inferences about which one of a pair of two objects scored higher on a criterion. The first experiment was designed to contrast the prediction of Probabilistic Mental Model theory (Gigerenzer, Hoffrage, & Kleinbölting, 1991) concerning sampling procedure with the hard-easy effect. The experiment failed to support the theory's prediction that a particular pair of randomly sampled item sets would differ in percentage correct; but the observation that German participants performed practically as well on comparisons between U.S. cities (many of which they did not even recognize) than on comparisons between German cities (about which they knew much more) ultimately led to the formulation of the recognition heuristic. Experiment 2 was a second, this time successful, attempt to unconfound item difficulty and sampling procedure. In Experiment 3, participants' knowledge and recognition of each city was elicited, and how often this could be used to make an inference was manipulated. Choices were consistent with the recognition heuristic in about 80% of the cases when it discriminated and people had no additional knowledge about the recognized city (and in about 90% when they had such knowledge). The frequency with which the heuristic could be used affected the percentage correct, mean confidence, and overconfidence as predicted. The size of the reference class, which was also manipulated, modified these effects in meaningful and theoretically important ways.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The diagnosis of idiopathic Parkinson's disease (IPD) is entirely clinical. The fact that neuronal damage begins 5-10 years before occurrence of sub-clinical signs, underlines the importance of preclinical diagnosis. A new approach for in-vivo pathophysiological assessment of IPD-related neurodegeneration was implemented based on recently developed neuroimaging methods. It is based on non- invasive magnetic resonance data sensitive to brain tissue property changes that precede macroscopic atrophy in the early stages of IPD. This research aims to determine the brain tissue property changes induced by neurodegeneration that can be linked to clinical phenotypes which will allow us to create a predictive model for early diagnosis in IPD. We hypothesized that the degree of disease progression in IPD patients will have a differential and specific impact on brain tissue properties used to create a predictive model of motor and non-motor impairment in IPD. We studied the potential of in-vivo quantitative imaging sensitive to neurodegeneration- related brain tissue characteristics to detect changes in patients with IPD. We carried out methodological work within the well established SPM8 framework to estimate the sensitivity of tissue probability maps for automated tissue classification for detection of early IPD. We performed whole-brain multi parameter mapping at high resolution followed by voxel-based morphometric (VBM) analysis and voxel-based quantification (VBQ) comparing healthy subjects to IPD patients. We found a trend demonstrating non-significant tissue property changes in the olfactory bulb area using the MT and R1 parameter with p<0.001. Comparing to the IPD patients, the healthy group presented a bilateral higher MT and R1 intensity in this specific functional region. These results did not correlate with age, severity or duration of disease. We failed to demonstrate any changes with the R2* parameter. We interpreted our findings as demyelination of the olfactory tract, which is clinically represented as anosmia. However, the lack of correlation with duration or severity complicates its implications in the creation of a predictive model of impairment in IPD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The mathematical representation of Brunswik s lens model has been usedextensively to study human judgment and provides a unique opportunity to conduct ameta-analysis of studies that covers roughly five decades. Specifically, we analyzestatistics of the lens model equation (Tucker, 1964) associated with 259 different taskenvironments obtained from 78 papers. In short, we find on average fairly high levelsof judgmental achievement and note that people can achieve similar levels of cognitiveperformance in both noisy and predictable environments. Although overall performancevaries little between laboratory and field studies, both differ in terms of components ofperformance and types of environments (numbers of cues and redundancy). An analysisof learning studies reveals that the most effective form of feedback is information aboutthe task. We also analyze empirically when bootstrapping is more likely to occur. Weconclude by indicating shortcomings of the kinds of studies conducted to date, limitationsin the lens model methodology, and possibilities for future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using a suitable Hull and White type formula we develop a methodology to obtain asecond order approximation to the implied volatility for very short maturities. Using thisapproximation we accurately calibrate the full set of parameters of the Heston model. Oneof the reasons that makes our calibration for short maturities so accurate is that we alsotake into account the term-structure for large maturities. We may say that calibration isnot "memoryless", in the sense that the option's behavior far away from maturity doesinfluence calibration when the option gets close to expiration. Our results provide a wayto perform a quick calibration of a closed-form approximation to vanilla options that canthen be used to price exotic derivatives. The methodology is simple, accurate, fast, andit requires a minimal computational cost.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a model of conglomeration motivated by technology synergies and strategic reductions in variable costs in the face of competitive pressures. The resulting firm integration is neither horizontal nor vertical but rather congeneric integration of firms in related industries. We endogenize the industrial conglomeration structure and examine the effects of competition between conglomerates, and between a conglomerate and independent firms. We show that there is an equilibrium synergy trap in which conglomerates are formed to exploit economies of scope, but resulting profits are lower than under the status quo. We also show that strategic firm integration can occur even in the presence of diseconomies of scope. The model helps to explain features of recent mergers and acquisitions experience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Determining the time since deposition of fingermarks may prove necessary to assess their relevance to criminal investigations. The crucial factor is the initial composition of fingermarks, because it represents the starting point of any aging model. This study mainly aimed to characterize the initial composition of fingerprints, which show a high variability between donors (inter-variability), but also to investigate the variations among fingerprints from the same donor (intra-variability). Solutions to reduce this initial variability using squalene and cholesterol as target compounds are proposed and should be further investigated. The influence of substrates was also evaluated, and the initial composition was observed to be larger on porous surface than nonporous surfaces. Preliminary aging of fingerprints over 30 days was finally studied on a porous and a nonporous substrate to evaluate the potential for dating of fingermarks. Squalene was observed to decrease in a faster rate on a nonporous substrate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We are interested in the development, implementation and testing of an orthotropic model for cardiac contraction based on an active strain decomposition. Our model addresses the coupling of a transversely isotropic mechanical description at the cell level, with an orthotropic constitutive law for incompressible tissue at the macroscopic level. The main differences with the active stress model are addressed in detail, and a finite element discretization using Taylor-Hood and MINI elements is proposed and illustrated with numerical examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intensification of agricultural production without a sound management and regulations can lead to severe environmental problems, as in Western Santa Catarina State, Brazil, where intensive swine production has caused large accumulations of manure and consequently water pollution. Natural resource scientists are asked by decision-makers for advice on management and regulatory decisions. Distributed environmental models are useful tools, since they can be used to explore consequences of various management practices. However, in many areas of the world, quantitative data for model calibration and validation are lacking. The data-intensive distributed environmental model AgNPS was applied in a data-poor environment, the upper catchment (2,520 ha) of the Ariranhazinho River, near the city of Seara, in Santa Catarina State. Steps included data preparation, cell size selection, sensitivity analysis, model calibration and application to different management scenarios. The model was calibrated based on a best guess for model parameters and on a pragmatic sensitivity analysis. The parameters were adjusted to match model outputs (runoff volume, peak runoff rate and sediment concentration) closely with the sparse observed data. A modelling grid cell resolution of 150 m adduced appropriate and computer-fit results. The rainfall runoff response of the AgNPS model was calibrated using three separate rainfall ranges (< 25, 25-60, > 60 mm). Predicted sediment concentrations were consistently six to ten times higher than observed, probably due to sediment trapping along vegetated channel banks. Predicted N and P concentrations in stream water ranged from just below to well above regulatory norms. Expert knowledge of the area, in addition to experience reported in the literature, was able to compensate in part for limited calibration data. Several scenarios (actual, recommended and excessive manure applications, and point source pollution from swine operations) could be compared by the model, using a relative ranking rather than quantitative predictions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a general class of non-Markovian processes defined by stochastic differential equations with Ornstein-Uhlenbeck noise. We present a general formalism to evaluate relaxation times associated with correlation functions in the steady state. This formalism is a generalization of a previous approach for Markovian processes. The theoretical results are shown to be in satisfactory agreement both with experimental data for a cubic bistable system and also with a computer simulation of the Stratonovich model. We comment on the dynamical role of the non-Markovianicity in different situations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rock slope instabilities such as rock slides, rock avalanche or deep-seated gravitational slope deformations are widespread in Alpine valleys. These phenomena represent at the same time a main factor that control the mountain belts erosion and also a significant natural hazard that creates important losses to the mountain communities. However, the potential geometrical and dynamic connections linking outcrop and slope-scale instabilities are often unknown. A more detailed definition of the potential links will be essential to improve the comprehension of the destabilization processes and to dispose of a more complete hazard characterization of the rock instabilities at different spatial scales. In order to propose an integrated approach in the study of the rock slope instabilities, three main themes were analysed in this PhD thesis: (1) the inventory and the spatial distribution of rock slope deformations at regional scale and their influence on the landscape evolution, (2) the influence of brittle and ductile tectonic structures on rock slope instabilities development and (3) the characterization of hazard posed by potential rock slope instabilities through the development of conceptual instability models. To prose and integrated approach for the analyses of these topics, several techniques were adopted. In particular, high resolution digital elevation models revealed to be fundamental tools that were employed during the different stages of the rock slope instability assessment. A special attention was spent in the application of digital elevation model for detailed geometrical modelling of past and potential instabilities and for the rock slope monitoring at different spatial scales. Detailed field analyses and numerical models were performed to complete and verify the remote sensing approach. In the first part of this thesis, large slope instabilities in Rhone valley (Switzerland) were mapped in order to dispose of a first overview of tectonic and climatic factors influencing their distribution and their characteristics. Our analyses demonstrate the key influence of neotectonic activity and the glacial conditioning on the spatial distribution of the rock slope deformations. Besides, the volumes of rock instabilities identified along the main Rhone valley, were then used to propose the first estimate of the postglacial denudation and filling of the Rhone valley associated to large gravitational movements. In the second part of the thesis, detailed structural analyses of the Frank slide and the Sierre rock avalanche were performed to characterize the influence of brittle and ductile tectonic structures on the geometry and on the failure mechanism of large instabilities. Our observations indicated that the geometric characteristics and the variation of the rock mass quality associated to ductile tectonic structures, that are often ignored landslide study, represent important factors that can drastically influence the extension and the failure mechanism of rock slope instabilities. In the last part of the thesis, the failure mechanisms and the hazard associated to five potential instabilities were analysed in detail. These case studies clearly highlighted the importance to incorporate different analyses and monitoring techniques to dispose of reliable and hazard scenarios. This information associated to the development of a conceptual instability model represents the primary data for an integrated risk management of rock slope instabilities. - Les mouvements de versant tels que les chutes de blocs, les éboulements ou encore les phénomènes plus lents comme les déformations gravitaires profondes de versant représentent des manifestations courantes en régions montagneuses. Les mouvements de versant sont à la fois un des facteurs principaux contrôlant la destruction progressive des chaines orogéniques mais aussi un danger naturel concret qui peut provoquer des dommages importants. Pourtant, les phénomènes gravitaires sont rarement analysés dans leur globalité et les rapports géométriques et mécaniques qui lient les instabilités à l'échelle du versant aux instabilités locales restent encore mal définis. Une meilleure caractérisation de ces liens pourrait pourtant représenter un apport substantiel dans la compréhension des processus de déstabilisation des versants et améliorer la caractérisation des dangers gravitaires à toutes les échelles spatiales. Dans le but de proposer un approche plus globale à la problématique des mouvements gravitaires, ce travail de thèse propose trois axes de recherche principaux: (1) l'inventaire et l'analyse de la distribution spatiale des grandes instabilités rocheuses à l'échelle régionale, (2) l'analyse des structures tectoniques cassantes et ductiles en relation avec les mécanismes de rupture des grandes instabilités rocheuses et (3) la caractérisation des aléas rocheux par une approche multidisciplinaire visant à développer un modèle conceptuel de l'instabilité et une meilleure appréciation du danger . Pour analyser les différentes problématiques traitées dans cette thèse, différentes techniques ont été utilisées. En particulier, le modèle numérique de terrain s'est révélé être un outil indispensable pour la majorité des analyses effectuées, en partant de l'identification de l'instabilité jusqu'au suivi des mouvements. Les analyses de terrain et des modélisations numériques ont ensuite permis de compléter les informations issues du modèle numérique de terrain. Dans la première partie de cette thèse, les mouvements gravitaires rocheux dans la vallée du Rhône (Suisse) ont été cartographiés pour étudier leur répartition en fonction des variables géologiques et morphologiques régionales. En particulier, les analyses ont mis en évidence l'influence de l'activité néotectonique et des phases glaciaires sur la distribution des zones à forte densité d'instabilités rocheuses. Les volumes des instabilités rocheuses identifiées le long de la vallée principale ont été ensuite utilisés pour estimer le taux de dénudations postglaciaire et le remplissage de la vallée du Rhône lié aux grands mouvements gravitaires. Dans la deuxième partie, l'étude de l'agencement structural des avalanches rocheuses de Sierre (Suisse) et de Frank (Canada) a permis de mieux caractériser l'influence passive des structures tectoniques sur la géométrie des instabilités. En particulier, les structures issues d'une tectonique ductile, souvent ignorées dans l'étude des instabilités gravitaires, ont été identifiées comme des structures très importantes qui contrôlent les mécanismes de rupture des instabilités à différentes échelles. Dans la dernière partie de la thèse, cinq instabilités rocheuses différentes ont été étudiées par une approche multidisciplinaire visant à mieux caractériser l'aléa et à développer un modèle conceptuel trois dimensionnel de ces instabilités. A l'aide de ces analyses on a pu mettre en évidence la nécessité d'incorporer différentes techniques d'analyses et de surveillance pour une gestion plus objective du risque associée aux grandes instabilités rocheuses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The spared nerve injury (SNI) model mimics human neuropathic pain related to peripheral nerve injury and is based upon an invasive but simple surgical procedure. Since its first description in 2000, it has displayed a remarkable development. It produces a robust, reliable and long-lasting neuropathic pain-like behaviour (allodynia and hyperalgesia) as well as the possibility of studying both injured and non-injured neuronal populations in the same spinal ganglion. Besides, variants of the SNI model have been developed in rats, mice and neonatal/young rodents, resulting in several possible angles of analysis. Therefore, the purpose of this chapter is to provide a detailed guidance regarding the SNI model and its variants, highlighting its surgical and behavioural testing specificities.