917 resultados para model-based clustering
Resumo:
Following major reforms of the British National Health Service (NHS) in 1990, the roles of purchasing and providing health services were separated, with the relationship between purchasers and providers governed by contracts. Using a mixed multinomial logit analysis, we show how this policy shift led to a selection of contracts that is consistent with the predictions of a simple model, based on contract theory, in which the characteristics of the health services being purchased and of the contracting parties influence the choice of contract form. The paper thus provides evidence in support of the practical relevance of theory in understanding health care market reform.
Resumo:
1. Model-based approaches have been used increasingly in conservation biology over recent years. Species presence data used for predictive species distribution modelling are abundant in natural history collections, whereas reliable absence data are sparse, most notably for vagrant species such as butterflies and snakes. As predictive methods such as generalized linear models (GLM) require absence data, various strategies have been proposed to select pseudo-absence data. However, only a few studies exist that compare different approaches to generating these pseudo-absence data. 2. Natural history collection data are usually available for long periods of time (decades or even centuries), thus allowing historical considerations. However, this historical dimension has rarely been assessed in studies of species distribution, although there is great potential for understanding current patterns, i.e. the past is the key to the present. 3. We used GLM to model the distributions of three 'target' butterfly species, Melitaea didyma, Coenonympha tullia and Maculinea teleius, in Switzerland. We developed and compared four strategies for defining pools of pseudo-absence data and applied them to natural history collection data from the last 10, 30 and 100 years. Pools included: (i) sites without target species records; (ii) sites where butterfly species other than the target species were present; (iii) sites without butterfly species but with habitat characteristics similar to those required by the target species; and (iv) a combination of the second and third strategies. Models were evaluated and compared by the total deviance explained, the maximized Kappa and the area under the curve (AUC). 4. Among the four strategies, model performance was best for strategy 3. Contrary to expectations, strategy 2 resulted in even lower model performance compared with models with pseudo-absence data simulated totally at random (strategy 1). 5. Independent of the strategy model, performance was enhanced when sites with historical species presence data were not considered as pseudo-absence data. Therefore, the combination of strategy 3 with species records from the last 100 years achieved the highest model performance. 6. Synthesis and applications. The protection of suitable habitat for species survival or reintroduction in rapidly changing landscapes is a high priority among conservationists. Model-based approaches offer planning authorities the possibility of delimiting priority areas for species detection or habitat protection. The performance of these models can be enhanced by fitting them with pseudo-absence data relying on large archives of natural history collection species presence data rather than using randomly sampled pseudo-absence data.
Resumo:
This work is focused on the development of a methodology for the use of chemical characteristic of tire traces to help answer the following question: "Is the offending tire at the origin of the trace found on the crime scene?". This methodology goes from the trace sampling on the road to statistical analysis of its chemical characteristics. Knowledge about the composition and manufacture of tread tires as well as a review of instrumental techniques used for the analysis of polymeric materials were studied to select, as an ansi vi cal technique for this research, pyrolysis coupled to a gas Chromatograph with a mass spectrometry detector (Py-GC/MS). An analytical method was developed and optimized to obtain the lowest variability between replicates of the same sample. Within-variability of the tread was evaluated regarding width and circumference with several samples taken from twelve tires of different brands and/or models. The variability within each of the treads (within-variability) and between the treads (between-variability) could be quantified. Different statistical methods have shown that within-variability is lower than between-variability, which helped differentiate these tires. Ten tire traces were produced with tires of different brands and/or models by braking tests. These traces have been adequately sampled using sheets of gelatine. Particles of each trace were analysed using the same methodology as for the tires at their origin. The general chemical profile of a trace or of a tire has been characterized by eighty-six compounds. Based on a statistical comparison of the chemical profiles obtained, it has been shown that a tire trace is not differentiable from the tire at its origin but is generally differentiable from tires that are not at its origin. Thereafter, a sample containing sixty tires was analysed to assess the discrimination potential of the developed methodology. The statistical results showed that most of the tires of different brands and models are differentiable. However, tires of the same brand and model with identical characteristics, such as country of manufacture, size and DOT number, are not differentiable. A model, based on a likelihood ratio approach, was chosen to evaluate the results of the comparisons between the chemical profiles of the traces and tires. The methodology developed was finally blindly tested using three simulated scenarios. Each scenario involved a trace of an unknown tire as well as two tires possibly at its origin. The correct results for the three scenarios were used to validate the developed methodology. The different steps of this work were useful to collect the required information to test and validate the underlying assumption that it is possible to help determine if an offending tire » or is not at the origin of a trace, by means of a statistical comparison of their chemical profile. This aid was formalized by a measure of the probative value of the evidence, which is represented by the chemical profile of the trace of the tire. - Ce travail s'est proposé de développer une méthodologie pour l'exploitation des caractéristiques chimiques des traces de pneumatiques dans le but d'aider à répondre à la question suivante : «Est-ce que le pneumatique incriminé est ou n'est pas à l'origine de la trace relevée sur les lieux ? ». Cette méthodologie s'est intéressée du prélèvement de la trace de pneumatique sur la chaussée à l'exploitation statistique de ses caractéristiques chimiques. L'acquisition de connaissances sur la composition et la fabrication de la bande de roulement des pneumatiques ainsi que la revue de techniques instrumentales utilisées pour l'analyse de matériaux polymériques ont permis de choisir, comme technique analytique pour la présente recherche, la pyrolyse couplée à un chromatographe en phase gazeuse avec un détecteur de spectrométrie de masse (Py-GC/MS). Une méthode analytique a été développée et optimisée afin d'obtenir la plus faible variabilité entre les réplicas d'un même échantillon. L'évaluation de l'intravariabilité de la bande de roulement a été entreprise dans sa largeur et sa circonférence à l'aide de plusieurs prélèvements effectués sur douze pneumatiques de marques et/ou modèles différents. La variabilité au sein de chacune des bandes de roulement (intravariabilité) ainsi qu'entre les bandes de roulement considérées (intervariabilité) a pu être quantifiée. Les différentes méthodes statistiques appliquées ont montré que l'intravariabilité est plus faible que l'intervariabilité, ce qui a permis de différencier ces pneumatiques. Dix traces de pneumatiques ont été produites à l'aide de pneumatiques de marques et/ou modèles différents en effectuant des tests de freinage. Ces traces ont pu être adéquatement prélevées à l'aide de feuilles de gélatine. Des particules de chaque trace ont été analysées selon la même méthodologie que pour les pneumatiques à leur origine. Le profil chimique général d'une trace de pneumatique ou d'un pneumatique a été caractérisé à l'aide de huitante-six composés. Sur la base de la comparaison statistique des profils chimiques obtenus, il a pu être montré qu'une trace de pneumatique n'est pas différenciable du pneumatique à son origine mais est, généralement, différenciable des pneumatiques qui ne sont pas à son origine. Par la suite, un échantillonnage comprenant soixante pneumatiques a été analysé afin d'évaluer le potentiel de discrimination de la méthodologie développée. Les méthodes statistiques appliquées ont mis en évidence que des pneumatiques de marques et modèles différents sont, majoritairement, différenciables entre eux. La méthodologie développée présente ainsi un bon potentiel de discrimination. Toutefois, des pneumatiques de la même marque et du même modèle qui présentent des caractéristiques PTD (i.e. pays de fabrication, taille et numéro DOT) identiques ne sont pas différenciables. Un modèle d'évaluation, basé sur une approche dite du likelihood ratio, a été adopté pour apporter une signification au résultat des comparaisons entre les profils chimiques des traces et des pneumatiques. La méthodologie mise en place a finalement été testée à l'aveugle à l'aide de la simulation de trois scénarios. Chaque scénario impliquait une trace de pneumatique inconnue et deux pneumatiques suspectés d'être à l'origine de cette trace. Les résultats corrects obtenus pour les trois scénarios ont permis de valider la méthodologie développée. Les différentes étapes de ce travail ont permis d'acquérir les informations nécessaires au test et à la validation de l'hypothèse fondamentale selon laquelle il est possible d'aider à déterminer si un pneumatique incriminé est ou n'est pas à l'origine d'une trace, par le biais d'une comparaison statistique de leur profil chimique. Cette aide a été formalisée par une mesure de la force probante de l'indice, qui est représenté par le profil chimique de la trace de pneumatique.
Resumo:
El projecte exposat té com a propòsit definir i implementar un model de simulació basat en la coordinació i assignació dels serveis d’emergència en accidents de trànsit. La definició del model s’ha realitzat amb l’ús de les Xarxes de Petri Acolorides i la implementació amb el software Rockwell Arena 7.0. El modelatge de la primera simulació ens mostra un model teòric basat en cues mentre que el segon, mostra un model més complet i real gràcies a la connexió mitjançant la plataforma Corba a una base de dades amb informació geogràfica de les flotes i de les rutes. Com a resultat de l’estudi i amb l’ajuda de GoogleEarth, podem realitzar simulacions gràfiques per veure els accidents generats, les flotes dels serveis i el moviment dels vehicles des de les bases fins als accidents.
Resumo:
These notes try to clarify some discussions on the formulation of individual intertemporal behavior under adaptive learning in representative agent models. First, we discuss two suggested approaches and related issues in the context of a simple consumption-saving model. Second, we show that the analysis of learning in the NewKeynesian monetary policy model based on “Euler equations” provides a consistent and valid approach.
Resumo:
INTRODUCTION: Therapeutic hypothermia (TH) is often used to treat out-of-hospital cardiac arrest (OHCA) patients who also often simultaneously receive insulin for stress-induced hyperglycaemia. However, the impact of TH on systemic metabolism and insulin resistance in critical illness is unknown. This study analyses the impact of TH on metabolism, including the evolution of insulin sensitivity (SI) and its variability, in patients with coma after OHCA. METHODS: This study uses a clinically validated, model-based measure of SI. Insulin sensitivity was identified hourly using retrospective data from 200 post-cardiac arrest patients (8,522 hours) treated with TH, shortly after admission to the intensive care unit (ICU). Blood glucose and body temperature readings were taken every one to two hours. Data were divided into three periods: 1) cool (T <35°C); 2) an idle period of two hours as normothermia was re-established; and 3) warm (T >37°C). A maximum of 24 hours each for the cool and warm periods was considered. The impact of each condition on SI is analysed per cohort and per patient for both level and hour-to-hour variability, between periods and in six-hour blocks. RESULTS: Cohort and per-patient median SI levels increase consistently by 35% to 70% and 26% to 59% (P <0.001) respectively from cool to warm. Conversely, cohort and per-patient SI variability decreased by 11.1% to 33.6% (P <0.001) for the first 12 hours of treatment. However, SI variability increases between the 18th and 30th hours over the cool to warm transition, before continuing to decrease afterward. CONCLUSIONS: OCHA patients treated with TH have significantly lower and more variable SI during the cool period, compared to the later warm period. As treatment continues, SI level rises, and variability decreases consistently except for a large, significant increase during the cool to warm transition. These results demonstrate increased resistance to insulin during mild induced hypothermia. Our study might have important implications for glycaemic control during targeted temperature management.
Resumo:
South Peak is a 7-Mm3 potentially unstable rock mass located adjacent to the 1903 Frank Slide on Turtle Mountain, Alberta. This paper presents three-dimensional numerical rock slope stability models and compares them with a previous conceptual slope instability model based on discontinuity surfaces identified using an airborne LiDAR digital elevation model (DEM). Rock mass conditions at South Peak are described using the Geological Strength Index and point load tests, whilst the mean discontinuity set orientations and characteristics are based on approximately 500 field measurements. A kinematic analysis was first conducted to evaluate probable simple discontinuity-controlled failure modes. The potential for wedge failure was further assessed by considering the orientation of wedge intersections over the airborne LiDAR DEM and through a limit equilibrium combination analysis. Block theory was used to evaluate the finiteness and removability of blocks in the rock mass. Finally, the complex interaction between discontinuity sets and the topography within South Peak was investigated through three-dimensional distinct element models using the code 3DEC. The influence of individual discontinuity sets, scale effects, friction angle and the persistence along the discontinuity surfaces on the slope stability conditions were all investigated using this code.
Resumo:
An African oxalogenic tree, the iroko tree (Milicia excelsa), has the property to enhance carbonate precipitation in tropical oxisols, where such accumulations are not expected due to the acidic conditions in these types of soils. This uncommon process is linked to the oxalate-carbonate pathway, which increases soil pH through oxalate oxidation. In order to investigate the oxalate-carbonate pathway in the iroko system, fluxes of matter have been identified, described, and evaluated from field to microscopic scales. In the first centimeters of the soil profile, decaying of the organic matter allows the release of whewellite crystals, mainly due to the action of termites and saprophytic fungi. In addition, a concomitant flux of carbonate formed in wood tissues contributes to the carbonate flux and is identified as a direct consequence of wood feeding by termites. Nevertheless, calcite biomineralization of the tree is not a consequence of in situ oxalate consumption, but rather related to the oxalate oxidation inside the upper part of the soil. The consequence of this oxidation is the presence of carbonate ions in the soil solution pumped through the roots, leading to preferential mineralization of the roots and the trunk base. An ideal scenario for the iroko biomineralization and soil carbonate accumulation starts with oxalatization: as the iroko tree grows, the organic matter flux to the soil constitutes the litter, and an oxalate pool is formed on the forest ground. Then, wood rotting agents (mainly termites, saprophytic fungi, and bacteria) release significant amounts of oxalate crystals from decaying plant tissues. In addition, some of these agents are themselves producers of oxalate (e.g. fungi). Both processes contribute to a soil pool of "available" oxalate crystals. Oxalate consumption by oxalotrophic bacteria can then start. Carbonate and calcium ions present in the soil solution represent the end products of the oxalate-carbonate pathway. The solution is pumped through the roots, leading to carbonate precipitation. The main pools of carbon are clearly identified as the organic matter (the tree and its organic products), the oxalate crystals, and the various carbonate features. A functional model based on field observations and diagenetic investigations with δ13C signatures of the various compartments involved in the local carbon cycle is proposed. It suggests that the iroko ecosystem can act as a long-term carbon sink, as long as the calcium source is related to non-carbonate rocks. Consequently, this carbon sink, driven by the oxalate carbonate pathway around an iroko tree, constitutes a true carbon trapping ecosystem as defined by ecological theory.
Resumo:
En este trabajo se explica cuáles fueron las estrategias utilizadas y los resultados obtenidos en la primera exposición del nuevo esquema museográfico del Museo de Historia Natural de Londres, concebido por Roger Miles, Jefe del Departamento de Servicios Públicos de esa prestigiada institución. Esta iniciativa pretendía atraer a un mayor número de visitantes a partir de exposiciones basadas en modelos y módulos interactivos que relegaban a los objetos de las colecciones a un segundo plano. La exposición se tituló Human Biology y fue inaugurada el 24 de mayo de 1977. El tema de la exposición fue la biología humana, pero como se argumenta en este trabajo, Human Biology sirvió también como medio para legitimar el discurso modernizador de la biología humana, en tanto disciplina más rigurosa por las herramientas y técnicas más precisas que las utilizadas por la antropología física tradicional. Se buscaba también generar una audiencia para reforzar el campo interdisciplinario de la ciencia cognitiva y en particular la inteligencia artificial. El equipo de asesores científicos de la exposición contó entre sus miembros con personalidades que jugaron un papel protagónico en el desarrollo de esas disciplinas, y necesitaban demostrar su validez y utilidad ante los no especialistas y el público en general.
Resumo:
La adaptación de los estudios universitarios al Espacio Europeo de Educación Superior (EEES) pretende conseguir un nuevo modelo educativo basado en el aprendizaje activo del estudiante. En este sentido, las Tecnologías de la Información y la Comunicación (TICs) pueden desempeñar un papel importante en la renovación de la metodología docente, y muy especialmente en asignaturas donde la carga iconográfica es fundamental, tal como ocurre en las Ciencias morfológicas y en algunas materias clínicas. En la Licenciatura en Veterinària de la UAB la carga presencial del alumno es muy elevada, lo que deja poco tiempo para el autoaprendizaje activo y el estudio autónomo. Para intentar paliar este problema, en nuestra Titulación se han elaborado en los últimos años diversos atlas y otros documentos virtuales cuyos contenidos didácticos están relacionados con materias como la Anatomía, Parasitología, Radiología y Anatomía Patológica. Estos materiales, algunos de los cuales ya están publicados on line en la plataforma Veterinària Virtual (http://quiro.uab.es), y que están a disposición de los estudiantes, posibilitan reducir en parte la carga presencial, sirven de ayuda en el proceso de enseñanza y aprendizaje, facilitan el aprendizaje no presencial, autónomo y activo y permiten la evaluación continuada, consiguiendo en definitiva un aumento del protagonismo del alumno en el proceso educativo, lo que constituye una de las metas de la adaptación al EEES. Los alumnos valoran muy positivamente la publicación on line de material educativo, ya que representa un recurso didáctico fácilmente disponible, de acceso permanente y de bajo coste económico. La duración del proyecto ha sido de dos años.
Resumo:
Recently, Revil & Florsch proposed a novel mechanistic model based on the polarization of the Stern layer relating the permeability of granular media to their spectral induced polarization (SIP) characteristics based on the formation of polarized cells around individual grains. To explore the practical validity of this model, we compare it to pertinent laboratory measurements on samples of quartz sands with a wide range of granulometric characteristics. In particular, we measure the hydraulic and SIP characteristics of all samples both in their loose, non-compacted and compacted states, which might allow for the detection of polarization processes that are independent of the grain size. We first verify the underlying grain size/permeability relationship upon which the model of Revil & Florsch is based and then proceed to compare the observed and predicted permeability values for our samples by substituting the grain size characteristics by corresponding SIP parameters, notably the so-called Cole-Cole time constant. In doing so, we also asses the quantitative impact of an observed shift in the Cole-Cole time constant related to textural variations in the samples and observe that changes related to the compaction of the samples are not relevant for the corresponding permeability predictions. We find that the proposed model does indeed provide an adequate prediction of the overall trend of the observed permeability values, but underestimates their actual values by approximately one order-of-magnitude. This discrepancy in turn points to the potential importance of phenomena, which are currently not accounted for in the model and which tend to reduce the characteristic size of the prevailing polarization cells compared to the considered model, such as, for example, membrane polarization, contacts of double-layers of neighbouring grains, and incorrect estimation of the size of the polarized cells because of the irregularity of natural sand grains.
Resumo:
Objectives: Imatinib has been increasingly proposed for therapeutic drug monitoring (TDM), as trough concentrations (Cmin) correlate with response rates in CML patients. This analysis aimed to evaluate the impact of imatinib exposure on optimal molecular response rates in a large European cohort of patients followed by centralized TDM.¦Methods: Sequential PK/PD analysis was performed in NONMEM 7 on 2230 plasma (PK) samples obtained along with molecular response (PD) data from 1299 CML patients. Model-based individual Bayesian estimates of exposure, parameterized as to initial dose adjusted and log-normalized Cmin (log-Cmin) or clearance (CL), were investigated as potential predictors of optimal molecular response, while accounting for time under treatment (stratified at 3 years), gender, CML phase, age, potentially interacting comedication, and TDM frequency. PK/PD analysis used mixed-effect logistic regression (iterative two-stage method) to account for intra-patient correlation.¦Results: In univariate analyses, CL, log-Cmin, time under treatment, TDM frequency, gender (all p<0.01) and CML phase (p=0.02) were significant predictors of the outcome. In multivariate analyses, all but log-Cmin remained significant (p<0.05). Our model estimates a 54.1% probability of optimal molecular response in a female patient with a median CL of 14.4 L/h, increasing by 4.7% with a 35% decrease in CL (percentile 10 of CL distribution), and decreasing by 6% with a 45% increased CL (percentile 90), respectively. Male patients were less likely than female to be in optimal response (odds ratio: 0.62, p<0.001), with an estimated probability of 42.3%.¦Conclusions: Beyond CML phase and time on treatment, expectedly correlated to the outcome, an effect of initial imatinib exposure on the probability of achieving optimal molecular response was confirmed in field-conditions by this multivariate analysis. Interestingly, male patients had a higher risk of suboptimal response, which might not exclusively derive from their 18.5% higher CL, but also from reported lower adherence to the treatment. A prospective longitudinal study would be desirable to confirm the clinical importance of identified covariates and to exclude biases possibly affecting this observational survey.
Resumo:
We derive necessary and sufficient conditions under which a set of variables is informationally sufficient, i.e. it contains enough information to estimate the structural shocks with a VAR model. Based on such conditions, we suggest a procedure to test for informational sufficiency. Moreover, we show how to amend the VAR if informational sufficiency is rejected. We apply our procedure to a VAR including TFP, unemployment and per-capita hours worked. We find that the three variables are not informationally sufficient. When adding missing information, the effects of technology shocks change dramatically.
Resumo:
This paper analyses the impact of using different correlation assumptions between lines of business when estimating the risk-based capital reserve, the Solvency Capital Requirement (SCR), under Solvency II regulations. A case study is presented and the SCR is calculated according to the Standard Model approach. Alternatively, the requirement is then calculated using an Internal Model based on a Monte Carlo simulation of the net underwriting result at a one-year horizon, with copulas being used to model the dependence between lines of business. To address the impact of these model assumptions on the SCR we conduct a sensitivity analysis. We examine changes in the correlation matrix between lines of business and address the choice of copulas. Drawing on aggregate historical data from the Spanish non-life insurance market between 2000 and 2009, we conclude that modifications of the correlation and dependence assumptions have a significant impact on SCR estimation.
Resumo:
Este documento refleja el estudio de investigación para la detección de factores que afectan al rendimiento en entornos multicore. Debido a la gran diversidad de arquitecturas multicore se ha definido un marco de trabajo, que consiste en la adopción de una arquitectura específica, un modelo de programación basado en paralelismo de datos, y aplicaciones del tipo Single Program Multiple Data. Una vez definido el marco de trabajo, se han evaluado los factores de rendimiento con especial atención al modelo de programación. Por este motivo, se ha analizado la librería de threads y la API OpenMP para detectar aquellas funciones sensibles de ser sintonizadas al permitir un comportamiento adaptativo de la aplicación al entorno, y que dependiendo de su adecuada utilización han de mejorar el rendimiento de la aplicación.