948 resultados para Random Coefficient Autoregressive Model{ RCAR (1)}
Resumo:
Meta-analysis of predictive values is usually discouraged because these values are directly affected by disease prevalence, but sensitivity and specificity sometimes show substantial heterogeneity as well. We propose a bivariate random-effects logitnormal model for the meta-analysis of the positive predictive value (PPV) and negative predictive value (NPV) of diagnostic tests.
Resumo:
BACKGROUND: Bone morphogenetic protein (BMP) is a potent differentiating agent for cells of the osteoblastic lineage. It has been used in the oral cavity under a variety of indications and with different carriers. However, the optimal carrier for each indication is not known. This study examined a synthetic bioabsorbable carrier for BMP used in osseous defects around dental implants in the canine mandible. METHODS: Twelve canines had their mandibular four premolars and first molar teeth extracted bilaterally. After 5 months, four implants were placed with standardized circumferential defects around the coronal 4 mm of each implant. One-half of the defects received a polylactide/glycolide (PLGA) polymer carrier with or without recombinant human BMP-2 (rhBMP-2), and the other half received a collagen carrier with or without rhBMP-2. Additionally, one-half of the implants were covered with a non-resorbable (expanded polytetrafluoroethylene [ePTFE]) membrane to exclude soft tissues. Animals were sacrificed either 4 or 12 weeks later. Histomorphometric analysis included the percentage of new bone contact with the implant, the area of new bone, and the percentage of defect fill. This article describes results with the PLGA carrier. RESULTS: All implants demonstrated clinical and radiographic success with the amount of new bone formed dependent on the time and presence/absence of rhBMP-2 and presence/absence of a membrane. The percentage of bone-to-implant contact was greater with rhBMP-2, and after 12 weeks of healing, there was approximately one-third of the implant contacting bone in the defect site. After 4 weeks, the presence of a membrane appeared to slow new bone area formation. The percentage of fill in membrane-treated sites with rhBMP-2 rose from 24% fill to 42% after 4 and 12 weeks, respectively. Without rhBMP-2, the percentage of fill was 14% rising to 36% fill, respectively. CONCLUSIONS: After 4 weeks, the rhBMP-2-treated sites had a significantly higher percentage of contact, more new bone area, and higher percentage of defect fill than the sites without rhBMP-2. After 12 weeks, there was no significant difference in sites with or without rhBMP-2 regarding percentage of contact, new bone area, or percentage of defect fill. In regard to these three outcomes, comparing the results with this carrier to the results reported earlier with a collagen carrier in this study, only the area of new bone was significantly different with the collagen carrier resulting in greater bone than the PLGA carrier. Thus, the PLGA carrier for rhBMP-2 significantly stimulated bone formation around dental implants in this model after 1 month but not after 3 months of healing. The use of this growth factor and carrier combination appears to stimulate early bone healing events around the implants but not quite to the same degree as a collagen carrier.
Resumo:
Objective: Impaired cognition is an important dimension in psychosis and its at-risk states. Research on the value of impaired cognition for psychosis prediction in at-risk samples, however, mainly relies on study-specific sample means of neurocognitive tests, which unlike widely available general test norms are difficult to translate into clinical practice. The aim of this study was to explore the combined predictive value of at-risk criteria and neurocognitive deficits according to test norms with a risk stratification approach. Method: Potential predictors of psychosis (neurocognitive deficits and at-risk criteria) over 24 months were investigated in 97 at-risk patients. Results: The final prediction model included (1) at-risk criteria (attenuated psychotic symptoms plus subjective cognitive disturbances) and (2) a processing speed deficit (digit symbol test). The model was stratified into 4 risk classes with hazard rates between 0.0 (both predictors absent) and 1.29 (both predictors present). Conclusions: The combination of a processing speed deficit and at-risk criteria provides an optimized stratified risk assessment. Based on neurocognitive test norms, the validity of our proposed 3 risk classes could easily be examined in independent at-risk samples and, pending positive validation results, our approach could easily be applied in clinical practice in the future.
Resumo:
Previous studies have either exclusively used annual tree-ring data or have combined tree-ring series with other, lower temporal resolution proxy series. Both approaches can lead to significant uncertainties, as tree-rings may underestimate the amplitude of past temperature variations, and the validity of non-annual records cannot be clearly assessed. In this study, we assembled 45 published Northern Hemisphere (NH) temperature proxy records covering the past millennium, each of which satisfied 3 essential criteria: the series must be of annual resolution, span at least a thousand years, and represent an explicit temperature signal. Suitable climate archives included ice cores, varved lake sediments, tree-rings and speleothems. We reconstructed the average annual land temperature series for the NH over the last millennium by applying 3 different reconstruction techniques: (1) principal components (PC) plus second-order autoregressive model (AR2), (2) composite plus scale (CPS) and (3) regularized errors-in-variables approach (EIV). Our reconstruction is in excellent agreement with 6 climate model simulations (including the first 5 models derived from the fifth phase of the Coupled Model Intercomparison Project (CMIP5) and an earth system model of intermediate complexity (LOVECLIM), showing similar temperatures at multi-decadal timescales; however, all simulations appear to underestimate the temperature during the Medieval Warm Period (MWP). A comparison with other NH reconstructions shows that our results are consistent with earlier studies. These results indicate that well-validated annual proxy series should be used to minimize proxy-based artifacts, and that these proxy series contain sufficient information to reconstruct the low-frequency climate variability over the past millennium.
Stable oxygen isotope record and relative abundances of planktonic foraminifera of ODP Hole 117-728A
Resumo:
High resolution stratigraphy based on oxygen isotope ratios of the planktonic foraminifers Neogloboquadrina dutertrei (d'Orbigny), Globigeriniodes ruber (d'Orbigny), and Globigerina bulloides (d'Orbigny), magnetic susceptibility, and calcium carbonate content covers the sedimentary record of ODP Hole 728A drilled on the Oman Margin from approximately 10 k.y. to 525 k.y., comprising isotopic stages 1-13. Below stage 13 isotopic stage boundaries cannot be defined with certainty in our data. Sediment accumulation rates were calculated from the isotopic record of N. dutertrei by matching it with the age model SPECMAP curve. During the glacial periods sediment accumulation rates were higher than during the interglacial periods, reflecting increased input from the shelf during low-stands of sea level and increased eolian input. Periodograms for the past 524 k.y. on oxygen isotope records of N. dutertrei, G. ruber, and G. bulloides, on calcium carbonate content, magnetic susceptibility, and on a foraminiferal fragmentation record show powers matching the Milankovitch periodicities. High powers are concentrated around 103 k.y. In the spectra of oxygen isotope ratios of N. dutertrei, magnetic susceptibility, and foraminiferal fragmentation these are significant at the 80% confidence level with respect to a first order autoregressive model. Power concentrations near 43 k.y., matching obliquity, are present but subdued in all spectra. Power concentrations near 23 k.y., matching precession, are significant in the spectra of the oxygen isotope record of N. dutertrei, magnetic susceptibility, and calcium carbonate content record. Fragmentation of planktonic foraminifers increased during the interglacial periods. This is attributed to dissolution of the tests in an expanded oxygen minimum zone (OMZ), where undersaturation of calcium carbonate is caused by enhanced production in the euphotic zone, which would suggest stronger monsoonal induced upwelling during interglacial periods. Extension of the OMZ could also be increased by outflow of low oxygen marginal basin bottom water.
Resumo:
La diabetes mellitus es una enfermedad que se caracteriza por la nula o insuficiente producción de insulina, o la resistencia del organismo a la misma. La insulina es una hormona que ayuda a que la glucosa (por ejemplo la obtenida a partir de los alimentos ingeridos) llegue a los tejidos periféricos y al sistema nervioso para suministrar energía. Hoy en día la tecnología actual permite abordar el desarrollo del llamado “páncreas endocrino artificial”, que consta de un sensor continuo de glucosa subcutánea, una bomba de infusión subcutánea de insulina y un algoritmo de control en lazo cerrado que calcule la dosis de insulina requerida por el paciente en cada momento, según la medida de glucosa obtenida por el sensor y según unos objetivos. El mayor problema que presentan los sistemas de control en lazo cerrado son los retardos, el sensor de glucosa subcutánea mide la glucosa del líquido intersticial, que representa la que hubo en la sangre un tiempo atrás, por tanto, un cambio en los niveles de glucosa en la sangre, debidos por ejemplo, a una ingesta, tardaría un tiempo en ser detectado por el sensor. Además, una dosis de insulina suministrada al paciente, tarda un tiempo aproximado de 20-30 minutos para la llegar a la sangre. Para evitar trabajar en la medida que sea posible con estos retardos, se intenta predecir cuál será el nivel de glucosa en un futuro próximo, para ello se utilizara un predictor de glucosa subcutánea, con la información disponible de glucosa e insulina. El objetivo del proyecto es diseñar una metodología para estimar el valor futuro de los niveles de glucosa obtenida a partir de un sensor subcutáneo, basada en la identificación recursiva del sistema glucorregulatorio a través de modelos lineales y determinando un horizonte de predicción óptimo de trabajo y analizando la influencia de la insulina en los resultados de la predicción. Se ha implementado un predictor paramétrico basado en un modelo autorregresivo ARX que predice con mejor precisión y con menor RMSE que un predictor ZOH a un horizonte de predicción de treinta minutos. Utilizar información relativa a la insulina no tiene efecto en la predicción. El preprocesado, postprocesado y el tratamiento de la estabilidad tienen un efecto muy beneficioso en la predicción. Diabetes mellitusis a group of metabolic diseases in which a person has high blood sugar, either because the body does not produce enough insulin, or because cells do not respond to the insulin produced. The insulin is a hormone that helps the glucose to reach to outlying tissues and the nervous system to supply energy. Nowadays, the actual technology allows raising the development of the “artificial endocrine pancreas”. It involves a continuous glucose sensor, an insulin bump, and a full closed loop algorithm that calculate the insulin units required by patient at any time, according to the glucose measure obtained by the sensor and any target. The main problem of the full closed loop systems is the delays, the glucose sensor measures the glucose in the interstitial fluid that represents the glucose was in the blood some time ago. Because of this, a change in the glucose in blood would take some time to be detected by the sensor. In addition, insulin units administered by a patient take about 20-30 minutes to reach the blood stream. In order to avoid this effect, it will try to predict the glucose level in the near future. To do that, a subcutaneous glucose predictor is used to predict the future glucose with the information about insulin and glucose. The goal of the proyect is to design a method in order to estimate the future valor of glucose obtained by a subcutaneous sensor. It is based on the recursive identification of the regulatory system through the linear models, determining optimal prediction horizon and analyzing the influence of insuline on the prediction results. A parametric predictor based in ARX autoregressive model predicts with better precision and with lesser RMSE than ZOH predictor in a thirty minutes prediction horizon. Using the relative insulin information has no effect in the prediction. The preprocessing, the postprocessing and the stability treatment have many advantages in the prediction.
Resumo:
Este trabajo aborda el problema de modelizar sistemas din´amicos reales a partir del estudio de sus series temporales, usando una formulaci´on est´andar que pretende ser una abstracci´on universal de los sistemas din´amicos, independientemente de su naturaleza determinista, estoc´astica o h´ıbrida. Se parte de modelizaciones separadas de sistemas deterministas por un lado y estoc´asticos por otro, para converger finalmente en un modelo h´ıbrido que permite estudiar sistemas gen´ericos mixtos, esto es, que presentan una combinaci´on de comportamiento determinista y aleatorio. Este modelo consta de dos componentes, uno determinista consistente en una ecuaci´on en diferencias, obtenida a partir de un estudio de autocorrelaci´on, y otro estoc´astico que modeliza el error cometido por el primero. El componente estoc´astico es un generador universal de distribuciones de probabilidad, basado en un proceso compuesto de variables aleatorias, uniformemente distribuidas en un intervalo variable en el tiempo. Este generador universal es deducido en la tesis a partir de una nueva teor´ıa sobre la oferta y la demanda de un recurso gen´erico. El modelo resultante puede formularse conceptualmente como una entidad con tres elementos fundamentales: un motor generador de din´amica determinista, una fuente interna de ruido generadora de incertidumbre y una exposici´on al entorno que representa las interacciones del sistema real con el mundo exterior. En las aplicaciones estos tres elementos se ajustan en base al hist´orico de las series temporales del sistema din´amico. Una vez ajustados sus componentes, el modelo se comporta de una forma adaptativa tomando como inputs los nuevos valores de las series temporales del sistema y calculando predicciones sobre su comportamiento futuro. Cada predicci´on se presenta como un intervalo dentro del cual cualquier valor es equipro- bable, teniendo probabilidad nula cualquier valor externo al intervalo. De esta forma el modelo computa el comportamiento futuro y su nivel de incertidumbre en base al estado actual del sistema. Se ha aplicado el modelo en esta tesis a sistemas muy diferentes mostrando ser muy flexible para afrontar el estudio de campos de naturaleza dispar. El intercambio de tr´afico telef´onico entre operadores de telefon´ıa, la evoluci´on de mercados financieros y el flujo de informaci´on entre servidores de Internet son estudiados en profundidad en la tesis. Todos estos sistemas son modelizados de forma exitosa con un mismo lenguaje, a pesar de tratarse de sistemas f´ısicos totalmente distintos. El estudio de las redes de telefon´ıa muestra que los patrones de tr´afico telef´onico presentan una fuerte pseudo-periodicidad semanal contaminada con una gran cantidad de ruido, sobre todo en el caso de llamadas internacionales. El estudio de los mercados financieros muestra por su parte que la naturaleza fundamental de ´estos es aleatoria con un rango de comportamiento relativamente acotado. Una parte de la tesis se dedica a explicar algunas de las manifestaciones emp´ıricas m´as importantes en los mercados financieros como son los “fat tails”, “power laws” y “volatility clustering”. Por ´ultimo se demuestra que la comunicaci´on entre servidores de Internet tiene, al igual que los mercados financieros, una componente subyacente totalmente estoc´astica pero de comportamiento bastante “d´ocil”, siendo esta docilidad m´as acusada a medida que aumenta la distancia entre servidores. Dos aspectos son destacables en el modelo, su adaptabilidad y su universalidad. El primero es debido a que, una vez ajustados los par´ametros generales, el modelo se “alimenta” de los valores observables del sistema y es capaz de calcular con ellos comportamientos futuros. A pesar de tener unos par´ametros fijos, la variabilidad en los observables que sirven de input al modelo llevan a una gran riqueza de ouputs posibles. El segundo aspecto se debe a la formulaci´on gen´erica del modelo h´ıbrido y a que sus par´ametros se ajustan en base a manifestaciones externas del sistema en estudio, y no en base a sus caracter´ısticas f´ısicas. Estos factores hacen que el modelo pueda utilizarse en gran variedad de campos. Por ´ultimo, la tesis propone en su parte final otros campos donde se han obtenido ´exitos preliminares muy prometedores como son la modelizaci´on del riesgo financiero, los algoritmos de routing en redes de telecomunicaci´on y el cambio clim´atico. Abstract This work faces the problem of modeling dynamical systems based on the study of its time series, by using a standard language that aims to be an universal abstraction of dynamical systems, irrespective of their deterministic, stochastic or hybrid nature. Deterministic and stochastic models are developed separately to be merged subsequently into a hybrid model, which allows the study of generic systems, that is to say, those having both deterministic and random behavior. This model is a combination of two different components. One of them is deterministic and consisting in an equation in differences derived from an auto-correlation study and the other is stochastic and models the errors made by the deterministic one. The stochastic component is an universal generator of probability distributions based on a process consisting in random variables distributed uniformly within an interval varying in time. This universal generator is derived in the thesis from a new theory of offer and demand for a generic resource. The resulting model can be visualized as an entity with three fundamental elements: an engine generating deterministic dynamics, an internal source of noise generating uncertainty and an exposure to the environment which depicts the interactions between the real system and the external world. In the applications these three elements are adjusted to the history of the time series from the dynamical system. Once its components have been adjusted, the model behaves in an adaptive way by using the new time series values from the system as inputs and calculating predictions about its future behavior. Every prediction is provided as an interval, where any inner value is equally probable while all outer ones have null probability. So, the model computes the future behavior and its level of uncertainty based on the current state of the system. The model is applied to quite different systems in this thesis, showing to be very flexible when facing the study of fields with diverse nature. The exchange of traffic between telephony operators, the evolution of financial markets and the flow of information between servers on the Internet are deeply studied in this thesis. All these systems are successfully modeled by using the same “language”, in spite the fact that they are systems physically radically different. The study of telephony networks shows that the traffic patterns are strongly weekly pseudo-periodic but mixed with a great amount of noise, specially in the case of international calls. It is proved that the underlying nature of financial markets is random with a moderate range of variability. A part of this thesis is devoted to explain some of the most important empirical observations in financial markets, such as “fat tails”, “power laws” and “volatility clustering”. Finally it is proved that the communication between two servers on the Internet has, as in the case of financial markets, an underlaying random dynamics but with a narrow range of variability, being this lack of variability more marked as the distance between servers is increased. Two aspects of the model stand out as being the most important: its adaptability and its universality. The first one is due to the fact that once the general parameters have been adjusted , the model is “fed” on the observable manifestations of the system in order to calculate its future behavior. Despite the fact that the model has fixed parameters the variability in the observable manifestations of the system, which are used as inputs of the model, lead to a great variability in the possible outputs. The second aspect is due to the general “language” used in the formulation of the hybrid model and to the fact that its parameters are adjusted based on external manifestations of the system under study instead of its physical characteristics. These factors made the model suitable to be used in great variety of fields. Lastly, this thesis proposes other fields in which preliminary and promising results have been obtained, such as the modeling of financial risk, the development of routing algorithms for telecommunication networks and the assessment of climate change.
Resumo:
• Premise of the study: The presence of compatible fungi is necessary for epiphytic orchid recruitment. Thus, identifying associated mycorrhizal fungi at the population level is essential for orchid conservation. Recruitment patterns may also be conditioned by factors such as seed dispersal range and specific environmental characteristics. • Methods: In a forest plot, all trees with a diameter at breast height >1 cm and all individuals of the epiphytic orchid Epidendrum rhopalostele were identified and mapped. Additionally, one flowering individual of E. rhopalostele per each host tree was randomly selected for root sampling and DNA extraction. • Key results: A total of 239 E. rhopalostele individuals were located in 25 of the 714 potential host trees. Light microscopy of sampled roots showed mycorrhizal fungi in 22 of the 25 sampled orchids. Phylogenetic analysis of ITS1-5.8S-ITS2 sequences yielded two Tulasnella clades. In four cases, plants were found to be associated with both clades. The difference between univariate and bivariate K functions was consistent with the random labeling null model at all spatial scales, indicating that trees hosting clades A and B of Tulasnella are not spatially segregated. The analysis of the inhomogenous K function showed that host trees are not clustered, suggesting no limitations to population-scale dispersal. χ2 analysis of contingency tables showed that E. rhopalostele is more frequent on dead trees than expected. • Conclusions: Epidendrum rhopalostele establishes mycorrhizal associations with at least two different Tulasnella species. The analysis of the distribution patterns of this orchid suggests a microsite preference for dead trees and no seed dispersal limitation.
Resumo:
With only two different cell types, the haploid green alga Volvox represents the simplest multicellular model system. To facilitate genetic investigations in this organism, the occurrence of homologous recombination events was investigated with the intent of developing methods for gene replacement and gene disruption. First, homologous recombination between two plasmids was demonstrated by using overlapping nonfunctional fragments of a recombinant arylsulfatase gene (tubulin promoter/arylsulfatase gene). After bombardment of Volvox reproductive cells with DNA-coated gold microprojectiles, transformants expressing arylsulfatase constitutively were recovered, indicating the presence of the machinery for homologous recombination in Volvox. Second, a well characterized loss-of-function mutation in the nuclear nitrate reductase gene (nitA) with a single G → A nucleotide exchange in a 5′-splice site was chosen as a target for gene replacement. Gene replacement by homologous recombination was observed with a reasonably high frequency only if the replacement vector containing parts of the functional nitrate reductase gene contained only a few nucleotide exchanges. The ratio of homologous to random integration events ranged between 1:10 and 1:50, i.e., homologous recombination occurs frequently enough in Volvox to apply the powerful tool of gene disruption for functional studies of novel genes.
Resumo:
El objetivo de este trabajo consiste en proponer un proceso de decisión secuencial y jerárquico que siguen los turistas vacacionales en cuatro etapas: 1) salir (o no) de vacaciones; 2) elección de un viaje nacional vs. internacional; 3) elección de determinadas áreas geográficas; y 4) elección de la modalidad del viaje -multidestino o de destino fijo- en estas áreas. Este análisis permite examinar las distintas fases que sigue un turista hasta seleccionar una determinada modalidad de viaje en un zona geográfica concreta, así como observar los factores que influyen en cada etapa. La aplicación empírica se realiza sobre una muestra de 3.781 individuos, y estima, mediante procedimientos bayesianos, un Modelo Logit de Coeficientes Aleatorios. Los resultados obtenidos revelan el carácter anidado y no independiente de las decisiones anteriores, lo que confirma el proceso secuencial y jerárquico propuesto.
Resumo:
La literatura de elección de destinos turísticos ha dedicado una gran atención al impacto directo del atributo “precio del destino”, pero no ha alcanzado un consenso en torno al mismo. Alternativamente, nuestro trabajo toma como punto de partida la relación entre las motivaciones turísticas y los beneficios buscados del turista en un destino, lo que lleva a proponer que el efecto del precio viene moderado por las motivaciones del turista a la hora de elegir un destino. Para ello, se argumentan diversas hipótesis de investigación que explican esta decisión a través de la interacción entre dicho atributo del destino y las motivaciones personales de los individuos. La metodología aplicada estima Modelos Logit con Coeficientes Aleatorios que permiten controlar posibles correlaciones entre los distintos destinos y recoger la heterogeneidad de los turistas. La aplicación empírica realizada en España sobre una muestra de 2.127 individuos evidencia que las motivaciones moderan el efecto de los precios en la elección de los destinos turísticos intrapaís.
Resumo:
El objetivo de este trabajo consiste en proponer y testar un proceso de decisión anidado y jerárquico que siguen los turistas vacacionales en cuatro etapas: 1) salir (o no) de vacaciones; 2) elección de un viaje nacional vs. internacional; 3) elección de determinadas áreas geográficas; y 4) elección de la modalidad del viaje –multidestino o de destino fijo– en estas áreas. Este análisis permite examinar las distintas fases que sigue un turista hasta seleccionar una determinada modalidad de viaje en un zona geográfica concreta, así como observar los factores que influyen en cada etapa. La aplicación empírica se realiza sobre una muestra de 3.781 individuos, y estima, mediante procedimientos bayesianos, un Modelo Logit de Coeficientes Aleatorios. Los resultados obtenidos revelan el carácter anidado y no independiente de las decisiones anteriores, lo que confirma el proceso anidado y jerárquico propuesto.
Resumo:
This paper explores the effects of non-standard monetary policies on international yield relationships. Based on a descriptive analysis of international long-term yields, we find evidence that long-term rates followed a global downward trend prior to as well as during the financial crisis. Comparing interest rate developments in the US and the eurozone, it is difficult to detect a distinct impact of the first round of the Fed’s quantitative easing programme (QE1) on US interest rates for which the global environment – the global downward trend in interest rates – does not account. Motivated by these findings, we analyse the impact of the Fed’s QE1 programme on the stability of the US-euro long-term interest rate relationship by using a CVAR (cointegrated vector autoregressive) model and, in particular, recursive estimation methods. Using data gathered between 2002 and 2014, we find limited evidence that QE1 caused the break-up or destabilised the transatlantic interest rate relationship. Taking global interest rate developments into account, we thus find no significant evidence that QE had any independent, distinct impact on US interest rates.
Resumo:
This paper uses the opening of the US textile/apparel market for China at the end of the Multifibre Arrangement in 2005 as a natural experiment to provide evidence for positive assortative matching of Mexican exporting firms and US importing firms by their capability. We identify three findings for liberalized products by comparing them to other textile/apparel products: (1) US importers switched their Mexican partners to those making greater preshock exports, whereas Mexican exporters switched their US partners to those making fewer preshock imports; (2) for firms who switched partners, trade volume of the old partners and the new partners are positively correlated; (3) small Mexican exporters stop exporting. We develop a model combining Becker-type matching of final producers and suppliers with the standard Melitz-type model to show that these findings are consistent with positive assortative matching but not with negative assortative matching or purely random matching. The model indicates that the findings are evidence for a new mechanism of gain from trade.