998 resultados para IV Estimation
Resumo:
This paper considers the general problem of Feasible Generalized Least Squares Instrumental Variables (FG LS IV) estimation using optimal instruments. First we summarize the sufficient conditions for the FG LS IV estimator to be asymptotic ally equivalent to an optimal G LS IV estimator. Then we specialize to stationary dynamic systems with stationary VAR errors, and use the sufficient conditions to derive new moment conditions for these models. These moment conditions produce useful IVs from the lagged endogenous variables, despite the correlation between errors and endogenous variables. This use of the information contained in the lagged endogenous variables expands the class of IV estimators under consideration and there by potentially improves both asymptotic and small-sample efficiency of the optimal IV estimator in the class. Some Monte Carlo experiments compare the new methods with those of Hatanaka [1976]. For the DG P used in the Monte Carlo experiments, asymptotic efficiency is strictly improved by the new IVs, and experimental small-sample efficiency is improved as well.
Resumo:
The presence of a large informal sector in developing economies poses the question of whether informal activity produces agglomeration externalities. This paper uses data on all the nonfarm establishments and enterprises in Cambodia to estimate the impact of informal agglomeration on the regional economic performance of formal and informal firms. We develop a Bayesian approach for a spatial autoregressive model with an endogenous explanatory variable to address endogeneity and spatial dependence. We find a significantly positive effect of informal agglomeration, where informal firms gain more strongly than formal firms. Calculating the spatial marginal effects of increased agglomeration, we demonstrate that more accessible regions are more likely than less accessible regions to benefit strongly from informal agglomeration.
Resumo:
Estimer la filtration glomérulaire chez les personnes âgées, tout en tenant compte de la difficulté supplémentaire d'évaluer leur masse musculaire, est difficile et particulièrement important pour la prescription de médicaments. Le taux plasmatique de la creatinine dépend à la fois de la fraction d'élimination rénale et extra-rénale et de la masse musculaire. Actuellement, pour estimer là filtration glomérulaire différentes formules sont utilisées, qui se fondent principalement sur la valeur de la créatinine. Néanmoins, en raison de la fraction éliminée par les voies tubulaires et intestinales la clairance de la créatinine surestime généralement le taux de filtration glomérulaire (GFR). Le but de cette étude est de vérifier la fiabilité de certains marqueurs et algorithmes de la fonction rénale actuellement utilisés et d'évaluer l'avantage additionnel de prendre en considération la masse musculaire mesurée par la bio-impédance dans une population âgée (> 70 ans) et avec une fonction rénale chronique compromise basée sur MDRD eGFR (CKD stades lll-IV). Dans cette étude, nous comparons 5 équations développées pour estimer la fonction rénale et basées respectivement sur la créatinine sérique (Cockcroft et MDRD), la cystatine C (Larsson), la créatinine combinée à la bêta-trace protéine (White), et la créatinine ajustée à la masse musculaire obtenue par analyse de la bio-impédance (MacDonald). La bio-impédance est une méthode couramment utilisée pour estimer la composition corporelle basée sur l'étude des propriétés électriques passives et de la géométrie des tissus biologiques. Cela permet d'estimer les volumes relatifs des différents tissus ou des fluides dans le corps, comme par exemple l'eau corporelle totale, la masse musculaire (=masse maigre) et la masse grasse corporelle. Nous avons évalué, dans une population âgée d'un service interne, et en utilisant la clairance de l'inuline (single shot) comme le « gold standard », les algorithmes de Cockcroft (GFR CKC), MDRD, Larsson (cystatine C, GFR CYS), White (beta trace protein, GFR BTP) et Macdonald (GFR = ALM, la masse musculaire par bio-impédance. Les résultats ont montré que le GFR (mean ± SD) mesurée avec l'inuline et calculée avec les algorithmes étaient respectivement de : 34.9±20 ml/min pour l'inuline, 46.7±18.5 ml/min pour CKC, 47.2±23 ml/min pour CYS, 54.4±18.2ml/min pour BTP, 49±15.9 ml/min pour MDRD et 32.9±27.2ml/min pour ALM. Les courbes ROC comparant la sensibilité et la spécificité, l'aire sous la courbe (AUC) et l'intervalle de confiance 95% étaient respectivement de : CKC 0 68 (055-0 81) MDRD 0.76 (0.64-0.87), Cystatin C 0.82 (0.72-0.92), BTP 0.75 (0.63-0.87), ALM 0.65 (0.52-0.78). ' En conclusion, les algorithmes comparés dans cette étude surestiment la GFR dans la population agee et hospitalisée, avec des polymorbidités et une classe CKD lll-IV. L'utilisation de l'impédance bioelectrique pour réduire l'erreur de l'estimation du GFR basé sur la créatinine n'a fourni aucune contribution significative, au contraire, elle a montré de moins bons résultats en comparaison aux autres equations. En fait dans cette étude 75% des patients ont changé leur classification CKD avec MacDonald (créatinine et masse musculaire), contre 49% avec CYS (cystatine C), 56% avec MDRD,52% avec Cockcroft et 65% avec BTP. Les meilleurs résultats ont été obtenus avec Larsson (CYS C) et la formule de Cockcroft.
Resumo:
Le suivi thérapeutique est recommandé pour l’ajustement de la dose des agents immunosuppresseurs. La pertinence de l’utilisation de la surface sous la courbe (SSC) comme biomarqueur dans l’exercice du suivi thérapeutique de la cyclosporine (CsA) dans la transplantation des cellules souches hématopoïétiques est soutenue par un nombre croissant d’études. Cependant, pour des raisons intrinsèques à la méthode de calcul de la SSC, son utilisation en milieu clinique n’est pas pratique. Les stratégies d’échantillonnage limitées, basées sur des approches de régression (R-LSS) ou des approches Bayésiennes (B-LSS), représentent des alternatives pratiques pour une estimation satisfaisante de la SSC. Cependant, pour une application efficace de ces méthodologies, leur conception doit accommoder la réalité clinique, notamment en requérant un nombre minimal de concentrations échelonnées sur une courte durée d’échantillonnage. De plus, une attention particulière devrait être accordée à assurer leur développement et validation adéquates. Il est aussi important de mentionner que l’irrégularité dans le temps de la collecte des échantillons sanguins peut avoir un impact non-négligeable sur la performance prédictive des R-LSS. Or, à ce jour, cet impact n’a fait l’objet d’aucune étude. Cette thèse de doctorat se penche sur ces problématiques afin de permettre une estimation précise et pratique de la SSC. Ces études ont été effectuées dans le cadre de l’utilisation de la CsA chez des patients pédiatriques ayant subi une greffe de cellules souches hématopoïétiques. D’abord, des approches de régression multiple ainsi que d’analyse pharmacocinétique de population (Pop-PK) ont été utilisées de façon constructive afin de développer et de valider adéquatement des LSS. Ensuite, plusieurs modèles Pop-PK ont été évalués, tout en gardant à l’esprit leur utilisation prévue dans le contexte de l’estimation de la SSC. Aussi, la performance des B-LSS ciblant différentes versions de SSC a également été étudiée. Enfin, l’impact des écarts entre les temps d’échantillonnage sanguins réels et les temps nominaux planifiés, sur la performance de prédiction des R-LSS a été quantifié en utilisant une approche de simulation qui considère des scénarios diversifiés et réalistes représentant des erreurs potentielles dans la cédule des échantillons sanguins. Ainsi, cette étude a d’abord conduit au développement de R-LSS et B-LSS ayant une performance clinique satisfaisante, et qui sont pratiques puisqu’elles impliquent 4 points d’échantillonnage ou moins obtenus dans les 4 heures post-dose. Une fois l’analyse Pop-PK effectuée, un modèle structural à deux compartiments avec un temps de délai a été retenu. Cependant, le modèle final - notamment avec covariables - n’a pas amélioré la performance des B-LSS comparativement aux modèles structuraux (sans covariables). En outre, nous avons démontré que les B-LSS exhibent une meilleure performance pour la SSC dérivée des concentrations simulées qui excluent les erreurs résiduelles, que nous avons nommée « underlying AUC », comparée à la SSC observée qui est directement calculée à partir des concentrations mesurées. Enfin, nos résultats ont prouvé que l’irrégularité des temps de la collecte des échantillons sanguins a un impact important sur la performance prédictive des R-LSS; cet impact est en fonction du nombre des échantillons requis, mais encore davantage en fonction de la durée du processus d’échantillonnage impliqué. Nous avons aussi mis en évidence que les erreurs d’échantillonnage commises aux moments où la concentration change rapidement sont celles qui affectent le plus le pouvoir prédictif des R-LSS. Plus intéressant, nous avons mis en exergue que même si différentes R-LSS peuvent avoir des performances similaires lorsque basées sur des temps nominaux, leurs tolérances aux erreurs des temps d’échantillonnage peuvent largement différer. En fait, une considération adéquate de l'impact de ces erreurs peut conduire à une sélection et une utilisation plus fiables des R-LSS. Par une investigation approfondie de différents aspects sous-jacents aux stratégies d’échantillonnages limités, cette thèse a pu fournir des améliorations méthodologiques notables, et proposer de nouvelles voies pour assurer leur utilisation de façon fiable et informée, tout en favorisant leur adéquation à la pratique clinique.
Resumo:
Urbanization refers to the process in which an increasing proportion of a population lives in cities and suburbs. Urbanization fuels the alteration of the Land use/Land cover pattern of the region including increase in built-up area, leading to imperviousness of the ground surface. With increasing urbanization and population pressures; the impervious areas in the cities are increasing fast. An impervious surface refers to an anthropogenic ally modified surface that prevents water from infiltrating into the soil. Surface imperviousness mapping is important for the studies related to water cycling, water quality, soil erosion, flood water drainage, non-point source pollution, urban heat island effect and urban hydrology. The present study estimates the Total Impervious Area (TIA) of the city of Kochi using high resolution satellite image (LISS IV, 5m. resolution). Additionally the study maps the Effective Impervious Area (EIA) by coupling the capabilities of GIS and Remote Sensing. Land use/Land cover map of the study area was prepared from the LISS IV image acquired for the year 2012. The classes were merged to prepare a map showing pervious and impervious area. Supervised Maximum Likelihood Classification (Supervised MLC),which is a simple but accurate method for image classification, is used in calculating TIA and an overall classification accuracy of 86.33% was obtained. Water bodies are 100% pervious, whereas urban built up area are 100% impervious. Further based on percentage of imperviousness, the Total Impervious Area is categorized into various classes
Resumo:
The research activity characterizing the present thesis was mainly centered on the design, development and validation of methodologies for the estimation of stationary and time-varying connectivity between different regions of the human brain during specific complex cognitive tasks. Such activity involved two main aspects: i) the development of a stable, consistent and reproducible procedure for functional connectivity estimation with a high impact on neuroscience field and ii) its application to real data from healthy volunteers eliciting specific cognitive processes (attention and memory). In particular the methodological issues addressed in the present thesis consisted in finding out an approach to be applied in neuroscience field able to: i) include all the cerebral sources in connectivity estimation process; ii) to accurately describe the temporal evolution of connectivity networks; iii) to assess the significance of connectivity patterns; iv) to consistently describe relevant properties of brain networks. The advancement provided in this thesis allowed finding out quantifiable descriptors of cognitive processes during a high resolution EEG experiment involving subjects performing complex cognitive tasks.
Resumo:
BACKGROUND The role of surgery for patients with metastatic esophagogastric adenocarcinoma (EGC) is not defined. The purpose of this study was to define selection criteria for patients who may benefit from resection following systemic chemotherapy. METHODS From 1987 to 2007, 160 patients presenting with synchronous metastatic EGC (cT3/4 cNany cM0/1 finally pM1) were treated with chemotherapy followed by resection of the primary tumor and metastases. Clinical and histopathological data, site and number of metastases were analyzed. A prognostic score was established and validated in a second cohort from another academic center (n = 32). RESULTS The median survival (MS) in cohort 1 was 13.6 months. Significant prognostic factors were grading (p = 0.046), ypT- (p = 0.001), ypN- (p = 0.011) and R-category (p = 0.015), lymphangiosis (p = 0.021), clinical (p = 0.004) and histopathological response (p = 0.006), but not localization or number of metastases. The addition of grading (G1/2:0 points; G3/4:1 points), clinical response (responder: 0; nonresponder: 1) and R-category (complete:0; R1:1; R2:2) defines two groups of patients with significantly different survival (p = 0.001) [low risk group (Score 0/1), n = 22: MS 35.3 months, 3-year-survival 47.6%); high risk group (Score 2/3/4) n = 126: MS 12.0 months, 3-year-survival 14.2%]. The score showed a strong trend in the validation cohort (p = 0.063) [low risk group (MS not reached, 3-year-survival 57.1%); high risk group (MS 19.9 months, 3-year-survival 6.7%)]. CONCLUSION We observed long-term survival after resection of metastatic EGC. A simple clinical score may help to identify a subgroup of patients with a high chance of benefit from resection. However, the accurate estimation of achieving a complete resection, which is an integral element of the score, remains challenging.
Resumo:
The vestibular system contributes to the control of posture and eye movements and is also involved in various cognitive functions including spatial navigation and memory. These functions are subtended by projections to a vestibular cortex, whose exact location in the human brain is still a matter of debate (Lopez and Blanke, 2011). The vestibular cortex can be defined as the network of all cortical areas receiving inputs from the vestibular system, including areas where vestibular signals influence the processing of other sensory (e.g. somatosensory and visual) and motor signals. Previous neuroimaging studies used caloric vestibular stimulation (CVS), galvanic vestibular stimulation (GVS), and auditory stimulation (clicks and short-tone bursts) to activate the vestibular receptors and localize the vestibular cortex. However, these three methods differ regarding the receptors stimulated (otoliths, semicircular canals) and the concurrent activation of the tactile, thermal, nociceptive and auditory systems. To evaluate the convergence between these methods and provide a statistical analysis of the localization of the human vestibular cortex, we performed an activation likelihood estimation (ALE) meta-analysis of neuroimaging studies using CVS, GVS, and auditory stimuli. We analyzed a total of 352 activation foci reported in 16 studies carried out in a total of 192 healthy participants. The results reveal that the main regions activated by CVS, GVS, or auditory stimuli were located in the Sylvian fissure, insula, retroinsular cortex, fronto-parietal operculum, superior temporal gyrus, and cingulate cortex. Conjunction analysis indicated that regions showing convergence between two stimulation methods were located in the median (short gyrus III) and posterior (long gyrus IV) insula, parietal operculum and retroinsular cortex (Ri). The only area of convergence between all three methods of stimulation was located in Ri. The data indicate that Ri, parietal operculum and posterior insula are vestibular regions where afferents converge from otoliths and semicircular canals, and may thus be involved in the processing of signals informing about body rotations, translations and tilts. Results from the meta-analysis are in agreement with electrophysiological recordings in monkeys showing main vestibular projections in the transitional zone between Ri, the insular granular field (Ig), and SII.
Resumo:
A new radiolarian-based transfer function for sea surface temperature (SST) estimations has been developed from 23 taxa and taxa groups in 53 surface sediment samples recovered between 35° and 72°S in the Atlantic sector of the Southern Ocean. For the selection of taxa and taxa groups ecological information from water column studies was considered. The transfer function allows the estimation of austral summer SST (December-March) ranging between -1 and 18°C with a standard error of estimate of 1.2°C. SST estimates from selected late Pleistocene squences were sucessfully compared with independend paleotemperature estimates derived from a diatom transfer function. This shows that radiolarians provide an excellent tool for paleotemperature reconstructions in Pleistocene sediments of the Southern Ocean.
Resumo:
Vols. 3-9 edited by W.A. Davis and Samuel S. Sadtler.
Resumo:
In this paper, we describe an algorithm that automatically detects and labels peaks I - VII of the normal, suprathreshold auditory brainstem response (ABR). The algorithm proceeds in three stages, with the option of a fourth: ( 1) all candidate peaks and troughs in the ABR waveform are identified using zero crossings of the first derivative, ( 2) peaks I - VII are identified from these candidate peaks based on their latency and morphology, ( 3) if required, peaks II and IV are identified as points of inflection using zero crossings of the second derivative and ( 4) interpeak troughs are identified before peak latencies and amplitudes are measured. The performance of the algorithm was estimated on a set of 240 normal ABR waveforms recorded using a stimulus intensity of 90 dBnHL. When compared to an expert audiologist, the algorithm correctly identified the major ABR peaks ( I, III and V) in 96 - 98% of the waveforms and the minor ABR peaks ( II, IV, VI and VII) in 45 - 83% of waveforms. Whilst peak II was correctly identified in only 83% and peak IV in 77% of waveforms, it was shown that 5% of the peak II identifications and 31% of the peak IV identifications came as a direct result of allowing these peaks to be found as points of inflection. Copyright (C) 2005 S. Karger AG, Basel.
Resumo:
Technology changes rapidly over years providing continuously more options for computer alternatives and making life easier for economic, intra-relation or any other transactions. However, the introduction of new technology “pushes” old Information and Communication Technology (ICT) products to non-use. E-waste is defined as the quantities of ICT products which are not in use and is bivariate function of the sold quantities, and the probability that specific computers quantity will be regarded as obsolete. In this paper, an e-waste generation model is presented, which is applied to the following regions: Western and Eastern Europe, Asia/Pacific, Japan/Australia/New Zealand, North and South America. Furthermore, cumulative computer sales were retrieved for selected countries of the regions so as to compute obsolete computer quantities. In order to provide robust results for the forecasted quantities, a selection of forecasting models, namely (i) Bass, (ii) Gompertz, (iii) Logistic, (iv) Trend model, (v) Level model, (vi) AutoRegressive Moving Average (ARMA), and (vii) Exponential Smoothing were applied, depicting for each country that model which would provide better results in terms of minimum error indices (Mean Absolute Error and Mean Square Error) for the in-sample estimation. As new technology does not diffuse in all the regions of the world with the same speed due to different socio-economic factors, the lifespan distribution, which provides the probability of a certain quantity of computers to be considered as obsolete, is not adequately modeled in the literature. The time horizon for the forecasted quantities is 2014-2030, while the results show a very sharp increase in the USA and United Kingdom, due to the fact of decreasing computer lifespan and increasing sales.
Resumo:
Limited literature regarding parameter estimation of dynamic systems has been identified as the central-most reason for not having parametric bounds in chaotic time series. However, literature suggests that a chaotic system displays a sensitive dependence on initial conditions, and our study reveals that the behavior of chaotic system: is also sensitive to changes in parameter values. Therefore, parameter estimation technique could make it possible to establish parametric bounds on a nonlinear dynamic system underlying a given time series, which in turn can improve predictability. By extracting the relationship between parametric bounds and predictability, we implemented chaos-based models for improving prediction in time series. ^ This study describes work done to establish bounds on a set of unknown parameters. Our research results reveal that by establishing parametric bounds, it is possible to improve the predictability of any time series, although the dynamics or the mathematical model of that series is not known apriori. In our attempt to improve the predictability of various time series, we have established the bounds for a set of unknown parameters. These are: (i) the embedding dimension to unfold a set of observation in the phase space, (ii) the time delay to use for a series, (iii) the number of neighborhood points to use for avoiding detection of false neighborhood and, (iv) the local polynomial to build numerical interpolation functions from one region to another. Using these bounds, we are able to get better predictability in chaotic time series than previously reported. In addition, the developments of this dissertation can establish a theoretical framework to investigate predictability in time series from the system-dynamics point of view. ^ In closing, our procedure significantly reduces the computer resource usage, as the search method is refined and efficient. Finally, the uniqueness of our method lies in its ability to extract chaotic dynamics inherent in non-linear time series by observing its values. ^
Resumo:
The quantitative diatom analysis of 218 surface sediment samples recovered in the Atlantic and western Indian sector of the Southern Ocean is used to define a base of reference data for paleotemperature estimations from diatom assemblages using the Imbrie and Kipp transfer function method. The criteria which justify the exclusion of samples and species out of the raw data set in order to define a reference database are outlined and discussed. Sensitivity tests with eight data sets were achieved evaluating the effects of overall dominance of single species, different methods of species abundance ranking, and no-analog conditions (e.g., Eucampia Antarctica) on the estimated paleotemperatures. The defined transfer functions were applied on a sediment core from the northern Antarctic zone. Overall dominance of Fragilariopsis kerguelensis in the diatom assemblages resulted in a close affinity between paleotemperature curve and relative abundance pattern of this species downcore. Logarithmic conversion of counting data applied with other ranking methods in order to compensate the dominance of F. kerguelensis revealed the best statistical results. A reliable diatom transfer function for future paleotemperature estimations is presented.
Resumo:
Type IV secretion systems (T4SSs) are multiprotein complexes that transport effector proteins and protein-DNA complexes through bacterial membranes to the extracellular milieu or directly into the cytoplasm of other cells. Many bacteria of the family Xanthomonadaceae, which occupy diverse environmental niches, carry a T4SS with unknown function but with several characteristics that distinguishes it from other T4SSs. Here we show that the Xanthomonas citri T4SS provides these cells the capacity to kill other Gram-negative bacterial species in a contact-dependent manner. The secretion of one type IV bacterial effector protein is shown to require a conserved C-terminal domain and its bacteriolytic activity is neutralized by a cognate immunity protein whose 3D structure is similar to peptidoglycan hydrolase inhibitors. This is the first demonstration of the involvement of a T4SS in bacterial killing and points to this special class of T4SS as a mediator of both antagonistic and cooperative interbacterial interactions.