861 resultados para Overall Likelihood and Posterior


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Roads represent a new source of mortality due to animal-vehicle risk of collision threatening log-term populations’ viability. Risk of road-kill depends on species sensitivity to roads and their specific life-history traits. The risk of road mortality for each species depends on the characteristics of roads and bioecological characteristics of the species. In this study we intend to know the importance of climatic parameters (temperature and precipitation) together with traffic and life history traits and understand the role of drought in barn owl population viability, also affected by road mortality in three scenarios: high mobility, high population density and the combination of previous scenarios (mixed) (Manuscript). For the first objective we correlated the several parameters (climate, traffic and life history traits). We used the most correlated variables to build a predictive mixed model (GLMM) the influence of the same. Using a population model we evaluated barn owl population viability in all three scenarios. Model revealed precipitation, traffic and dispersal have negative relationship with road-kills, although the relationship was not significant. Scenarios showed different results, high mobility scenario showed greater population depletion, more fluctuations over time and greater risk of extinction. High population density scenario showed a more stable population with lower risk of extinction and mixed scenario showed similar results as first scenario. Climate seems to play an indirect role on barn owl road-kills, it may influence prey availability which influences barn owl reproductive success and activity. Also, high mobility scenario showed a greater negative impact on viability of populations which may affect their ability and resilience to other stochastic events. Future research should take in account climate and how it may influence species life cycles and activity periods for a more complete approach of road-kills. Also it is important to make the best mitigation decisions which might include improving prey quality habitat.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Central notations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform. In this way very elaborated aspects of mathematical statistics can be understood easily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating, combination of likelihood and robust M-estimation functions are simple additions/ perturbations in A2(Pprior). Weighting observations corresponds to a weighted addition of the corresponding evidence. Likelihood based statistics for general exponential families turns out to have a particularly easy interpretation in terms of A2(P). Regular exponential families form finite dimensional linear subspaces of A2(P) and they correspond to finite dimensional subspaces formed by their posterior in the dual information space A2(Pprior). The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P. The discussion of A2(P) valued random variables, such as estimation functions or likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The application of custom classification techniques and posterior probability modeling (PPM) using Worldview-2 multispectral imagery to archaeological field survey is presented in this paper. Research is focused on the identification of Neolithic felsite stone tool workshops in the North Mavine region of the Shetland Islands in Northern Scotland. Sample data from known workshops surveyed using differential GPS are used alongside known non-sites to train a linear discriminant analysis (LDA) classifier based on a combination of datasets including Worldview-2 bands, band difference ratios (BDR) and topographical derivatives. Principal components analysis is further used to test and reduce dimensionality caused by redundant datasets. Probability models were generated by LDA using principal components and tested with sites identified through geological field survey. Testing shows the prospective ability of this technique and significance between 0.05 and 0.01, and gain statistics between 0.90 and 0.94, higher than those obtained using maximum likelihood and random forest classifiers. Results suggest that this approach is best suited to relatively homogenous site types, and performs better with correlated data sources. Finally, by combining posterior probability models and least-cost analysis, a survey least-cost efficacy model is generated showing the utility of such approaches to archaeological field survey.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Risks and uncertainties are inevitable in engineering projects and infrastructure investments. Decisions about investment in infrastructure such as for maintenance, rehabilitation and construction works can pose risks, and may generate significant impacts on social, cultural, environmental and other related issues. This report presents the results of a literature review of current practice in identifying, quantifying and managing risks and predicting impacts as part of the planning and assessment process for infrastructure investment proposals. In assessing proposals for investment in infrastructure, it is necessary to consider social, cultural and environmental risks and impacts to the overall community, as well as financial risks to the investor. The report defines and explains the concept of risk and uncertainty, and describes the three main methodology approaches to the analysis of risk and uncertainty in investment planning for infrastructure, viz examining a range of scenarios or options, sensitivity analysis, and a statistical probability approach, listed here in order of increasing merit and complexity. Forecasts of costs, benefits and community impacts of infrastructure are recognised as central aspects of developing and assessing investment proposals. Increasingly complex modelling techniques are being used for investment evaluation. The literature review identified forecasting errors as the major cause of risk. The report contains a summary of the broad nature of decision-making tools used by governments and other organisations in Australia, New Zealand, Europe and North America, and shows their overall approach to risk assessment in assessing public infrastructure proposals. While there are established techniques to quantify financial and economic risks, quantification is far less developed for political, social and environmental risks and impacts. The report contains a summary of the broad nature of decision-making tools used by governments and other organisations in Australia, New Zealand, Europe and North America, and shows their overall approach to risk assessment in assessing public infrastructure proposals. While there are established techniques to quantify financial and economic risks, quantification is far less developed for political, social and environmental risks and impacts. For risks that cannot be readily quantified, assessment techniques commonly include classification or rating systems for likelihood and consequence. The report outlines the system used by the Australian Defence Organisation and in the Australian Standard on risk management. After each risk is identified and quantified or rated, consideration can be given to reducing the risk, and managing any remaining risk as part of the scope of the project. The literature review identified use of risk mapping techniques by a North American chemical company and by the Australian Defence Organisation. This literature review has enabled a risk assessment strategy to be developed, and will underpin an examination of the feasibility of developing a risk assessment capability using a probability approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exceeding the speed limit and driving too fast for the conditions are regularly cited as significant contributing factors in traffic crashes, particularly fatal and serious injury crashes. Despite an extensive body of research highlighting the relationship between increased vehicle speeds and crash risk and severity, speeding remains a pervasive behaviour on Australian roads. The development of effective countermeasures designed to reduce the prevalence of speeding behaviour requires that this behaviour is well understood. The primary aim of this program of research was to develop a better understanding of the influence of drivers’ perceptions and attitudes toward police speed enforcement on speeding behaviour. Study 1 employed focus group discussions with 39 licensed drivers to explore the influence of perceptions relating to specific characteristics of speed enforcement policies and practices on drivers’ attitudes towards speed enforcement. Three primary factors were identified as being most influential: site selection; visibility; and automaticity (i.e., whether the enforcement approach is automated/camera-based or manually operated). Perceptions regarding these enforcement characteristics were found to influence attitudes regarding the perceived legitimacy and transparency of speed enforcement. Moreover, misperceptions regarding speed enforcement policies and practices appeared to also have a substantial impact on attitudes toward speed enforcement, typically in a negative direction. These findings have important implications for road safety given that prior research has suggested that the effectiveness of speed enforcement approaches may be reduced if efforts are perceived by drivers as being illegitimate, such that they do little to encourage voluntary compliance. Study 1 also examined the impact of speed enforcement approaches varying in the degree of visibility and automaticity on self-reported willingness to comply with speed limits. These discussions suggested that all of the examined speed enforcement approaches (see Section 1.5 for more details) generally showed potential to reduce vehicle speeds and encourage compliance with posted speed limits. Nonetheless, participant responses suggested a greater willingness to comply with approaches operated in a highly visible manner, irrespective of the corresponding level of automaticity of the approach. While less visible approaches were typically associated with poorer rates of driver acceptance (e.g., perceived as “sneaky” and “unfair”), participants reported that such approaches would likely encourage long-term and network-wide impacts on their own speeding behaviour, as a function of the increased unpredictability of operations and increased direct (specific deterrence) and vicarious (general deterrence) experiences with punishment. Participants in Study 1 suggested that automated approaches, particularly when operated in a highly visible manner, do little to encourage compliance with speed limits except in the immediate vicinity of the enforcement location. While speed cameras have been criticised on such grounds in the past, such approaches can still have substantial road safety benefits if implemented in high-risk settings. Moreover, site-learning effects associated with automated approaches can also be argued to be a beneficial by-product of enforcement, such that behavioural modifications are achieved even in the absence of actual enforcement. Conversely, manually operated approaches were reported to be associated with more network-wide impacts on behaviour. In addition, the reported acceptance of such methods was high, due to the increased swiftness of punishment, ability for additional illegal driving behaviours to be policed and the salutary influence associated with increased face-to-face contact with authority. Study 2 involved a quantitative survey conducted with 718 licensed Queensland drivers from metropolitan and regional areas. The survey sought to further examine the influence of the visibility and automaticity of operations on self-reported likelihood and duration of compliance. Overall, the results from Study 2 corroborated those of Study 1. All examined approaches were again found to encourage compliance with speed limits, such that all approaches could be considered to be “effective”. Nonetheless, significantly greater self-reported likelihood and duration of compliance was associated with visibly operated approaches, irrespective of the corresponding automaticity of the approach. In addition, the impact of automaticity was influenced by visibility; such that significantly greater self-reported likelihood of compliance was associated with manually operated approaches, but only when they are operated in a less visible fashion. Conversely, manually operated approaches were associated with significantly greater durations of self-reported compliance, but only when they are operated in a highly visible manner. Taken together, the findings from Studies 1 and 2 suggest that enforcement efforts, irrespective of their visibility or automaticity, generally encourage compliance with speed limits. However, the duration of these effects on behaviour upon removal of the enforcement efforts remains questionable and represents an area where current speed enforcement practices could possibly be improved. Overall, it appears that identifying the optimal mix of enforcement operations, implementing them at a sufficient intensity and increasing the unpredictability of enforcement efforts (e.g., greater use of less visible approaches, random scheduling) are critical elements of success. Hierarchical multiple regression analyses were also performed in Study 2 to investigate the punishment-related and attitudinal constructs that influence self-reported frequency of speeding behaviour. The research was based on the theoretical framework of expanded deterrence theory, augmented with three particular attitudinal constructs. Specifically, previous research examining the influence of attitudes on speeding behaviour has typically focussed on attitudes toward speeding behaviour in general only. This research sought to more comprehensively explore the influence of attitudes by also individually measuring and analysing attitudes toward speed enforcement and attitudes toward the appropriateness of speed limits on speeding behaviour. Consistent with previous research, a number of classical and expanded deterrence theory variables were found to significantly predict self-reported frequency of speeding behaviour. Significantly greater speeding behaviour was typically reported by those participants who perceived punishment associated with speeding to be less certain, who reported more frequent use of punishment avoidance strategies and who reported greater direct experiences with punishment. A number of interesting differences in the significant predictors among males and females, as well as younger and older drivers, were reported. Specifically, classical deterrence theory variables appeared most influential on the speeding behaviour of males and younger drivers, while expanded deterrence theory constructs appeared more influential for females. These findings have important implications for the development and implementation of speeding countermeasures. Of the attitudinal factors, significantly greater self-reported frequency of speeding behaviour was reported among participants who held more favourable attitudes toward speeding and who perceived speed limits to be set inappropriately low. Disappointingly, attitudes toward speed enforcement were found to have little influence on reported speeding behaviour, over and above the other deterrence theory and attitudinal constructs. Indeed, the relationship between attitudes toward speed enforcement and self-reported speeding behaviour was completely accounted for by attitudes toward speeding. Nonetheless, the complexity of attitudes toward speed enforcement are not yet fully understood and future research should more comprehensively explore the measurement of this construct. Finally, given the wealth of evidence (both in general and emerging from this program of research) highlighting the association between punishment avoidance and speeding behaviour, Study 2 also sought to investigate the factors that influence the self-reported propensity to use punishment avoidance strategies. A standard multiple regression analysis was conducted for exploratory purposes only. The results revealed that punishment-related and attitudinal factors significantly predicted approximately one fifth of the variance in the dependent variable. The perceived ability to avoid punishment, vicarious punishment experience, vicarious punishment avoidance and attitudes toward speeding were all significant predictors. Future research should examine these relationships more thoroughly and identify additional influential factors. In summary, the current program of research has a number of implications for road safety and speed enforcement policy and practice decision-making. The research highlights a number of potential avenues for the improvement of public education regarding enforcement efforts and provides a number of insights into punishment avoidance behaviours. In addition, the research adds strength to the argument that enforcement approaches should not only demonstrate effectiveness in achieving key road safety objectives, such as reduced vehicle speeds and associated crashes, but also strive to be transparent and legitimate, such that voluntary compliance is encouraged. A number of potential strategies are discussed (e.g., point-to-point speed cameras, intelligent speed adaptation. The correct mix and intensity of enforcement approaches appears critical for achieving optimum effectiveness from enforcement efforts, as well as enhancements in the unpredictability of operations and swiftness of punishment. Achievement of these goals should increase both the general and specific deterrent effects associated with enforcement through an increased perceived risk of detection and a more balanced exposure to punishment and punishment avoidance experiences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We show that the sensor self-localization problem can be cast as a static parameter estimation problem for Hidden Markov Models and we implement fully decentralized versions of the Recursive Maximum Likelihood and on-line Expectation-Maximization algorithms to localize the sensor network simultaneously with target tracking. For linear Gaussian models, our algorithms can be implemented exactly using a distributed version of the Kalman filter and a novel message passing algorithm. The latter allows each node to compute the local derivatives of the likelihood or the sufficient statistics needed for Expectation-Maximization. In the non-linear case, a solution based on local linearization in the spirit of the Extended Kalman Filter is proposed. In numerical examples we demonstrate that the developed algorithms are able to learn the localization parameters. © 2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Partial sequences of mitochondrial 16S rRNA gene were obtained by PCR amplification for comparisons among nine species of glyptosternoid fishes and six species of non-glyptosternoids representing 10 sisorid genera. There are compositional biases in the A-rich impaired regions and G-rich paired regions. A-G transitions are primarily responsible for the Ts/Tv bias in impaired regions. The overall substitution rate in impaired regions is almost two times higher than that in the paired regions. Saturation plots at comparable levels of sequence divergence demonstrate no saturation effects. Phylogenetic analyses using both maximum likelihood and Bayesian methods support the monophyly of Sisoridae. Chinese sisorid catfishes are composed of two major lineages, one represented by (Gagata (Bagarius, Glyptothorax)) and the other by "glyptosternoids + Pseudecheneis". The glyptosternoids may not be a monophyletic group. A previous hypothesis referring to Pseudecheneis as the sister group of monophyletic glyptosternoids, based on morphological evidence, is not supported by the molecular data. Pseudecheneis is shown to be a sister taxon of Glaridoglanis. Pareuchiloglanis might be paraphyletic with Pseudexostoma and Euchiloglanis. Our results also support the hypothesis that Pareuchiloglanis anteanalis might be considered as the synonyms of Pareuchiloglanis sinensis, and genus Euchiloglanis might have only one valid species, Euchiloglanis davidi.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study explores the role of livestock insurance to complement existing risk management strategies adopted by smallholder farmers. Using survey data, first, it provides insights into farmers’ risk perception of livestock farming, in terms of likelihood and severity of risk, attitude to risk and their determinants. Second, it examines farmers’ risk management strategies and their determinants. Third, it investigates farmers’ potential engagement with a hypothetical cattle insurance decision and their intensity of participation. Factor analysis is used to analyse risk sources and risk management, multiple regressions are used to identify the determinants; a Heckman model was used to investigate cattle insurance participation and intensity of participation. The findings show different groups of farmers display different risk attitude in their decision-making related to livestock farming. Production risk (especially livestock diseases) was perceived as the most likely and severe source of risk. Disease control was perceived as the best strategy to manage risk overall. Disease control and feed management were important strategies to mitigate the production risks. Disease control and participation on safety net program were found to be important to counter households’ financial risks. With regard to the hypothetical cattle insurance scheme, 94.38% of households were interested to participate in cattle insurance. Of those households that accepted cattle insurance, 77.38% of the households were willing to pay the benchmark annual premium of 4% of the animal value while for the remaining households this was not affordable. The average number of cattle that farmers were willing to insure was 2.71 at this benchmark. Results revealed that income (log income) and education levels influenced positively and significantly farmers’ participation in cattle insurance and the number of cattle to insure. The findings prompt policy makers to consider livestock insurance as a complement to existing risk management strategies to reduce poverty in the long-run.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: Platinum agents can cause the formation of DNA adducts and induce apoptosis to eliminate tumor cells. The aim of the present study was to investigate the influence of genetic variants of MDM2 on chemotherapy-related toxicities and clinical outcomes in patients with advanced non-small-cell lung cancer (NSCLC). MATERIALS AND METHODS: We recruited 663 patients with advanced NSCLC who had been treated with first-line platinum-based chemotherapy. Five tagging single nucleotide polymorphisms (SNPs) in MDM2 were genotyped in these patients. The associations of these SNPs with clinical toxicities and outcomes were evaluated using logistic regression and Cox regression analyses. RESULTS: Two SNPs (rs1470383 and rs1690924) showed significant associations with chemotherapy-related toxicities (ie, overall, hematologic, and gastrointestinal toxicity). Compared with the wild genotype AA carriers, patients with the GG genotype of rs1470383 had an increased risk of overall toxicity (odds ratio [OR], 3.28; 95% confidence interval [CI], 1.34-8.02; P = .009) and hematologic toxicity (OR, 4.10; 95% CI, 1.73-9.71; P = .001). Likewise, patients with the AG genotype of rs1690924 showed more sensitivity to gastrointestinal toxicity than did those with the wild-type homozygote GG (OR, 2.32; 95% CI, 1.30-4.14; P = .004). Stratified survival analysis revealed significant associations between rs1470383 genotypes and overall survival in patients without overall or hematologic toxicity (P = .007 and P = .0009, respectively). CONCLUSION: The results of our study suggest that SNPs in MDM2 might be used to predict the toxicities of platinum-based chemotherapy and overall survival in patients with advanced NSCLC. Additional validations of the association are warranted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The HSP90 chaperone and immunophilin FKBPL is an estrogen-responsive gene that interacts with estogen receptor a (ERa) and regulates its levels. In this study, we explored the effects of FKBPL on breast cancer proliferation. Breast cancer cells stably overexpressing FKBPL became dependent on estrogen for their growth and were dramatically more sensitive to the antiestrogens tamoxifen and fulvestrant, whereas FKBPL knockdown reverses this phenotype. FKBPL knockdown also decreased the levels of the cell cycle inhibitor p21WAF1 and increased ERa phosphorylation on Ser118 in response to 17ß-estradiol and tamoxifen. In support of the likelihood that these effects explained FKBPL-mediated cell growth inhibition and sensitivity to endocrine therapies, FKBPL expression was correlated with increased overall survival and distant metastasis-free survival in breast cancer patients. Our findings suggest that FKBPL may have prognostic value based on its impact on tumor proliferative capacity and sensitivity to endocrine therapies, which improve outcome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To describe sequential phacoemulsification-intraocular lens (IOL) implantation-posterior capsulorhexis-anterior vitrectomy in the management of phakic malignant glaucoma. Methods: Twenty consecutive patients (25 eyes) with phakic malignant glaucoma were enrolled at the Zhongshan Ophthalmic Center, Sun Yat-sen University. All patients underwent phacoemulsification, IOL implantation and posterior capsulorhexis together with anterior vitrectomy via a clear corneal paracentesis. Visual acuity, intraocular pressure (IOP), anterior chamber depth (ACD), surgical complications and medications required after the surgery were recorded. Results: After surgery, the mean LogMAR visual acuity and ACD increased significantly (visual acuity from -1.56 ± 1.17 to -0.54 ± 0.81, p < 0.001; ACD from 0.367 ± 0.397 mm to 2.390 ± 0.575 mm, p < 0.001), and mean IOP decreased significantly (from 39.6 ± 10.6 mm Hg to 14.5 ± 4.1 mmHg, p < 0.001). No serious perioperative complications occurred, and only five eyes required topical glaucoma medications after surgery. Conclusion: Combined phacoemulsification-IOL implantation-posterior capsulorhexis-anterior vitrectomy surgery is a safe and effective method for treating patients with phakic malignant glaucoma. © 2012 The Authors. Acta Ophthalmologica © 2012 Acta Ophthalmologica Scandinavica Foundation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cycliophora é um filo animal descrito recentemente que acomoda, apenas, duas espécies: Symbion pandora Funch e Kristensen, 1995 e S. americanus Obst, Funch e Kristensen, 2006. Este filo é caracterizado por um ciclo de vida assaz complexo, cuja posição filogenética tem sido debatida desde a sua descoberta. Esta dissertação visa aprofundar o conhecimento geral existente acerca destes enigmáticos e pouco explorados metazoários. Assim, vários aspectos da morfologia e ecologia de ciclióforos foram estudados através de observações in vivo, técnicas de microscopia e reconstrução tridimensional. A mioanatomia de várias fases do ciclo de vida é descrita para S. pandora e S. americanus. Os nossos resultados revelam uma similaridade contundente entre a musculatura das duas espécies. A mioanatomia geral de Symbion é, ainda, comparada à de outros metazoários. A expressão de algumas substâncias imunorreactivas, como são exemplo a serotonina e as sinapsinas, é investigada em várias formas do ciclo de vida. Quando comparados com outros representantes de Spiralia, conclui-se que a neuroanatomia geral dos ciclióforos se assemelha mais às formas larvares do que aos adultos. Apesar de possuírem um plano corporal sofisticado, com extensas áreas ciliadas e uma mioanatomia complexa, descobrimos que o macho de ambas as espécies Symbion é composto por apenas algumas dezenas de células. Baseando-nos nestas observações, inferimos que a complexidade dos metazoários não se relaciona com o tamanho corporal nem com o número de células de um organismo. Estudos sobre a ultra-estrutura da fêmea revelaram, entre outras estruturas, um putativo poro genital, extensões citoplasmáticas do oócito e glândulas posteriores. Morfologia e implicações funcionais destas estruturas são aqui discutidas. A anatomia do protonefrídeo da larva cordóide é descrita. A arquitectura deste órgão diverge daquela presente noutros representantes de Nephrozoa, particularmente ao nível da área de filtração da célula terminal. As nossas observações são discutidas em termos filogenéticos. A maturação sexual em ciclióforos é investigada. Os nossos resultados sugerem que a transição de reprodução assexual a sexual se relacione com a idade da forma séssil, a “feeding stage”. A presença da larva Prometeus assente no tronco desta também poderá influenciar o processo, embora mais estudos sejam desejáveis para o comprovar. Os nossos resultados são discutidos integrativa e comparativamente com o conhecimento prévio sobre Cycliophora. A cumulação deste conhecimento será essencial para a compreensão da evolução e filogenia deste enigmático filo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La dernière décennie a connu un intérêt croissant pour les problèmes posés par les variables instrumentales faibles dans la littérature économétrique, c’est-à-dire les situations où les variables instrumentales sont faiblement corrélées avec la variable à instrumenter. En effet, il est bien connu que lorsque les instruments sont faibles, les distributions des statistiques de Student, de Wald, du ratio de vraisemblance et du multiplicateur de Lagrange ne sont plus standard et dépendent souvent de paramètres de nuisance. Plusieurs études empiriques portant notamment sur les modèles de rendements à l’éducation [Angrist et Krueger (1991, 1995), Angrist et al. (1999), Bound et al. (1995), Dufour et Taamouti (2007)] et d’évaluation des actifs financiers (C-CAPM) [Hansen et Singleton (1982,1983), Stock et Wright (2000)], où les variables instrumentales sont faiblement corrélées avec la variable à instrumenter, ont montré que l’utilisation de ces statistiques conduit souvent à des résultats peu fiables. Un remède à ce problème est l’utilisation de tests robustes à l’identification [Anderson et Rubin (1949), Moreira (2002), Kleibergen (2003), Dufour et Taamouti (2007)]. Cependant, il n’existe aucune littérature économétrique sur la qualité des procédures robustes à l’identification lorsque les instruments disponibles sont endogènes ou à la fois endogènes et faibles. Cela soulève la question de savoir ce qui arrive aux procédures d’inférence robustes à l’identification lorsque certaines variables instrumentales supposées exogènes ne le sont pas effectivement. Plus précisément, qu’arrive-t-il si une variable instrumentale invalide est ajoutée à un ensemble d’instruments valides? Ces procédures se comportent-elles différemment? Et si l’endogénéité des variables instrumentales pose des difficultés majeures à l’inférence statistique, peut-on proposer des procédures de tests qui sélectionnent les instruments lorsqu’ils sont à la fois forts et valides? Est-il possible de proposer les proédures de sélection d’instruments qui demeurent valides même en présence d’identification faible? Cette thèse se focalise sur les modèles structurels (modèles à équations simultanées) et apporte des réponses à ces questions à travers quatre essais. Le premier essai est publié dans Journal of Statistical Planning and Inference 138 (2008) 2649 – 2661. Dans cet essai, nous analysons les effets de l’endogénéité des instruments sur deux statistiques de test robustes à l’identification: la statistique d’Anderson et Rubin (AR, 1949) et la statistique de Kleibergen (K, 2003), avec ou sans instruments faibles. D’abord, lorsque le paramètre qui contrôle l’endogénéité des instruments est fixe (ne dépend pas de la taille de l’échantillon), nous montrons que toutes ces procédures sont en général convergentes contre la présence d’instruments invalides (c’est-à-dire détectent la présence d’instruments invalides) indépendamment de leur qualité (forts ou faibles). Nous décrivons aussi des cas où cette convergence peut ne pas tenir, mais la distribution asymptotique est modifiée d’une manière qui pourrait conduire à des distorsions de niveau même pour de grands échantillons. Ceci inclut, en particulier, les cas où l’estimateur des double moindres carrés demeure convergent, mais les tests sont asymptotiquement invalides. Ensuite, lorsque les instruments sont localement exogènes (c’est-à-dire le paramètre d’endogénéité converge vers zéro lorsque la taille de l’échantillon augmente), nous montrons que ces tests convergent vers des distributions chi-carré non centrées, que les instruments soient forts ou faibles. Nous caractérisons aussi les situations où le paramètre de non centralité est nul et la distribution asymptotique des statistiques demeure la même que dans le cas des instruments valides (malgré la présence des instruments invalides). Le deuxième essai étudie l’impact des instruments faibles sur les tests de spécification du type Durbin-Wu-Hausman (DWH) ainsi que le test de Revankar et Hartley (1973). Nous proposons une analyse en petit et grand échantillon de la distribution de ces tests sous l’hypothèse nulle (niveau) et l’alternative (puissance), incluant les cas où l’identification est déficiente ou faible (instruments faibles). Notre analyse en petit échantillon founit plusieurs perspectives ainsi que des extensions des précédentes procédures. En effet, la caractérisation de la distribution de ces statistiques en petit échantillon permet la construction des tests de Monte Carlo exacts pour l’exogénéité même avec les erreurs non Gaussiens. Nous montrons que ces tests sont typiquement robustes aux intruments faibles (le niveau est contrôlé). De plus, nous fournissons une caractérisation de la puissance des tests, qui exhibe clairement les facteurs qui déterminent la puissance. Nous montrons que les tests n’ont pas de puissance lorsque tous les instruments sont faibles [similaire à Guggenberger(2008)]. Cependant, la puissance existe tant qu’au moins un seul instruments est fort. La conclusion de Guggenberger (2008) concerne le cas où tous les instruments sont faibles (un cas d’intérêt mineur en pratique). Notre théorie asymptotique sous les hypothèses affaiblies confirme la théorie en échantillon fini. Par ailleurs, nous présentons une analyse de Monte Carlo indiquant que: (1) l’estimateur des moindres carrés ordinaires est plus efficace que celui des doubles moindres carrés lorsque les instruments sont faibles et l’endogenéité modérée [conclusion similaire à celle de Kiviet and Niemczyk (2007)]; (2) les estimateurs pré-test basés sur les tests d’exogenété ont une excellente performance par rapport aux doubles moindres carrés. Ceci suggère que la méthode des variables instrumentales ne devrait être appliquée que si l’on a la certitude d’avoir des instruments forts. Donc, les conclusions de Guggenberger (2008) sont mitigées et pourraient être trompeuses. Nous illustrons nos résultats théoriques à travers des expériences de simulation et deux applications empiriques: la relation entre le taux d’ouverture et la croissance économique et le problème bien connu du rendement à l’éducation. Le troisième essai étend le test d’exogénéité du type Wald proposé par Dufour (1987) aux cas où les erreurs de la régression ont une distribution non-normale. Nous proposons une nouvelle version du précédent test qui est valide même en présence d’erreurs non-Gaussiens. Contrairement aux procédures de test d’exogénéité usuelles (tests de Durbin-Wu-Hausman et de Rvankar- Hartley), le test de Wald permet de résoudre un problème courant dans les travaux empiriques qui consiste à tester l’exogénéité partielle d’un sous ensemble de variables. Nous proposons deux nouveaux estimateurs pré-test basés sur le test de Wald qui performent mieux (en terme d’erreur quadratique moyenne) que l’estimateur IV usuel lorsque les variables instrumentales sont faibles et l’endogénéité modérée. Nous montrons également que ce test peut servir de procédure de sélection de variables instrumentales. Nous illustrons les résultats théoriques par deux applications empiriques: le modèle bien connu d’équation du salaire [Angist et Krueger (1991, 1999)] et les rendements d’échelle [Nerlove (1963)]. Nos résultats suggèrent que l’éducation de la mère expliquerait le décrochage de son fils, que l’output est une variable endogène dans l’estimation du coût de la firme et que le prix du fuel en est un instrument valide pour l’output. Le quatrième essai résout deux problèmes très importants dans la littérature économétrique. D’abord, bien que le test de Wald initial ou étendu permette de construire les régions de confiance et de tester les restrictions linéaires sur les covariances, il suppose que les paramètres du modèle sont identifiés. Lorsque l’identification est faible (instruments faiblement corrélés avec la variable à instrumenter), ce test n’est en général plus valide. Cet essai développe une procédure d’inférence robuste à l’identification (instruments faibles) qui permet de construire des régions de confiance pour la matrices de covariances entre les erreurs de la régression et les variables explicatives (possiblement endogènes). Nous fournissons les expressions analytiques des régions de confiance et caractérisons les conditions nécessaires et suffisantes sous lesquelles ils sont bornés. La procédure proposée demeure valide même pour de petits échantillons et elle est aussi asymptotiquement robuste à l’hétéroscédasticité et l’autocorrélation des erreurs. Ensuite, les résultats sont utilisés pour développer les tests d’exogénéité partielle robustes à l’identification. Les simulations Monte Carlo indiquent que ces tests contrôlent le niveau et ont de la puissance même si les instruments sont faibles. Ceci nous permet de proposer une procédure valide de sélection de variables instrumentales même s’il y a un problème d’identification. La procédure de sélection des instruments est basée sur deux nouveaux estimateurs pré-test qui combinent l’estimateur IV usuel et les estimateurs IV partiels. Nos simulations montrent que: (1) tout comme l’estimateur des moindres carrés ordinaires, les estimateurs IV partiels sont plus efficaces que l’estimateur IV usuel lorsque les instruments sont faibles et l’endogénéité modérée; (2) les estimateurs pré-test ont globalement une excellente performance comparés à l’estimateur IV usuel. Nous illustrons nos résultats théoriques par deux applications empiriques: la relation entre le taux d’ouverture et la croissance économique et le modèle de rendements à l’éducation. Dans la première application, les études antérieures ont conclu que les instruments n’étaient pas trop faibles [Dufour et Taamouti (2007)] alors qu’ils le sont fortement dans la seconde [Bound (1995), Doko et Dufour (2009)]. Conformément à nos résultats théoriques, nous trouvons les régions de confiance non bornées pour la covariance dans le cas où les instruments sont assez faibles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous research has shown that people's evaluations of explanations about medication and their intention to comply with the prescription are detrimentally affected by the inclusion of information about adverse side effects of the medication. The present study (Experiment 1) examined which particular aspects of information about side effects (their number, likelihood of occurrence, or severity) are likely to have the greatest effect on people's satisfaction, perception of risk, and intention to comply, as well as how the information about side effects interacts with information about the severity of the illness for which the medication was prescribed. Across all measures, it was found that manipulations of side effect severity had the greatest impact on people's judgements, followed by manipulations of side effect likelihood and then number. Experiments 2 and 3 examined how the severity of the diagnosed illness and information about negative side effects interact with two other factors suggested by Social Cognition models of health behaviour to affect people's intention to comply: namely, perceived benefit of taking the prescribed drug, and the perceived level of control over preventing or alleviating the side effects. It was found that providing people with a statement about the positive benefit of taking the medication had relatively little effect on judgements, whereas informing them about how to reduce the chances of experiencing the side effects had an overall beneficial effect on ratings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The amygdala plays a critical role in determining the emotional significance of sensory stimuli and the production of fear-related responses. Large amygdalar lesions have been shown to practically abolish innate defensiveness to a predator; however, it is not clear how the different amygdalar systems participate in the defensive response to a live predator. Our first aim was to provide a comprehensive analysis of the amygdalar activation pattern during exposure to a live cat and to a predator-associated context. Accordingly, exposure to a live predator up-regulated Fos expression in the medial amygdalar nucleus (MEA) and in the lateral and posterior basomedial nuclei, the former responding to predator-related pheromonal information and the latter two nuclei likely to integrate a wider array of predatory sensory information, ranging from olfactory to non-olfactory ones, such as visual and auditory sensory inputs. Next, we tested how the amygdalar nuclei most responsive to predator exposure (i.e. the medial, posterior basomedial and lateral amygdalar nuclei) and the central amygdalar nucleus (CEA) influence both unconditioned and contextual conditioned anti-predatory defensive behavior. Medial amygdalar nucleus lesions practically abolished defensive responses during cat exposure, whereas lesions of the posterior basomedial or lateral amygdalar nuclei reduced freezing and increased risk assessment displays (i.e. crouch sniff and stretch postures), a pattern of responses compatible with decreased defensiveness to predator stimuli. Moreover, the present findings suggest a role for the posterior basomedial and lateral amygdalar nuclei in the conditioning responses to a predator-related context. We have further shown that the CEA does not seem to be involved in either unconditioned or contextual conditioned anti-predatory responses. Overall, the present results help to clarify the amygdalar systems involved in processing predator-related sensory stimuli and how they influence the expression of unconditioned and contextual conditioned anti-predatory responses. (C) 2011 IBRO. Published by Elsevier Ltd. All rights reserved.