892 resultados para Coverage bias


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cette thèse développe des méthodes bootstrap pour les modèles à facteurs qui sont couram- ment utilisés pour générer des prévisions depuis l'article pionnier de Stock et Watson (2002) sur les indices de diffusion. Ces modèles tolèrent l'inclusion d'un grand nombre de variables macroéconomiques et financières comme prédicteurs, une caractéristique utile pour inclure di- verses informations disponibles aux agents économiques. Ma thèse propose donc des outils éco- nométriques qui améliorent l'inférence dans les modèles à facteurs utilisant des facteurs latents extraits d'un large panel de prédicteurs observés. Il est subdivisé en trois chapitres complémen- taires dont les deux premiers en collaboration avec Sílvia Gonçalves et Benoit Perron. Dans le premier article, nous étudions comment les méthodes bootstrap peuvent être utilisées pour faire de l'inférence dans les modèles de prévision pour un horizon de h périodes dans le futur. Pour ce faire, il examine l'inférence bootstrap dans un contexte de régression augmentée de facteurs où les erreurs pourraient être autocorrélées. Il généralise les résultats de Gonçalves et Perron (2014) et propose puis justifie deux approches basées sur les résidus : le block wild bootstrap et le dependent wild bootstrap. Nos simulations montrent une amélioration des taux de couverture des intervalles de confiance des coefficients estimés en utilisant ces approches comparativement à la théorie asymptotique et au wild bootstrap en présence de corrélation sérielle dans les erreurs de régression. Le deuxième chapitre propose des méthodes bootstrap pour la construction des intervalles de prévision permettant de relâcher l'hypothèse de normalité des innovations. Nous y propo- sons des intervalles de prédiction bootstrap pour une observation h périodes dans le futur et sa moyenne conditionnelle. Nous supposons que ces prévisions sont faites en utilisant un ensemble de facteurs extraits d'un large panel de variables. Parce que nous traitons ces facteurs comme latents, nos prévisions dépendent à la fois des facteurs estimés et les coefficients de régres- sion estimés. Sous des conditions de régularité, Bai et Ng (2006) ont proposé la construction d'intervalles asymptotiques sous l'hypothèse de Gaussianité des innovations. Le bootstrap nous permet de relâcher cette hypothèse et de construire des intervalles de prédiction valides sous des hypothèses plus générales. En outre, même en supposant la Gaussianité, le bootstrap conduit à des intervalles plus précis dans les cas où la dimension transversale est relativement faible car il prend en considération le biais de l'estimateur des moindres carrés ordinaires comme le montre une étude récente de Gonçalves et Perron (2014). Dans le troisième chapitre, nous suggérons des procédures de sélection convergentes pour les regressions augmentées de facteurs en échantillons finis. Nous démontrons premièrement que la méthode de validation croisée usuelle est non-convergente mais que sa généralisation, la validation croisée «leave-d-out» sélectionne le plus petit ensemble de facteurs estimés pour l'espace généré par les vraies facteurs. Le deuxième critère dont nous montrons également la validité généralise l'approximation bootstrap de Shao (1996) pour les regressions augmentées de facteurs. Les simulations montrent une amélioration de la probabilité de sélectionner par- cimonieusement les facteurs estimés comparativement aux méthodes de sélection disponibles. L'application empirique revisite la relation entre les facteurs macroéconomiques et financiers, et l'excès de rendement sur le marché boursier américain. Parmi les facteurs estimés à partir d'un large panel de données macroéconomiques et financières des États Unis, les facteurs fortement correlés aux écarts de taux d'intérêt et les facteurs de Fama-French ont un bon pouvoir prédictif pour les excès de rendement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Les réseaux de capteurs sont formés d’un ensemble de dispositifs capables de prendre individuellement des mesures d’un environnement particulier et d’échanger de l’information afin d’obtenir une représentation de haut niveau sur les activités en cours dans la zone d’intérêt. Une telle détection distribuée, avec de nombreux appareils situés à proximité des phénomènes d’intérêt, est pertinente dans des domaines tels que la surveillance, l’agriculture, l’observation environnementale, la surveillance industrielle, etc. Nous proposons dans cette thèse plusieurs approches pour effectuer l’optimisation des opérations spatio-temporelles de ces dispositifs, en déterminant où les placer dans l’environnement et comment les contrôler au fil du temps afin de détecter les cibles mobiles d’intérêt. La première nouveauté consiste en un modèle de détection réaliste représentant la couverture d’un réseau de capteurs dans son environnement. Nous proposons pour cela un modèle 3D probabiliste de la capacité de détection d’un capteur sur ses abords. Ce modèle inègre également de l’information sur l’environnement grâce à l’évaluation de la visibilité selon le champ de vision. À partir de ce modèle de détection, l’optimisation spatiale est effectuée par la recherche du meilleur emplacement et l’orientation de chaque capteur du réseau. Pour ce faire, nous proposons un nouvel algorithme basé sur la descente du gradient qui a été favorablement comparée avec d’autres méthodes génériques d’optimisation «boites noires» sous l’aspect de la couverture du terrain, tout en étant plus efficace en terme de calculs. Une fois que les capteurs placés dans l’environnement, l’optimisation temporelle consiste à bien couvrir un groupe de cibles mobiles dans l’environnement. D’abord, on effectue la prédiction de la position future des cibles mobiles détectées par les capteurs. La prédiction se fait soit à l’aide de l’historique des autres cibles qui ont traversé le même environnement (prédiction à long terme), ou seulement en utilisant les déplacements précédents de la même cible (prédiction à court terme). Nous proposons de nouveaux algorithmes dans chaque catégorie qui performent mieux ou produits des résultats comparables par rapport aux méthodes existantes. Une fois que les futurs emplacements de cibles sont prédits, les paramètres des capteurs sont optimisés afin que les cibles soient correctement couvertes pendant un certain temps, selon les prédictions. À cet effet, nous proposons une méthode heuristique pour faire un contrôle de capteurs, qui se base sur les prévisions probabilistes de trajectoire des cibles et également sur la couverture probabiliste des capteurs des cibles. Et pour terminer, les méthodes d’optimisation spatiales et temporelles proposées ont été intégrées et appliquées avec succès, ce qui démontre une approche complète et efficace pour l’optimisation spatio-temporelle des réseaux de capteurs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In late 2014, a series of highly publicized police killings of unarmed Black male civilians in the United States prompted large-scale social turmoil. In the current review, we dissect the psychological antecedents of these killings and explain how the nature of police work may attract officers with distinct characteristics that may make them especially well-primed for negative interactions with Black male civilians. We use media reports to contextualize the precipitating events of the social unrest as we ground our explanations in theory and empirical research from social psychology and industrial and organizational (I/O) psychology. To isolate some of the key mechanisms at play, we disentangle racial bias (e.g., stereotyping processes) from common characteristics of law enforcement agents (e.g., social dominance orientation), while also addressing the interaction between racial bias and policing. By separating the moving parts of the phenomenon, we provide a more fine-grained analysis of the factors that may have contributed to the killings. In doing so, we endeavor to more effectively identify and develop solutions to eradicate excessive use of force during interactions between “Black” (unarmed Black male civilians) and “Blue” (law enforcement).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Timely assessment of the burden of HIV/AIDS is essential for policy setting and programme evaluation. In this report from the Global Burden of Disease Study 2015 (GBD 2015), we provide national estimates of levels and trends of HIV/AIDS incidence, prevalence, coverage of antiretroviral therapy (ART), and mortality for 195 countries and territories from 1980 to 2015. Methods For countries without high-quality vital registration data, we estimated prevalence and incidence with data from antenatal care clinics and population-based seroprevalence surveys, and with assumptions by age and sex on initial CD4 distribution at infection, CD4 progression rates (probability of progression from higher to lower CD4 cell-count category), on and off antiretroviral therapy (ART) mortality, and mortality from all other causes. Our estimation strategy links the GBD 2015 assessment of all-cause mortality and estimation of incidence and prevalence so that for each draw from the uncertainty distribution all assumptions used in each step are internally consistent. We estimated incidence, prevalence, and death with GBD versions of the Estimation and Projection Package (EPP) and Spectrum software originally developed by the Joint United Nations Programme on HIV/AIDS (UNAIDS). We used an open-source version of EPP and recoded Spectrum for speed, and used updated assumptions from systematic reviews of the literature and GBD demographic data. For countries with high-quality vital registration data, we developed the cohort incidence bias adjustment model to estimate HIV incidence and prevalence largely from the number of deaths caused by HIV recorded in cause-of-death statistics. We corrected these statistics for garbage coding and HIV misclassifi cation. Findings Global HIV incidence reached its peak in 1997, at 3·3 million new infections (95% uncertainty interval [UI] 3·1–3·4 million). Annual incidence has stayed relatively constant at about 2·6 million per year (range 2·5–2·8 million) since 2005, after a period of fast decline between 1997 and 2005. The number of people living with HIV/AIDS has been steadily increasing and reached 38·8 million (95% UI 37·6–40·4 million) in 2015. At the same time, HIV/AIDS mortality has been declining at a steady pace, from a peak of 1·8 million deaths (95% UI 1·7–1·9 million) in 2005, to 1·2 million deaths (1·1–1·3 million) in 2015. We recorded substantial heterogeneity in the levels and trends of HIV/AIDS across countries. Although many countries have experienced decreases in HIV/AIDS mortality and in annual new infections, other countries have had slowdowns or increases in rates of change in annual new infections. Interpretation Scale-up of ART and prevention of mother-to-child transmission has been one of the great successes of global health in the past two decades. However, in the past decade, progress in reducing new infections has been slow, development assistance for health devoted to HIV has stagnated, and resources for health in low-income countries have grown slowly. Achievement of the new ambitious goals for HIV enshrined in Sustainable Development Goal 3 and the 90-90-90 UNAIDS targets will be challenging, and will need continued eff orts from governments and international agencies in the next 15 years to end AIDS by 2030.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-08

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This Master’s thesis examines the implementation of management system standard requirements as integrated in the organization. The aim is to determine how requirements from management system standards ISO 14001:2015 and ISO 9001:2015 can be integrated and implemented into the existing ISO 9001:2008 compliant management system. Research was executed as action research by utilizing an operating model about the integrated use of management system standards created by the International Organization for Standardization. Phases of the operating model were applied to the target organization. The similarity and integration potential of relevant standards were assessed by using comparative matrices. Allocation of the requirements and conformity assessment of the processes was executed by gap analysis. The main results indicate that the requirements of the relevant standards are principally equivalent or have the same kind of purpose. The results also show the most important processes of the target organization in terms of requirement compliance, as well as the requirements which affect the process the most. Prioritizing the compliance achievement of the most important processes and implementation of those requirements that have the most effect create an opportunity for organizations to implement the integrated requirements effectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Immunization is the most cost-effective intervention for infectious diseases, which are the major cause of morbidity and mortality worldwide. Vaccines not only protect the individual who is vaccinated but also reduce the burden of infectious vaccine-preventable diseases for the entire community.1 Adult vaccination is very important given that >25% of mortality is due to infectious diseases.2 There is a scarcity of information on the vaccination status of young adults and the role of socioeconomic conditions in India.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to the Declaration of Helsinki, as well as the Statement on Public Disclosure of Clinical Trial Results of the World Health Organization, every researcher has the ethical obligation to publish research results on all trials with human participants in a complete and accurate way within 12 months after the end of the trial.1,2 Nevertheless, for several reasons, not all research results are published in an accurate way in case they are released at all. This phenomenon of publication bias may not only create a false impression on the reliability of clinical research business, but it may also affect the evidence of clinical conclusions about the best treatments, which are mostly based on published data and results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Atomic layer deposition (ALD) has been recognized as a promising method to deposit conformal and uniform thin film of copper for future electronic devices. However, many aspects of the reaction mechanism and the surface chemistry of copper ALD remain unclear. In this paper, we employ plane wave density functional theory (DFT) to study the transmetalation ALD reaction of copper dimethylamino-2-propoxide [Cu(dmap)2] and diethylzinc [Et2Zn] that was realized experimentally by Lee et al. [ Angew. Chem., Int. Ed. 2009, 48, 4536−4539]. We find that the Cu(dmap)2 molecule adsorbs and dissociates through the scission of one or two Cu–O bonds into surface-bound dmap and Cu(dmap) fragments during the copper pulse. As Et2Zn adsorbs on the surface covered with Cu(dmap) and dmap fragments, butane formation and desorption was found to be facilitated by the surrounding ligands, which leads to one reaction mechanism, while the migration of ethyl groups to the surface leads to another reaction mechanism. During both reaction mechanisms, ligand diffusion and reordering are generally endothermic processes, which may result in residual ligands blocking the surface sites at the end of the Et2Zn pulse, and in residual Zn being reduced and incorporated as an impurity. We also find that the nearby ligands play a cooperative role in lowering the activation energy for formation and desorption of byproducts, which explains the advantage of using organometallic precursors and reducing agents in Cu ALD. The ALD growth rate estimated for the mechanism is consistent with the experimental value of 0.2 Å/cycle. The proposed reaction mechanisms provide insight into ALD processes for copper and other transition metals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This index is designed to inform map users of the various series of maps produced and distributed by the U.S. Geological Survey, and to assist users in selecting and purchasing maps.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context. Recent observations of brown dwarf spectroscopic variability in the infrared infer the presence of patchy cloud cover. Aims. This paper proposes a mechanism for producing inhomogeneous cloud coverage due to the depletion of cloud particles through the Coulomb explosion of dust in atmospheric plasma regions. Charged dust grains Coulomb-explode when the electrostatic stress of the grain exceeds its mechanical tensile stress, which results in grains below a critical radius a < a Coul crit being broken up. Methods. This work outlines the criteria required for the Coulomb explosion of dust clouds in substellar atmospheres, the effect on the dust particle size distribution function, and the resulting radiative properties of the atmospheric regions. Results. Our results show that for an atmospheric plasma region with an electron temperature of Te = 10 eV (≈105 K), the critical grain radius varies from 10−7 to 10−4 cm, depending on the grains’ tensile strength. Higher critical radii up to 10−3 cm are attainable for higher electron temperatures. We find that the process produces a bimodal particle size distribution composed of stable nanoscale seed particles and dust particles with a ≥ a Coul crit , with the intervening particle sizes defining a region devoid of dust. As a result, the dust population is depleted, and the clouds become optically thin in the wavelength range 0.1–10 μm, with a characteristic peak that shifts to higher wavelengths as more sub-micrometer particles are destroyed. Conclusions. In an atmosphere populated with a distribution of plasma volumes, this will yield regions of contrasting radiative properties, thereby giving a source of inhomogeneous cloud coverage. The results presented here may also be relevant for dust in supernova remnants and protoplanetary disks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent evidences indicate that tRNA modifications and tRNA modifying enzymes may play important roles in complex human diseases such as cancer, neurological disorders and mitochondrial-linked diseases. We postulate that expression deregulation of tRNA modifying enzymes affects the level of tRNA modifications and, consequently, their function and the translation efficiency of their tRNA corresponding codons. Due to the degeneracy of the genetic code, most amino acids are encoded by two to six synonymous codons. This degeneracy and the biased usage of synonymous codons cause alterations that can span from protein folding to enhanced translation efficiency of a select gene group. In this work, we focused on cancer and performed a meta-analysis study to compare microarray gene expression profiles, reported by previous studies and evaluate the codon usage of different types of cancer where tRNA modifying enzymes were found de-regulated. A total of 36 different tRNA modifying enzymes were found de-regulated in most cancer datasets analyzed. The codon usage analysis revealed a preference for codons ending in AU for the up-regulated genes, while the down-regulated genes show a preference for GC ending codons. Furthermore, a PCA biplot analysis showed this same tendency. We also analyzed the codon usage of the datasets where the CTU2 tRNA modifying enzyme was found deregulated as this enzyme affects the wobble position (position 34) of specific tRNAs. Our data points to a distinct codon usage pattern between up and downregulated genes in cancer, which might be caused by the deregulation of specific tRNA modifying enzymes. This codon usage bias may augment the transcription and translation efficiency of some genes that otherwise, in a normal situation, would be translated less efficiently.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The informational properties of biological systems are the subject of much debate and research. I present a general argument in favor of the existence and central importance of information in organisms, followed by a case study of the genetic code (specifically, codon bias) and the translation system from the perspective of information. The codon biases of 831 Bacteria and Archeae are analyzed and modeled as points in a 64-dimensional statistical space. The major results are that (1) codon bias evolution does not follow canonical patterns, and (2) the use of coding space in organsims is a subset of the total possible coding space. These findings imply that codon bias is a unique adaptive mechanism that owes its existence to organisms' use of information in representing genes, and that there is a particularly biological character to the resulting biased coding and information use.