898 resultados para 340402 Econometric and Statistical Methods
Resumo:
The Earth we know today was not always so. Over millions of years have undergone significant ch an g e s brought about by numerous geological phenomena aimed at your balance, some internal order, creating new geological formations and other external order smoothing formations previously created. From t h e tectonic standpoint, Angola is located in a relatively stable area which gives it a certain p ri v i l e g e w h e n compared with some Asian countries or even Americans where quite often occur earthquakes and volcanic eruptions. However, the same cannot be said in relation to the occurrence of an external geodynamics phenomena, such as the ravines, which in recent years has taken shape in many provinces, especially due to anthropogenic activity, giving rise to geological hazards, increasing the risk of damage in buildings and others infrastructures, losses direct or indirect in economic activities and loss of human lives. We understand that the reducing of these risks starts, in particular, by their identification, for later take preventive measures. This work is the result of some research work carried out by the authors through erosion courses of s o i l and stabilization of soils subject to erosion phenomena, carried out by Engineering Laboratory of Angola (LEA). For the realization of this work, we resorted to cartographic data query, literature, listening to s o m e o f the provincial representatives and local residents, as well as the observation in lo co o f s o m e af f e ct ed areas. The results allow us to infer that the main provinces affected by ravine phenomenon are located in Central and Northern highlands, as well as in the eastern region, and more recently in Cuando-Cub an go province. Not ruling out, however, other regions, such as in Luanda and Cabinda [1]. Relatively the causes, we can say that the ravines in Angola are primarily due to the combination of three natural factors: climate, topography and type of soil [2]. When we add the anthropogenic activit y , namely the execution of construction works, the drainage system obstructio n, exploration of m i n e ral s, agriculture and fires, it is verified an increasing of the phenomenon, often requiring immedi at e act i o n . These interventions can be done through structural or engineering measures and by the stabilization measures on the degraded soil cover [3]. We present an example of stabilization measures throu g h t h e deployment of a local vegetation called Pennisetum purpureum. It is expected that the results may contribute to a better understanding of the causes of the ravine phenomenon in Angola and that the adopted stabilization method can be adapted in other affected provinces in order to prevent and making the contention of the ravines.
Resumo:
The book Worldwide Wound Healing - Innovation in Natural and Conventional Methods develops a set of themes on the healing and treatment of complex wounds through evidence-based practice with innovations in the use of natural and conventional methods. It is an innovative way that promotes the integration of conventional and natural perspectives in wound healing, with a unique focus on the quality of life of the patient.
Resumo:
This study aimed to compare four establishment methods of mixed swards of Tangolagrass and forage peanut (Arachis pintoi).
Resumo:
Crash prediction models are used for a variety of purposes including forecasting the expected future performance of various transportation system segments with similar traits. The influence of intersection features on safety have been examined extensively because intersections experience a relatively large proportion of motor vehicle conflicts and crashes compared to other segments in the transportation system. The effects of left-turn lanes at intersections in particular have seen mixed results in the literature. Some researchers have found that left-turn lanes are beneficial to safety while others have reported detrimental effects on safety. This inconsistency is not surprising given that the installation of left-turn lanes is often endogenous, that is, influenced by crash counts and/or traffic volumes. Endogeneity creates problems in econometric and statistical models and is likely to account for the inconsistencies reported in the literature. This paper reports on a limited-information maximum likelihood (LIML) estimation approach to compensate for endogeneity between left-turn lane presence and angle crashes. The effects of endogeneity are mitigated using the approach, revealing the unbiased effect of left-turn lanes on crash frequency for a dataset of Georgia intersections. The research shows that without accounting for endogeneity, left-turn lanes ‘appear’ to contribute to crashes; however, when endogeneity is accounted for in the model, left-turn lanes reduce angle crash frequencies as expected by engineering judgment. Other endogenous variables may lurk in crash models as well, suggesting that the method may be used to correct simultaneity problems with other variables and in other transportation modeling contexts.
Resumo:
Genome-wide association studies (GWAS) have identified around 60 common variants associated with multiple sclerosis (MS), but these loci only explain a fraction of the heritability of MS. Some missing heritability may be caused by rare variants that have been suggested to play an important role in the aetiology of complex diseases such as MS. However current genetic and statistical methods for detecting rare variants are expensive and time consuming. 'Population-based linkage analysis' (PBLA) or so called identity-by-descent (IBD) mapping is a novel way to detect rare variants in extant GWAS datasets. We employed BEAGLE fastIBD to search for rare MS variants utilising IBD mapping in a large GWAS dataset of 3,543 cases and 5,898 controls. We identified a genome-wide significant linkage signal on chromosome 19 (LOD = 4.65; p = 1.9×10-6). Network analysis of cases and controls sharing haplotypes on chromosome 19 further strengthened the association as there are more large networks of cases sharing haplotypes than controls. This linkage region includes a cluster of zinc finger genes of unknown function. Analysis of genome wide transcriptome data suggests that genes in this zinc finger cluster may be involved in very early developmental regulation of the CNS. Our study also indicates that BEAGLE fastIBD allowed identification of rare variants in large unrelated population with moderate computational intensity. Even with the development of whole-genome sequencing, IBD mapping still may be a promising way to narrow down the region of interest for sequencing priority. © 2013 Lin et al.
Resumo:
The aim of the thesis was to study the extent of spatial concentration of immigrant population in Helsinki and to analyse the impact of housing policy on ethnic residential segregation in 1992-2005. For the purpose of the study, immigrant population was defined based on the language spoken at home. The theory of residential segregation by Andersson and Molina formed the main theoretical framework for the study. According to Andersson and Molina ethnic residential segregation results from different dynamic intra-urban migration processes. Institutionally generated migration, i.e. migration patterns generated by various housing and immigrant policies and procedures, is one of the central factors in the development of ethnic segregation. The data of the study consisted of population and housing statistics and housing and immigrant policy documents of Helsinki municipality. Spatial concentration of immigrant population was studied both at district and building levels using GIS-methods and statistical methods. The housing policy of Helsinki municipality was analysed using a method created by Musterd et al. Musterd et al. categorise two types of policy approaches to residential segregation: spatial dispersion policy and compensating policy. The housing policy of Helsinki has a strong focus on social mixing and spatial dispersion of housing stock. Ethnic segregation is regarded as a threat. The importance of ethnic communities and networks is, however, acknowledged and small-scale concentration is therefore not considered harmful. Despite the spatial dispersion policy, the immigrant population is concentrated in the eastern, north-eastern and north-western suburbs of Helsinki. The spatial pattern of concentration was formed already at the beginning of the 1990's when immigration to Finland suddenly peaked. New immigrant groups were housed in the neighbourhoods where public housing was available at the time. Housing policy, namely the location of new residential areas and public housing blocks and the policies of public housing allocation were key factors influencing the residential patterns of immigrant population in the 1990's. The immigration and refugee policies of the state have also had an impact on the development. The concentration of immigrant population has continued in the same areas in the beginning of the 2000's. Dispersion to new areas has mainly taken place within the eastern and north-eastern parts of the city or in the adjacent areas. The migration patterns of native population and the reasonably rapid changes in the housing market have emerged as new factors generating and influencing the ethnic residential segregation in Helsinki in the 2000's. Due to social mixing and spatial dispersion policies, ethnic segregation in Helsinki has so far been fairly small-scale, concentrated in particular housing blocks. The number of residential buildings with a high share of immigrant population is very modest. However, the number of such buildings has doubled between 1996-2002. The concentration of immigrant population concerns mainly the public housing sector. The difference in the level of concentration between the public housing sector and privately owned housing companies is remarkable.
Resumo:
Mudanças marcantes no perfil geográfico eleitoral do Brasil, no que diz respeito às eleições para Presidente da República, aconteceram no ano de 2006. Candidatos do Partido dos Trabalhadores (PT) que antes eram bem votados em regiões mais desenvolvidas do país, passaram a ganhar as eleições, com grande margem de preferência, nas regiões mais pobres do país. Entenda por regiões mais pobres as regiões Norte e Nordeste. Esse trabalho pretende traçar um histórico da relação eleitoral da Região Nordeste do Brasil com os candidatos do Partido dos Trabalhadores ao cargo de Presidente da República desde a redemocratização. O objetivo é entender a relação entre a opção de voto em candidatos desse partido e o desenvolvimento socioeconômico ao longo do tempo da região, levando em consideração também o programa Bolsa Família para tentar comprovar a sua racionalidade. Além disso, este trabalho faz um estudo de caso da votação da cidade de Viçosa do Ceará para exemplificar o fato de que a indução do voto para candidatos de âmbito federal por representantes locais não é mais a regra da região e sim a exceção. A investigação usa análise bibliográfica, histórica e métodos de análise estatística, principalmente, técnicas de análise espacial.
Resumo:
Nolan and Temple Lang argue that “the ability to express statistical computations is an es- sential skill.” A key related capacity is the ability to conduct and present data analysis in a way that another person can understand and replicate. The copy-and-paste workflow that is an artifact of antiquated user-interface design makes reproducibility of statistical analysis more difficult, especially as data become increasingly complex and statistical methods become increasingly sophisticated. R Markdown is a new technology that makes creating fully-reproducible statistical analysis simple and painless. It provides a solution suitable not only for cutting edge research, but also for use in an introductory statistics course. We present experiential and statistical evidence that R Markdown can be used effectively in introductory statistics courses, and discuss its role in the rapidly-changing world of statistical computation.
Resumo:
A design methodology based on numerical modelling, integrated with optimisation techniques and statistical methods, to aid the process control of micro and nano-electronics based manufacturing processes is presented in this paper. The design methodology is demonstrated for a micro-machining process called Focused Ion Beam (FIB). This process has been modelled to help understand how a pre-defined geometry of micro- and nano- structures can be achieved using this technology. The process performance is characterised on the basis of developed Reduced Order Models (ROM) and are generated using results from a mathematical model of the Focused Ion Beam and Design of Experiment (DoE) methods. Two ion beam sources, Argon and Gallium ions, have been used to compare and quantify the process variable uncertainties that can be observed during the milling process. The evaluations of the process performance takes into account the uncertainties and variations of the process variables and are used to identify their impact on the reliability and quality of the fabricated structure. An optimisation based design task is to identify the optimal process conditions, by varying the process variables, so that certain quality objectives and requirements are achieved and imposed constraints are satisfied. The software tools used and developed to demonstrate the design methodology are also presented.
Resumo:
This paper presents a design methodology based on numerical modelling, integrated with optimisation techniques and statistical methods, to aid the development of new advanced technologies in the area of micro and nano systems. The design methodology is demonstrated for a micro-machining process called Focused Ion Beam (FIB). This process has been modelled to provide knowledge of how a pre-defined geometry can be achieved through this direct milling. The geometry characterisation is obtained using a Reduced Order Models (ROM), generated from the results of a mathematical model of the Focused Ion Beam, and Design of Experiment (DoE) methods. In this work, the focus is on the design flow methodology which includes an approach on how to include process parameter uncertainties into the process optimisation modelling framework. A discussion on the impact of the process parameters, and their variations, on the quality and performance of the fabricated structure is also presented. The design task is to identify the optimal process conditions, by altering the process parameters, so that certain reliability and confidence of the application is achieved and the imposed constraints are satisfied. The software tools used and developed to demonstrate the design methodology are also presented.
Resumo:
Mechanistic models such as those based on dynamic energy budget (DEB) theory are emergent ecomechanics tools to investigate the extent of fitness in organisms through changes in life history traits as explained by bioenergetic principles. The rapid growth in interest around this approach originates from the mechanistic characteristics of DEB, which are based on a number of rules dictating the use of mass and energy flow through organisms. One apparent bottleneck in DEB applications comes from the estimations of DEB parameters which are based on mathematical and statistical methods (covariation method). The parameterisation process begins with the knowledge of some functional traits of a target organism (e. g. embryo, sexual maturity and ultimate body size, feeding and assimilation rates, maintenance costs), identified from the literature or laboratory experiments. However, considering the prominent role of the mechanistic approach in ecology, the reduction of possible uncertainties is an important objective. We propose a revaluation of the laboratory procedures commonly used in ecological studies to estimate DEB parameters in marine bivalves. Our experimental organism was Brachidontes pharaonis. We supported our proposal with a validation exercise which compared life history traits as obtained by DEBs (implemented with parameters obtained using classical laboratory methods) with the actual set of species traits obtained in the field. Correspondence between the 2 approaches was very high (>95%) with respect to estimating both size and fitness. Our results demonstrate a good agreement between field data and model output for the effect of temperature and food density on age-size curve, maximum body size and total gamete production per life span. The mechanistic approach is a promising method of providing accurate predictions in a world that is under in creasing anthropogenic pressure.
Resumo:
L’objectif de cette étude est de saisir une image des éléments explicitement reconnaissables de la recherche en communication visibles dans les revues savantes Canadian Journal of Communication et dans Communication de 1974 à 2005. Il s’agit d’une analyse bibliométrique des articles publiés par les chercheurs d’institutions canadiennes et de leurs références bibliographiques. La bibliométrie est « l’application de méthodes statistiques aux livres et aux autres moyens de communication » (Pritchard, 1969: 348-349). C’est la première fois qu’une analyse de ce type est tentée dans ce corpus particulier. Nous nous sommes appuyés sur des postulats théoriques provenant de la sociologie des sciences et des études en communication scientifique. L’idée maîtresse est la suivante : l’activité scientifique est un « continuum de création de nouvelles connaissances » (Vassallo, 1999), dont l’organisation est basée sur l’échange d’information (Price, 1963; Crane, 1972), qui se traduit en reconnaissance sociale, en autorité scientifique, et constitue un investissement pour l’acquisition de crédibilité (Merton, 1938; Hagstrom, 1965; Bourdieu, 1975; Latour et Woolgar, 1986). À partir de l’analyse des articles, nous identifions s’ils sont le résultat de recherches empiriques ou fondamentales, ou le produit d’une réflexion critique. Il s’agit aussi de détecter les approches méthodologiques et les techniques d’investigation utilisées, ainsi que les sujets qui y sont abordés par les chercheurs. Nous détectons également les principaux lieux de recherche (universités et types de départements). Nous analysons aussi les thématiques des articles. Enfin, nous analysons des références bibliographiques des articles afin de cerner les sources d’idées qui y sont décelables. Notre corpus principal comporte 1154 articles et 12 840 titres de documents en référence. L’analyse bibliométrique des articles révèle ainsi une recherche canadienne en communication d’emblée qualitative, intéressée pour les spécificités historiques, le contexte social et la compréhension des interrelations sous-jacentes aux phénomènes de communication, en particulier, au Canada et au Québec. Au cœur de ces études se distingue principalement l’application de l’analyse de contenu qualitative dans les médias en général. Cependant, à partir de 1980, l’exploration du cinéma, de l’audiovisuel, des nouvelles technologies de l’information et de la communication, ainsi que la multiplication des sujets de recherche, annoncent un déplacement dans l’ordre des intérêts. Communication et le CJC, se distinguent cependant par l’origine linguistique des chercheurs qui y publient ainsi que dans les thématiques. L’analyse des références bibliographiques, et de leurs auteurs, met en relief l’intérêt partagé des chercheurs d’institutions universitaires canadiennes pour les agences de réglementation et les politiques gouvernementales canadiennes s’appuyant souvent sur l’analyse de documents législatifs et de rapports de diverses commissions d’enquête du gouvernement canadien. L’analyse révèle aussi les principales inspirations théoriques et méthodologiques des chercheurs. Parmi les plus citées, on voit Innis, McLuhan, Habermas, Tuchman, Bourdieu, Foucault, Raboy, et Rogers. Mais ces références évoluent dans le temps. On voit aussi une distinction relativement claire entre les sources citées par la recherche francophone et la recherche anglophone.
Resumo:
Nous proposons dans cette thèse un système permettant de déterminer, à partir des données envoyées sur les microblogs, les évènements qui stimulent l’intérêt des utilisateurs durant une période donnée et les dates saillantes de chaque évènement. Étant donné son taux d’utilisation élevé et l’accessibilité de ses données, nous avons utilisé la plateforme Twitter comme source de nos données. Nous traitons dans ce travail les tweets portant sur la Tunisie dont la plupart sont écrits par des tunisiens. La première tâche de notre système consistait à extraire automatiquement les tweets d’une façon continue durant 67 jours (de 8 février au 15 avril 2012). Nous avons supposé qu’un évènement est représenté par plusieurs termes dont la fréquence augmente brusquement à un ou plusieurs moments durant la période analysée. Le manque des ressources nécessaires pour déterminer les termes (notamment les hashtags) portant sur un même sujet, nous a obligé à proposer des méthodes permettant de regrouper les termes similaires. Pour ce faire, nous avons eu recours à des méthodes phonétiques que nous avons adaptées au mode d’écriture utilisée par les tunisiens, ainsi que des méthodes statistiques. Pour déterminer la validité de nos méthodes, nous avons demandé à des experts, des locuteurs natifs du dialecte tunisien, d’évaluer les résultats retournés par nos méthodes. Ces groupes ont été utilisés pour déterminer le sujet de chaque tweet et/ou étendre les tweets par de nouveaux termes. Enfin, pour sélectionner l'ensemble des évènements (EV), nous nous sommes basés sur trois critères : fréquence, variation et TF-IDF. Les résultats que nous avons obtenus ont montré la robustesse de notre système.
Resumo:
The growing human population will require a significant increase in agricultural production. This challenge is made more difficult by the fact that changes in the climatic and environmental conditions under which crops are grown have resulted in the appearance of new diseases, whereas genetic changes within the pathogen have resulted in the loss of previously effective sources of resistance. To help meet this challenge, advanced genetic and statistical methods of analysis have been used to identify new resistance genes through global screens, and studies of plant-pathogen interactions have been undertaken to uncover the mechanisms by which disease resistance is achieved. The informed deployment of major, race-specific and partial, race-nonspecific resistance, either by conventional breeding or transgenic approaches, will enable the production of crop varieties with effective resistance without impacting on other agronomically important crop traits. Here, we review these recent advances and progress towards the ultimate goal of developing disease-resistant crops.
Resumo:
The Compact Muon Solenoid (CMS) detector is described. The detector operates at the Large Hadron Collider (LHC) at CERN. It was conceived to study proton-proton (and lead-lead) collisions at a centre-of-mass energy of 14 TeV (5.5 TeV nucleon-nucleon) and at luminosities up to 10(34)cm(-2)s(-1) (10(27)cm(-2)s(-1)). At the core of the CMS detector sits a high-magnetic-field and large-bore superconducting solenoid surrounding an all-silicon pixel and strip tracker, a lead-tungstate scintillating-crystals electromagnetic calorimeter, and a brass-scintillator sampling hadron calorimeter. The iron yoke of the flux-return is instrumented with four stations of muon detectors covering most of the 4 pi solid angle. Forward sampling calorimeters extend the pseudo-rapidity coverage to high values (vertical bar eta vertical bar <= 5) assuring very good hermeticity. The overall dimensions of the CMS detector are a length of 21.6 m, a diameter of 14.6 m and a total weight of 12500 t.