864 resultados para fate and effect modelling
Resumo:
This paper presents general problems and approaches for the spatial data analysis using machine learning algorithms. Machine learning is a very powerful approach to adaptive data analysis, modelling and visualisation. The key feature of the machine learning algorithms is that they learn from empirical data and can be used in cases when the modelled environmental phenomena are hidden, nonlinear, noisy and highly variable in space and in time. Most of the machines learning algorithms are universal and adaptive modelling tools developed to solve basic problems of learning from data: classification/pattern recognition, regression/mapping and probability density modelling. In the present report some of the widely used machine learning algorithms, namely artificial neural networks (ANN) of different architectures and Support Vector Machines (SVM), are adapted to the problems of the analysis and modelling of geo-spatial data. Machine learning algorithms have an important advantage over traditional models of spatial statistics when problems are considered in a high dimensional geo-feature spaces, when the dimension of space exceeds 5. Such features are usually generated, for example, from digital elevation models, remote sensing images, etc. An important extension of models concerns considering of real space constrains like geomorphology, networks, and other natural structures. Recent developments in semi-supervised learning can improve modelling of environmental phenomena taking into account on geo-manifolds. An important part of the study deals with the analysis of relevant variables and models' inputs. This problem is approached by using different feature selection/feature extraction nonlinear tools. To demonstrate the application of machine learning algorithms several interesting case studies are considered: digital soil mapping using SVM, automatic mapping of soil and water system pollution using ANN; natural hazards risk analysis (avalanches, landslides), assessments of renewable resources (wind fields) with SVM and ANN models, etc. The dimensionality of spaces considered varies from 2 to more than 30. Figures 1, 2, 3 demonstrate some results of the studies and their outputs. Finally, the results of environmental mapping are discussed and compared with traditional models of geostatistics.
Resumo:
Sexual reproduction is a fundamental aspect of life. Sex-determination mechanisms are responsible for the sexual fate and development of sexual characteristics in an organism, be it a unicellular alga, a plant, or an animal. Surprisingly, sex-determination mechanisms are not evolutionarily conserved but are bewilderingly diverse and appear to have had rapid turnover rates during evolution. Evolutionary biologists continue to seek a solution to this conundrum. What drives the surprising dynamics of such a fundamental process that always leads to the same outcome: two sex types, male and female? The answer is complex but the ongoing genomic revolution has already greatly increased our knowledge of sex-determination systems and sex chromosomes in recent years. This novel book presents and synthesizes our current understanding, and clearly shows that sex-determination evolution will remain a dynamic field of future research. The Evolution of Sex Determination is an advanced, research level text suitable for graduate students and researchers in genetics, developmental biology, and evolution.
Resumo:
En Suisse, comme dans la plupart des pays industrialisés, le stress au travail et l'épuisement qui en découle sont devenus, au cours des dernières décennies, une réalité qui ne cesse de s'accentuer. Différentes disciplines scientifiques ont tenté de rendre compte, depuis le milieu du siècle dernier, des difficultés rencontrées par les individus dans le cadre de leur travail, avec une prédominance marquée pour des analyses de type causaliste. Dans le cadre de cette étude doctorale, nous nous sommes penché sur le cas d'un office régional de placement, mais avec une perspective sensiblement différente. La grille de lecture psychodynamique utilisée permet en effet de donner accès au sens des situations de travail et d'ouvrir sur une compréhension originale des mécanismes à l'origine des problèmes de santé mentale au travail. Cette approche permet ainsi de comprendre les rapports complexes que les individus entretiennent avec leur travail tel que structuré et organisé, et d'analyser leur expérience en termes de plaisir, de souffrance, de défenses face à la souffrance et de répercussions sur la santé. Dans ce but, nous avons utilisé une méthodologie basée sur des entrevues collectives, afin de stimuler l'expression libre des travailleurs. L'enquête s'est déroulée en deux temps : une première série d'entretiens de groupe a permis la récolte des données empiriques, puis une seconde série, appelée entretiens de restitution, a donné la possibilité aux participants de réagir sur l'interprétation de leur parole faite par le chercheur, et de valider l'analyse. Nos résultats mettent alors en évidence que le travail, tel qu'organisé au sein de cette institution de service public, apparaît considérablement pathogène, mais heureusement compensé par le pouvoir structurant de la relation d'aide aux assurés. Ils montrent également que l'expérience subjective de travail des participants a pour principales sources de souffrance la perception désagréable d'un manque de reconnaissance, d'autonomie et de pouvoir sur leurs actes. - In Switzerland and in other industrialized countries, work-related stress and resulting burn-out has become an ever increasing problem in recent decades. Many researchers Jrom many different fields have made efforts to understand the difficulties employees encounter at work since the middle of the last century. Most of this research is based on a cause and effect analysis approach. For this doctoral research project, we have analyzed cases handled by an unemployment office in Switzerland. We have taken a novel approach by using a number of psychodynamic criteria which permitted us to interpret situations at work and to open up a new way of understanding the mechanisms at work which lead to mental health problems. This approach allows us to understand account the complex relationship people have towards structured and organized work as well as to take into account and to analyze their experience in terms of pleasure, suffering, defense mechanisms against suffering and the consequences on their mental health. In order to achieve this goal we performed collective interviews in order to encourage workers to express themselves freely. The interviews were divided into two series. The first series of group interviews allowed us to collect empirical statistics and the second series gave the workers an opportunity to react to the researchers ' analysis of their answers and to validate the researchers ' interpretation of their answers. Our results show that work has considerable negative effects on mental health. Fortunately, these negative effects are counterbalanced by the psychological support system offered by the unemployment office. Our project also shows that the subjective negative experiences of workers are caused by their perceptions of being under-appreciated, lack of autonomy and having no power over their acts.
Resumo:
A cause and effect relationship between arterial hypertension and decline of cognitive function has long been suspected. In middle-age subjects indeed, an abnormally high blood pressure is a risk factor for the long-term development of dementia. Presently, it seems crucial to treat hypertensive patients in order to better protect them against cognitive decline. However, in the elderly patients the risk of mental deterioration may also be enhanced when diastolic pressure becomes too low, for example below 70 mmHg. Further studies are required to better define the antihypertensive drug regimen and target blood pressure which would be optimal for the prevention of cerebral small vessel disease.
Genetic basis of adaptation in Arabidopsis thaliana: local adaptation at the seed dormancy QTL DOG1.
Resumo:
Local adaptation provides an opportunity to study the genetic basis of adaptation and investigate the allelic architecture of adaptive genes. We study delay of germination 1 (DOG1), a gene controlling natural variation in seed dormancy in Arabidopsis thaliana and investigate evolution of dormancy in 41 populations distributed in four regions separated by natural barriers. Using F(ST) and Q(ST) comparisons, we compare variation at DOG1 with neutral markers and quantitative variation in seed dormancy. Patterns of genetic differentiation among populations suggest that the gene DOG1 contributes to local adaptation. Although Q(ST) for seed dormancy is not different from F(ST) for neutral markers, a correlation with variation in summer precipitation supports that seed dormancy is adaptive. We characterize dormancy variation in several F(2) -populations and show that a series of functionally distinct alleles segregate at the DOG1 locus. Theoretical models have shown that the number and effect of alleles segregatin at quantitative trait loci (QTL) have important consequences for adaptation. Our results provide support to models postulating a large number of alleles at quantitative trait loci involved in adaptation.
Resumo:
Our work is concerned with user modelling in open environments. Our proposal then is the line of contributions to the advances on user modelling in open environments thanks so the Agent Technology, in what has been called Smart User Model. Our research contains a holistic study of User Modelling in several research areas related to users. We have developed a conceptualization of User Modelling by means of examples from a broad range of research areas with the aim of improving our understanding of user modelling and its role in the next generation of open and distributed service environments. This report is organized as follow: In chapter 1 we introduce our motivation and objectives. Then in chapters 2, 3, 4 and 5 we provide the state-of-the-art on user modelling. In chapter 2, we give the main definitions of elements described in the report. In chapter 3, we present an historical perspective on user models. In chapter 4 we provide a review of user models from the perspective of different research areas, with special emphasis on the give-and-take relationship between Agent Technology and user modelling. In chapter 5, we describe the main challenges that, from our point of view, need to be tackled by researchers wanting to contribute to advances in user modelling. From the study of the state-of-the-art follows an exploratory work in chapter 6. We define a SUM and a methodology to deal with it. We also present some cases study in order to illustrate the methodology. Finally, we present the thesis proposal to continue the work, together with its corresponding work scheduling and temporalisation
Resumo:
Poor understanding of the spliceosomal mechanisms to select intronic 3' ends (3'ss) is a major obstacle to deciphering eukaryotic genomes. Here, we discern the rules for global 3'ss selection in yeast. We show that, in contrast to the uniformity of yeast splicing, the spliceosome uses all available 3'ss within a distance window from the intronic branch site (BS), and that in 70% of all possible 3'ss this is likely to be mediated by pre-mRNA structures. Our results reveal that one of these RNA folds acts as an RNA thermosensor, modulating alternative splicing in response to heat shock by controlling alternate 3'ss availability. Thus, our data point to a deeper role for the pre-mRNA in the control of its own fate, and to a simple mechanism for some alternative splicing.
Resumo:
Meta-analysis of genome-wide association studies (GWASs) has led to the discoveries of many common variants associated with complex human diseases. There is a growing recognition that identifying "causal" rare variants also requires large-scale meta-analysis. The fact that association tests with rare variants are performed at the gene level rather than at the variant level poses unprecedented challenges in the meta-analysis. First, different studies may adopt different gene-level tests, so the results are not compatible. Second, gene-level tests require multivariate statistics (i.e., components of the test statistic and their covariance matrix), which are difficult to obtain. To overcome these challenges, we propose to perform gene-level tests for rare variants by combining the results of single-variant analysis (i.e., p values of association tests and effect estimates) from participating studies. This simple strategy is possible because of an insight that multivariate statistics can be recovered from single-variant statistics, together with the correlation matrix of the single-variant test statistics, which can be estimated from one of the participating studies or from a publicly available database. We show both theoretically and numerically that the proposed meta-analysis approach provides accurate control of the type I error and is as powerful as joint analysis of individual participant data. This approach accommodates any disease phenotype and any study design and produces all commonly used gene-level tests. An application to the GWAS summary results of the Genetic Investigation of ANthropometric Traits (GIANT) consortium reveals rare and low-frequency variants associated with human height. The relevant software is freely available.
Resumo:
O presente trabalho cujo título é Implementação do ABC numa empresa prestadora de serviços de Saúde, tem como finalidade a obtenção do grau de licenciatura em Contabilidade e Administração e tem como principal objectivo a implementação do método ABC numa pequena e média empresa de prestação de serviços de saúde, como um instrumento de apoio á gestão. Para a introdução da Contabilidade de Gestão na empresa, há que se escolher um método/sistema de apuramento de gastos que espelha a realidade da empresa, e de uma certa forma o ABC é o método ideal para apuramento de resultados sem distorções. O ABC (Activity-Based Cost) apura os resultados através da relação de causa-efeito, considerando que as actividades é que geram gastos e os objectos de custeio é que consomem as actividades. É aplicável tanto nas empresas industriais como nas empresas prestadoras de serviços, apesar de inicialmente ter sido concebido para as empresas industrias, isto é, para as grandes empresas devido aos avultados recursos financeiros e humanos como também pelo tempo necessário para a sua implementação. Mas o modelo matricial apresentado por Roztcki et al (1999) permite a aplicação deste método nas PME com poucos recursos financeiros e de tempo, utilizando uma folha de cálculo no Excel. Será este modelo a ser proposto e poderá ser implementado na clínica. O modelo apresentado foi testado num estudo de caso realizado numa clínica. Com a realização dos testes foi detectado algumas dificuldades e limitações, as maiores dificuldades encontradas foram a identificação das actividades e dos cost drivers, devido à complexidade do sector. A implementação foi concluída com sucesso, proporcionando informações detalhadas dos gastos dos produtos/serviços prestados em toda a clínica. This work was done as a requisite for obtaining a degree in Accounting and Administration, and is titled “The Implementation of ABC – Activity Based Cost in a company that provides health services”. Its main purpose is to analyze the implementation of ABC method in a small and medium-sized enterprise which provides health services to support decision making by the Managers. To adopt management accounting in a company, it’s necessary to choose a cost qualifying system that reflects the reality of the company and in a certain way ABC is the method which can determine the results without any distortion. ABC (Activity-Based Cost) determines the results through cause-and-effect relationship, whereas the activities generate spending while costing objects consume the activities. It’s applicable both in industrial companies as in services providers, although it was initially designed for industrial companies, that is, to large companies, due to the huge financial and human resources existent as well as by the time required for its implementation. But the matrix model presented by Roztckiet al (1999) allows application of this method in small and medium-sized enterprises with limited financial resources and time, using a spreadsheet in Excel. This model will be proposed and could be implemented in any clinic. The model was tested in a case study, undertaken in a private clinic. With the realization of the tests, some problems and limitations were detected, and the major difficulties encountered were the identification of activities and cost drivers, due to the complexity of the sector. The implementation was completed successfully, providing detailed information of the products services spending throughout the clinic.
Resumo:
Recent reports have indicated that 23.5% of the nation's highway bridges are structurally deficient and 17.7% are functionally obsolete. A significant number of these bridges are on the Iowa secondary road system where over 86% of the rural bridge management responsibilities are assigned to the counties. Some of the bridges can be strengthened or otherwise rehabilitated, but many more are in need of immediate replacement. In a recent investigation (HR-365 "Evaluation of Bridge Replacement Alternatives for the County Bridge System") several types of replacement bridges that are currently being used on low volume roads were identified. It was also determined that a large number of counties (69%) have the ability and are interested in utilizing their own forces to design and construct short span bridges. In reviewing the results from HR-365, the research team developed one "new" bridge replacement concept and a modification of a replacement system currently being used. Both of these bridge replacement alternatives were investigated in this study, the results of which are presented in two volumes. This volume (Volume 1) presents the results of Concept 1 - Steel Beam Precast Units. Concept 2 - Modification of the Beam-in-Slab Bridge is presented in Volume 2. Concept 1, involves the fabrication of precast units (two steel beams connected by a concrete slab) by county work forces. Deck thickness is limited so that the units can be fabricated at one site and then transported to the bridge site where they are connected and the remaining portion of the deck placed. Since Concept 1 bridge is primarily intended for use on low-volume roads, the precast units can be constructed with new or used beams. In the experimental part of the investigation, there were three types of static load tests: small scale connector tests, "handling strength" tests, and service and overload tests of a model bridge. Three finite element models for analyzing the bridge in various states of construction were also developed. Small scale connector tests were completed to determine the best method of connecting the precast double-T (PCDT) units. "Handling strength" tests on an individual PCDT unit were performed to determine the strength and behavior of the precast unit in this configuration. The majority of the testing was completed on the model bridge [L=9,750 mm (32 ft), W=6,400 mm (21 ft)] which was fabricated using the precast units developed. Some of the variables investigated in the model bridge tests were number of connectors required to connect adjacent precast units, contribution of diaphragms to load distribution, influence of position of diaphragms on bridge strength and load distribution, and effect of cast-in-place portion of deck on load distribution. In addition to the service load tests, the bridge was also subjected to overload conditions. Using the finite element models developed, one can predict the behavior and strength of bridges similar to the laboratory model as well as design them. Concept 1 has successfully passed all laboratory testing; the next step is to field test it.
Resumo:
MVA is a candidate vector for vaccination against pathogens and tumors. Little is known about its behaviour in mucosal tissues. We have investigated the fate and biosafety of MVA, when inoculated by different routes in C57BL/6 mice. Intranasal inoculation targeted the virus to the nasal associated lymphoid tissue and the lungs, whereas systemic inoculation led to distribution of MVA in almost all lymphoid organs, lungs and ovaries. Intravaginal, intrarectal and intragastric inoculations failed to induce efficient infection. After 48 h no virus was detectable any more in the organs analyzed. Upon intranasal inoculation, no inflammatory reactions were detected in the central nervous system as well as the upper and lower airways. These results show the tropism of MVA and indicate that high doses of recombinant MVA are safe when nasally administered, a vaccination route known to elicit strong cellular and humoral immune responses in the female genital tract.
Resumo:
The problems of laboratory compaction procedures, the effect of gradation and mineralogy on shearing strength, and effect of stabilizing agents on shearing strength of granular base course mixes are discussed. For the materials tested, a suitable laboratory compaction procedure was developed which involves the use of a vibratory table to prepare triaxial test specimens. A computer program has been developed to facilitate the analysis of the test data of the effect of gradation and mineralogy on shearing strength of soils. The effects of the following materials have been selected for evaluation as stabilizing agents’ portland cement, sodium and calcium chloride, lime organic cationic waterproofer, and asphaltic materials.
Resumo:
The problems of laboratory compaction procedures, the effect of gradation and mineralogy on shearing strength, and effect of stabilizing agents on shearing strength of granular base course mixes are discussed. For the materials tested, a suitable laboratory compaction procedure was developed which involves the use of a vibratory table to prepare triaxial test specimens. A computer program has been developed to facilitate the analysis of the test data of the effect of gradation and mineralogy on shearing strength of soils. The effects of the following materials have been selected for evaluation as stabilizing agents’ portland cement, sodium and calcium chloride, lime organic cationic waterproofer, and asphaltic materials.
Factors Influencing Stability of Granular Base Course Mixes, Progress Report, HR-99, 1964 (November)
Resumo:
The problems of laboratory compaction procedures, the effect of gradation and mineralogy on shearing strength, and effect of stabilizing agents on shearing strength of granular base course mixes are discussed. For the materials tested, a suitable laboratory compaction procedure was developed which involves the use of a vibratory table to prepare triaxial test specimens. A computer program has been developed to facilitate the analysis of the test data of the effect of gradation and mineralogy on shearing strength of soils. The effects of the following materials have been selected for evaluation as stabilizing agents’ portland cement, sodium and calcium chloride, lime organic cationic waterproofer, and asphaltic materials.
Resumo:
Risk maps summarizing landscape suitability of novel areas for invading species can be valuable tools for preventing species' invasions or controlling their spread, but methods employed for development of such maps remain variable and unstandardized. We discuss several considerations in development of such models, including types of distributional information that should be used, the nature of explanatory variables that should be incorporated, and caveats regarding model testing and evaluation. We highlight that, in the case of invasive species, such distributional predictions should aim to derive the best hypothesis of the potential distribution of the species by using (1) all distributional information available, including information from both the native range and other invaded regions; (2) predictors linked as directly as is feasible to the physiological requirements of the species; and (3) modelling procedures that carefully avoid overfitting to the training data. Finally, model testing and evaluation should focus on well-predicted presences, and less on efficient prediction of absences; a k-fold regional cross-validation test is discussed.