58 resultados para Competency-Based Approach

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Habitat destruction and fragmentation are known to strongly affect dispersal by altering the quality of the environment between populations. As a consequence, lower landscape connectivity is expected to enhance extinction risks through a decrease in gene flow and the resulting negative effects of genetic drift, accumulation of deleterious mutations and inbreeding depression. Such phenomena are particularly harmful for amphibian species, characterized by disjunct breeding habitats. The dispersal behaviour of amphibians being poorly understood, it is crucial to develop new tools, allowing us to determine the influence of landscape connectivity on the persistence of populations. In this study, we developed a new landscape genetics approach that aims at identifying land-uses affecting genetic differentiation, without a priori assumptions about associated ecological costs. We surveyed genetic variation at seven microsatellite loci for 19 Alpine newt (Mesotriton alpestris) populations in western Switzerland. Using strips of varying widths that define a dispersal corridor between pairs of populations, we were able to identify land-uses that act as dispersal barriers (i.e. urban areas) and corridors (i.e. forests). Our results suggest that habitat destruction and landscape fragmentation might in the near future affect common species such as M. alpestris. In addition, by identifying relevant landscape variables influencing population structure without unrealistic assumptions about dispersal, our method offers a simple and flexible tool of investigation as an alternative to least-cost models and other approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methods like Event History Analysis can show the existence of diffusion and part of its nature, but do not study the process itself. Nowadays, thanks to the increasing performance of computers, processes can be studied using computational modeling. This thesis presents an agent-based model of policy diffusion mainly inspired from the model developed by Braun and Gilardi (2006). I first start by developing a theoretical framework of policy diffusion that presents the main internal drivers of policy diffusion - such as the preference for the policy, the effectiveness of the policy, the institutional constraints, and the ideology - and its main mechanisms, namely learning, competition, emulation, and coercion. Therefore diffusion, expressed by these interdependencies, is a complex process that needs to be studied with computational agent-based modeling. In a second step, computational agent-based modeling is defined along with its most significant concepts: complexity and emergence. Using computational agent-based modeling implies the development of an algorithm and its programming. When this latter has been developed, we let the different agents interact. Consequently, a phenomenon of diffusion, derived from learning, emerges, meaning that the choice made by an agent is conditional to that made by its neighbors. As a result, learning follows an inverted S-curve, which leads to partial convergence - global divergence and local convergence - that triggers the emergence of political clusters; i.e. the creation of regions with the same policy. Furthermore, the average effectiveness in this computational world tends to follow a J-shaped curve, meaning that not only time is needed for a policy to deploy its effects, but that it also takes time for a country to find the best-suited policy. To conclude, diffusion is an emergent phenomenon from complex interactions and its outcomes as ensued from my model are in line with the theoretical expectations and the empirical evidence.Les méthodes d'analyse de biographie (event history analysis) permettent de mettre en évidence l'existence de phénomènes de diffusion et de les décrire, mais ne permettent pas d'en étudier le processus. Les simulations informatiques, grâce aux performances croissantes des ordinateurs, rendent possible l'étude des processus en tant que tels. Cette thèse, basée sur le modèle théorique développé par Braun et Gilardi (2006), présente une simulation centrée sur les agents des phénomènes de diffusion des politiques. Le point de départ de ce travail met en lumière, au niveau théorique, les principaux facteurs de changement internes à un pays : la préférence pour une politique donnée, l'efficacité de cette dernière, les contraintes institutionnelles, l'idéologie, et les principaux mécanismes de diffusion que sont l'apprentissage, la compétition, l'émulation et la coercition. La diffusion, définie par l'interdépendance des différents acteurs, est un système complexe dont l'étude est rendue possible par les simulations centrées sur les agents. Au niveau méthodologique, nous présenterons également les principaux concepts sous-jacents aux simulations, notamment la complexité et l'émergence. De plus, l'utilisation de simulations informatiques implique le développement d'un algorithme et sa programmation. Cette dernière réalisée, les agents peuvent interagir, avec comme résultat l'émergence d'un phénomène de diffusion, dérivé de l'apprentissage, où le choix d'un agent dépend en grande partie de ceux faits par ses voisins. De plus, ce phénomène suit une courbe en S caractéristique, poussant à la création de régions politiquement identiques, mais divergentes au niveau globale. Enfin, l'efficacité moyenne, dans ce monde simulé, suit une courbe en J, ce qui signifie qu'il faut du temps, non seulement pour que la politique montre ses effets, mais également pour qu'un pays introduise la politique la plus efficace. En conclusion, la diffusion est un phénomène émergent résultant d'interactions complexes dont les résultats du processus tel que développé dans ce modèle correspondent tant aux attentes théoriques qu'aux résultats pratiques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L'adolescence est une période de grands changements et de ce fait potentiellement de grande vulnérabilité. Ainsi, les bouleversements physiques et psychiques induits par les processus pubertaires sont un terrain propice à l'émergence d'un trouble des conduites alimentaires (TCA). La thérapie familiale selon Maudsley, ou family based treatment (FBT), a émergé en parallèle aux avancées neurobiologiques, qui confirment une origine multifactorielle des troubles du comportement alimentaire. Cette thérapie replace les parents au centre de la prise en charge des adolescents souffrant d'un TCA avec comme grand atout, une approche basée sur l'évidence scientifique. Adolescence is a time of great change and therefore, potentially of great vulnerability. Thus, physical and psychological changes induced by pubertal processes are fertile ground for the emergence of an eating disorder (ED). Family therapy according to Maudsley or "family based treatment" (FBT) has emerged in parallel with neurobiological advances confirming a multifactorial origin of eating disorders. This therapy places parents at the centre of care for adolescents with EDs. Its great asset is the evidence-based approach underpinning the therapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AbstractText BACKGROUND: Profiling sperm DNA present on vaginal swabs taken from rape victims often contributes to identifying and incarcerating rapists. Large amounts of the victim's epithelial cells contaminate the sperm present on swabs, however, and complicate this process. The standard method for obtaining relatively pure sperm DNA from a vaginal swab is to digest the epithelial cells with Proteinase K in order to solubilize the victim's DNA, and to then physically separate the soluble DNA from the intact sperm by pelleting the sperm, removing the victim's fraction, and repeatedly washing the sperm pellet. An alternative approach that does not require washing steps is to digest with Proteinase K, pellet the sperm, remove the victim's fraction, and then digest the residual victim's DNA with a nuclease. METHODS: The nuclease approach has been commercialized in a product, the Erase Sperm Isolation Kit (PTC Labs, Columbia, MO, USA), and five crime laboratories have tested it on semen-spiked female buccal swabs in a direct comparison with their standard methods. Comparisons have also been performed on timed post-coital vaginal swabs and evidence collected from sexual assault cases. RESULTS: For the semen-spiked buccal swabs, Erase outperformed the standard methods in all five laboratories and in most cases was able to provide a clean male profile from buccal swabs spiked with only 1,500 sperm. The vaginal swabs taken after consensual sex and the evidence collected from rape victims showed a similar pattern of Erase providing superior profiles. CONCLUSIONS: In all samples tested, STR profiles of the male DNA fractions obtained with Erase were as good as or better than those obtained using the standard methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measurement of total energy expenditure may be crucial to an understanding of the relation between physical activity and disease and in order to frame public health intervention. To devise a self-administered physical activity frequency questionnaire (PAFQ), the following data-based approach was used. A 24-hour recall was administered to a random sample of 919 adult residents of Geneva, Switzerland. The data obtained were used to establish the list of activities (and their median duration) that contributed to 95% of the energy expended, separately for men and women. Activities that were trivial for the whole sample but that contributed to > or = 10% of an individual's energy expenditure were also selected. The final PAFQ lists 70 activities or group of activities with their typical duration. About 20 minutes are required for respondents to indicate the number of days and the number of hours per day that they performed each activity. The PAFQ method was validated against a heart rate monitor, a more objective method. The total energy estimated by the PAFQ in 41 volunteers correlated well (r = 0.76) with estimates using a heart rate monitor. The authors conclude that the design of their self-administered physical activity frequency questionnaire based on data from 24-hour recall appeared to accurately estimate energy expenditure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The high complexity of cortical convolutions in humans is very challenging both for engineers to measure and compare it, and for biologists and physicians to understand it. In this paper, we propose a surface-based method for the quantification of cortical gyrification. Our method uses accurate 3-D cortical reconstruction and computes local measurements of gyrification at thousands of points over the whole cortical surface. The potential of our method to identify and localize precisely gyral abnormalities is illustrated by a clinical study on a group of children affected by 22q11 Deletion Syndrome, compared to control individuals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DNA is nowadays swabbed routinely to investigate serious and volume crimes, but research remains scarce when it comes to determining the criteria that may impact the success rate of DNA swabs taken on different surfaces and situations. To investigate these criteria in fully operational conditions, DNA analysis results of 4772 swabs taken by the forensic unit of a police department in Western Switzerland over a 2.5-year period (2012-2014) in volume crime cases were considered. A representative and random sample of 1236 swab analyses was extensively examined and codified, describing several criteria such as whether the swabbing was performed at the scene or in the lab, the zone of the scene where it was performed, the kind of object or surface that was swabbed, whether the target specimen was a touch surface or a biological fluid, and whether the swab targeted a single surface or combined different surfaces. The impact of each criterion and of their combination was assessed in regard to the success rate of DNA analysis, measured through the quality of the resulting profile, and whether the profile resulted in a hit in the national database or not. Results show that some situations - such as swabs taken on door and window handles for instance - have a higher success rate than average swabs. Conversely, other situations lead to a marked decrease in the success rate, which should discourage further analyses of such swabs. Results also confirm that targeting a DNA swab on a single surface is preferable to swabbing different surfaces with the intent to aggregate cells deposited by the offender. Such results assist in predicting the chance that the analysis of a swab taken in a given situation will lead to a positive result. The study could therefore inform an evidence-based approach to decision-making at the crime scene (what to swab or not) and at the triage step (what to analyse or not), contributing thus to save resource and increase the efficiency of forensic science efforts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIM: The aim of this study was to evaluate a new pedagogical approach in teaching fluid, electrolyte and acid-base pathophysiology in undergraduate students. METHODS: This approach comprises traditional lectures, the study of clinical cases on the web and a final interactive discussion of these cases in the classroom. When on the web, the students are asked to select laboratory tests that seem most appropriate to understand the pathophysiological condition underlying the clinical case. The percentage of students having chosen a given test is made available to the teacher who uses it in an interactive session to stimulate discussion with the whole class of students. The same teacher used the same case studies during 2 consecutive years during the third year of the curriculum. RESULTS: The majority of students answered the questions on the web as requested and evaluated positively their experience with this form of teaching and learning. CONCLUSIONS: Complementing traditional lectures with online case-based studies and interactive group discussions represents, therefore, a simple means to promote the learning and the understanding of complex pathophysiological mechanisms. This simple problem-based approach to teaching and learning may be implemented to cover all fields of medicine.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim Conservation strategies are in need of predictions that capture spatial community composition and structure. Currently, the methods used to generate these predictions generally focus on deterministic processes and omit important stochastic processes and other unexplained variation in model outputs. Here we test a novel approach of community models that accounts for this variation and determine how well it reproduces observed properties of alpine butterfly communities. Location The western Swiss Alps. Methods We propose a new approach to process probabilistic predictions derived from stacked species distribution models (S-SDMs) in order to predict and assess the uncertainty in the predictions of community properties. We test the utility of our novel approach against a traditional threshold-based approach. We used mountain butterfly communities spanning a large elevation gradient as a case study and evaluated the ability of our approach to model species richness and phylogenetic diversity of communities. Results S-SDMs reproduced the observed decrease in phylogenetic diversity and species richness with elevation, syndromes of environmental filtering. The prediction accuracy of community properties vary along environmental gradient: variability in predictions of species richness was higher at low elevation, while it was lower for phylogenetic diversity. Our approach allowed mapping the variability in species richness and phylogenetic diversity projections. Main conclusion Using our probabilistic approach to process species distribution models outputs to reconstruct communities furnishes an improved picture of the range of possible assemblage realisations under similar environmental conditions given stochastic processes and help inform manager of the uncertainty in the modelling results

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Clinical guidelines are essential in implementing and maintaining nationwide stage-specific diagnostic and therapeutic standards. In 2011, the first German expert consensus guideline defined the evidence for diagnosis and treatment of early and locally advanced esophagogastric cancers. Here, we compare this guideline with other national guidelines as well as current literature. METHODS: The German S3-guideline used an approved development process with de novo literature research, international guideline adaptation, or good clinical practice. Other recent evidence-based national guidelines and current references were compared with German recommendations. RESULTS: In the German S3 and other Western guidelines, adenocarcinomas of the esophagogastric junction (AEG) are classified according to formerly defined AEG I-III subgroups due to the high surgical impact. To stage local disease, computed tomography of the chest and abdomen and endosonography are reinforced. In contrast, laparoscopy is optional for staging. Mucosal cancers (T1a) should be endoscopically resected "en-bloc" to allow complete histological evaluation of lateral and basal margins. For locally advanced cancers of the stomach or esophagogastric junction (≥T3N+), preferred treatment is preoperative and postoperative chemotherapy. Preoperative radiochemotherapy is an evidence-based alternative for large AEG type I-II tumors (≥T3N+). Additionally, some experts recommend treating T2 tumors with a similar approach, mainly because pretherapeutic staging is often considered to be unreliable. CONCLUSIONS: The German S3 guideline represents an up-to-date European position with regard to diagnosis, staging, and treatment recommendations for patients with locally advanced esophagogastric cancer. Effects of perioperative chemotherapy versus chemoradiotherapy are still to be investigated for adenocarcinoma of the cardia and the lower esophagus.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The potential of type-2 fuzzy sets for managing high levels of uncertainty in the subjective knowledge of experts or of numerical information has focused on control and pattern classification systems in recent years. One of the main challenges in designing a type-2 fuzzy logic system is how to estimate the parameters of type-2 fuzzy membership function (T2MF) and the Footprint of Uncertainty (FOU) from imperfect and noisy datasets. This paper presents an automatic approach for learning and tuning Gaussian interval type-2 membership functions (IT2MFs) with application to multi-dimensional pattern classification problems. T2MFs and their FOUs are tuned according to the uncertainties in the training dataset by a combination of genetic algorithm (GA) and crossvalidation techniques. In our GA-based approach, the structure of the chromosome has fewer genes than other GA methods and chromosome initialization is more precise. The proposed approach addresses the application of the interval type-2 fuzzy logic system (IT2FLS) for the problem of nodule classification in a lung Computer Aided Detection (CAD) system. The designed IT2FLS is compared with its type-1 fuzzy logic system (T1FLS) counterpart. The results demonstrate that the IT2FLS outperforms the T1FLS by more than 30% in terms of classification accuracy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The n-octanol/water partition coefficient (log Po/w) is a key physicochemical parameter for drug discovery, design, and development. Here, we present a physics-based approach that shows a strong linear correlation between the computed solvation free energy in implicit solvents and the experimental log Po/w on a cleansed data set of more than 17,500 molecules. After internal validation by five-fold cross-validation and data randomization, the predictive power of the most interesting multiple linear model, based on two GB/SA parameters solely, was tested on two different external sets of molecules. On the Martel druglike test set, the predictive power of the best model (N = 706, r = 0.64, MAE = 1.18, and RMSE = 1.40) is similar to six well-established empirical methods. On the 17-drug test set, our model outperformed all compared empirical methodologies (N = 17, r = 0.94, MAE = 0.38, and RMSE = 0.52). The physical basis of our original GB/SA approach together with its predictive capacity, computational efficiency (1 to 2 s per molecule), and tridimensional molecular graphics capability lay the foundations for a promising predictor, the implicit log P method (iLOGP), to complement the portfolio of drug design tools developed and provided by the SIB Swiss Institute of Bioinformatics.